Multi-TB Migrations Assitance

Hi All,

I have been looking around to find a better way to do multi-terabyte migrations in the future as the use of WebDAV / Drive client and having to download > copy > sync has just taken a long while to do.

I’ve been looking into alternative solutions, such as FTP - connecting it directly to a folder and then using an online sync tool to transfer the data (for example) from Dropbox to FTP (currently this is being done through WebDAV but is extremely slow.

Would anyone have any advise on how to setup an FTP connection that I would then be able to use SeaF-Import.sh to import into Seafile? (With SeaF-Import, where which Libraries would the files originally be put in?)

Thank you in advance :smiley:

Kind regards,
Mitch

Hello Buddy,

This is Johnson from KBS Technologies, Check my process as below.

Run SeaF-Import.sh:

  • Execute the SeaF-Import.sh script, specifying the source directory from your FTP server and the target Seafile library where you want to import the data.

Find the code:
./SeaF-Import.sh -s ftp://ftp_server/source_directory -d seafile://username:password@seafile_server/target_library

Please also, check the documentation.

Thanks

1 Like

Hey Johnson,

So just for confirmation, I would setup the FTP to be local on the Ubuntu server, but then would just use the SeaF-import to import them with just specifying the location? (I presumed this would be the case).

My only query would be, once the file has been imported into, does it delete the file? Only because I use a cloud sync software for this (and I know on average FTP is around 2x faster than WebDAV); or would it be better to setup a file transfer Ubuntu server to speed up the process?

Kind regards,
Mitch