Trying to sync large number of files results in error


#1

I’ve created a library for sync’ing my development source files. When the client tries to sync vendor or node_modules folders (roughly about 40k files combined per project), it results in Error: Error occured in upload.

The client reports:
[01/11/18 13:58:34] http-tx-mgr.c(2980): Bad response code for POST https://***.com/seafhttp/repo/****/recv-fs/: 502. [01/11/18 13:58:34] http-tx-mgr.c(3616): Failed to send fs objects for repo 0fcba839. [01/11/18 13:58:34] http-tx-mgr.c(1132): Transfer repo '0fcba839': ('normal', 'fs') --> ('error', 'finished') [01/11/18 13:58:34] sync-mgr.c(801): Repo 'Development' sync state transition from uploading to 'error': 'Error occured in upload.'.
I can’t find any errors in the logs folder, in controller.log seaf-server needs restart...
[01/11/18 13:58:35] seafile-controller.c(473): seaf-server need restart... [01/11/18 13:58:35] seafile-controller.c(199): starting seaf-server ... [01/11/18 13:58:35] seafile-controller.c(86): spawn_process: seaf-server -F /usr/local/www/haiwen/conf -c /usr/local/www/haiwen/ccnet -d /usr/local/www/haiwen/seafile-data -l /usr/local/www/haiwen/logs/seafile.log -P /usr/local/www/haiwen/pids/seaf-server.pid
Is there a way or config to enable syncing such large number of files?


#2

I think it’s failing on you proxy server(Apache, nginx). The POST request is just too big, look at their log files, may there will be more info.

Possible solution is increase max request size and max file upload size in proxy settings but firstly confirm my idea please.

I don’t know if I get you right but seafile support ignore file like GIT.
https://www.seafile.com/en/help/ignore/


#3

I’ve looked at the nginx error logs and this shows up:
2018/01/14 20:46:11 [error] 84467#101884: *3 upstream prematurely closed connection while reading response header from upstream, client: 10.0.10.100, server: ***.***.com, request: "POST /seafhttp/repo/c6a9776b-12e9-4f87-bb93-173b95d73636/recv-fs/ HTTP/1.1", upstream: "http://127.0.0.1:8082/repo/c6a9776b-12e9-4f87-bb93-173b95d73636/recv-fs/", host: "***.***.com" 2018/01/14 20:46:13 [error] 84467#101884: *2 kevent() reported that connect() failed (61: Connection refused) while connecting to upstream, client: 10.0.10.100, server: ***.***.com, request: "POST /seafhttp/repo/head-commits-multi/ HTTP/1.1", upstream: "http://127.0.0.1:8082/repo/head-commits-multi/", host: "***.***.com"

Then right at the same time:
[01/14/18 20:46:13] seafile-controller.c(473): seaf-server need restart... [01/14/18 20:46:13] seafile-controller.c(199): starting seaf-server ... [01/14/18 20:46:13] seafile-controller.c(86): spawn_process: seaf-server -F /usr/local/www/haiwen/conf -c /usr/local/www/haiwen/ccnet -d /usr/local/www/haiwen/seafile-data -l /usr/local/www/haiwen/logs/seafile.log -P /usr/local/www/haiwen/pids/seaf-server.pid [01/14/18 20:46:13] seafile-controller.c(101): spawned seaf-server, pid 84599

I don’t think it’s the proxy as the issue was happening before I setup the nginx proxy for SSL.


#4

I do sync my dev folder with roughly 80k files without any issue for more than a year now. So I don’t think there are any issues.

Does your proxy do request buffering? If not it might be worth trying out as I recall Seafile only handles X requests at a time and possibly synchronizing such a large library involves a higher number of requests than Seafile accepts at a time.


#5

I used the settings from this manual when setting up the proxy for SSL:
https://manual.seafile.com/deploy/https_with_nginx.html

So I’ve set the /seafhttp block with the following values as per the manual:
location /seafhttp { rewrite ^/seafhttp(.*)$ $1 break; proxy_pass http://127.0.0.1:8082; client_max_body_size 0; proxy_connect_timeout 36000s; proxy_read_timeout 36000s; proxy_request_buffering off; proxy_send_timeout 36000s; send_timeout 36000s; }

I’m open to suggestions. Removing proxy_request_bufferring off; doesn’t seem to have any beneficial effects.

I’ve used a seafile-ignore.txt to ignore some folders that don’t need to be synced (but I’d like them to be). My development files do sync now, however now I’m running into the same issue syncing my photo library. folders with large 4-6GB videos and jpegs I haven’t archived yet work, however it fails with the same error when it gets to my macOS Photos.photoslibrary file/folder.


#6

Turn off gzip in nginx.conf and try that


#7

Unfortunately gzip was already off.