Sorry for taking this long to reply. I gave up! No matter what I tried with regards to cleaning up, I was not able to resolve the issue.
So I decided to start all over - once again! - and this time I would make sure to NOT start seafile/seahub before I had every single S3-related config step in place beforehand!
At the same time, I decided to go with version 11 instead.
The only two repos/libs that currently exist are My Library of the admin user that was created when seahub did its initial launch, and then of course the Template system library.
So at this point I have no “lingering” users and libraries as a result of moving from local disks to S3 buckets (since I never started seafile until all S3 config was in place).
When it comes to virus_scan, it failed to scan because it assumes it can read the buckets via http://127.0.0.1
, but my MinIO S3 is listening on port 9000.
However, I already specify port 9000 in seafile.conf (I have intentionally removed key_id and key before posting it here).
[commit_object_backend]
name = s3
bucket = seafile-commit-objects
host = 127.0.0.1:9000
path_style_request = true
[fs_object_backend]
name = s3
bucket = seafile-fs-objects
host = 127.0.0.1:9000
path_style_request = true
[block_backend]
name = s3
bucket = seafile-block-objects
host = 127.0.0.1:9000
path_style_request = true
[virus_scan]
scan_command = clamscan
virus_code = 1
nonvirus_code = 0
scan_interval = 120
scan_size_limit = 7
But no problem. I simply added the following to my Apache config for the default virtual host definition to work around the port issue:
RewriteEngine on
RewriteRule ^/(.*)$ http://127.0.0.1:9000/$1 [R,L]
NOW things really took off! Including a million DEBUG output lines (any way to easily adjust/decrease output verbosity to just INFO or WARNING perhaps?).
The virus scan now ends with one warning per repo:
[04/10/2024 22:29:30] [WARNING] Failed to scan virus for repo 2d2b4ac0: Failed to read object 2d2b4ac0-268e-4d91-8f69-a7960c86ed51/e9850cbfc7f9eabfdbe1b832003bc0fb83728eb4: maximum recursion depth exceeded.
[04/10/2024 22:29:30] [WARNING] Failed to scan virus for repo caab0677: Failed to read object caab0677-d479-402e-ba57-1f023fa05919/d28ffdec771e9e676394ef8f7be00d78df41186f: maximum recursion depth exceeded.
More output here:
[...]
[04/10/2024 22:29:30] [DEBUG] Event before-endpoint-resolution.s3: calling handler <bound method S3RegionRedirectorv2.redirect_from_cache of <botocore.utils.S3RegionRedirectorv2 object at 0x7fd11aa474d0>>
[04/10/2024 22:29:30] [DEBUG] Event request-created.s3.HeadBucket: calling handler <function add_retry_headers at 0x7fd12a288d60>
[04/10/2024 22:29:30] [DEBUG] Calling endpoint provider with parameters: {'Bucket': 'seafile-commit-objects', 'Region': 'us-east-1', 'UseFIPS': False, 'UseDualStack': False, 'Endpoint': 'http://127.0.0.1', 'ForcePathStyle': True, 'Accelerate': False, 'UseGlobalEndpoint': True, 'DisableMultiRegionAccessPoints': False, 'UseArnRegion': True}
[04/10/2024 22:29:30] [DEBUG] Sending http request: <AWSPreparedRequest stream_output=False, method=HEAD, url=http://127.0.0.1/seafile-commit-objects, headers={'User-Agent': b'Boto3/1.34.29 md/Botocore#1.34.29 ua/2.0 os/linux#6.1.0-13-amd64 md/arch#x86_64 lang/python#3.11.2 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.29', 'Date': b'Wed, 10 Apr 2024 20:29:30 GMT', 'Authorization': b'AWS seafile:KGRlZtgGXoXPQprcFx1azsVSoD8=', 'amz-sdk-invocation-id': b'9268375c-f796-49a9-8dd4-7d0d187403e8', 'amz-sdk-request': b'attempt=1'}>
[04/10/2024 22:29:30] [DEBUG] Endpoint provider result: http://127.0.0.1/seafile-commit-objects
[04/10/2024 22:29:30] [DEBUG] Selecting from endpoint provider's list of auth schemes: "sigv4". User selected auth scheme is: "s3"
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <function add_expect_header at 0x7fd12a243060>
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <bound method S3ExpressIdentityResolver.apply_signing_cache_key of <botocore.utils.S3ExpressIdentityResolver object at 0x7fd11aa4b710>>
[04/10/2024 22:29:30] [DEBUG] http://127.0.0.1:80 "HEAD /seafile-commit-objects HTTP/1.1" 302 0
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <function add_recursion_detection_header at 0x7fd12a241a80>
[04/10/2024 22:29:30] [DEBUG] Response headers: {'Date': 'Wed, 10 Apr 2024 20:29:30 GMT', 'Server': 'Apache/2.4.57 (Debian)', 'Location': 'http://127.0.0.1:9000/seafile-commit-objects', 'Content-Type': 'text/html; charset=iso-8859-1'}
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <function inject_api_version_header_if_needed at 0x7fd12a2885e0>
[04/10/2024 22:29:30] [DEBUG] Response body:
b''
[04/10/2024 22:29:30] [DEBUG] Making request for OperationModel(name=HeadBucket) with params: {'url_path': '', 'query_string': {}, 'method': 'HEAD', 'headers': {'User-Agent': 'Boto3/1.34.29 md/Botocore#1.34.29 ua/2.0 os/linux#6.1.0-13-amd64 md/arch#x86_64 lang/python#3.11.2 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.29'}, 'body': b'', 'auth_path': '/seafile-commit-objects/', 'url': 'http://127.0.0.1/seafile-commit-objects', 'context': {'client_region': 'us-east-1', 'client_config': <botocore.config.Config object at 0x7fd11aa28a10>, 'has_streaming_input': False, 'auth_type': None, 's3_redirect': {'redirected': False, 'bucket': 'seafile-commit-objects', 'params': {'Bucket': 'seafile-commit-objects'}}, 'S3Express': {'bucket_name': 'seafile-commit-objects'}, 'signing': {}, 'endpoint_properties': {'authSchemes': [{'disableDoubleEncoding': True, 'name': 'sigv4', 'signingName': 's3', 'signingRegion': 'us-east-1'}]}}}
[04/10/2024 22:29:30] [DEBUG] Event needs-retry.s3.HeadBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7fd11b618090>
[04/10/2024 22:29:30] [DEBUG] Event request-created.s3.HeadBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fd11aa289d0>>
[04/10/2024 22:29:30] [DEBUG] No retry needed.
[04/10/2024 22:29:30] [DEBUG] Event needs-retry.s3.HeadBucket: calling handler <bound method S3RegionRedirectorv2.redirect_from_error of <botocore.utils.S3RegionRedirectorv2 object at 0x7fd11aa3c810>>
[04/10/2024 22:29:30] [WARNING] Failed to scan virus for repo 2d2b4ac0: Failed to read object 2d2b4ac0-268e-4d91-8f69-a7960c86ed51/e9850cbfc7f9eabfdbe1b832003bc0fb83728eb4: maximum recursion depth exceeded.
[04/10/2024 22:29:30] [DEBUG] Event before-parameter-build.s3.HeadBucket: calling handler <function validate_bucket_name at 0x7fd12a242ca0>
[04/10/2024 22:29:30] [DEBUG] Event before-parameter-build.s3.HeadBucket: calling handler <function remove_bucket_from_url_paths_from_model at 0x7fd12a288e00>
[04/10/2024 22:29:30] [DEBUG] Event before-parameter-build.s3.HeadBucket: calling handler <bound method S3RegionRedirectorv2.annotate_request_context of <botocore.utils.S3RegionRedirectorv2 object at 0x7fd11aa3c810>>
[04/10/2024 22:29:30] [DEBUG] Event before-parameter-build.s3.HeadBucket: calling handler <bound method S3ExpressIdentityResolver.inject_signing_cache_key of <botocore.utils.S3ExpressIdentityResolver object at 0x7fd11b618050>>
[04/10/2024 22:29:30] [DEBUG] Event before-parameter-build.s3.HeadBucket: calling handler <function generate_idempotent_uuid at 0x7fd12a242ac0>
[04/10/2024 22:29:30] [DEBUG] Event before-endpoint-resolution.s3: calling handler <function customize_endpoint_resolver_builtins at 0x7fd12a288fe0>
[04/10/2024 22:29:30] [DEBUG] Event before-endpoint-resolution.s3: calling handler <bound method S3RegionRedirectorv2.redirect_from_cache of <botocore.utils.S3RegionRedirectorv2 object at 0x7fd11aa3c810>>
[04/10/2024 22:29:30] [DEBUG] Calling endpoint provider with parameters: {'Bucket': 'seafile-commit-objects', 'Region': 'us-east-1', 'UseFIPS': False, 'UseDualStack': False, 'Endpoint': 'http://127.0.0.1', 'ForcePathStyle': True, 'Accelerate': False, 'UseGlobalEndpoint': True, 'DisableMultiRegionAccessPoints': False, 'UseArnRegion': True}
[04/10/2024 22:29:30] [DEBUG] Endpoint provider result: http://127.0.0.1/seafile-commit-objects
[04/10/2024 22:29:30] [DEBUG] Selecting from endpoint provider's list of auth schemes: "sigv4". User selected auth scheme is: "s3"
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <function add_expect_header at 0x7fd12a243060>
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <bound method S3ExpressIdentityResolver.apply_signing_cache_key of <botocore.utils.S3ExpressIdentityResolver object at 0x7fd11b618050>>
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <function add_recursion_detection_header at 0x7fd12a241a80>
[04/10/2024 22:29:30] [DEBUG] Event before-call.s3.HeadBucket: calling handler <function inject_api_version_header_if_needed at 0x7fd12a2885e0>
[04/10/2024 22:29:30] [DEBUG] Making request for OperationModel(name=HeadBucket) with params: {'url_path': '', 'query_string': {}, 'method': 'HEAD', 'headers': {'User-Agent': 'Boto3/1.34.29 md/Botocore#1.34.29 ua/2.0 os/linux#6.1.0-13-amd64 md/arch#x86_64 lang/python#3.11.2 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.29'}, 'body': b'', 'auth_path': '/seafile-commit-objects/', 'url': 'http://127.0.0.1/seafile-commit-objects', 'context': {'client_region': 'us-east-1', 'client_config': <botocore.config.Config object at 0x7fd11aec0990>, 'has_streaming_input': False, 'auth_type': None, 's3_redirect': {'redirected': False, 'bucket': 'seafile-commit-objects', 'params': {'Bucket': 'seafile-commit-objects'}}, 'S3Express': {'bucket_name': 'seafile-commit-objects'}, 'signing': {}, 'endpoint_properties': {'authSchemes': [{'disableDoubleEncoding': True, 'name': 'sigv4', 'signingName': 's3', 'signingRegion': 'us-east-1'}]}}}
[04/10/2024 22:29:30] [DEBUG] Event request-created.s3.HeadBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fd11aa122d0>>
[04/10/2024 22:29:30] [WARNING] Failed to scan virus for repo caab0677: Failed to read object caab0677-d479-402e-ba57-1f023fa05919/d28ffdec771e9e676394ef8f7be00d78df41186f: maximum recursion depth exceeded.