-
-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Local to S3 upload limitation. 5G #2
Comments
Although I am not familiar with the PHP language, ChatGPT and NewBing helped me complete the following content, and I am running it now. I am not sure if there are any other issues.Add
Set S3put()
|
added support for 'MultipartUploader', could you test? (un-comment line 17 & 18 and keep/set the rest of your config naturally) PS: let me know how it went (in general) |
Thanks for the update, I'm testing it on a 2Tb sized Nextcloud project |
Yes it'll be a serious wait! When the script runs and you 'at most' see a warning here and there and (eventually :P) it completes, then you should be able to simply set $TEST to 'zero'. then the database settings will be converted to S3 usage. It shouldn't, but if it does fail, simply restore the SQL-backup (do first go to maintenance mode ON!) and you are reset to "local usage".. so your data should be safe :) And, well, yea.. you will only know if it all went well, when 'test=1' has completed.. (and only then re run with test=0) |
Let me know if you run in any (more) trouble |
Why does running this script on a suspended project prompt some files on S3, but is older then local, upload... |
Hmm, if the upload succeed then there must be a discrepancy in your database? The script uploads the file again if the timestamp of the file is more recent then the timestamp of that file in your database.. have you set $SHOWINFO = 1 and $TEST anything but 0? I suspect that's it, since it's the message "XX/YY.zz on S3, but is older then local, upload..." You could do 'occ files:scan --all' (or set $DO_FILES_SCAN=1), also setting $DO_FILES_CLEAN =1 (occ files:cleanup) might be a good idea.. those are "check & clean" operations, that can help in cleaning up discrepancies.. AD: Running those two "set to 1" once should be enough.. it'll fix discrepancies.. once fixed.. |
I didn't pay too much attention to this error, because he would just make me waste some traffic. |
Ehm, I don't understand that question completely.. If everything went well with $TEST=1, you can set $TEST=0.. then it'll do a final check & sync (which should go quick) and perform all the database changes to point your Nextcloud from local to S3 and the migration is completed! AD: do keep in mind that the script will put your instance in maintenance mode! Users can not use the instance while in maintenance mode! Don't do this "by hand", the order "of things" is important! The script does that for you! There are a few ways you can use the script, so do carefully read what the script tells you.. sometimes you need to do something by hand (again, the order of things is important!!) AD: this is "the big one".. if by any chance it fails, simply restore the backup and you will have reverted to a working local operation (and tell me what went wrong ;) |
I have verified that the MultipartUpload is valid. Thank you again for providing this project. |
Migration completed? You are now "live with S3"? |
Yes, the migration is now complete, and the data is in S3.
…----------Reply to Message----------
On Thu, Mar 2, 2023 17:20 PM ***@***.***> wrote:
Migration completed? You are now "live with S3"?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Thank you very much for providing this project. I have encountered an issue where it prompts me that the file I am trying to upload exceeds the size limit. After reviewing the AWS documentation, I found that using the SDK for uploading supports a single file size of 5G, but it also provides a method for segmented uploading. I am looking forward to the implementation of this feature in the project. Cheers!
https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/php/example_code/s3/MultipartUpload.php
The text was updated successfully, but these errors were encountered: