-
Notifications
You must be signed in to change notification settings - Fork 35
port to boto3 #11
base: master
Are you sure you want to change the base?
port to boto3 #11
Conversation
- streaming downloads, including from multi-part files - streaming uploads, without file size limit (multi-part upload supported)
@grahame great work! Thank you very much for fixing these issues. I hope we will have time to review and test it soon. I'll let you know! |
@nibecker hey, just prodding for a review on this (I know I need to fix the tests up, but just of the basic concept would be good.) Have been running this on prod, uploading and downloading terabytes of data, so I'm pretty sure it's solid and could be merged. |
return bucket | ||
session = boto3.Session(aws_access_key_id=p_key, aws_secret_access_key=s_key) | ||
s3 = session.resource('s3') | ||
return s3.Bucket(bucket_name) | ||
|
||
def upload_to_key(self, filepath, upload_file, make_public=False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see any separation between public and private files (eg group image files should be public and at least the private resources should be made private).
From my initial reading, I'm under the impression that this will depend on the bucket policies and all uploaded files will have those permissions. @grahame am I correct?
If so, everything being publicly accessible from the S3 links can be a security issue here.
This patch addresses some issues encountered using ckanext-s3filestore in production.
I haven't updated the tests yet as I can't figure out how to run them - if you could let me know whether this PR is likely to be merged, I'll update the tests.