[1017] Upload and download all folders into an Amazon S3 bucket
To upload all folders into an Amazon S3 bucket and maintain the same file structure, you can use the Boto3 library. Here’s how:
-
Uploading Folders:
- Use the
upload_file()
function to upload files to your S3 bucket. - Iterate through the local folders and files, and upload them to the corresponding S3 paths.
- Example code snippet:
import boto3, os s3 = boto3.resource('s3') LOCAL_FOLDER_PATH = 'I:\\IMAGERY' BUCKET_NAME = 'li.imagery' S3_PREFIX = 'desired-prefix/' # Optional: Set a prefix for S3 keys def upload_folder(local_path, s3_prefix=''): for root, dirs, files in os.walk(local_path): print(root) for i in range(len(files)): file = files[i] local_file_path = os.path.join(root, file) s3_key = os.path.join(s3_prefix, os.path.relpath(local_file_path, local_path)) if len(local_file_path) < 256: s3.Bucket(BUCKET_NAME).upload_file(local_file_path, s3_key) upload_folder(LOCAL_FOLDER_PATH)
- Use the
-
Downloading with Same Structure:
- To download files from S3 with the same structure, use the
download_file()
method. - Example code snippet:
LOCAL_FOLDER_PATH = r'D:\...\2024\AWS_uploading_downloading\test' def download_all_files(bucket_name, local_path): my_bucket = s3.Bucket(bucket_name) for s3_object in my_bucket.objects.all(): filename = s3_object.key print(filename) # Create necessary folders if "\\" in filename: folder = os.path.dirname(filename) if not os.path.exists(os.path.join(LOCAL_FOLDER_PATH, folder)): os.makedirs(os.path.join(LOCAL_FOLDER_PATH, folder)) my_bucket.download_file(s3_object.key, os.path.join(local_path, filename)) # Usage: download_all_files('li.imagery', LOCAL_FOLDER_PATH)
- To download files from S3 with the same structure, use the
Remember to replace local_folder
, bucket_name
, and other placeholders with your actual values. Happy coding! 😊