Created
March 27, 2013 21:26
-
-
Save slotrans/5258159 to your computer and use it in GitHub Desktop.
Quick python script for multipart s3 file uploads, in case you need to upload a file larger than 5GB. Split up your file using something like /usr/bin/split, then invoke this as s3_multipart_upload.py targetbucket targetfilename part1 part2 part3 (or a glob like part*). AWS credentials are taken from the environment variables AWS_ACCESS_KEY_ID a…
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import boto | |
import sys | |
bucketname = sys.argv[1] | |
filename = sys.argv[2] | |
parts = sys.argv[3:] | |
print('target=s3://{0}/{1}'.format(bucketname, filename)) | |
conn = boto.connect_s3() | |
bucket = conn.lookup(bucketname) | |
mp = bucket.initiate_multipart_upload(filename) | |
partnum = 0 | |
for localfile in parts: | |
with open(localfile, 'rb') as fp: | |
partnum += 1 | |
print('uploading part #{0} from {1}...'.format(partnum, localfile)) | |
mp.upload_part_from_file(fp, partnum) | |
print('\nsummary...') | |
print('part number: bytes') | |
for part in mp: | |
print('{0}: {1}'.format(part.part_number, part.size)) | |
mp.complete_upload() |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment