试过这个:
import boto3 from boto3.s3.transfer import TransferConfig, S3Transfer path = "/temp/" fileName = "bigFile.gz" # this happens to be a 5.9 Gig file client = boto3.client('s3', region) config = TransferConfig( multipart_threshold=4*1024, # number of bytes max_concurrency=10, num_download_attempts=10, ) transfer = S3Transfer(client, config) transfer.upload_file(path+fileName, 'bucket', 'key')
结果:s3上的5.9 gig文件.似乎不包含多个部分.
我找到了这个例子,但part
没有定义.
import boto3 bucket = 'bucket' path = "/temp/" fileName = "bigFile.gz" key = 'key' s3 = boto3.client('s3') # Initiate the multipart upload and send the part(s) mpu = s3.create_multipart_upload(Bucket=bucket, Key=key) with open(path+fileName,'rb') as data: part1 = s3.upload_part(Bucket=bucket , Key=key , PartNumber=1 , UploadId=mpu['UploadId'] , Body=data) # Next, we need to gather information about each part to complete # the upload. Needed are the part number and ETag. part_info = { 'Parts': [ { 'PartNumber': 1, 'ETag': part['ETag'] } ] } # Now the upload works! s3.complete_multipart_upload(Bucket=bucket , Key=key , UploadId=mpu['UploadId'] , MultipartUpload=part_info)
问题:有没有人知道如何使用boto3进行分段上传?