boto3: Upload or put object in S3 failing silently

I’ve been trying to upload files from a local folder into folders on S3 using Boto3, and it’s failing kinda silently, with no indication of why the upload isn’t happening.

key_name = folder + '/' 
 s3_connect = boto3.client('s3', s3_bucket_region,)
 # upload File to S3
 for filename in os.listdir(folder):
     s3_name = key_name + filename
     print folder, filename, key_name, s3_name
     upload = s3_connect.upload_file(
         s3_name, s3_bucket, key_name,
     )

Printing upload just says “None”, with no other information. No upload happens. I’ve also tried using put_object:

put_obj = s3_connect.put_object(
        Bucket=s3_bucket, 
        Key=key_name,
        Body=s3_name,
    )

and I get an HTTPS response code of 200 - but no files upload.

First, I’d love to solve this problem, but second, it seems this isn’t the right behavior - if an upload doesn’t happen, there should be a bit more information about why (although I imagine this might be a limitation of the API?)

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Reactions: 14
  • Comments: 37 (3 by maintainers)

Most upvoted comments

Having a leading slash in your path to the object will create a folder with a blank name in your bucket root, which holds the uploads. e.g.:

path = "/uploads/"
s3.Object(my_bucket, path + filename).put(Body=data)

will create / /uploads in your bucket root (note the blank folder) and put the object in there rather than in /uploads. Since boto was also spamming the bucket root with odd files, I didn’t notice the empty folder until much later and lost a lot of time. Could be your problem, too?

So just for your information (and perhaps this will help someone figure out what’s up,) using the Service Resource instead of the client works fine. The upload into S3 “folders” works perfectly. Here’s some example code:

import boto3

testfile = 'test.csv'
bucket_name = 'bucket-name-here'
folder_name = 'folder-name'

key = folder_name + '/' + testfile
s3 = boto3.resource('s3')
bucket = s3.Bucket(bucket_name)

bucket.upload_file(testfile, key)

So at least there is a workaround for the problem with the client.

Having an issue with this now. it seems I had this before and referenced this issue. At the time, changing from using client to resource resolved the issue for me.

But now I’m experiencing the same thing with the bucket Resource.

I’m trying to upload a single file with no directory in the key name. just {uuid4}.zip

Any update on this?

If region is not specified or miss-configured upload_fileobj fail silently.

conn = boto3.client('s3')
conn.upload_fileobj(content, bucket_name, filename)
return True

method PUT, HTTP response status code 400:

<?xml version="1.0" encoding="UTF-8"?>
        <Error><Code>AuthorizationHeaderMalformed</Code><Message>The authorization
        header is malformed; the region ''us-east-1'' is wrong; expecting ''eu-west-1''</Message><Region>
eu-west-1</Region><RequestId>AAAAA</RequestId><HostId>BBBBB</HostId></Error>
boto3==1.9.151
botocore==1.12.151

Could this be raised as S3Exceptions or ClientError please?

Same problem here - is there any solution to that ?

same issue here! I also tried: s3_resource.Object(destination_bucket, destination_key).copy_from(CopySource=f'{source_bucket}/{source_key}') but it didn’t work. s3_client.copy_object and s3_client.put_object didn’t work too.

does anybody have an update here?

my issue was caused by lambda disable internet access when explicit vpc setting. It has nothing to do with s3 or boto3. Sorry for the confusion!

https://medium.com/@philippholly/aws-lambda-enable-outgoing-internet-access-within-vpc-8dd250e11e12

I’m experiencing the same behavior. Please let me know if you would like my debug logs as well.

Could you share a santized version of your debug logs? That should have more information about what’s going on. You can enable debug logs by adding boto3.set_stream_logger('').

Why am I back here. Seeing the same issue now with s3client.upload_file() , and this isn’t new code.
I suspect this is because I just updated boto3 to 1.26.146.

Changed from using upload_file() to use upload_fileobj(), but seeing the same silent failing.

Also tried to set the region via AWS_DEFAULT_REGION but that had no effect on the siilent faililng.

I’m experiencing the same problem when both using client and resource objects. Have you experienced a similar issue on your workaround as well @maxpearl ?