11

I have a bucket on s3, and a user given full access to that bucket.

I can perform an ls command and see the files in the bucket, but downloading them fails with:

A client error (403) occurred when calling the HeadObject operation: Forbidden

I also attempted this with a user granted full S3 permissions through the IAM console. Same problem.

For reference, here is the IAM policy I have:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "arn:aws:s3:::*"
        },
        {
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::mybucket",
                "arn:aws:s3:::mybucket/*"
            ]
        }
    ]
}

I also tried adding a bucket policy, even making the bucket public, and still no go...also, from the console, I tried to set individual permissions on the files in the bucket, and got an error saying I cannot view the bucket, which is strange, since I was viewing it from the console when the message appeared, and can ls anything in the bucket.

EDIT the files in my bucket were copied there from another bucket belonging to a different account, using credentials from my account. May or may not be relevant...

2nd EDIT just tried to upload, download and copy my own files to and from this bucket from other buckets, and it works fine. The issue is specifically with the files placed there from another account's bucket.

Thanks!

MrSilverSnorkel
  • 419
  • 1
  • 4
  • 17

3 Answers3

9

I think you need to make sure that the permissions are applied to objects when moving/copying them between buckets with the "bucket-owner-full-control" acl.

Here are the details about how to do this when moving or copying files as well as retroactively: https://aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-owner-access/

Also, you can read about the various predefined grants here: http://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl

wrj
  • 201
  • 1
  • 3
8

The problem here stems from how you get the files into the bucket. Specifically the credentials you have and/or privileges you grant at the time of upload. I ran into a similar permissions issue issue when I had multiple AWS accounts, even though my bucket policy was quite open (as yours is here). I had accidentally used credentials from one account (call it A1) when uploading to a bucket owned by a different account (A2). Because of this A1 kept the permissions on the object and the bucket owner did not get them. There are at least 3 possible ways to fix this in this scenario at time of upload:

  • Switch accounts. Run $export AWS_DEFAULT_PROFILE=A2 or, for a more permanent change, go modify ~/.aws/credentials and ~/.aws/config to move the correct credentials and configuration under [default]. Then re-upload.
  • Specify the other profile at time of upload: aws s3 cp foo s3://mybucket --profile A2
  • Open up the permissions to bucket owner (doesn't require changing profiles): aws s3 cp foo s3://mybucket --acl bucket-owner-full-control

Note that the first two ways involve having a separate AWS profile. If you want to keep two sets of account credentials available to you, this is the way to go. You can set up a profile with your keys, region etc by doing aws configure --profile Foo. See here for more info on Named Profiles.

There are also slightly more involved ways to do this retroactively (post upload) which you can read about here.

watsonic
  • 2,512
  • 1
  • 23
  • 29
1

To correctly set the appropriate permissions for newly added files, add this bucket policy:

[...]
{
    "Effect": "Allow",
    "Principal": {
        "AWS": "arn:aws:iam::123456789012::user/their-user"
    },
    "Action": [
        "s3:PutObject",
        "s3:PutObjectAcl"
    ],
    "Resource": "arn:aws:s3:::my-bucket/*"
}

Your bucket policy is even more open, so that's not what's blocking you.

However, the uploader needs to set the ACL for newly created files. Python example:

import boto3

client = boto3.client('s3')
local_file_path = '/home/me/data.csv'
bucket_name = 'my-bucket'
bucket_file_path = 'exports/data.csv'
client.upload_file(
    local_file_path,
    bucket_name, 
    bucket_file_path, 
    ExtraArgs={'ACL':'bucket-owner-full-control'}
)

source: https://medium.com/artificial-industry/how-to-download-files-that-others-put-in-your-aws-s3-bucket-2269e20ed041 (disclaimer: written by me)

Nino van Hooff
  • 3,048
  • 27
  • 45