2

I'm trying to files from a vendors S3 bucket to my S3 bucket using boto3. I'm using the sts service to assume a role to access the vendor s3 bucket. I'm able to connect to the vendor bucket and get a listing of the bucket. I run into CopyObject operation: Access Denied error when copying to my bucket. Here is my script

session = boto3.session.Session(profile_name="s3_transfer")
sts_client = session.client("sts", verify=False)
assumed_role_object = sts_client.assume_role(
    RoleArn="arn:aws:iam::<accountid>:role/assumedrole",
    RoleSessionName="transfer_session",
    ExternalId="<ID>",
    DurationSeconds=18000,
)

creds = assumed_role_object["Credentials"]
src_s3 = boto3.client(
    "s3",
    aws_access_key_id=creds["AccessKeyId"],
    aws_secret_access_key=creds["SecretAccessKey"],
    aws_session_token=creds["SessionToken"],
    verify=False,
)
paginator =src_s3.get_paginator("list_objects_v2")
# testing with just 2 items.
# TODO: Remove MaxItems once script works.
pages = paginator.paginate(
    Bucket="ven_bucket", Prefix="client", PaginationConfig={"MaxItems": 2, "PageSize": 1000}
)
dest_s3 = session.client("s3", verify=False)
for page in pages:
    for obj in page["Contents"]:
        src_key = obj["Key"]
        des_key = dest_prefix + src_key[len(src_prefix) :]
        src = {"Bucket": "ven_bucket", "Key": src_key}
        print(src)
        print(des_key)
        dest_s3.copy(src, "my-bucket", des_key, SourceClient=src_s3)

The line dest_s3.copy... is where I get the error. I have the following policy of my aws user to allow copy to my bucket

{
    "Version": "2012-10-17",
   "Statement": [
    {
        "Sid": "VisualEditor1",
        "Effect": "Allow",
        "Action": [
            "s3:*"
        ],
        "Resource": [
            "arn:aws:s3:::my-bucket/*",
            "arn:aws:s3:::my-bucket/"
        ]
    }
    ]
}

I get the following error when running the above script.

botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the CopyObject operation: Access Denied

Satish
  • 2,710
  • 5
  • 30
  • 43
  • The credentials that you are using to invoke the copy operation need to be able to access the object in the source bucket and the object in the destination bucket. – jarmod May 30 '19 at 15:15
  • @jarmod I'm able to access the object in the source bucket using that credentials. And from the policy on the user account I should be able to copy to the destination account as well unless I'm missing something. I'm able to download the files from source to my local computer and then upload it to the destination using these credentials. Then error is when trying to copy directly from source to destination. – Satish May 30 '19 at 15:21
  • If the credentials you are using for the copy have access to the source objects then why are you using temporary STS creds to list the source objects (using src_s3) ? – jarmod May 30 '19 at 16:35
  • @jarmod access to the source bucket is through the sts assumed role. – Satish May 30 '19 at 16:57
  • Unless I mis-read your code, access to the source bucket for the copy operation is not using the STS creds. It's using the s3_transfer creds. The copy operation uses one set of creds (and only one), and that's the creds associated with the dest_s3 client. – jarmod May 30 '19 at 19:26
  • @jarmod I'm using the sts creds to assume the role and [request temp creds](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp_request.html).Then I'm using that temp creds to create the source s3 client. If thats not the right way can you show how i should rewrite this? Thanks. – Satish May 30 '19 at 19:32
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/194194/discussion-between-jarmod-and-satish). – jarmod May 30 '19 at 20:27
  • did the accepted answer work for you? can you post the updated code, please? – saadi Mar 23 '20 at 18:23
  • @saadi What did was I had 2 s3 clients one connected to src account and one connected to target account. Then I download the from file from src using the src client and then upload it to the target using target client. – Satish Mar 24 '20 at 13:46

1 Answers1

10

The CopyObject() command can be used to copy objects between buckets without having to upload/download. Basically, the two S3 buckets communicate with each other and transfer the data.

This command can also be used to copy between buckets that in different regions and different AWS accounts.

If you wish to copy between buckets that belong to different AWS accounts, then you will need to use a single set of credentials that have:

  • GetObject permission on the source bucket
  • PutObject permission on the destination bucket

Also, please note that the CopyObject() command is sent to the destination account. The destination bucket effectively pulls the objects from the source bucket.

From your description, your code is assuming a role from the other account to gain read permission on the source bucket. Unfortunately, this is not sufficient for the CopyObject() command because the command must be sent to the destination bucket. (Yes, it is a little hard to discern this from the documentation. That is why the source bucket is specifically named, rather than the destination bucket.)

Therefore, in your situation, to be able to copy the objects, you will need to use a set of credentials from Account-B (the destination) that also has permission to read from Bucket-A (the source). This will require the vendor to modify the Bucket Policy associated with Bucket-A.

If they do not wish to do this, then your only option is to download the objects using the assumed role and then separately upload the files to your own bucket using credentials from your own Account-B.

John Rotenstein
  • 165,783
  • 13
  • 223
  • 298