19

I use the data processing pipeline constructed of

S3 + SNS + Lambda

becasue S3 can not send notificaiton out of its storage region so I made use of SNS to send S3 notification to Lambda in other region.

The lambda function coded with

from __future__ import print_function
import boto3


def lambda_handler (event, context):
    input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
    input_file_key = event["Records"][0]["s3"]["object"]["key"]

    input_file_name = input_file_bucket+"/"+input_file_key

    s3=boto3.resource("s3")
    obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
    response = obj.get()

    return event #echo first key valuesdf

when I ran save and test, I got the following error

    {
  "stackTrace": [
    [
      "/var/task/lambda_function.py",
      20,
      "lambda_handler",
      "response = obj.get()"
    ],
    [
      "/var/runtime/boto3/resources/factory.py",
      394,
      "do_action",
      "response = action(self, *args, **kwargs)"
    ],
    [
      "/var/runtime/boto3/resources/action.py",
      77,
      "__call__",
      "response = getattr(parent.meta.client, operation_name)(**params)"
    ],
    [
      "/var/runtime/botocore/client.py",
      310,
      "_api_call",
      "return self._make_api_call(operation_name, kwargs)"
    ],
    [
      "/var/runtime/botocore/client.py",
      395,
      "_make_api_call",
      "raise ClientError(parsed_response, operation_name)"
    ]
  ],
  "errorType": "ClientError",
  "errorMessage": "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied"
}

I configured the lambda Role with

full S3 access

and set bucket policy on my target bucket

everyone can do anything(list, delete, etc.)

It seems that I haven't set policy well.

Hello lad
  • 13,374
  • 29
  • 103
  • 174

6 Answers6

18

I had a similar problem, I solved it by attaching the appropriate policy to my user.

IAM -> Users -> Username -> Permissions -> Attach policy.

Also make sure you add the correct access key and secret access key, you can do so using AmazonCLI.

Amri
  • 884
  • 7
  • 14
15

Omuthu's answer actually correctly identified my problem, but it didn't provide a solution so I thought I'd do that.

It's possible that when you setup your permissions in IAM you made something like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        }
    ]
}

Unfortunately, that's not correct. You need to apply the Object permissions to the objects in the bucket. So it has to look like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::test/*"
            ]
        }
    ]
}

Note the second ARN witht the /* at the end of it.

Rob Rose
  • 1,070
  • 15
  • 36
6

Possibility of the specific S3 object which you are looking for is having limited permissions

  1. S3 object level permission for read is denied
  2. The role attached to lambda does not have permission to get/read S3 objects
  3. If access granted using S3 bucket policy, verify read permissions are provided
omuthu
  • 4,069
  • 18
  • 29
  • 5
    This is pretty vague. Could you possibly point out how one might address this issue? – Patrick Perini Jul 05 '16 at 21:09
  • 2
    Two possibilities 1. S3 object level permission for read is denied 2. The role attached to lambda does not have permission to get/read S3 objects – omuthu Jul 08 '16 at 07:58
  • For me helped adding of `s3:GetObject` to the policy. – Vitaly Zdanevich Oct 04 '17 at 15:29
  • I had to write a bucket policy to grant access. I had to add to bucket policy: ` { "Version": "2012-10-17", "Statement": [ { "Sid": "Allow All", "Effect": "Allow", "Principal": { "AWS": [ "arn:aws:iam:::user/" ] }, "Action": [ "s3:GetObject", "s3:PutObject", "s3:PutObjectAcl" ], "Resource": "arn:aws:s3:::/*" } ] } ` – Pedro Dec 01 '18 at 18:20
  • Also worth pointing out that AWS S3 returns Access Denied for objects that don't exist so as to not reveal whether an object exists or not... – DrCord Aug 27 '20 at 22:21
6

Adding to Amri's answer, if your bucket is private and you have the credentials to access it you can use the boto3.client:

import boto3
s3 = boto3.client('s3',aws_access_key_id='ACCESS_KEY',aws_secret_access_key='SECRET_KEY')
response = s3.get_object(Bucket='BUCKET', Key='KEY')

*For this file: s3://bucket/a/b/c/some.text, Bucket is 'bucket' and Key is 'a/b/c/some.text'

---EDIT---

You can easily change the script to accept keys as environment variables for instance so they are not hardcoded. I left it like this for simplicity

Tal Joffe
  • 3,844
  • 2
  • 21
  • 27
4

I had similar problem, the difference was the bucket was encrypted in KMS key.

Fixed with: IAM -> Encryption keys -> YOUR_AWS_KMS_KEY -> to your policy or account

MiaeKim
  • 1,433
  • 16
  • 24
1

In my case - the Lambda which I was running had a role blahblahRole and this blahblahRole didn't have the permission on S3 bucket.

VIPIN KUMAR
  • 2,692
  • 1
  • 14
  • 28