0

I'm writing a DAG that makes use of RedShiftToS3Operator but I'm implementing the code of the operator instead of using the operator itself, only modification is specifying schema in PostgresHook. Airflow is running on an EC2 instance and has the aws_default connection defined. This is properly configured in AWS with a proper I_AM role, and yet I get the following error:

error:  S3ServiceException:The AWS Access Key Id you provided does not exist in our records.,Status 403,Error InvalidAccessKeyId.

The region is specified in the connection in the extras field if anyone's wondering. I've tried solutions mentioned here and here but I still keep facing this issue. I do not want to expose the credentials in the UI or put them in a file and put them in the EC2 instance. Any help is much appreciated, thank you!

Navaneeth
  • 9
  • 4

1 Answers1

0

This happens not because of incorrect AWS_CONN_ID for the operator or setting up of the connection, rather due to a bug in the RedshiftToS3Transfer operator. You need to supply in token=credentials.tokenwith other credentials in the unload query.

More here

Navaneeth
  • 9
  • 4