I'm writing a DAG that makes use of RedShiftToS3Operator but I'm implementing the code of the operator instead of using the operator itself, only modification is specifying schema
in PostgresHook
.
Airflow is running on an EC2 instance and has the aws_default
connection defined. This is properly configured in AWS with a proper I_AM
role, and yet I get the following error:
error: S3ServiceException:The AWS Access Key Id you provided does not exist in our records.,Status 403,Error InvalidAccessKeyId.
The region is specified in the connection in the extras
field if anyone's wondering.
I've tried solutions mentioned here and here but I still keep facing this issue.
I do not want to expose the credentials in the UI or put them in a file and put them in the EC2 instance.
Any help is much appreciated, thank you!