3

We're using Boto3 to connect to an SQS queue. In testing/production this is fine, but for local development I'm using a version of fake_sqs.

Rather than somehow fake real AWS endpoints (like overriding the hostname) I've opted to use a fake region (imaginatively called test)

Other parts of our code use the earlier boto library (as opposed to Boto3). With this it's possible to override the region's endpoint host names by specifying an environment variable named BOTO_ENDPOINTS. This contains a path to a file that contains Boto config for regions. For local testing, it contains just one

{ "sqs": { "test": "fake_sqs" } }

Boto3, however, ignores such an envvar. It loads the data/endpoints.json file stored in the botocore package and, when connecting to our fake region, we get the error EndpointConnectionError: Could not connect to the endpoint URL: "https://test.queue.amazonaws.com/".

The codebase we've inherited has embedded the configuration of the queues pretty deep, so we can't simply tell the connection to use a specific endpoint (like this thread suggests).

So, does anyone know of a way to get Boto3 to use a different hostname without changing the code that we have to setup the SQS connection?

otupman
  • 1,179
  • 9
  • 17

1 Answers1

3

You can drop in a custom endpoint file in ~/.aws/models to replace the default one provided by botocore.

Additionally, you can use the AWS_DATA_PATH environment variable if you have data overrides in another location.

Steve Buzonas
  • 4,245
  • 27
  • 53