Questions tagged [boto3]

Boto 3 - The Amazon Web Services (AWS) SDK for Python

About

Boto is the Amazon Web Services (AWS) software development kit (SDK) for Python, which allows Python developers to write software that makes use of services like Amazon S3 and Amazon EC2. Boto provides an easy to use, object-oriented API as well as low-level direct service access.

Links

5549 questions
1
vote
1 answer

How to pull logs from an API and store them in AWS S3

I am trying to achieve this through a Python program. I am able to pull logs from API to a local drive. However I am struggling to copy them to AWS S3. I appreciate your help on this. I am using the code below to copy files to a local…
kris
  • 11
  • 2
1
vote
0 answers

How to access secret manager with boto3 and rds-data

In a chalice route, I'm using boto3 to execute queries against my RDS Aurora Serverless DB cluster. This works as expected locally ($ chalice local) but when deployed on Lamdba I receive the error: An error occurred (BadRequestException) when…
1
vote
1 answer

Can't write to writable folder /tmp/ on S3 bucket

I've tried writing a file to writable /tmp/ folder inside my bucket with a lamda function but got AccessDenied error. This is weird since I can do it by calling the lambda function locally. Below is the code for the lambda function: import…
An Nguyen
  • 87
  • 1
  • 11
1
vote
2 answers

boto3: config profile could not be found

I'm testing my lambda function wrapped in a docker image and provided environment variable AWS_PROFILE=my-profile for the lambda function. However, I got an error : "The config profile (my-profile) could not be found" while this information is there…
An Nguyen
  • 87
  • 1
  • 11
1
vote
0 answers

Error code for try/catch with boto3 S3 upload

I have a function that I use to upload files/folders to S3 using boto3. I want to have a try/catch that handles errors if something fails to upload.: import os import boto3 import botocore def upload_files(key, secret, bucket, s3_path,…
mlenthusiast
  • 790
  • 5
  • 21
1
vote
0 answers

I get a 0 Byte file when I upload to S3

When I programatically upload a file to my S3 Bucket I end up with a 0 Byte file but the name of the file and the folder is good. myfile = request.files["myfile"] myfile.seek(0, os.SEEK_END) s3.upload_fileobj( myfile, …
John doe
  • 3,025
  • 4
  • 20
  • 44
1
vote
2 answers

python installing boto3 - could not find a version even though pypi shows th index

I got the following error when trying to install boto3 from python on my mac pip3 install boto Looking in indexes: https://pypi.org ERROR: Could not find a version that satisfies the requirement boto (from versions: none) ERROR: No matching…
Rajesh Chamarthi
  • 17,862
  • 2
  • 35
  • 65
1
vote
1 answer

Get an AWS Table Resource using the low-level Client

My goal is to get similar object that I get through: dynamodb = boto3.resource(service_name='dynamodb') self._table = dynamodb.Table(name) but through the Client object. How can I reference a DynamoDB table through the low-level Client…
Michael Sidoroff
  • 1,331
  • 4
  • 13
  • 25
1
vote
2 answers

What is the best approach to sync data from AWS 3 bucket to Azure Data Lake Gen 2

Currently, I download csv files from AWS S3 to my local computer using: aws s3 sync s3:// c:/ --profile aws_profile. Now, I would like to use the same process to sync the files from AWS to Azure Data Lake Storage…
1
vote
1 answer

Boto3 and DynamoDB - How to mimic Aggregation

I have a table in DynamoDB in the below format: DeviceId (PK) SensorDataType SensorValue CurrentTime (SK) BSMD002 HeartRate 86 2021-03-13 14:50:17.292663 BSMD002 HeartRate 106 2021-03-13 14:50:17.564644 BSMD002 HeartRate 97 2021-03-13…
1
vote
2 answers

Python - How to query DynamoDB on GSI using primary and sort key

Well, as the title suggest I want to query on my DynamoDB table using GSI with the primary key and sort key (both from the GSI). I tried some ways to do it, but any success. I have a table with the url-date-index, the url is the primary key from…
1
vote
1 answer

Issues while working with Amazon Aurora Database

My Requirments: I want to store real-time events data coming from e-commerce websites into a database In parallel to storing the data, i want to access the events data from a database I want to perform some sort of ad-hoc analysis(SQL) Using some…
1
vote
1 answer

How to read Txt file from S3 Bucket using Python And Boto3

How to read Txt file from S3 Bucket using Python And Boto3 I am using the below script which is working very well I am able to see instance name which is in the S3 bucket import boto3 import codecs access_key = "XXXXXXXXXXXX" secret_key =…
Rahul
  • 107
  • 9
1
vote
1 answer

How to iterate through nested Dictionary in python boto3

Hi want to find only stopped instance and their ids below is the below-nested JSON { "Reservations": [ { "Groups": [], "Instances": [ { "AmiLaunchIndex": 0, "Architecture": "x86_64", "BlockDeviceMappings": [ { …
Rahul
  • 107
  • 9
1
vote
1 answer

AWS boto3 InvalidAccessKeyId when using IAM role

I upload to and download from S3 with a presigned post/url. The presigned url/post are generated with boto3 in the Lambda function (it is deployed with zappa). While I add my AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID as env variable works…
1 2 3
99
100