I have a bunch of s3 folders for different projects/clients and I would like to estimate total size (so I can for instance consider reducing sizes/cost). What is a good way to determine this?
Asked
Active
Viewed 447 times
0
-
Similar to: [How to find size of a folder inside an S3 bucket?](https://stackoverflow.com/q/49759940/174777) – John Rotenstein Mar 30 '20 at 03:33
-
Sub-folders too, or just top-level folders? – John Rotenstein Mar 30 '20 at 03:34
3 Answers
0
I can do this with a combination of Python and the AWS client:
import os
bucket_rows = os.popen('aws s3 ls').split(chr(10))
sizes = dict()
for bucket in bucket_rows:
buck = bucket.split(' ')[-1] # the full row contains additional information
cmd = f"aws s3 ls --summarize --human-readable --recursive s3://{buck}/ | grep 'Total'"
sizes[buck] = os.popen(cmd).read()
![](../../users/profiles/4650316.webp)
Sergio Lucero
- 700
- 9
- 15
0
As stated here AWS CLI natively supports --query parameter with can determine the size of every object in S3 bucket.
aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"
I hope it helps.
![](../../users/profiles/4774816.webp)
Vaibhav Jain
- 1,505
- 2
- 15
- 31
0
If you want to check via console.
- If you mention a folder by folder and not bucket, then just select that object go to "action" drop down and select "Get total size"
- If you mean bucket by folder , then go to management tab and in that go to metrics it will show entire bucket size
![](../../users/profiles/11595451.webp)
Abhishek Kumar
- 85
- 6