Last active
September 17, 2019 11:45
-
-
Save rkhozinov/a40fa8cdcdf07a76ff30d98256ed258e to your computer and use it in GitHub Desktop.
Prepare a csv file for s3 buckets size list
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env bash | |
buckets=($(aws --region us-east-1 s3api list-buckets --output text --query 'Buckets[].Name')) | |
echo "bucket name,size(Bytes)" > buckets_storage.csv | |
for bucket in "${buckets[@]}"; do | |
size=$(aws cloudwatch get-metric-statistics \ | |
--region us-east-1 \ | |
--metric-name BucketSizeBytes \ | |
--namespace AWS/S3 \ | |
--start-time 2019-09-12T00:00:00Z \ | |
--end-time 2019-09-12T00:01:00Z \ | |
--statistics Maximum \ | |
--unit Bytes \ | |
--dimensions Name=BucketName,Value=${bucket} Name=StorageType,Value=StandardStorage \ | |
--period 86400 \ | |
--output text \ | |
--query 'Datapoints[].Maximum') | |
echo "${bucket} ${size} Bytes" | |
echo "${bucket},${size}" >> buckets_storage.csv | |
done |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment