$ sudo pip install awscli (or: sudo apt-get install awscli)
$ aws configure
You'll need to fill the following settings:
AWS Access Key ID [None]:
AWS Secret Access Key [None]:
Default region name [None]:
Default output format [None]:
$ aws s3 sync s3://bucket_name .
$ aws s3 sync s3://bucket_name/some/path/ local/path/ --exclude "*" --include "*string*" --dryrun
$ aws s3api list-objects --bucket "bucket_name" --prefix "some/prefix/path/" --query "Contents[?LastModified>='yyyy-mm-dd].{Key: Key}"
aws s3 ls s3://bucket_name/some/path/to/files/ --recursive --human-readable --summarize > output_file.txt
$ aws s3 sync files/to/upload/path s3://bucket_name
Source Bucket policy statements:
{
"Sid": "Stmt1357935647218",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
},
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::SourceBucket"
},
{
"Sid": "Stmt1357935676138",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::SourceBucket/*"
},
Destination Bucket policy statements:
{
"Sid": "Stmt1357935647218",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
},
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::DestinationBucket"
},
{
"Sid": "Stmt1357935676138",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
},
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::DestinationBucket/*"
}
Once you set these policies, just run:
$ aws s3 cp s3://SourceBucket/ s3://DestinationBucket/ --recursive
- The
--recursive
argument will download everything inside the specified folder. Without this argument, you'd have to download each file, one by one.