Skip to content

Instantly share code, notes, and snippets.

@sirtawast
Last active May 15, 2024 09:48
Show Gist options
  • Save sirtawast/34157927f4b3c4d2cc932de37f1277c4 to your computer and use it in GitHub Desktop.
Save sirtawast/34157927f4b3c4d2cc932de37f1277c4 to your computer and use it in GitHub Desktop.
AWS: Restore and change S3 storage class
#!/bin/bash
# (should not be run as shell script, the steps take so long)
# Restore/change AWS S3 files from Glacier storage class to STANDARD or STANDARD_IA and copy them in-place to get non-expiring ones
# There wasn't any instructions on how to do this easily so I decided to write them.
# In case you screw things up with lifecycle options etc.
# Procedure might take a while if you have many many files.
# There are three steps:
# 1. Install software
# 2. Restore files (and wait for them to be available)
# 3. Copy restored files in-place
# 1. Install software
brew install aws-cli
aws configure
brew install s3cmd
s3cmd --configure
# 2. Restore files
# Disable energy saving and screensaver, buckle up for a looong operation.
# Let's start by restoring Glacier storage class files to allow unlock downloads and other actions.
# Give a little room to play by setting week's expiry on restored files.
s3cmd restore --restore-days=7 --restore-priority=STANDARD s3://my-bucket/
# Haven't tried this but if you have many files and predictable indexing might help if these
# are run parallel on `screen` or terminal tabs
# s3cmd restore --restore-days=7 --restore-priority=STANDARD s3://my-bucket/2010/
# s3cmd restore --restore-days=7 --restore-priority=STANDARD s3://my-bucket/.../
# Now wait for 0.5-12 hrs depending on set restore priority ...
# 3. Copy files in-place
# Copy restore files in-place to replace them with normal bucket files with no expiry
# You can also `aws sync` files to a temp bucket if you don't feel like making it in-place
# Use --acl=public-read if files are to be public
aws s3 cp s3://my-bucket/ s3://my-bucket/ --recursive --storage-class=STANDARD --force-glacier-transfer
# Same parallel trick can be used here ^ to speed things up, and I've tested it
# Please make sure the files have been restored, there should not be any of these anymore:
# warning: Skipping file s3://my-bucket/2010/summery-aws-keyboard-cat-playing-rick-roll.jpg. Object is of storage class GLACIER. Unable to perform download operations on GLACIER objects. You must restore the object to be able to the perform operation. See aws s3 download help for additional parameter options to ignore or force these transfers.
# Now remember to configure/update/delete your lifecycle if files were put to Glacier too early
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment