Skip to content

Instantly share code, notes, and snippets.

@jjtroberts
Last active October 12, 2022 20:20
Show Gist options
  • Select an option

  • Save jjtroberts/e83bd081619389f969ac40e7b9e2c2be to your computer and use it in GitHub Desktop.

Select an option

Save jjtroberts/e83bd081619389f969ac40e7b9e2c2be to your computer and use it in GitHub Desktop.
Route53 zone export
#!/usr/bin/env bash
# Enter Bash "strict mode"
set -o errexit # Exit immediately on any non-zero error exit status
set -o nounset # Trigger error when expanding unset variables
set -o pipefail # Prevent errors in a pipeline from being masked
IFS=\$'\n\t' # Internal Field Separator controls Bash word splitting
# Declare backup path & master zone files
BACKUP_PATH="$(date +%F)"
ZONES_FILE="all-zones.txt"
DNS_FILE="all-dns.txt"
echo "Backing up Route53: ${BACKUP_PATH}"
# Create date-stamped backup directory and enter it
mkdir -p "$BACKUP_PATH"
cd "$BACKUP_PATH"
# Create a list of all hosted zones
cli53 list --debug --format text > "$ZONES_FILE" 2>&1
# Create a list of domain names only
sed '/Name:/!d' "$ZONES_FILE" | cut -d: -f2 | sed 's/^..//' | sed 's/.\{3\}$//' > "$DNS_FILE"
# Create backup files for each domain
while read -r line; do
sleep 5
cli53 export --debug --full "$line" > "$line.txt"
done < "$DNS_FILE"
cd ../
tar czvf "${BACKUP_PATH}.tgz" $BACKUP_PATH
aws s3 cp $BACKUP_PATH.tgz "s3://${BACKUP_PATH}.tgz"
rm -rf $BACKUP_PATH
# Prune any tgz files older than 30 days
find *.tgz -type f -mtime +30 -exec rm -f {} \;
# Exit Bash "strict mode"
set +o errexit
set +o nounset
set +o pipefail
exit 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment