Skip to content

Instantly share code, notes, and snippets.

@ChrisCinelli
Last active September 29, 2023 16:30
Show Gist options
  • Save ChrisCinelli/593100533a7dbb4474612f9adb8f8ff6 to your computer and use it in GitHub Desktop.
Save ChrisCinelli/593100533a7dbb4474612f9adb8f8ff6 to your computer and use it in GitHub Desktop.
Backup all indexes in Elastic Search (ES) in an archive
#!/bin/bash
# See https://gist.github.com/593100533a7dbb4474612f9adb8f8ff6 for more info
# This script backup data from a ES instance to disk and create a xz archive with the command to restore it
# You need to have jq ( https://stedolan.github.io/jq/ ) and elasticdump ( https://www.npmjs.com/package/elasticdump ) installed
# source ES instance
DEFAULT_ESR='http://locahost:9200' # Edit this
ESR=${1:-$DEFAULT_ESR} # Or just use $1 to add the Elastic Search url
# Tune this reg expression to filter the indexes you want to backup
DEFAULT_REGEX='.*'
REGEX=${2:-$DEFAULT_REGEX}
DEFAULT_CMD='elasticdump' # You can use your own clone of elasticdump. Ex: '/Users/youruser/my-projects/elasticsearch-dump/bin/elasticdump'
CMD=${3:-$DEFAULT_CMD}
######################
# Backup!
######################
BACKUP_FOLDER_BASE=$(mktemp -d 2>/dev/null || mktemp -d -t 'es-backup')
BACKUP_FOLDER="${BACKUP_FOLDER_BASE}/backup"
RESTORE_FILE="$BACKUP_FOLDER/restore.sh"
mkdir -p $BACKUP_FOLDER
echo "# $0 [<elastic_search_url>]" >> $RESTORE_FILE
echo "# If <elastic_search_url> is not specified, it will use '${ESR}'" >> $RESTORE_FILE
echo "# elasticdump ( https://www.npmjs.com/package/elasticdump ) needs to be installed." >> $RESTORE_FILE
echo "# See https://gist.github.com/593100533a7dbb4474612f9adb8f8ff6 for more info" >> $RESTORE_FILE
echo "" >> $RESTORE_FILE
echo "# set -x # Uncomment this to debug" >> $RESTORE_FILE
echo "ES_DEFAULT_INSTANCE='${ESR}';" >> $RESTORE_FILE
echo "ES_INSTANCE=\"\${1:-\$ES_DEFAULT_INSTANCE}\";" >> $RESTORE_FILE
echo "" >> $RESTORE_FILE
chmod 755 $RESTORE_FILE
######################
# Backup indexes
######################
duplicate_index() {
local in=$1
local out=$2
local index=$3
shift
shift
shift
for i; do
local folder="$out/${index}";
local full_in="${in}/${index}";
local full_out="${folder}/${i}";
mkdir -p $folder
echo " ----> Backing up '${in}/${index}' ('$i')"
$CMD \
--limit=9999 \
--input=${full_in} \
--output=${full_out} \
--type=$i
if [ $? -eq 0 ]; then
echo "$CMD --input=\"./${index}/${i}\" --output=\"\$ES_INSTANCE/${index}\" --type=$i;" >> $RESTORE_FILE
else
echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"
echo " !!! Failed to backup '${index}' ('$i')"
echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"
fi
done
}
indexes=$(curl $ESR/_aliases?pretty=1 | jq --raw-output 'keys | .[]')
echo "$indexes" | while read -r current_index; do
if [[ $current_index =~ $REGEX ]]
then
echo " --> Coping '$current_index'";
duplicate_index "$ESR" "$BACKUP_FOLDER" "$current_index" analyzer settings mapping alias data
else
echo " !!--> Skipping '$current_index' that does not match the REGEX '$REGEX'";
fi
done;
echo "" >> $RESTORE_FILE
############
# Archive
############
TODAY=$(date +%Y-%m-%d-%H-%M)
CURRENT_FOLDER=$(pwd)
ARCHIVE_NAME="es-backup-${TODAY}.tar.xz"
echo "Done dumping from ES. Creating a compressed file '$ARCHIVE_NAME'."
cd "$BACKUP_FOLDER_BASE"
XZ_OPT=-9 tar cJf "${CURRENT_FOLDER}/$ARCHIVE_NAME" .
cd "$CURRENT_FOLDER"
echo "Done archiving. You can decompress it with 'tar PxJf $ARCHIVE_NAME' and use 'restore.sh [<elastic_search_url>]' to restore it"
rm -rf "$BACKUP_FOLDER"
@sabbir28
Copy link

Can you explain this code ........

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment