Last active
October 26, 2017 10:33
-
-
Save etiennemarais/60366402f024376504340defb912412f to your computer and use it in GitHub Desktop.
Compressing a big folder and wanting progress
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
sudo su | |
tar cf - <folder> -P | pv -s $(du -sb <folder> | awk '{print $1}') | gzip > data.tar.gz | |
# Outputs something like this | |
# 3.06GiB 0:03:01 [15.8MiB/s] [========> ] 5% ETA 0:57:05 | |
# | |
# I recently needed to perform a data recovery on a bad migration where I deleted the wrong stuff. I tried to recover data | |
# from a mysql 5.5 dump to a mysql 5.7 ubuntu 16.04 google cloud box and even though I could edit the data and fix the deleted | |
# rows in the tables, I coulnd't export it well enough since incompatibilities between 5.5 and 5.7 :( | |
# | |
# I am using this command to copy the fixed data out, so I don't have to redo 2 days of work. I will save the zip somethere | |
# and restore using a ubuntu 14.04 and mysql 5.5 and try and use that to get the mysql dump .sql file with. | |
# Extracting side | |
sudo apt-get install pv # progress bar dependency | |
pv /path/to/data.tar.gz | tar xzf - -C . | |
# Outputs something like this | |
# 3.06GiB 0:03:01 [15.8MiB/s] [========> ] 5% ETA 0:57:05 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Also don't work in
/tmp
if you reboot the server you lose your files. :(