-
-
Save pad92/2875906 to your computer and use it in GitHub Desktop.
#!/bin/sh | |
# Run it into backup directory. | |
# export file format : base-table....sql.gz | |
for FILE in $(ls -1 *.sql.gz) ; do | |
DATABASE=$(echo $FILE | cut -d\- -f1) | |
TABLE=$(echo $FILE | cut -d\- -f2 | cut -d\. -f1) | |
PS=$(mysql -e 'show processlist\G' | grep ^Command | wc -l) | |
while [ $PS -gt 100 ]; do | |
sleep 5 | |
PS=$(mysql -e 'show processlist\G' | grep ^Command | wc -l) | |
done | |
mysql -e "TRUNCATE TABLE $DATABASE.$TABLE" | |
zcat $FILE | mysql $DATABASE & | |
done |
Why 100 processes? What is the limiting factor or how did you come to that number?
to avoid overloading the server
This is more like one table per thread and not multiple threads for the same table.
It's an old gist, but if you add wait
at the end of the command, the command does not finish until all tables have been imported. Without the wait
there could still be running background processes, which you don't know about.
Also if you run this over an ssh terminal, the subprocesses will be stopped when you disconnect your ssh session.
It's an old gist, but if you add
wait
at the end of the command, the command does not finish until all tables have been imported. Without thewait
there could still be running background processes, which you don't know about. Also if you run this over an ssh terminal, the subprocesses will be stopped when you disconnect your ssh session.
yes, you can :)
Why 100 processes? What is the limiting factor or how did you come to that number?