Note: this page was originally written in pmWiki for the wiki of the Biomedia group.
It was converted to markdown using an HTML-to-markdown converter and a markdown editor.
See the official documentation for more info about SLURM.
the --wrap
option allows you to pass one-liners to sbatch (see sbatch -h
):
--wrap[=command string] wrap commmand string in a sh script and submit
It can be used to call sbatch for each subject within a for loop:
for i in {0..54}
do
sbatch --mem=5G -c 10 --wrap="python leaveoneout_segmentation.py $i --n_jobs=10"
done
Alternatively, this could have been done with arrays: sbatch --array=0-55
and then relying on the SLURM_ARRAY_TASK_ID environment to retrieve your parameter value.
You can use a function such as:
import subprocess
__all__ = [ "sbatch" ]
def sbatch( cmd, mem=5, c=10, verbose=False, dryrun=False ):
sbatch_cmd = 'sbatch --mem=' + str(mem) + 'G -c ' + str(c) + ' --wrap="' + cmd + '"'
if verbose or dryrun:
print sbatch_cmd
if dryrun:
return
Launch an interactive session through SLURM:
function ssh_slurm {
ntasks=${1:-8}
mem=${2:-32}
ssh [email protected] -t "salloc -p interactive -c${ntasks} --mem=${mem}G srun -c${ntasks} --mem=${mem}G --pty /bin/bash"
}
Launch the ipython notebook web server, retrieve the name of the machine assigned to it, setup an ssh tunnel and finally open a web browser:
function notebook {
folder=${1:-/vol/medic02/users/kpk09}
ncpus=${2:-8}
ntasks=${3:-10}
mem=${4:-32}
port1=${5:-8888}
port2=${6:-8890}
# beware of [ -z "$PS1" ] && return at the beginning of your .bashrc
# (hence .bash_setup)
# see [https://github.com/ipython/ipython/issues/2426/](https://github.com/ipython/ipython/issues/2426/)
# for IPYTHONDIR and the crash on a network folder
job_id=`echo "source ~/.bash_setup && sbatch -p interactive --mem=${mem}G -c${ncpus} --wrap='cd $folder && export IPYTHONDIR=/tmp && ipython notebook --no-browser --port=${port1}'" | ssh [email protected] "/bin/bash -s" | perl -ne '@a =split(" ");print \$a[-1]'`
if [ -z "${job_id}" ]; then
echo "sbatch failed"
return
fi
echo "job id: ${job_id}"
# sleep to give ipython the time to get started
sleep 10
node_id=`ssh [email protected] "squeue -u kpk09 -o %N -j ${job_id}" | tail -1`
if [ -z "${node_id}" ]; then
echo "could not obtain node id"
return
fi
echo "node id: ${node_id}:${port1}"
# setup SSH tunnel
ssh -N -f -L localhost:${port2}:localhost:${port1} kpk09@${node_id}.doc.ic.ac.uk
echo "url: [http://localhost:$](http://localhost:$){port2}/"
chromium-browser [http://localhost:$](http://localhost:$){port2}/
}
On a Mac, you can replace the last line by:
open -a Safari [http://localhost:$](http://localhost:$){port2}/
Run scancel
with or without argument:
function scancel {
ssh [email protected] "scancel $@"
}
Simple wrapper for the whole command line:
function mysbatch {
sbatch --mem=5G -c 10 --wrap="$*"
}
for i in `squeue -u kpk09 | grep PD | perl -ne 's/\s+/ /g; @a=split(" ",$_); print $a[0]." ";'`;
do
scontrol update JobId=$i Partition=long;
done