Skip to content

Instantly share code, notes, and snippets.

@satra
Created March 15, 2011 01:23
Show Gist options
  • Select an option

  • Save satra/870177 to your computer and use it in GitHub Desktop.

Select an option

Save satra/870177 to your computer and use it in GitHub Desktop.
runs a simple spm job on the sge cluster
"""Example script that executes via SGE qsub
"""
import nipype.pipeline.engine as pe
import nipype.interfaces.matlab as mlab
import nipype.interfaces.spm as spm
import os
mlab.MatlabCommand.set_default_matlab_cmd("matlab -nodesktop -nosplash")
mlab.MatlabCommand.set_default_paths('/software/spm8')
segment = pe.Node(interface=spm.Segment(), name="segment")
# ensure that the folder data exists and there is a T1.nii file there
segment.inputs.data = os.path.abspath("data/T1.nii")
segment.inputs.gm_output_type = [True, True, True]
segment.inputs.wm_output_type = [True, True, True]
smooth_gm = pe.Node(interface=spm.Smooth(), name="smooth_gm")
workflow = pe.Workflow(name="workflow_cleanup_test")
workflow.base_dir = os.getcwd()
workflow.connect(segment, 'native_gm_image', smooth_gm, 'in_files')
# -V ensures environment variables are sent over and -q chooses the queue
workflow.config.update(sgeargs='-V -q testqueue')
# run the workflow - assumes you can execute qsub from the
# current machine and that the filesystem is shared between
# the current machine and the SGE cluster.
workflow.run(plugin='SGE')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment