nipype.interfaces.ants.registration module

The ants module provides basic functions for interfacing with ants functions.

ANTS

Link to code

Bases: ANTSCommand

Wrapped executable: ANTS.

ANTS wrapper for registration of images (old, use Registration instead)

Examples

>>> from nipype.interfaces.ants import ANTS
>>> ants = ANTS()
>>> ants.inputs.dimension = 3
>>> ants.inputs.output_transform_prefix = 'MY'
>>> ants.inputs.metric = ['CC']
>>> ants.inputs.fixed_image = ['T1.nii']
>>> ants.inputs.moving_image = ['resting.nii']
>>> ants.inputs.metric_weight = [1.0]
>>> ants.inputs.radius = [5]
>>> ants.inputs.transformation_model = 'SyN'
>>> ants.inputs.gradient_step_length = 0.25
>>> ants.inputs.number_of_iterations = [50, 35, 15]
>>> ants.inputs.use_histogram_matching = True
>>> ants.inputs.mi_option = [32, 16000]
>>> ants.inputs.regularization = 'Gauss'
>>> ants.inputs.regularization_gradient_field_sigma = 3
>>> ants.inputs.regularization_deformation_field_sigma = 0
>>> ants.inputs.number_of_affine_iterations = [10000,10000,10000,10000,10000]
>>> ants.cmdline
'ANTS 3 --MI-option 32x16000 --image-metric CC[ T1.nii, resting.nii, 1, 5 ] --number-of-affine-iterations 10000x10000x10000x10000x10000 --number-of-iterations 50x35x15 --output-naming MY --regularization Gauss[3.0,0.0] --transformation-model SyN[0.25] --use-Histogram-Matching 1'
fixed_imagea list of items which are a pathlike object or string representing an existing file

Image to which the moving image is warped.

metric : a list of items which are ‘CC’ or ‘MI’ or ‘SMI’ or ‘PR’ or ‘SSD’ or ‘MSQ’ or ‘PSE’ metric_weight : a list of items which are a float

The metric weight(s) for each stage. The weights must sum to 1 per stage. Requires inputs: metric. (Nipype default value: [1.0])

moving_imagea list of items which are a pathlike object or string representing an existing file

Image to apply transformation to (generally a coregisteredfunctional). Maps to a command-line argument: %s.

output_transform_prefixa unicode string

Maps to a command-line argument: --output-naming %s. (Nipype default value: out)

radiusa list of items which are an integer (int or long)

Radius of the region (i.e. number of layers around a voxel/pixel) that is used for computing cross correlation. Requires inputs: metric.

transformation_model‘Diff’ or ‘Elast’ or ‘Exp’ or ‘Greedy Exp’ or ‘SyN’

Maps to a command-line argument: %s.

affine_gradient_descent_optiona list of items which are a float

Maps to a command-line argument: %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

delta_timea float

Requires inputs: number_of_time_steps.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 1).

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

gradient_step_lengtha float

Requires inputs: transformation_model.

mi_optiona list of items which are an integer (int or long)

Maps to a command-line argument: --MI-option %s.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

number_of_affine_iterationsa list of items which are an integer (int or long)

Maps to a command-line argument: --number-of-affine-iterations %s.

number_of_iterationsa list of items which are an integer (int or long)

Maps to a command-line argument: --number-of-iterations %s.

number_of_time_stepsan integer (int or long)

Requires inputs: gradient_step_length.

regularization‘Gauss’ or ‘DMFFD’

Maps to a command-line argument: %s.

regularization_deformation_field_sigmaa float

Requires inputs: regularization.

regularization_gradient_field_sigmaa float

Requires inputs: regularization.

smoothing_sigmasa list of items which are an integer (int or long)

Maps to a command-line argument: --gaussian-smoothing-sigmas %s.

subsampling_factorsa list of items which are an integer (int or long)

Maps to a command-line argument: --subsampling-factors %s.

symmetry_typea float

Requires inputs: delta_time.

use_histogram_matchinga boolean

Maps to a command-line argument: %s. (Nipype default value: True)

affine_transforma pathlike object or string representing an existing file

Affine transform file.

inverse_warp_transforma pathlike object or string representing an existing file

Inverse warping deformation field.

metaheadera pathlike object or string representing an existing file

VTK metaheader .mhd file.

metaheader_rawa pathlike object or string representing an existing file

VTK metaheader .raw file.

warp_transforma pathlike object or string representing an existing file

Warping deformation field.

CompositeTransformUtil

Link to code

Bases: ANTSCommand

Wrapped executable: CompositeTransformUtil.

ANTs utility which can combine or break apart transform files into their individual constituent components.

Examples

>>> from nipype.interfaces.ants import CompositeTransformUtil
>>> tran = CompositeTransformUtil()
>>> tran.inputs.process = 'disassemble'
>>> tran.inputs.in_file = 'output_Composite.h5'
>>> tran.cmdline
'CompositeTransformUtil --disassemble output_Composite.h5 transform'
>>> tran.run()  

example for assembling transformation files

>>> from nipype.interfaces.ants import CompositeTransformUtil
>>> tran = CompositeTransformUtil()
>>> tran.inputs.process = 'assemble'
>>> tran.inputs.out_file = 'my.h5'
>>> tran.inputs.in_file = ['AffineTransform.mat', 'DisplacementFieldTransform.nii.gz']
>>> tran.cmdline
'CompositeTransformUtil --assemble my.h5 AffineTransform.mat DisplacementFieldTransform.nii.gz '
>>> tran.run()  
in_filea list of items which are a pathlike object or string representing an existing file

Input transform file(s). Maps to a command-line argument: %s... (position: 3).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_filea pathlike object or string representing a file

Output file path (only used for disassembly). Maps to a command-line argument: %s (position: 2).

output_prefixa unicode string

A prefix that is prepended to all output files (only used for assembly). Maps to a command-line argument: %s (position: 4). (Nipype default value: transform)

process‘assemble’ or ‘disassemble’

What to do with the transform inputs (assemble or disassemble). Maps to a command-line argument: --%s (position: 1). (Nipype default value: assemble)

affine_transforma pathlike object or string representing a file

Affine transform component.

displacement_fielda pathlike object or string representing a file

Displacement field component.

out_filea pathlike object or string representing a file

Compound transformation file.

MeasureImageSimilarity

Link to code

Bases: ANTSCommand

Wrapped executable: MeasureImageSimilarity.

Examples

>>> from nipype.interfaces.ants import MeasureImageSimilarity
>>> sim = MeasureImageSimilarity()
>>> sim.inputs.dimension = 3
>>> sim.inputs.metric = 'MI'
>>> sim.inputs.fixed_image = 'T1.nii'
>>> sim.inputs.moving_image = 'resting.nii'
>>> sim.inputs.metric_weight = 1.0
>>> sim.inputs.radius_or_number_of_bins = 5
>>> sim.inputs.sampling_strategy = 'Regular'
>>> sim.inputs.sampling_percentage = 1.0
>>> sim.inputs.fixed_image_mask = 'mask.nii'
>>> sim.inputs.moving_image_mask = 'mask.nii.gz'
>>> sim.cmdline
'MeasureImageSimilarity --dimensionality 3 --masks ["mask.nii","mask.nii.gz"] --metric MI["T1.nii","resting.nii",1.0,5,Regular,1.0]'
fixed_imagea pathlike object or string representing an existing file

Image to which the moving image is warped.

metric‘CC’ or ‘MI’ or ‘Mattes’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’

Maps to a command-line argument: %s.

moving_imagea pathlike object or string representing an existing file

Image to apply transformation to (generally a coregistered functional).

radius_or_number_of_binsan integer (int or long)

The number of bins in each stage for the MI and Mattes metric, or the radius for other metrics. Requires inputs: metric.

sampling_percentage0.0 <= a floating point number <= 1.0

Percentage of points accessible to the sampling strategy over which to optimize the metric. Requires inputs: metric.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension2 or 3 or 4

Dimensionality of the fixed/moving image pair. Maps to a command-line argument: --dimensionality %d (position: 1).

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

fixed_image_maska pathlike object or string representing an existing file

Mask used to limit metric sampling region of the fixed image. Maps to a command-line argument: %s.

metric_weighta float

The “metricWeight” variable is not used. Requires inputs: metric. (Nipype default value: 1.0)

moving_image_maska pathlike object or string representing an existing file

Mask used to limit metric sampling region of the moving image. Requires inputs: fixed_image_mask.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

sampling_strategy‘None’ or ‘Regular’ or ‘Random’

Manner of choosing point set over which to optimize the metric. Defaults to “None” (i.e. a dense sampling of one sample per voxel). Requires inputs: metric. (Nipype default value: None)

similarity : a float

MeasureImageSimilarity.aggregate_outputs(runtime=None, needed_outputs=None)

Collate expected outputs and apply output traits validation.

Registration

Link to code

Bases: ANTSCommand

Wrapped executable: antsRegistration.

ANTs Registration command for registration of images

antsRegistration registers a moving_image to a fixed_image, using a predefined (sequence of) cost function(s) and transformation operations. The cost function is defined using one or more ‘metrics’, specifically local cross-correlation (CC), Mean Squares (MeanSquares), Demons (Demons), global correlation (GC), or Mutual Information (Mattes or MI).

ANTS can use both linear (Translation, Rigid, Affine, CompositeAffine, or Translation) and non-linear transformations (BSpline, GaussianDisplacementField, TimeVaryingVelocityField, TimeVaryingBSplineVelocityField, SyN, BSplineSyN, Exponential, or BSplineExponential). Usually, registration is done in multiple stages. For example first an Affine, then a Rigid, and ultimately a non-linear (Syn)-transformation.

antsRegistration can be initialized using one ore more transforms from moving_image to fixed_image with the initial_moving_transform-input. For example, when you already have a warpfield that corrects for geometrical distortions in an EPI (functional) image, that you want to apply before an Affine registration to a structural image. You could put this transform into ‘intial_moving_transform’.

The Registration-interface can output the resulting transform(s) that map moving_image to fixed_image in a single file as a composite_transform (if write_composite_transform is set to True), or a list of transforms as forwards_transforms. It can also output inverse transforms (from fixed_image to moving_image) in a similar fashion using inverse_composite_transform. Note that the order of forward_transforms is in ‘natural’ order: the first element should be applied first, the last element should be applied last.

Note, however, that ANTS tools always apply lists of transformations in reverse order (the last transformation in the list is applied first). Therefore, if the output forward_transforms is a list, one can not directly feed it into, for example, ants.ApplyTransforms. To make ants.ApplyTransforms apply the transformations in the same order as ants.Registration, you have to provide the list of transformations in reverse order from forward_transforms. reverse_forward_transforms outputs forward_transforms in reverse order and can be used for this purpose. Note also that, because composite_transform is always a single file, this output is preferred for most use-cases.

More information can be found in the ANTS manual.

See below for some useful examples.

Examples

Set up a Registration node with some default settings. This Node registers ‘fixed1.nii’ to ‘moving1.nii’ by first fitting a linear ‘Affine’ transformation, and then a non-linear ‘SyN’ transformation, both using the Mutual Information-cost metric.

The registration is initialized by first applying the (linear) transform trans.mat.

>>> import copy, pprint
>>> from nipype.interfaces.ants import Registration
>>> reg = Registration()
>>> reg.inputs.fixed_image = 'fixed1.nii'
>>> reg.inputs.moving_image = 'moving1.nii'
>>> reg.inputs.output_transform_prefix = "output_"
>>> reg.inputs.initial_moving_transform = 'trans.mat'
>>> reg.inputs.transforms = ['Affine', 'SyN']
>>> reg.inputs.transform_parameters = [(2.0,), (0.25, 3.0, 0.0)]
>>> reg.inputs.number_of_iterations = [[1500, 200], [100, 50, 30]]
>>> reg.inputs.dimension = 3
>>> reg.inputs.write_composite_transform = True
>>> reg.inputs.collapse_output_transforms = False
>>> reg.inputs.initialize_transforms_per_stage = False
>>> reg.inputs.metric = ['Mattes']*2
>>> reg.inputs.metric_weight = [1]*2 # Default (value ignored currently by ANTs)
>>> reg.inputs.radius_or_number_of_bins = [32]*2
>>> reg.inputs.sampling_strategy = ['Random', None]
>>> reg.inputs.sampling_percentage = [0.05, None]
>>> reg.inputs.convergence_threshold = [1.e-8, 1.e-9]
>>> reg.inputs.convergence_window_size = [20]*2
>>> reg.inputs.smoothing_sigmas = [[1,0], [2,1,0]]
>>> reg.inputs.sigma_units = ['vox'] * 2
>>> reg.inputs.shrink_factors = [[2,1], [3,2,1]]
>>> reg.inputs.use_estimate_learning_rate_once = [True, True]
>>> reg.inputs.use_histogram_matching = [True, True] # This is the default
>>> reg.inputs.output_warped_image = 'output_warped_image.nii.gz'
>>> reg.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 0 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
>>> reg.run()  

Same as reg1, but first invert the initial transform (‘trans.mat’) before applying it.

>>> reg.inputs.invert_initial_moving_transform = True
>>> reg1 = copy.deepcopy(reg)
>>> reg1.inputs.winsorize_lower_quantile = 0.025
>>> reg1.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.025, 1.0 ]  --write-composite-transform 1'
>>> reg1.run()  

Clip extremely high intensity data points using winsorize_upper_quantile. All data points higher than the 0.975 quantile are set to the value of the 0.975 quantile.

>>> reg2 = copy.deepcopy(reg)
>>> reg2.inputs.winsorize_upper_quantile = 0.975
>>> reg2.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 0.975 ]  --write-composite-transform 1'

Clip extremely low intensity data points using winsorize_lower_quantile. All data points lower than the 0.025 quantile are set to the original value at the 0.025 quantile.

>>> reg3 = copy.deepcopy(reg)
>>> reg3.inputs.winsorize_lower_quantile = 0.025
>>> reg3.inputs.winsorize_upper_quantile = 0.975
>>> reg3.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.025, 0.975 ]  --write-composite-transform 1'

Use float instead of double for computations (saves memory usage)

>>> reg3a = copy.deepcopy(reg)
>>> reg3a.inputs.float = True
>>> reg3a.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --float 1 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Force to use double instead of float for computations (more precision and memory usage).

>>> reg3b = copy.deepcopy(reg)
>>> reg3b.inputs.float = False
>>> reg3b.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --float 0 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

‘collapse_output_transforms’ can be used to put all transformation in a single ‘composite_transform’- file. Note that forward_transforms will now be an empty list.

>>> # Test collapse transforms flag
>>> reg4 = copy.deepcopy(reg)
>>> reg4.inputs.save_state = 'trans.mat'
>>> reg4.inputs.restore_state = 'trans.mat'
>>> reg4.inputs.initialize_transforms_per_stage = True
>>> reg4.inputs.collapse_output_transforms = True
>>> outputs = reg4._list_outputs()
>>> pprint.pprint(outputs)  
{'composite_transform': '...data/output_Composite.h5',
 'elapsed_time': <undefined>,
 'forward_invert_flags': [],
 'forward_transforms': [],
 'inverse_composite_transform': '...data/output_InverseComposite.h5',
 'inverse_warped_image': <undefined>,
 'metric_value': <undefined>,
 'reverse_forward_invert_flags': [],
 'reverse_forward_transforms': [],
 'reverse_invert_flags': [],
 'reverse_transforms': [],
 'save_state': '...data/trans.mat',
 'warped_image': '...data/output_warped_image.nii.gz'}
>>> reg4.cmdline
'antsRegistration --collapse-output-transforms 1 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 1 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --restore-state trans.mat --save-state trans.mat --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
>>> # Test collapse transforms flag
>>> reg4b = copy.deepcopy(reg4)
>>> reg4b.inputs.write_composite_transform = False
>>> outputs = reg4b._list_outputs()
>>> pprint.pprint(outputs)  
{'composite_transform': <undefined>,
 'elapsed_time': <undefined>,
 'forward_invert_flags': [False, False],
 'forward_transforms': ['...data/output_0GenericAffine.mat',
 '...data/output_1Warp.nii.gz'],
 'inverse_composite_transform': <undefined>,
 'inverse_warped_image': <undefined>,
 'metric_value': <undefined>,
 'reverse_forward_invert_flags': [False, False],
 'reverse_forward_transforms': ['...data/output_1Warp.nii.gz',
 '...data/output_0GenericAffine.mat'],
 'reverse_invert_flags': [True, False],
 'reverse_transforms': ['...data/output_0GenericAffine.mat',     '...data/output_1InverseWarp.nii.gz'],
 'save_state': '...data/trans.mat',
 'warped_image': '...data/output_warped_image.nii.gz'}
>>> reg4b.aggregate_outputs()  
>>> reg4b.cmdline
'antsRegistration --collapse-output-transforms 1 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 1 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --restore-state trans.mat --save-state trans.mat --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 0'

One can use multiple similarity metrics in a single registration stage.The Node below first performs a linear registation using only the Mutual Information (‘Mattes’)-metric. In a second stage, it performs a non-linear registration (‘Syn’) using both a Mutual Information and a local cross-correlation (‘CC’)-metric. Both metrics are weighted equally (‘metric_weight’ is .5 for both). The Mutual Information- metric uses 32 bins. The local cross-correlations (correlations between every voxel’s neighborhoods) is computed with a radius of 4.

>>> # Test multiple metrics per stage
>>> reg5 = copy.deepcopy(reg)
>>> reg5.inputs.fixed_image = 'fixed1.nii'
>>> reg5.inputs.moving_image = 'moving1.nii'
>>> reg5.inputs.metric = ['Mattes', ['Mattes', 'CC']]
>>> reg5.inputs.metric_weight = [1, [.5,.5]]
>>> reg5.inputs.radius_or_number_of_bins = [32, [32, 4] ]
>>> reg5.inputs.sampling_strategy = ['Random', None] # use default strategy in second stage
>>> reg5.inputs.sampling_percentage = [0.05, [0.05, 0.10]]
>>> reg5.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 0.5, 32, None, 0.05 ] --metric CC[ fixed1.nii, moving1.nii, 0.5, 4, None, 0.1 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

ANTS Registration can also use multiple modalities to perform the registration. Here it is assumed that fixed1.nii and fixed2.nii are in the same space, and so are moving1.nii and moving2.nii. First, a linear registration is performed matching fixed1.nii to moving1.nii, then a non-linear registration is performed to match fixed2.nii to moving2.nii, starting from the transformation of the first step.

>>> # Test multiple inputS
>>> reg6 = copy.deepcopy(reg5)
>>> reg6.inputs.fixed_image = ['fixed1.nii', 'fixed2.nii']
>>> reg6.inputs.moving_image = ['moving1.nii', 'moving2.nii']
>>> reg6.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 0.5, 32, None, 0.05 ] --metric CC[ fixed2.nii, moving2.nii, 0.5, 4, None, 0.1 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Different methods can be used for the interpolation when applying transformations.

>>> # Test Interpolation Parameters (BSpline)
>>> reg7a = copy.deepcopy(reg)
>>> reg7a.inputs.interpolation = 'BSpline'
>>> reg7a.inputs.interpolation_parameters = (3,)
>>> reg7a.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation BSpline[ 3 ] --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
>>> # Test Interpolation Parameters (MultiLabel/Gaussian)
>>> reg7b = copy.deepcopy(reg)
>>> reg7b.inputs.interpolation = 'Gaussian'
>>> reg7b.inputs.interpolation_parameters = (1.0, 1.0)
>>> reg7b.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Gaussian[ 1.0, 1.0 ] --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

BSplineSyN non-linear registration with custom parameters.

>>> # Test Extended Transform Parameters
>>> reg8 = copy.deepcopy(reg)
>>> reg8.inputs.transforms = ['Affine', 'BSplineSyN']
>>> reg8.inputs.transform_parameters = [(2.0,), (0.25, 26, 0, 3)]
>>> reg8.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform BSplineSyN[ 0.25, 26, 0, 3 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Mask the fixed image in the second stage of the registration (but not the first).

>>> # Test masking
>>> reg9 = copy.deepcopy(reg)
>>> reg9.inputs.fixed_image_masks = ['NULL', 'fixed1.nii']
>>> reg9.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --masks [ NULL, NULL ] --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --masks [ fixed1.nii, NULL ] --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Here we use both a warpfield and a linear transformation, before registration commences. Note that the first transformation that needs to be applied (‘ants_Warp.nii.gz’) is last in the list of ‘initial_moving_transform’.

>>> # Test initialization with multiple transforms matrices (e.g., unwarp and affine transform)
>>> reg10 = copy.deepcopy(reg)
>>> reg10.inputs.initial_moving_transform = ['func_to_struct.mat', 'ants_Warp.nii.gz']
>>> reg10.inputs.invert_initial_moving_transform = [False, False]
>>> reg10.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ func_to_struct.mat, 0 ] [ ants_Warp.nii.gz, 0 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
fixed_imagea list of items which are a pathlike object or string representing an existing file

Image to which the moving_image should be transformed(usually a structural image).

metrica list of items which are ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ or a list of items which are ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’

The metric(s) to use for each stage. Note that multiple metrics per stage are not supported in ANTS 1.9.1 and earlier.

metric_weighta list of items which are a float or a list of items which are a float

The metric weight(s) for each stage. The weights must sum to 1 per stage. Requires inputs: metric. (Nipype default value: [1.0])

moving_imagea list of items which are a pathlike object or string representing an existing file

Image that will be registered to the space of fixed_image. This is theimage on which the transformations will be applied to.

shrink_factors : a list of items which are a list of items which are an integer (int or long) smoothing_sigmas : a list of items which are a list of items which are a float transforms : a list of items which are ‘Rigid’ or ‘Affine’ or ‘CompositeAffine’ or ‘Similarity’ or ‘Translation’ or ‘BSpline’ or ‘GaussianDisplacementField’ or ‘TimeVaryingVelocityField’ or ‘TimeVaryingBSplineVelocityField’ or ‘SyN’ or ‘BSplineSyN’ or ‘Exponential’ or ‘BSplineExponential’

Maps to a command-line argument: %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

collapse_output_transformsa boolean

Collapse output transforms. Specifically, enabling this option combines all adjacent linear transforms and composes all adjacent displacement field transforms before writing the results to disk. Maps to a command-line argument: --collapse-output-transforms %d. (Nipype default value: True)

convergence_thresholda list of at least 1 items which are a float

Requires inputs: number_of_iterations. (Nipype default value: [1e-06])

convergence_window_sizea list of at least 1 items which are an integer (int or long)

Requires inputs: convergence_threshold. (Nipype default value: [10])

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: --dimensionality %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

fixed_image_maska pathlike object or string representing an existing file

Mask used to limit metric sampling region of the fixed imagein all stages. Maps to a command-line argument: %s. Mutually exclusive with inputs: fixed_image_masks.

fixed_image_masksa list of items which are a pathlike object or string representing an existing file or ‘NULL’

Masks used to limit metric sampling region of the fixed image, defined per registration stage(Use “NULL” to omit a mask at a given stage). Mutually exclusive with inputs: fixed_image_mask.

floata boolean

Use float instead of double for computations. Maps to a command-line argument: --float %d.

initial_moving_transforma list of items which are a pathlike object or string representing an existing file

A transform or a list of transforms that should be applied before the registration begins. Note that, when a list is given, the transformations are applied in reverse order. Maps to a command-line argument: %s. Mutually exclusive with inputs: initial_moving_transform_com.

initial_moving_transform_com0 or 1 or 2

Align the moving_image and fixed_image before registration using the geometric center of the images (=0), the image intensities (=1), or the origin of the images (=2). Maps to a command-line argument: %s. Mutually exclusive with inputs: initial_moving_transform.

initialize_transforms_per_stagea boolean

Initialize linear transforms from the previous stage. By enabling this option, the current linear stage transform is directly intialized from the previous stages linear transform; this allows multiple linear stages to be run where each stage directly updates the estimated linear transform from the previous stage. (e.g. Translation -> Rigid -> Affine). . Maps to a command-line argument: --initialize-transforms-per-stage %d. (Nipype default value: False)

interpolation‘Linear’ or ‘NearestNeighbor’ or ‘CosineWindowedSinc’ or ‘WelchWindowedSinc’ or ‘HammingWindowedSinc’ or ‘LanczosWindowedSinc’ or ‘BSpline’ or ‘MultiLabel’ or ‘Gaussian’

Maps to a command-line argument: %s. (Nipype default value: Linear)

interpolation_parameters : a tuple of the form: (an integer (int or long)) or a tuple of the form: (a float, a float) invert_initial_moving_transform : a list of items which are a boolean

One boolean or a list of booleans that indicatewhether the inverse(s) of the transform(s) definedin initial_moving_transform should be used. Mutually exclusive with inputs: initial_moving_transform_com. Requires inputs: initial_moving_transform.

metric_item_trait : ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ metric_stage_trait : ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ or a list of items which are ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ metric_weight_item_trait : a float

(Nipype default value: 1.0)

metric_weight_stage_trait : a float or a list of items which are a float moving_image_mask : a pathlike object or string representing an existing file

Mask used to limit metric sampling region of the moving imagein all stages. Mutually exclusive with inputs: moving_image_masks. Requires inputs: fixed_image_mask.

moving_image_masksa list of items which are a pathlike object or string representing an existing file or ‘NULL’

Masks used to limit metric sampling region of the moving image, defined per registration stage(Use “NULL” to omit a mask at a given stage). Mutually exclusive with inputs: moving_image_mask.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

number_of_iterations : a list of items which are a list of items which are an integer (int or long) output_inverse_warped_image : a boolean or a pathlike object or string representing a file

Requires inputs: output_warped_image.

output_transform_prefixa unicode string

Maps to a command-line argument: %s. (Nipype default value: transform)

output_warped_image : a boolean or a pathlike object or string representing a file radius_bins_item_trait : an integer (int or long)

(Nipype default value: 5)

radius_bins_stage_trait : an integer (int or long) or a list of items which are an integer (int or long) radius_or_number_of_bins : a list of items which are an integer (int or long) or a list of items which are an integer (int or long)

The number of bins in each stage for the MI and Mattes metric, the radius for other metrics. Requires inputs: metric_weight. (Nipype default value: [5])

restore_statea pathlike object or string representing an existing file

Filename for restoring the internal restorable state of the registration. Maps to a command-line argument: --restore-state %s.

restrict_deformationa list of items which are a list of items which are 0 or 1

This option allows the user to restrict the optimization of the displacement field, translation, rigid or affine transform on a per-component basis. For example, if one wants to limit the deformation or rotation of 3-D volume to the first two dimensions, this is possible by specifying a weight vector of ‘1x1x0’ for a deformation field or ‘1x1x0x1x1x0’ for a rigid transformation. Low-dimensional restriction only works if there are no preceding transformations.

sampling_percentagea list of items which are 0.0 <= a floating point number <= 1.0 or None or a list of items which are 0.0 <= a floating point number <= 1.0 or None

The metric sampling percentage(s) to use for each stage. Requires inputs: sampling_strategy.

sampling_percentage_item_trait : 0.0 <= a floating point number <= 1.0 or None sampling_percentage_stage_trait : 0.0 <= a floating point number <= 1.0 or None or a list of items which are 0.0 <= a floating point number <= 1.0 or None sampling_strategy : a list of items which are ‘None’ or ‘Regular’ or ‘Random’ or None or a list of items which are ‘None’ or ‘Regular’ or ‘Random’ or None

The metric sampling strategy (strategies) for each stage. Requires inputs: metric_weight.

sampling_strategy_item_trait : ‘None’ or ‘Regular’ or ‘Random’ or None sampling_strategy_stage_trait : ‘None’ or ‘Regular’ or ‘Random’ or None or a list of items which are ‘None’ or ‘Regular’ or ‘Random’ or None save_state : a pathlike object or string representing a file

Filename for saving the internal restorable state of the registration. Maps to a command-line argument: --save-state %s.

sigma_unitsa list of items which are ‘mm’ or ‘vox’

Units for smoothing sigmas. Requires inputs: smoothing_sigmas.

transform_parameters : a list of items which are a tuple of the form: (a float) or a tuple of the form: (a float, a float, a float) or a tuple of the form: (a float, an integer (int or long), an integer (int or long), an integer (int or long)) or a tuple of the form: (a float, an integer (int or long), a float, a float, a float, a float) or a tuple of the form: (a float, a float, a float, an integer (int or long)) or a tuple of the form: (a float, an integer (int or long), an integer (int or long), an integer (int or long), an integer (int or long)) use_estimate_learning_rate_once : a list of items which are a boolean use_histogram_matching : a boolean or a list of items which are a boolean

Histogram match the images before registration. (Nipype default value: True)

verbosea boolean

Maps to a command-line argument: -v. (Nipype default value: False)

winsorize_lower_quantile0.0 <= a floating point number <= 1.0

The Lower quantile to clip image ranges. Maps to a command-line argument: %s. (Nipype default value: 0.0)

winsorize_upper_quantile0.0 <= a floating point number <= 1.0

The Upper quantile to clip image ranges. Maps to a command-line argument: %s. (Nipype default value: 1.0)

write_composite_transforma boolean

Maps to a command-line argument: --write-composite-transform %d. (Nipype default value: False)

composite_transforma pathlike object or string representing an existing file

Composite transform file.

elapsed_timea float

The total elapsed time as reported by ANTs.

forward_invert_flagsa list of items which are a boolean

List of flags corresponding to the forward transforms.

forward_transformsa list of items which are a pathlike object or string representing an existing file

List of output transforms for forward registration.

inverse_composite_transforma pathlike object or string representing a file

Inverse composite transform file.

inverse_warped_imagea pathlike object or string representing a file

Outputs the inverse of the warped image.

metric_valuea float

The final value of metric.

reverse_forward_invert_flagsa list of items which are a boolean

List of flags corresponding to the forward transforms reversed for antsApplyTransform.

reverse_forward_transformsa list of items which are a pathlike object or string representing an existing file

List of output transforms for forward registration reversed for antsApplyTransform.

reverse_invert_flagsa list of items which are a boolean

List of flags corresponding to the reverse transforms.

reverse_transformsa list of items which are a pathlike object or string representing an existing file

List of output transforms for reverse registration.

save_statea pathlike object or string representing a file

The saved registration state to be restored.

warped_imagea pathlike object or string representing a file

Outputs warped image.

Registration.DEF_SAMPLING_STRATEGY = 'None'

The default sampling strategy argument.

RegistrationSynQuick

Link to code

Bases: ANTSCommand

Wrapped executable: antsRegistrationSyNQuick.sh.

Registration using a symmetric image normalization method (SyN). You can read more in Avants et al.; Med Image Anal., 2008 (https://www.ncbi.nlm.nih.gov/pubmed/17659998).

Examples

>>> from nipype.interfaces.ants import RegistrationSynQuick
>>> reg = RegistrationSynQuick()
>>> reg.inputs.fixed_image = 'fixed1.nii'
>>> reg.inputs.moving_image = 'moving1.nii'
>>> reg.inputs.num_threads = 2
>>> reg.cmdline
'antsRegistrationSyNQuick.sh -d 3 -f fixed1.nii -r 32 -m moving1.nii -n 2 -o transform -p d -s 26 -t s'
>>> reg.run()  

example for multiple images

>>> from nipype.interfaces.ants import RegistrationSynQuick
>>> reg = RegistrationSynQuick()
>>> reg.inputs.fixed_image = ['fixed1.nii', 'fixed2.nii']
>>> reg.inputs.moving_image = ['moving1.nii', 'moving2.nii']
>>> reg.inputs.num_threads = 2
>>> reg.cmdline
'antsRegistrationSyNQuick.sh -d 3 -f fixed1.nii -f fixed2.nii -r 32 -m moving1.nii -m moving2.nii -n 2 -o transform -p d -s 26 -t s'
>>> reg.run()  
fixed_imagea list of items which are a pathlike object or string representing an existing file

Fixed image or source image or reference image. Maps to a command-line argument: -f %s....

moving_imagea list of items which are a pathlike object or string representing an existing file

Moving image or target image. Maps to a command-line argument: -m %s....

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

histogram_binsan integer (int or long)

Histogram bins for mutual information in SyN stage (default = 32). Maps to a command-line argument: -r %d. (Nipype default value: 32)

num_threadsan integer (int or long)

Number of threads (default = 1). Maps to a command-line argument: -n %d. (Nipype default value: 1)

output_prefixa unicode string

A prefix that is prepended to all output files. Maps to a command-line argument: -o %s. (Nipype default value: transform)

precision_type‘double’ or ‘float’

Precision type (default = double). Maps to a command-line argument: -p %s. (Nipype default value: double)

spline_distancean integer (int or long)

Spline distance for deformable B-spline SyN transform (default = 26). Maps to a command-line argument: -s %d. (Nipype default value: 26)

transform_type‘s’ or ‘t’ or ‘r’ or ‘a’ or ‘sr’ or ‘b’ or ‘br’

Transform type

  • t: translation

  • r: rigid

  • a: rigid + affine

  • s: rigid + affine + deformable syn (default)

  • sr: rigid + deformable syn

  • b: rigid + affine + deformable b-spline syn

  • br: rigid + deformable b-spline syn

Maps to a command-line argument: -t %s. (Nipype default value: s)

use_histogram_matchinga boolean

Use histogram matching. Maps to a command-line argument: -j %d.

forward_warp_fielda pathlike object or string representing an existing file

Forward warp field.

inverse_warp_fielda pathlike object or string representing an existing file

Inverse warp field.

inverse_warped_imagea pathlike object or string representing an existing file

Inverse warped image.

out_matrixa pathlike object or string representing an existing file

Affine matrix.

warped_imagea pathlike object or string representing an existing file

Warped image.