nipype.interfaces.ants.segmentation module

Wrappers for segmentation utilities within ANTs.

AntsJointFusion

Link to code

alias of JointFusion

Atropos

Link to code

Bases: ANTSCommand

Wrapped executable: Atropos.

A multivariate n-class segmentation algorithm.

A finite mixture modeling (FMM) segmentation approach with possibilities for specifying prior constraints. These prior constraints include the specification of a prior label image, prior probability images (one for each class), and/or an MRF prior to enforce spatial smoothing of the labels. Similar algorithms include FAST and SPM.

Examples

>>> from nipype.interfaces.ants import Atropos
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'Random'
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization Random[2] --intensity-image structural.nii
--likelihood-model Gaussian --mask-image mask.nii --mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz] --posterior-formulation Socrates[1]
--use-random-seed 1'
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'KMeans'
>>> at.inputs.kmeans_init_centers = [100, 200]
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization KMeans[2,100,200] --intensity-image structural.nii
--likelihood-model Gaussian --mask-image mask.nii --mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz] --posterior-formulation Socrates[1]
--use-random-seed 1'
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'PriorProbabilityImages'
>>> at.inputs.prior_image = 'BrainSegmentationPrior%02d.nii.gz'
>>> at.inputs.prior_weighting = 0.8
>>> at.inputs.prior_probability_threshold = 0.0000001
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization PriorProbabilityImages[2,BrainSegmentationPrior%02d.nii.gz,0.8,1e-07]
--intensity-image structural.nii --likelihood-model Gaussian --mask-image mask.nii
--mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz]
--posterior-formulation Socrates[1] --use-random-seed 1'
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'PriorLabelImage'
>>> at.inputs.prior_image = 'segmentation0.nii.gz'
>>> at.inputs.number_of_tissue_classes = 2
>>> at.inputs.prior_weighting = 0.8
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization PriorLabelImage[2,segmentation0.nii.gz,0.8] --intensity-image structural.nii
--likelihood-model Gaussian --mask-image mask.nii --mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz] --posterior-formulation Socrates[1]
--use-random-seed 1'
initialization‘Random’ or ‘Otsu’ or ‘KMeans’ or ‘PriorProbabilityImages’ or ‘PriorLabelImage’

Maps to a command-line argument: %s. Requires inputs: number_of_tissue_classes.

intensity_imagesa list of items which are a pathlike object or string representing an existing file

Maps to a command-line argument: --intensity-image %s....

mask_imagea pathlike object or string representing an existing file

Maps to a command-line argument: --mask-image %s.

number_of_tissue_classes : an integer

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

convergence_thresholda float

Requires inputs: n_iterations.

dimension3 or 2 or 4

Image dimension (2, 3, or 4). Maps to a command-line argument: --image-dimensionality %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

icm_use_synchronous_updatea boolean

Maps to a command-line argument: %s.

kmeans_init_centers : a list of at least 1 items which are an integer or a float likelihood_model : a string

Maps to a command-line argument: --likelihood-model %s.

maximum_number_of_icm_terationsan integer

Requires inputs: icm_use_synchronous_update.

mrf_radiusa list of items which are an integer

Requires inputs: mrf_smoothing_factor.

mrf_smoothing_factora float

Maps to a command-line argument: %s.

n_iterationsan integer

Maps to a command-line argument: %s.

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

out_classified_image_namea pathlike object or string representing a file

Maps to a command-line argument: %s.

output_posteriors_name_templatea string

(Nipype default value: POSTERIOR_%02d.nii.gz)

posterior_formulationa string

Maps to a command-line argument: %s.

prior_imagea pathlike object or string representing an existing file or a string

Either a string pattern (e.g., ‘prior%02d.nii’) or an existing vector-image file.

prior_probability_thresholda float

Requires inputs: prior_weighting.

prior_weighting : a float save_posteriors : a boolean use_mixture_model_proportions : a boolean

Requires inputs: posterior_formulation.

use_random_seeda boolean

Use random seed value over constant. Maps to a command-line argument: --use-random-seed %d. (Nipype default value: True)

classified_image : a pathlike object or string representing an existing file posteriors : a list of items which are a pathlike object or string representing a file

BrainExtraction

Link to code

Bases: ANTSCommand

Wrapped executable: antsBrainExtraction.sh.

Atlas-based brain extraction.

Examples

>>> from nipype.interfaces.ants.segmentation import BrainExtraction
>>> brainextraction = BrainExtraction()
>>> brainextraction.inputs.dimension = 3
>>> brainextraction.inputs.anatomical_image ='T1.nii.gz'
>>> brainextraction.inputs.brain_template = 'study_template.nii.gz'
>>> brainextraction.inputs.brain_probability_mask ='ProbabilityMaskOfStudyTemplate.nii.gz'
>>> brainextraction.cmdline
'antsBrainExtraction.sh -a T1.nii.gz -m ProbabilityMaskOfStudyTemplate.nii.gz
-e study_template.nii.gz -d 3 -s nii.gz -o highres001_'
anatomical_imagea pathlike object or string representing an existing file

Structural image, typically T1. If more than one anatomical image is specified, subsequently specified images are used during the segmentation process. However, only the first image is used in the registration of priors. Our suggestion would be to specify the T1 as the first image. Anatomical template created using e.g. LPBA40 data set with buildtemplateparallel.sh in ANTs. Maps to a command-line argument: -a %s.

brain_probability_maska pathlike object or string representing an existing file

Brain probability mask created using e.g. LPBA40 data set which have brain masks defined, and warped to anatomical template and averaged resulting in a probability image. Maps to a command-line argument: -m %s.

brain_templatea pathlike object or string representing an existing file

Anatomical template created using e.g. LPBA40 data set with buildtemplateparallel.sh in ANTs. Maps to a command-line argument: -e %s.

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

debuga boolean

If > 0, runs a faster version of the script. Only for testing. Implies -u 0. Requires single thread computation for complete reproducibility. Maps to a command-line argument: -z 1.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

extraction_registration_maska pathlike object or string representing an existing file

Mask (defined in the template space) used during registration for brain extraction. To limit the metric computation to a specific region. Maps to a command-line argument: -f %s.

image_suffixa string

Any of standard ITK formats, nii.gz is default. Maps to a command-line argument: -s %s. (Nipype default value: nii.gz)

keep_temporary_filesan integer

Keep brain extraction/segmentation warps, etc (default = 0). Maps to a command-line argument: -k %d.

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

out_prefixa string

Prefix that is prepended to all output files. Maps to a command-line argument: -o %s. (Nipype default value: highres001_)

use_floatingpoint_precision0 or 1

Use floating point precision in registrations (default = 0). Maps to a command-line argument: -q %d.

use_random_seeding0 or 1

Use random number generated from system clock in Atropos (default = 1). Maps to a command-line argument: -u %d.

BrainExtractionBraina pathlike object or string representing an existing file

Brain extraction image.

BrainExtractionCSFa pathlike object or string representing an existing file

Segmentation mask with only CSF.

BrainExtractionGMa pathlike object or string representing an existing file

Segmentation mask with only grey matter.

BrainExtractionInitialAffine : a pathlike object or string representing an existing file BrainExtractionInitialAffineFixed : a pathlike object or string representing an existing file BrainExtractionInitialAffineMoving : a pathlike object or string representing an existing file BrainExtractionLaplacian : a pathlike object or string representing an existing file BrainExtractionMask : a pathlike object or string representing an existing file

Brain extraction mask.

BrainExtractionPrior0GenericAffine : a pathlike object or string representing an existing file BrainExtractionPrior1InverseWarp : a pathlike object or string representing an existing file BrainExtractionPrior1Warp : a pathlike object or string representing an existing file BrainExtractionPriorWarped : a pathlike object or string representing an existing file BrainExtractionSegmentation : a pathlike object or string representing an existing file

Segmentation mask with CSF, GM, and WM.

BrainExtractionTemplateLaplacian : a pathlike object or string representing an existing file BrainExtractionTmp : a pathlike object or string representing an existing file BrainExtractionWM : a pathlike object or string representing an existing file

Segmenration mask with only white matter.

N4Corrected0a pathlike object or string representing an existing file

N4 bias field corrected image.

N4Truncated0 : a pathlike object or string representing an existing file

CorticalThickness

Link to code

Bases: ANTSCommand

Wrapped executable: antsCorticalThickness.sh.

Examples

>>> from nipype.interfaces.ants.segmentation import CorticalThickness
>>> corticalthickness = CorticalThickness()
>>> corticalthickness.inputs.dimension = 3
>>> corticalthickness.inputs.anatomical_image ='T1.nii.gz'
>>> corticalthickness.inputs.brain_template = 'study_template.nii.gz'
>>> corticalthickness.inputs.brain_probability_mask ='ProbabilityMaskOfStudyTemplate.nii.gz'
>>> corticalthickness.inputs.segmentation_priors = ['BrainSegmentationPrior01.nii.gz',
...                                                 'BrainSegmentationPrior02.nii.gz',
...                                                 'BrainSegmentationPrior03.nii.gz',
...                                                 'BrainSegmentationPrior04.nii.gz']
>>> corticalthickness.inputs.t1_registration_template = 'brain_study_template.nii.gz'
>>> corticalthickness.cmdline
'antsCorticalThickness.sh -a T1.nii.gz -m ProbabilityMaskOfStudyTemplate.nii.gz
-e study_template.nii.gz -d 3 -s nii.gz -o antsCT_
-p nipype_priors/BrainSegmentationPrior%02d.nii.gz -t brain_study_template.nii.gz'
anatomical_imagea pathlike object or string representing an existing file

Structural intensity image, typically T1. If more than one anatomical image is specified, subsequently specified images are used during the segmentation process. However, only the first image is used in the registration of priors. Our suggestion would be to specify the T1 as the first image. Maps to a command-line argument: -a %s.

brain_probability_maska pathlike object or string representing an existing file

Brain probability mask in template space. Maps to a command-line argument: -m %s.

brain_templatea pathlike object or string representing an existing file

Anatomical intensity template (possibly created using a population data set with buildtemplateparallel.sh in ANTs). This template is not skull-stripped. Maps to a command-line argument: -e %s.

segmentation_priorsa list of items which are a pathlike object or string representing an existing file

Maps to a command-line argument: -p %s.

t1_registration_templatea pathlike object or string representing an existing file

Anatomical intensity template (assumed to be skull-stripped). A common case would be where this would be the same template as specified in the -e option which is not skull stripped. Maps to a command-line argument: -t %s.

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

b_spline_smoothinga boolean

Use B-spline SyN for registrations and B-spline exponential mapping in DiReCT. Maps to a command-line argument: -v.

cortical_label_imagea pathlike object or string representing an existing file

Cortical ROI labels to use as a prior for ATITH.

debuga boolean

If > 0, runs a faster version of the script. Only for testing. Implies -u 0. Requires single thread computation for complete reproducibility. Maps to a command-line argument: -z 1.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

extraction_registration_maska pathlike object or string representing an existing file

Mask (defined in the template space) used during registration for brain extraction. Maps to a command-line argument: -f %s.

image_suffixa string

Any of standard ITK formats, nii.gz is default. Maps to a command-line argument: -s %s. (Nipype default value: nii.gz)

keep_temporary_filesan integer

Keep brain extraction/segmentation warps, etc (default = 0). Maps to a command-line argument: -k %d.

label_propagationa string

Incorporate a distance prior one the posterior formulation. Should be of the form ‘label[lambda,boundaryProbability]’ where label is a value of 1,2,3,… denoting label ID. The label probability for anything outside the current label = boundaryProbability * exp( -lambda * distanceFromBoundary ) Intuitively, smaller lambda values will increase the spatial capture range of the distance prior. To apply to all label values, simply omit specifying the label, i.e. -l [lambda,boundaryProbability]. Maps to a command-line argument: -l %s.

max_iterationsan integer

ANTS registration max iterations (default = 100x100x70x20). Maps to a command-line argument: -i %d.

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

out_prefixa string

Prefix that is prepended to all output files. Maps to a command-line argument: -o %s. (Nipype default value: antsCT_)

posterior_formulationa string

Atropos posterior formulation and whether or not to use mixture model proportions. e.g ‘Socrates[1]’ (default) or ‘Aristotle[1]’. Choose the latter if you want use the distance priors (see also the -l option for label propagation control). Maps to a command-line argument: -b %s.

prior_segmentation_weighta float

Atropos spatial prior probability weight for the segmentation. Maps to a command-line argument: -w %f.

quick_registrationa boolean

If = 1, use antsRegistrationSyNQuick.sh as the basis for registration during brain extraction, brain segmentation, and (optional) normalization to a template. Otherwise use antsRegistrationSyN.sh (default = 0). Maps to a command-line argument: -q 1.

segmentation_iterationsan integer

N4 -> Atropos -> N4 iterations during segmentation (default = 3). Maps to a command-line argument: -n %d.

use_floatingpoint_precision0 or 1

Use floating point precision in registrations (default = 0). Maps to a command-line argument: -j %d.

use_random_seeding0 or 1

Use random number generated from system clock in Atropos (default = 1). Maps to a command-line argument: -u %d.

BrainExtractionMaska pathlike object or string representing an existing file

Brain extraction mask.

BrainSegmentationa pathlike object or string representing an existing file

Brain segmentation image.

BrainSegmentationN4a pathlike object or string representing an existing file

N4 corrected image.

BrainSegmentationPosteriorsa list of items which are a pathlike object or string representing an existing file

Posterior probability images.

BrainVolumesa pathlike object or string representing an existing file

Brain volumes as text.

CorticalThicknessa pathlike object or string representing an existing file

Cortical thickness file.

CorticalThicknessNormedToTemplatea pathlike object or string representing an existing file

Normalized cortical thickness.

ExtractedBrainN4a pathlike object or string representing an existing file

Extracted brain from N4 image.

SubjectToTemplate0GenericAffinea pathlike object or string representing an existing file

Template to subject inverse affine.

SubjectToTemplate1Warpa pathlike object or string representing an existing file

Template to subject inverse warp.

SubjectToTemplateLogJacobiana pathlike object or string representing an existing file

Template to subject log jacobian.

TemplateToSubject0Warpa pathlike object or string representing an existing file

Template to subject warp.

TemplateToSubject1GenericAffinea pathlike object or string representing an existing file

Template to subject affine.

DenoiseImage

Link to code

Bases: ANTSCommand

Wrapped executable: DenoiseImage.

Examples

>>> import copy
>>> from nipype.interfaces.ants import DenoiseImage
>>> denoise = DenoiseImage()
>>> denoise.inputs.dimension = 3
>>> denoise.inputs.input_image = 'im1.nii'
>>> denoise.cmdline
'DenoiseImage -d 3 -i im1.nii -n Gaussian -o im1_noise_corrected.nii -s 1'
>>> denoise_2 = copy.deepcopy(denoise)
>>> denoise_2.inputs.output_image = 'output_corrected_image.nii.gz'
>>> denoise_2.inputs.noise_model = 'Rician'
>>> denoise_2.inputs.shrink_factor = 2
>>> denoise_2.cmdline
'DenoiseImage -d 3 -i im1.nii -n Rician -o output_corrected_image.nii.gz -s 2'
>>> denoise_3 = DenoiseImage()
>>> denoise_3.inputs.input_image = 'im1.nii'
>>> denoise_3.inputs.save_noise = True
>>> denoise_3.cmdline
'DenoiseImage -i im1.nii -n Gaussian -o [ im1_noise_corrected.nii, im1_noise.nii ] -s 1'
input_imagea pathlike object or string representing an existing file

A scalar image is expected as input for noise correction. Maps to a command-line argument: -i %s.

save_noisea boolean

True if the estimated noise should be saved to file. Mutually exclusive with inputs: noise_image. (Nipype default value: False)

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension2 or 3 or 4

This option forces the image to be treated as a specified-dimensional image. If not specified, the program tries to infer the dimensionality from the input image. Maps to a command-line argument: -d %d.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

noise_imagea pathlike object or string representing a file

Filename for the estimated noise.

noise_model‘Gaussian’ or ‘Rician’

Employ a Rician or Gaussian noise model. Maps to a command-line argument: -n %s. (Nipype default value: Gaussian)

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

output_imagea pathlike object or string representing a file

The output consists of the noise corrected version of the input image. Maps to a command-line argument: -o %s.

shrink_factoran integer

Running noise correction on large images can be time consuming. To lessen computation time, the input image can be resampled. The shrink factor, specified as a single integer, describes this resampling. Shrink factor = 1 is the default. Maps to a command-line argument: -s %s. (Nipype default value: 1)

verbosea boolean

Verbose output. Maps to a command-line argument: -v.

noise_image : a pathlike object or string representing a file output_image : a pathlike object or string representing an existing file

JointFusion

Link to code

Bases: ANTSCommand

Wrapped executable: antsJointFusion.

An image fusion algorithm.

Developed by Hongzhi Wang and Paul Yushkevich, and it won segmentation challenges at MICCAI 2012 and MICCAI 2013. The original label fusion framework was extended to accommodate intensities by Brian Avants. This implementation is based on Paul’s original ITK-style implementation and Brian’s ANTsR implementation.

References include 1) H. Wang, J. W. Suh, S. Das, J. Pluta, C. Craige, P. Yushkevich, Multi-atlas segmentation with joint label fusion IEEE Trans. on Pattern Analysis and Machine Intelligence, 35(3), 611-623, 2013. and 2) H. Wang and P. A. Yushkevich, Multi-atlas segmentation with joint label fusion and corrective learning–an open source implementation, Front. Neuroinform., 2013.

Examples

>>> from nipype.interfaces.ants import JointFusion
>>> jf = JointFusion()
>>> jf.inputs.out_label_fusion = 'ants_fusion_label_output.nii'
>>> jf.inputs.atlas_image = [ ['rc1s1.nii','rc1s2.nii'] ]
>>> jf.inputs.atlas_segmentation_image = ['segmentation0.nii.gz']
>>> jf.inputs.target_image = ['im1.nii']
>>> jf.cmdline
"antsJointFusion -a 0.1 -g ['rc1s1.nii', 'rc1s2.nii'] -l segmentation0.nii.gz
-b 2.0 -o ants_fusion_label_output.nii -s 3x3x3 -t ['im1.nii']"
>>> jf.inputs.target_image = [ ['im1.nii', 'im2.nii'] ]
>>> jf.cmdline
"antsJointFusion -a 0.1 -g ['rc1s1.nii', 'rc1s2.nii'] -l segmentation0.nii.gz
-b 2.0 -o ants_fusion_label_output.nii -s 3x3x3 -t ['im1.nii', 'im2.nii']"
>>> jf.inputs.atlas_image = [ ['rc1s1.nii','rc1s2.nii'],
...                                        ['rc2s1.nii','rc2s2.nii'] ]
>>> jf.inputs.atlas_segmentation_image = ['segmentation0.nii.gz',
...                                                    'segmentation1.nii.gz']
>>> jf.cmdline
"antsJointFusion -a 0.1 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 2.0 -o ants_fusion_label_output.nii
-s 3x3x3 -t ['im1.nii', 'im2.nii']"
>>> jf.inputs.dimension = 3
>>> jf.inputs.alpha = 0.5
>>> jf.inputs.beta = 1.0
>>> jf.inputs.patch_radius = [3,2,1]
>>> jf.inputs.search_radius = [3]
>>> jf.cmdline
"antsJointFusion -a 0.5 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 1.0 -d 3 -o ants_fusion_label_output.nii
-p 3x2x1 -s 3 -t ['im1.nii', 'im2.nii']"
>>> jf.inputs.search_radius = ['mask.nii']
>>> jf.inputs.verbose = True
>>> jf.inputs.exclusion_image = ['roi01.nii', 'roi02.nii']
>>> jf.inputs.exclusion_image_label = ['1','2']
>>> jf.cmdline
"antsJointFusion -a 0.5 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 1.0 -d 3 -e 1[roi01.nii] -e 2[roi02.nii]
-o ants_fusion_label_output.nii -p 3x2x1 -s mask.nii -t ['im1.nii', 'im2.nii'] -v"
>>> jf.inputs.out_label_fusion = 'ants_fusion_label_output.nii'
>>> jf.inputs.out_intensity_fusion_name_format = 'ants_joint_fusion_intensity_%d.nii.gz'
>>> jf.inputs.out_label_post_prob_name_format = 'ants_joint_fusion_posterior_%d.nii.gz'
>>> jf.inputs.out_atlas_voting_weight_name_format = 'ants_joint_fusion_voting_weight_%d.nii.gz'
>>> jf.cmdline
"antsJointFusion -a 0.5 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 1.0 -d 3 -e 1[roi01.nii] -e 2[roi02.nii]
-o [ants_fusion_label_output.nii, ants_joint_fusion_intensity_%d.nii.gz,
ants_joint_fusion_posterior_%d.nii.gz, ants_joint_fusion_voting_weight_%d.nii.gz]
-p 3x2x1 -s mask.nii -t ['im1.nii', 'im2.nii'] -v"
atlas_imagea list of items which are a list of items which are a pathlike object or string representing an existing file

The atlas image (or multimodal atlas images) assumed to be aligned to a common image domain. Maps to a command-line argument: -g %s....

atlas_segmentation_imagea list of items which are a pathlike object or string representing an existing file

The atlas segmentation images. For performing label fusion the number of specified segmentations should be identical to the number of atlas image sets. Maps to a command-line argument: -l %s....

target_imagea list of items which are a list of items which are a pathlike object or string representing an existing file

The target image (or multimodal target images) assumed to be aligned to a common image domain. Maps to a command-line argument: -t %s.

alphaa float

Regularization term added to matrix Mx for calculating the inverse. Default = 0.1. Maps to a command-line argument: -a %s. (Nipype default value: 0.1)

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

betaa float

Exponent for mapping intensity difference to the joint error. Default = 2.0. Maps to a command-line argument: -b %s. (Nipype default value: 2.0)

constrain_nonnegativea boolean

Constrain solution to non-negative weights. Maps to a command-line argument: -c. (Nipype default value: False)

dimension3 or 2 or 4

This option forces the image to be treated as a specified-dimensional image. If not specified, the program tries to infer the dimensionality from the input image. Maps to a command-line argument: -d %d.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

exclusion_imagea list of items which are a pathlike object or string representing an existing file

Specify an exclusion region for the given label.

exclusion_image_labela list of items which are a string

Specify a label for the exclusion region. Maps to a command-line argument: -e %s. Requires inputs: exclusion_image.

mask_imagea pathlike object or string representing an existing file

If a mask image is specified, fusion is only performed in the mask region. Maps to a command-line argument: -x %s.

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

out_atlas_voting_weight_name_formata string

Optional atlas voting weight image file name format. Requires inputs: out_label_fusion, out_intensity_fusion_name_format, out_label_post_prob_name_format.

out_intensity_fusion_name_formata string

Optional intensity fusion image file name format. (e.g. “antsJointFusionIntensity_%d.nii.gz”).

out_label_fusiona pathlike object or string representing a file

The output label fusion image. Maps to a command-line argument: %s.

out_label_post_prob_name_formata string

Optional label posterior probability image file name format. Requires inputs: out_label_fusion, out_intensity_fusion_name_format.

patch_metric‘PC’ or ‘MSQ’

Metric to be used in determining the most similar neighborhood patch. Options include Pearson’s correlation (PC) and mean squares (MSQ). Default = PC (Pearson correlation). Maps to a command-line argument: -m %s.

patch_radiusa list of from 3 to 3 items which are an integer

Patch radius for similarity measures. Default: 2x2x2. Maps to a command-line argument: -p %s.

retain_atlas_voting_imagesa boolean

Retain atlas voting images. Default = false. Maps to a command-line argument: -f. (Nipype default value: False)

retain_label_posterior_imagesa boolean

Retain label posterior probability images. Requires atlas segmentations to be specified. Default = false. Maps to a command-line argument: -r. Requires inputs: atlas_segmentation_image. (Nipype default value: False)

search_radiusa list of from 1 to 3 items which are any value

Search radius for similarity measures. Default = 3x3x3. One can also specify an image where the value at the voxel specifies the isotropic search radius at that voxel. Maps to a command-line argument: -s %s. (Nipype default value: [3, 3, 3])

verbosea boolean

Verbose output. Maps to a command-line argument: -v.

out_atlas_voting_weight : a list of items which are a pathlike object or string representing an existing file out_intensity_fusion : a list of items which are a pathlike object or string representing an existing file out_label_fusion : a pathlike object or string representing an existing file out_label_post_prob : a list of items which are a pathlike object or string representing an existing file

KellyKapowski

Link to code

Bases: ANTSCommand

Wrapped executable: KellyKapowski.

Nipype Interface to ANTs’ KellyKapowski, also known as DiReCT.

DiReCT is a registration based estimate of cortical thickness. It was published in S. R. Das, B. B. Avants, M. Grossman, and J. C. Gee, Registration based cortical thickness measurement, Neuroimage 2009, 45:867–879.

Examples

>>> from nipype.interfaces.ants.segmentation import KellyKapowski
>>> kk = KellyKapowski()
>>> kk.inputs.dimension = 3
>>> kk.inputs.segmentation_image = "segmentation0.nii.gz"
>>> kk.inputs.convergence = "[45,0.0,10]"
>>> kk.inputs.thickness_prior_estimate = 10
>>> kk.cmdline
'KellyKapowski --convergence "[45,0.0,10]"
--output "[segmentation0_cortical_thickness.nii.gz,segmentation0_warped_white_matter.nii.gz]"
--image-dimensionality 3 --gradient-step 0.025000
--maximum-number-of-invert-displacement-field-iterations 20 --number-of-integration-points 10
--segmentation-image "[segmentation0.nii.gz,2,3]" --smoothing-variance 1.000000
--smoothing-velocity-field-parameter 1.500000 --thickness-prior-estimate 10.000000'
segmentation_imagea pathlike object or string representing an existing file

A segmentation image must be supplied labeling the gray and white matters. Default values = 2 and 3, respectively. Maps to a command-line argument: --segmentation-image "%s".

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

convergencea string

Convergence is determined by fitting a line to the normalized energy profile of the last N iterations (where N is specified by the window size) and determining the slope which is then compared with the convergence threshold. Maps to a command-line argument: --convergence "%s". (Nipype default value: [50,0.001,10])

cortical_thicknessa pathlike object or string representing a file

Filename for the cortical thickness. Maps to a command-line argument: --output "%s".

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: --image-dimensionality %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

gradient_stepa float

Gradient step size for the optimization. Maps to a command-line argument: --gradient-step %f. (Nipype default value: 0.025)

gray_matter_labelan integer

The label value for the gray matter label in the segmentation_image. (Nipype default value: 2)

gray_matter_prob_imagea pathlike object or string representing an existing file

In addition to the segmentation image, a gray matter probability image can be used. If no such image is supplied, one is created using the segmentation image and a variance of 1.0 mm. Maps to a command-line argument: --gray-matter-probability-image "%s".

max_invert_displacement_field_itersan integer

Maximum number of iterations for estimating the invertdisplacement field. Maps to a command-line argument: --maximum-number-of-invert-displacement-field-iterations %d. (Nipype default value: 20)

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

number_integration_pointsan integer

Number of compositions of the diffeomorphism per iteration. Maps to a command-line argument: --number-of-integration-points %d. (Nipype default value: 10)

smoothing_variancea float

Defines the Gaussian smoothing of the hit and total images. Maps to a command-line argument: --smoothing-variance %f. (Nipype default value: 1.0)

smoothing_velocity_fielda float

Defines the Gaussian smoothing of the velocity field (default = 1.5). If the b-spline smoothing option is chosen, then this defines the isotropic mesh spacing for the smoothing spline (default = 15). Maps to a command-line argument: --smoothing-velocity-field-parameter %f. (Nipype default value: 1.5)

thickness_prior_estimatea float

Provides a prior constraint on the final thickness measurement in mm. Maps to a command-line argument: --thickness-prior-estimate %f. (Nipype default value: 10)

thickness_prior_imagea pathlike object or string representing an existing file

An image containing spatially varying prior thickness values. Maps to a command-line argument: --thickness-prior-image "%s".

use_bspline_smoothinga boolean

Sets the option for B-spline smoothing of the velocity field. Maps to a command-line argument: --use-bspline-smoothing 1.

warped_white_mattera pathlike object or string representing a file

Filename for the warped white matter file.

white_matter_labelan integer

The label value for the white matter label in the segmentation_image. (Nipype default value: 3)

white_matter_prob_imagea pathlike object or string representing an existing file

In addition to the segmentation image, a white matter probability image can be used. If no such image is supplied, one is created using the segmentation image and a variance of 1.0 mm. Maps to a command-line argument: --white-matter-probability-image "%s".

cortical_thicknessa pathlike object or string representing a file

A thickness map defined in the segmented gray matter.

warped_white_mattera pathlike object or string representing a file

A warped white matter image.

LaplacianThickness

Link to code

Bases: ANTSCommand

Wrapped executable: LaplacianThickness.

Calculates the cortical thickness from an anatomical image

Examples

>>> from nipype.interfaces.ants import LaplacianThickness
>>> cort_thick = LaplacianThickness()
>>> cort_thick.inputs.input_wm = 'white_matter.nii.gz'
>>> cort_thick.inputs.input_gm = 'gray_matter.nii.gz'
>>> cort_thick.cmdline
'LaplacianThickness white_matter.nii.gz gray_matter.nii.gz white_matter_thickness.nii.gz'
>>> cort_thick.inputs.output_image = 'output_thickness.nii.gz'
>>> cort_thick.cmdline
'LaplacianThickness white_matter.nii.gz gray_matter.nii.gz output_thickness.nii.gz'
input_gma pathlike object or string representing a file

Gray matter segmentation image. Maps to a command-line argument: %s (position: 2).

input_wma pathlike object or string representing a file

White matter segmentation image. Maps to a command-line argument: %s (position: 1).

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

dTa float

Time delta used during integration (defaults to 0.01). Maps to a command-line argument: %s (position: 6). Requires inputs: prior_thickness.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

output_imagea string

Name of output file. Maps to a command-line argument: %s (position: 3).

prior_thicknessa float

Prior thickness (defaults to 500). Maps to a command-line argument: %s (position: 5). Requires inputs: smooth_param.

smooth_parama float

Sigma of the Laplacian Recursive Image Filter (defaults to 1). Maps to a command-line argument: %s (position: 4).

sulcus_priora float

Positive floating point number for sulcus prior. Authors said that 0.15 might be a reasonable value. Maps to a command-line argument: %s (position: 7). Requires inputs: dT.

tolerancea float

Tolerance to reach during optimization (defaults to 0.001). Maps to a command-line argument: %s (position: 8). Requires inputs: sulcus_prior.

output_imagea pathlike object or string representing an existing file

Cortical thickness.

N4BiasFieldCorrection

Link to code

Bases: ANTSCommand, CopyHeaderInterface

Wrapped executable: N4BiasFieldCorrection.

Bias field correction.

N4 is a variant of the popular N3 (nonparameteric nonuniform normalization) retrospective bias correction algorithm. Based on the assumption that the corruption of the low frequency bias field can be modeled as a convolution of the intensity histogram by a Gaussian, the basic algorithmic protocol is to iterate between deconvolving the intensity histogram by a Gaussian, remapping the intensities, and then spatially smoothing this result by a B-spline modeling of the bias field itself. The modifications from and improvements obtained over the original N3 algorithm are described in [Tustison2010].

[Tustison2010]

N. Tustison et al., N4ITK: Improved N3 Bias Correction, IEEE Transactions on Medical Imaging, 29(6):1310-1320, June 2010.

Examples

>>> import copy
>>> from nipype.interfaces.ants import N4BiasFieldCorrection
>>> n4 = N4BiasFieldCorrection()
>>> n4.inputs.dimension = 3
>>> n4.inputs.input_image = 'structural.nii'
>>> n4.inputs.bspline_fitting_distance = 300
>>> n4.inputs.shrink_factor = 3
>>> n4.inputs.n_iterations = [50,50,30,20]
>>> n4.cmdline
'N4BiasFieldCorrection --bspline-fitting [ 300 ]
-d 3 --input-image structural.nii
--convergence [ 50x50x30x20 ] --output structural_corrected.nii
--shrink-factor 3'
>>> n4_2 = copy.deepcopy(n4)
>>> n4_2.inputs.convergence_threshold = 1e-6
>>> n4_2.cmdline
'N4BiasFieldCorrection --bspline-fitting [ 300 ]
-d 3 --input-image structural.nii
--convergence [ 50x50x30x20, 1e-06 ] --output structural_corrected.nii
--shrink-factor 3'
>>> n4_3 = copy.deepcopy(n4_2)
>>> n4_3.inputs.bspline_order = 5
>>> n4_3.cmdline
'N4BiasFieldCorrection --bspline-fitting [ 300, 5 ]
-d 3 --input-image structural.nii
--convergence [ 50x50x30x20, 1e-06 ] --output structural_corrected.nii
--shrink-factor 3'
>>> n4_4 = N4BiasFieldCorrection()
>>> n4_4.inputs.input_image = 'structural.nii'
>>> n4_4.inputs.save_bias = True
>>> n4_4.inputs.dimension = 3
>>> n4_4.cmdline
'N4BiasFieldCorrection -d 3 --input-image structural.nii
--output [ structural_corrected.nii, structural_bias.nii ]'
>>> n4_5 = N4BiasFieldCorrection()
>>> n4_5.inputs.input_image = 'structural.nii'
>>> n4_5.inputs.dimension = 3
>>> n4_5.inputs.histogram_sharpening = (0.12, 0.02, 200)
>>> n4_5.cmdline
'N4BiasFieldCorrection -d 3  --histogram-sharpening [0.12,0.02,200]
--input-image structural.nii --output structural_corrected.nii'
copy_headera boolean

Copy headers of the original image into the output (corrected) file. (Nipype default value: False)

input_imagea pathlike object or string representing a file

Input for bias correction. Negative values or values close to zero should be processed prior to correction. Maps to a command-line argument: --input-image %s.

save_biasa boolean

True if the estimated bias should be saved to file. Mutually exclusive with inputs: bias_image. (Nipype default value: False)

argsa string

Additional parameters to the command. Maps to a command-line argument: %s.

bias_imagea pathlike object or string representing a file

Filename for the estimated bias.

bspline_fitting_distancea float

Maps to a command-line argument: --bspline-fitting %s.

bspline_orderan integer

Requires inputs: bspline_fitting_distance.

convergence_thresholda float

Requires inputs: n_iterations.

dimension3 or 2 or 4

Image dimension (2, 3 or 4). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

histogram_sharpeninga tuple of the form: (a float, a float, an integer)

Three-values tuple of histogram sharpening parameters (FWHM, wienerNose, numberOfHistogramBins). These options describe the histogram sharpening parameters, i.e. the deconvolution step parameters described in the original N3 algorithm. The default values have been shown to work fairly well. Maps to a command-line argument: --histogram-sharpening [%g,%g,%d].

mask_imagea pathlike object or string representing a file

Image to specify region to perform final bias correction in. Maps to a command-line argument: --mask-image %s.

n_iterationsa list of items which are an integer

Maps to a command-line argument: --convergence %s.

num_threadsan integer

Number of ITK threads to use. (Nipype default value: 1)

output_imagea string

Output file name. Maps to a command-line argument: --output %s.

rescale_intensitiesa boolean

[NOTE: Only ANTs>=2.1.0] At each iteration, a new intensity mapping is calculated and applied but there is nothing which constrains the new intensity range to be within certain values. The result is that the range can “drift” from the original at each iteration. This option rescales to the [min,max] range of the original image intensities within the user-specified mask. Maps to a command-line argument: -r. (Nipype default value: False)

shrink_factoran integer

Maps to a command-line argument: --shrink-factor %d.

weight_imagea pathlike object or string representing a file

Image for relative weighting (e.g. probability map of the white matter) of voxels during the B-spline fitting. . Maps to a command-line argument: --weight-image %s.

bias_imagea pathlike object or string representing an existing file

Estimated bias.

output_imagea pathlike object or string representing an existing file

Warped image.