workflows#

Module: workflows.align#

ResliceFlow(*[, output_strategy, mix_names, ...])

Methods

SlrWithQbxFlow(*[, output_strategy, ...])

Methods

ImageRegistrationFlow(*[, output_strategy, ...])

The registration workflow allows the user to use only one type of registration (such as center of mass or rigid body registration only).

ApplyTransformFlow(*[, output_strategy, ...])

Methods

SynRegistrationFlow(*[, output_strategy, ...])

Methods

MotionCorrectionFlow(*[, output_strategy, ...])

The Motion Correction workflow allows the user to align between-volumes DWI dataset.

BundleWarpFlow(*[, output_strategy, ...])

Methods

check_dimensions(static, moving)

Check the dimensions of the input images.

Module: workflows.base#

IntrospectiveArgumentParser([prog, usage, ...])

Attributes:

add_default_args_to_docstring(npds, func)

Add default arguments to the docstring of a function.

get_args_default(func)

none_or_dtype(dtype)

Check None presence before type casting.

Module: workflows.cli#

run()

Run scripts located in pyproject.toml.

Module: workflows.combined_workflow#

CombinedWorkflow(*[, output_strategy, ...])

Methods

Module: workflows.denoise#

Patch2SelfFlow(*[, output_strategy, ...])

Methods

NLMeansFlow(*[, output_strategy, mix_names, ...])

Methods

LPCAFlow(*[, output_strategy, mix_names, ...])

Methods

MPPCAFlow(*[, output_strategy, mix_names, ...])

Methods

GibbsRingingFlow(*[, output_strategy, ...])

Methods

Module: workflows.docstring_parser#

This was taken directly from the file docscrape.py of numpydoc package.

Copyright (C) 2008 Stefan van der Walt <stefan@mentat.za.net>, Pauli Virtanen <pav@iki.fi>

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

  2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY THE AUTHOR “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Reader(data)

A line-based string reader.

NumpyDocString(docstring, *[, config])

dedent_lines(lines)

Deindent a list of lines maximally

Module: workflows.flow_runner#

get_level(lvl)

Transforms the logging level passed on the commandline into a proper logging level name.

run_flow(flow, *[, extra_args])

Wraps the process of building an argparser that reflects the workflow that we want to run along with some generic parameters like logging, force and output strategies.

Module: workflows.io#

IoInfoFlow(*[, output_strategy, mix_names, ...])

Methods

FetchFlow(*[, output_strategy, mix_names, ...])

Methods

SplitFlow(*[, output_strategy, mix_names, ...])

Methods

ExtractB0Flow(*[, output_strategy, ...])

Methods

ExtractShellFlow(*[, output_strategy, ...])

Methods

ExtractVolumeFlow(*[, output_strategy, ...])

Methods

ConcatenateTractogramFlow(*[, ...])

Methods

ConvertSHFlow(*[, output_strategy, ...])

Methods

ConvertTensorsFlow(*[, output_strategy, ...])

Methods

ConvertTractogramFlow(*[, output_strategy, ...])

Methods

NiftisToPamFlow(*[, output_strategy, ...])

Methods

TensorToPamFlow(*[, output_strategy, ...])

Methods

PamToNiftisFlow(*[, output_strategy, ...])

Methods

MathFlow(*[, output_strategy, mix_names, ...])

Methods

Module: workflows.mask#

MaskFlow(*[, output_strategy, mix_names, ...])

Methods

Module: workflows.multi_io#

IOIterator(*[, output_strategy, mix_names])

Create output filenames that work nicely with multiple input files from multiple directories (processing multiple subjects with one command)

common_start(sa, sb)

Return the longest common substring from the beginning of sa and sb.

slash_to_under(dir_str)

connect_output_paths(inputs, out_dir, ...[, ...])

Generate a list of output files paths based on input files and output strategies.

concatenate_inputs(multi_inputs)

Concatenate list of inputs.

basename_without_extension(fname)

io_iterator(inputs, out_dir, fnames, *[, ...])

Create an IOIterator from the parameters.

Module: workflows.nn#

EVACPlusFlow(*[, output_strategy, ...])

Methods

BiasFieldCorrectionFlow(*[, ...])

Methods

Module: workflows.reconst#

ReconstMAPMRIFlow(*[, output_strategy, ...])

Methods

ReconstDtiFlow(*[, output_strategy, ...])

Methods

ReconstDsiFlow(*[, output_strategy, ...])

Methods

ReconstCSDFlow(*[, output_strategy, ...])

Methods

ReconstQBallBaseFlow(*[, output_strategy, ...])

Methods

ReconstDkiFlow(*[, output_strategy, ...])

Methods

ReconstIvimFlow(*[, output_strategy, ...])

Methods

ReconstRUMBAFlow(*[, output_strategy, ...])

Methods

ReconstSDTFlow(*[, output_strategy, ...])

Methods

ReconstSFMFlow(*[, output_strategy, ...])

Methods

ReconstGQIFlow(*[, output_strategy, ...])

Methods

ReconstForecastFlow(*[, output_strategy, ...])

Methods

Module: workflows.segment#

MedianOtsuFlow(*[, output_strategy, ...])

Methods

RecoBundlesFlow(*[, output_strategy, ...])

Methods

LabelsBundlesFlow(*[, output_strategy, ...])

Methods

ClassifyTissueFlow(*[, output_strategy, ...])

Methods

Module: workflows.stats#

SNRinCCFlow(*[, output_strategy, mix_names, ...])

Methods

BundleAnalysisTractometryFlow(*[, ...])

Methods

LinearMixedModelsFlow(*[, output_strategy, ...])

Methods

BundleShapeAnalysis(*[, output_strategy, ...])

Methods

buan_bundle_profiles(model_bundle_folder, ...)

Applies statistical analysis on bundles and saves the results in a directory specified by out_dir.

Module: workflows.tracking#

LocalFiberTrackingPAMFlow(*[, ...])

Methods

PFTrackingPAMFlow(*[, output_strategy, ...])

Methods

Module: workflows.utils#

Module for utility functions.

handle_vol_idx(vol_idx)

Handle user input for volume index.

Module: workflows.viz#

HorizonFlow(*[, output_strategy, mix_names, ...])

Methods

Module: workflows.workflow#

Workflow(*[, output_strategy, mix_names, ...])

Methods

ResliceFlow#

class dipy.workflows.align.ResliceFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, new_vox_size[, order, ...])

Reslice data with new voxel resolution defined by new_vox_sz

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, new_vox_size, order=1, mode='constant', cval=0, num_processes=1, out_dir='', out_resliced='resliced.nii.gz')[source]#

Reslice data with new voxel resolution defined by new_vox_sz

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

new_vox_sizevariable float

new voxel size.

orderint, optional

order of interpolation, from 0 to 5, for resampling/reslicing, 0 nearest interpolation, 1 trilinear etc.. if you don’t want any smoothing 0 is the option you need.

modestring, optional

Points outside the boundaries of the input are filled according to the given mode ‘constant’, ‘nearest’, ‘reflect’ or ‘wrap’.

cvalfloat, optional

Value used for points outside the boundaries of the input if mode=’constant’.

num_processesint, optional

Split the calculation to a pool of children processes. This only applies to 4D data arrays. Default is 1. If < 0 the maximal number of cores minus num_processes + 1 is used (enter -1 to use as many cores as possible). 0 raises an error.

out_dirstring, optional

Output directory.

out_reslicedstring, optional

Name of the resliced dataset to be saved.

SlrWithQbxFlow#

class dipy.workflows.align.SlrWithQbxFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(static_files, moving_files[, x0, ...])

Streamline-based linear registration.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(static_files, moving_files, x0='affine', rm_small_clusters=50, qbx_thr=(40, 30, 20, 15), num_threads=None, greater_than=50, less_than=250, nb_pts=20, progressive=True, out_dir='', out_moved='moved.trk', out_affine='affine.txt', out_stat_centroids='static_centroids.trk', out_moving_centroids='moving_centroids.trk', out_moved_centroids='moved_centroids.trk')[source]#

Streamline-based linear registration.

For efficiency we apply the registration on cluster centroids and remove small clusters.

See [1], [2], [3] for further details.

Parameters:
static_filesstring

List of reference/fixed bundle tractograms.

moving_filesstring

List of target bundle tractograms that will be moved/registered to match the static bundles.

x0string, optional

rigid, similarity or affine transformation model.

rm_small_clustersint, optional

Remove clusters that have less than rm_small_clusters.

qbx_thrvariable int, optional

Thresholds for QuickBundlesX.

num_threadsint, optional

Number of threads to be used for OpenMP parallelization. If None (default) the value of OMP_NUM_THREADS environment variable is used if it is set, otherwise all available threads are used. If < 0 the maximal number of threads minus \(|num_threads + 1|\) is used (enter -1 to use as many threads as possible). 0 raises an error. Only metrics using OpenMP will use this variable.

greater_thanint, optional

Keep streamlines that have length greater than this value.

less_thanint, optional

Keep streamlines have length less than this value.

nb_ptsint, optional

Number of points for discretizing each streamline.

progressiveboolean, optional

True to enable progressive registration.

out_dirstring, optional

Output directory.

out_movedstring, optional

Filename of moved tractogram.

out_affinestring, optional

Filename of affine for SLR transformation.

out_stat_centroidsstring, optional

Filename of static centroids.

out_moving_centroidsstring, optional

Filename of moving centroids.

out_moved_centroidsstring, optional

Filename of moved centroids.

Notes

The order of operations is the following. First short or long streamlines are removed. Second the tractogram or a random selection of the tractogram is clustered with QuickBundlesX. Then SLR [2] is applied.

References

ImageRegistrationFlow#

class dipy.workflows.align.ImageRegistrationFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

The registration workflow allows the user to use only one type of registration (such as center of mass or rigid body registration only).

Alternatively, a registration can be done in a progressive manner. For example, using affine registration with progressive set to ‘True’ will involve center of mass, translation, rigid body and full affine registration. Whereas, when progressive is False the registration will include only center of mass and affine registration. The progressive registration will be slower but will improve the quality.

This can be controlled by using the progressive flag (True by default).

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(static_image_files, moving_image_files)

Parameters:

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

run(static_image_files, moving_image_files, transform='affine', nbins=32, sampling_prop=None, metric='mi', level_iters=(10000, 1000, 100), sigmas=(3.0, 1.0, 0.0), factors=(4, 2, 1), progressive=True, save_metric=False, static_vol_idx=None, moving_vol_idx=None, out_dir='', out_moved='moved.nii.gz', out_affine='affine.txt', out_quality='quality_metric.txt')[source]#
Parameters:
static_image_filesstring

Path to the static image file.

moving_image_filesstring

Path to the moving image file.

transformstring, optional

'com': center of mass; 'trans': translation; 'rigid': rigid body; 'rigid_isoscaling': rigid body + isotropic scaling, 'rigid_scaling': rigid body + scaling; 'affine': full affine including translation, rotation, shearing and scaling.

nbinsint, optional

Number of bins to discretize the joint and marginal PDF.

sampling_propint, optional

Number ([0-100]) of voxels for calculating the PDF. None implies all voxels.

metricstring, optional

Similarity metric for gathering mutual information.

level_itersvariable int, optional

The number of iterations at each scale of the scale space. level_iters[0] corresponds to the coarsest scale, level_iters[-1] the finest, where n is the length of the sequence.

sigmasvariable floats, optional

Custom smoothing parameter to build the scale space (one parameter for each scale).

factorsvariable floats, optional

Custom scale factors to build the scale space (one factor for each scale).

progressiveboolean, optional

Enable/Disable the progressive registration.

save_metricboolean, optional

If true, quality assessment metric are saved in ‘quality_metric.txt’.

static_vol_idxstr, optional

1D array representing indices of axis=-1 of a 4D static input volume. From the command line use something like 3 4 5 6. From script use something like [3, 4, 5, 6]. This input is required for 4D volumes.

moving_vol_idxstr, optional

1D array representing indices of axis=-1 of a 4D moving input volume. From the command line use something like 3 4 5 6. From script use something like [3, 4, 5, 6]. This input is required for 4D volumes.

out_dirstring, optional

Directory to save the transformed image and the affine matrix

out_movedstring, optional

Name for the saved transformed image.

out_affinestring, optional

Name for the saved affine matrix.

out_qualitystring, optional

Name of the file containing the saved quality metric.

ApplyTransformFlow#

class dipy.workflows.align.ApplyTransformFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(static_image_files, moving_image_files, ...)

Parameters:

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

run(static_image_files, moving_image_files, transform_map_file, transform_type='affine', out_dir='', out_file='transformed.nii.gz')[source]#
Parameters:
static_image_filesstring

Path of the static image file.

moving_image_filesstring

Path of the moving image(s). It can be a single image or a folder containing multiple images.

transform_map_filestring

For the affine case, it should be a text(*.txt) file containing the affine matrix. For the diffeomorphic case, it should be a nifti file containing the mapping displacement field in each voxel with this shape (x, y, z, 3, 2).

transform_typestring, optional

Select the transformation type to apply between ‘affine’ or ‘diffeomorphic’.

out_dirstring, optional

Directory to save the transformed files.

out_filestring, optional

Name of the transformed file. It is recommended to use the flag –mix-names to prevent the output files from being overwritten.

SynRegistrationFlow#

class dipy.workflows.align.SynRegistrationFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(static_image_files, moving_image_files)

Parameters:

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

run(static_image_files, moving_image_files, prealign_file='', inv_static=False, level_iters=(10, 10, 5), metric='cc', mopt_sigma_diff=2.0, mopt_radius=4, mopt_smooth=0.0, mopt_inner_iter=0, mopt_q_levels=256, mopt_double_gradient=True, mopt_step_type='', step_length=0.25, ss_sigma_factor=0.2, opt_tol=1e-05, inv_iter=20, inv_tol=0.001, out_dir='', out_warped='warped_moved.nii.gz', out_inv_static='inc_static.nii.gz', out_field='displacement_field.nii.gz')[source]#
Parameters:
static_image_filesstring

Path of the static image file.

moving_image_filesstring

Path to the moving image file.

prealign_filestring, optional

The text file containing pre alignment information via an affine matrix.

inv_staticboolean, optional

Apply the inverse mapping to the static image.

level_itersvariable int, optional

The number of iterations at each level of the gaussian pyramid.

metricstring, optional

The metric to be used. metric available: cc (Cross Correlation), ssd (Sum Squared Difference), em (Expectation-Maximization).

mopt_sigma_difffloat, optional

Metric option applied on Cross correlation (CC). The standard deviation of the Gaussian smoothing kernel to be applied to the update field at each iteration.

mopt_radiusint, optional

Metric option applied on Cross correlation (CC). the radius of the squared (cubic) neighborhood at each voxel to be considered to compute the cross correlation.

mopt_smoothfloat, optional

Metric option applied on Sum Squared Difference (SSD) and Expectation Maximization (EM). Smoothness parameter, the larger the value the smoother the deformation field. (default 1.0 for EM, 4.0 for SSD)

mopt_inner_iterint, optional

Metric option applied on Sum Squared Difference (SSD) and Expectation Maximization (EM). This is number of iterations to be performed at each level of the multi-resolution Gauss-Seidel optimization algorithm (this is not the number of steps per Gaussian Pyramid level, that parameter must be set for the optimizer, not the metric). Default 5 for EM, 10 for SSD.

mopt_q_levelsint, optional

Metric option applied on Expectation Maximization (EM). Number of quantization levels (Default: 256 for EM)

mopt_double_gradientbool, optional

Metric option applied on Expectation Maximization (EM). if True, the gradient of the expected static image under the moving modality will be added to the gradient of the moving image, similarly, the gradient of the expected moving image under the static modality will be added to the gradient of the static image.

mopt_step_typestring, optional

Metric option applied on Sum Squared Difference (SSD) and Expectation Maximization (EM). The optimization schedule to be used in the multi-resolution Gauss-Seidel optimization algorithm (not used if Demons Step is selected). Possible value: (‘gauss_newton’, ‘demons’). default: ‘gauss_newton’ for EM, ‘demons’ for SSD.

step_lengthfloat, optional

the length of the maximum displacement vector of the update displacement field at each iteration.

ss_sigma_factorfloat, optional

parameter of the scale-space smoothing kernel. For example, the std. dev. of the kernel will be factor*(2^i) in the isotropic case where i = 0, 1, …, n_scales is the scale.

opt_tolfloat, optional

the optimization will stop when the estimated derivative of the energy profile w.r.t. time falls below this threshold.

inv_iterint, optional

the number of iterations to be performed by the displacement field inversion algorithm.

inv_tolfloat, optional

the displacement field inversion algorithm will stop iterating when the inversion error falls below this threshold.

out_dirstring, optional

Directory to save the transformed files.

out_warpedstring, optional

Name of the warped file.

out_inv_staticstring, optional

Name of the file to save the static image after applying the inverse mapping.

out_fieldstring, optional

Name of the file to save the diffeomorphic map.

MotionCorrectionFlow#

class dipy.workflows.align.MotionCorrectionFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

The Motion Correction workflow allows the user to align between-volumes DWI dataset.

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, bvectors_files)

Parameters:

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

run(input_files, bvalues_files, bvectors_files, b0_threshold=50, bvecs_tol=0.01, out_dir='', out_moved='moved.nii.gz', out_affine='affine.txt')[source]#
Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Threshold used to check that norm(bvec) = 1 +/- bvecs_tol b-vectors are unit vectors

out_dirstring, optional

Directory to save the transformed image and the affine matrix.

out_movedstring, optional

Name for the saved transformed image.

out_affinestring, optional

Name for the saved affine matrix.

BundleWarpFlow#

class dipy.workflows.align.BundleWarpFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(static_file, moving_file[, dist, alpha, ...])

BundleWarp: streamline-based nonlinear registration.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(static_file, moving_file, dist=None, alpha=0.3, beta=20, max_iter=15, affine=True, out_dir='', out_linear_moved='linearly_moved.trk', out_nonlinear_moved='nonlinearly_moved.trk', out_warp_transform='warp_transform.npy', out_warp_kernel='warp_kernel.npy', out_dist='distance_matrix.npy', out_matched_pairs='matched_pairs.npy')[source]#

BundleWarp: streamline-based nonlinear registration.

BundleWarp [4] is a nonrigid registration method for deformable registration of white matter tracts.

Parameters:
static_filestring

Path to the static (reference) .trk file.

moving_filestring

Path to the moving (target to be registered) .trk file.

diststring, optional

Path to the precalculated distance matrix file.

alphafloat, optional

Represents the trade-off between regularizing the deformation and having points match very closely. Lower value of alpha means high deformations. It is represented with λ in BundleWarp paper. NOTE: setting alpha<=0.01 will result in highly deformable registration that could extremely modify the original anatomy of the moving bundle.

betaint, optional

Represents the strength of the interaction between points Gaussian kernel size.

max_iterint, optional

Maximum number of iterations for deformation process in ml-CPD method.

affineboolean, optional

If False, use rigid registration as starting point. (default True)

out_dirstring, optional

Output directory.

out_linear_movedstring, optional

Filename of linearly moved bundle.

out_nonlinear_movedstring, optional

Filename of nonlinearly moved (warped) bundle.

out_warp_transformstring, optional

Filename of warp transformations generated by BundleWarp.

out_warp_kernelstring, optional

Filename of regularization gaussian kernel generated by BundleWarp.

out_diststring, optional

Filename of MDF distance matrix.

out_matched_pairsstring, optional

Filename of matched pairs; streamline correspondences between two bundles.

References

check_dimensions#

dipy.workflows.align.check_dimensions(static, moving)[source]#

Check the dimensions of the input images.

Parameters:
static2D or 3D array

the image to be used as reference during optimization.

moving: 2D or 3D array

the image to be used as “moving” during optimization. It is necessary to pre-align the moving image to ensure its domain lies inside the domain of the deformation fields. This is assumed to be accomplished by “pre-aligning” the moving image towards the static using an affine transformation given by the ‘starting_affine’ matrix.

IntrospectiveArgumentParser#

class dipy.workflows.base.IntrospectiveArgumentParser(prog=None, usage=None, description=None, epilog=None, parents=(), formatter_class=<class 'argparse.RawTextHelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='resolve', add_help=True)[source]#

Bases: ArgumentParser

Attributes:
optional_parameters
output_parameters
positional_parameters

Methods

add_argument(add_argument)

add_sub_flow_args(sub_flows)

Take an array of workflow objects and use introspection to extract the parameters, types and docstrings of their run method.

add_subparsers(**kwargs)

add_workflow(workflow)

Take a workflow object and use introspection to extract the parameters, types and docstrings of its run method.

error(message)

Prints a usage message incorporating the message to stderr and exits.

exit([status, message])

format_usage()

get_flow_args([args, namespace])

Return the parsed arguments as a dictionary that will be used as a workflow's run method arguments.

parse_args([args, namespace])

print_usage([file])

register(registry_name, value, object)

set_defaults(**kwargs)

add_argument_group

add_description

add_epilogue

add_mutually_exclusive_group

convert_arg_line_to_args

format_help

get_default

parse_intermixed_args

parse_known_args

parse_known_intermixed_args

print_help

show_argument

update_argument

add_description()[source]#
add_epilogue()[source]#
add_sub_flow_args(sub_flows)[source]#

Take an array of workflow objects and use introspection to extract the parameters, types and docstrings of their run method. Only the optional input parameters are extracted for these as they are treated as sub workflows.

Parameters:
sub_flowsarray of dipy.workflows.workflow.Workflow

Workflows to inspect.

Returns:
sub_flow_optionalsdictionary of all sub workflow optional parameters
add_workflow(workflow)[source]#

Take a workflow object and use introspection to extract the parameters, types and docstrings of its run method. Then add these parameters to the current arparser’s own params to parse. If the workflow is of type combined_workflow, the optional input parameters of its sub workflows will also be added.

Parameters:
workflowdipy.workflows.workflow.Workflow

Workflow from which to infer parameters.

Returns:
sub_flow_optionalsdictionary of all sub workflow optional parameters
get_flow_args(args=None, namespace=None)[source]#

Return the parsed arguments as a dictionary that will be used as a workflow’s run method arguments.

property optional_parameters#
property output_parameters#
property positional_parameters#
show_argument(dest)[source]#
update_argument(*args, **kargs)[source]#

add_default_args_to_docstring#

dipy.workflows.base.add_default_args_to_docstring(npds, func)[source]#

Add default arguments to the docstring of a function.

Parameters:
npdslist

List of parameters from the docstring.

funcfunction

Function to inspect.

get_args_default#

dipy.workflows.base.get_args_default(func)[source]#

none_or_dtype#

dipy.workflows.base.none_or_dtype(dtype)[source]#

Check None presence before type casting.

run#

dipy.workflows.cli.run()[source]#

Run scripts located in pyproject.toml.

CombinedWorkflow#

class dipy.workflows.combined_workflow.CombinedWorkflow(*, output_strategy='append', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_optionals(flow, **kwargs)

Returns the sub flow's optional arguments merged with those passed as params in kwargs.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Returns a list of tuples (sub flow name, sub flow run method, sub flow short name) to be used in the sub flow parameters extraction.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(*args, **kwargs)

Execute the workflow.

run_sub_flow(flow, *args, **kwargs)

Runs the sub flow with the optional parameters passed via the command line.

set_sub_flows_optionals(opts)

Sets the self._optionals variable with all sub flow arguments that were passed in the commandline.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

get_optionals(flow, **kwargs)[source]#

Returns the sub flow’s optional arguments merged with those passed as params in kwargs.

get_sub_runs()[source]#

Returns a list of tuples (sub flow name, sub flow run method, sub flow short name) to be used in the sub flow parameters extraction.

run_sub_flow(flow, *args, **kwargs)[source]#

Runs the sub flow with the optional parameters passed via the command line. This is a convenience method to make sub flow running more intuitive on the concrete CombinedWorkflow side.

set_sub_flows_optionals(opts)[source]#

Sets the self._optionals variable with all sub flow arguments that were passed in the commandline.

Patch2SelfFlow#

class dipy.workflows.denoise.Patch2SelfFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bval_files[, model, ...])

Workflow for Patch2Self denoising method.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bval_files, model='ols', b0_threshold=50, alpha=1.0, verbose=False, patch_radius=0, skip_b0_denoising=False, clip_negative_vals=False, skip_shift_intensity=False, ver=3, out_dir='', out_denoised='dwi_patch2self.nii.gz')[source]#

Workflow for Patch2Self denoising method.

See [5] for further details about the method. See [6] for further details about the new method.

It applies patch2self denoising [5] on each file found by ‘globing’ input_file and bval_file. It saves the results in a directory specified by out_dir.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bval_filesstring

bval file associated with the diffusion data.

modelstring, or initialized linear model object.

This will determine the algorithm used to solve the set of linear equations underlying this model. If it is a string it needs to be one of the following: {‘ols’, ‘ridge’, ‘lasso’}. Otherwise, it can be an object that inherits from dipy.optimize.SKLearnLinearSolver or an object with a similar interface from Scikit-Learn: sklearn.linear_model.LinearRegression, sklearn.linear_model.Lasso or sklearn.linear_model.Ridge and other objects that inherit from sklearn.base.RegressorMixin. Default: ‘ols’.

b0_thresholdint, optional

Threshold for considering volumes as b0.

alphafloat, optional

Regularization parameter only for ridge regression model.

verbosebool, optional

Show progress of Patch2Self and time taken.

patch_radiusvariable int, optional

The radius of the local patch to be taken around each voxel

skip_b0_denoisingbool, optional

Skips denoising b0 volumes if set to True.

clip_negative_valsbool, optional

Sets negative values after denoising to 0 using np.clip.

skip_shift_intensitybool, optional

Skips shifting the distribution of intensities per volume to give non-negative values if set to True.

verint, optional

Version of the Patch2Self algorithm to use between 1 or 3.

out_dirstring, optional

Output directory.

out_denoisedstring, optional

Name of the resulting denoised volume (default: dwi_patch2self.nii.gz)

References

NLMeansFlow#

class dipy.workflows.denoise.NLMeansFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, sigma, patch_radius, ...])

Workflow wrapping the nlmeans denoising method.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, sigma=0, patch_radius=1, block_radius=5, rician=True, out_dir='', out_denoised='dwi_nlmeans.nii.gz')[source]#

Workflow wrapping the nlmeans denoising method.

It applies nlmeans denoise [7] on each file found by ‘globing’ input_files and saves the results in a directory specified by out_dir.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

sigmafloat, optional

Sigma parameter to pass to the nlmeans algorithm.

patch_radiusint, optional

patch size is 2 x patch_radius + 1.

block_radiusint, optional

block size is 2 x block_radius + 1.

ricianbool, optional

If True the noise is estimated as Rician, otherwise Gaussian noise is assumed.

out_dirstring, optional

Output directory.

out_denoisedstring, optional

Name of the resulting denoised volume.

References

LPCAFlow#

class dipy.workflows.denoise.LPCAFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, bvectors_files)

Workflow wrapping LPCA denoising method.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, sigma=0, b0_threshold=50, bvecs_tol=0.01, patch_radius=2, pca_method='eig', tau_factor=2.3, out_dir='', out_denoised='dwi_lpca.nii.gz')[source]#

Workflow wrapping LPCA denoising method.

See [8] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

sigmafloat, optional

Standard deviation of the noise estimated from the data. 0 means sigma value estimation following the algorithm in Manjón et al.[9].

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Threshold used to check that norm(bvec) = 1 +/- bvecs_tol b-vectors are unit vectors.

patch_radiusint, optional

The radius of the local patch to be taken around each voxel (in voxels) For example, for a patch radius with value 2, and assuming the input image is a 3D image, the denoising will take place in blocks of 5x5x5 voxels.

pca_methodstring, optional

Use either eigenvalue decomposition (‘eig’) or singular value decomposition (‘svd’) for principal component analysis. The default method is ‘eig’ which is faster. However, occasionally ‘svd’ might be more accurate.

tau_factorfloat, optional

Thresholding of PCA eigenvalues is done by nulling out eigenvalues that are smaller than:

\[\tau = (\tau_{factor} \sigma)^2\]

\(\tau_{factor}\) can be change to adjust the relationship between the noise standard deviation and the threshold \(\tau\). If \(\tau_{factor}\) is set to None, it will be automatically calculated using the Marcenko-Pastur distribution :footcite:p`Veraart2016b`.

out_dirstring, optional

Output directory.

out_denoisedstring, optional

Name of the resulting denoised volume.

References

MPPCAFlow#

class dipy.workflows.denoise.MPPCAFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, patch_radius, pca_method, ...])

Workflow wrapping Marcenko-Pastur PCA denoising method.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, patch_radius=2, pca_method='eig', return_sigma=False, out_dir='', out_denoised='dwi_mppca.nii.gz', out_sigma='dwi_sigma.nii.gz')[source]#

Workflow wrapping Marcenko-Pastur PCA denoising method.

See [8] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

patch_radiusvariable int, optional

The radius of the local patch to be taken around each voxel (in voxels) For example, for a patch radius with value 2, and assuming the input image is a 3D image, the denoising will take place in blocks of 5x5x5 voxels.

pca_methodstring, optional

Use either eigenvalue decomposition (‘eig’) or singular value decomposition (‘svd’) for principal component analysis. The default method is ‘eig’ which is faster. However, occasionally ‘svd’ might be more accurate.

return_sigmabool, optional

If true, a noise standard deviation estimate based on the Marcenko-Pastur distribution is returned [10].

out_dirstring, optional

Output directory.

out_denoisedstring, optional

Name of the resulting denoised volume.

out_sigmastring, optional

Name of the resulting sigma volume.

References

GibbsRingingFlow#

class dipy.workflows.denoise.GibbsRingingFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, slice_axis, n_points, ...])

Workflow for applying Gibbs Ringing method.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, slice_axis=2, n_points=3, num_processes=1, out_dir='', out_unring='dwi_unring.nii.gz')[source]#

Workflow for applying Gibbs Ringing method.

See [11] and [12] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

slice_axisint, optional

Data axis corresponding to the number of acquired slices. Could be (0, 1, or 2): for example, a value of 2 would mean the third axis.

n_pointsint, optional

Number of neighbour points to access local TV (see note).

num_processesint or None, optional

Split the calculation to a pool of children processes. Only applies to 3D or 4D data arrays. Default is 1. If < 0 the maximal number of cores minus num_processes + 1 is used (enter -1 to use as many cores as possible). 0 raises an error.

out_dirstring, optional

Output directory.

out_unringstring, optional

Name of the resulting denoised volume.

References

Reader#

class dipy.workflows.docstring_parser.Reader(data)[source]#

Bases: object

A line-based string reader.

Methods

eof

is_empty

peek

read

read_to_condition

read_to_next_empty_line

read_to_next_unindented_line

reset

seek_next_non_empty_line

eof()[source]#
is_empty()[source]#
peek(n=0)[source]#
read()[source]#
read_to_condition(condition_func)[source]#
read_to_next_empty_line()[source]#
read_to_next_unindented_line()[source]#
reset()[source]#
seek_next_non_empty_line()[source]#

NumpyDocString#

class dipy.workflows.docstring_parser.NumpyDocString(docstring, *, config=None)[source]#

Bases: object

dedent_lines#

dipy.workflows.docstring_parser.dedent_lines(lines)[source]#

Deindent a list of lines maximally

get_level#

dipy.workflows.flow_runner.get_level(lvl)[source]#

Transforms the logging level passed on the commandline into a proper logging level name.

run_flow#

dipy.workflows.flow_runner.run_flow(flow, *, extra_args=None)[source]#

Wraps the process of building an argparser that reflects the workflow that we want to run along with some generic parameters like logging, force and output strategies. The resulting parameters are then fed to the workflow’s run method.

IoInfoFlow#

class dipy.workflows.io.IoInfoFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, b0_threshold, bvecs_tol, ...])

Provides useful information about different files used in medical imaging.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, b0_threshold=50, bvecs_tol=0.01, bshell_thr=100, reference=None)[source]#

Provides useful information about different files used in medical imaging. Any number of input files can be provided. The program identifies the type of file by its extension.

Parameters:
input_filesvariable string

Any number of Nifti1, bvals or bvecs files.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Threshold used to check that norm(bvec) = 1 +/- bvecs_tol b-vectors are unit vectors.

bshell_thrfloat, optional

Threshold for distinguishing b-values in different shells.

referencestring, optional

Reference anatomy for tck/vtk/fib/dpy file. support (.nii or .nii.gz).

FetchFlow#

class dipy.workflows.io.FetchFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_fetcher_datanames()

Gets available dataset and function names.

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

load_module(module_path)

Load / reload an external module.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(data_names[, subjects, ...])

Download files to folder and check their md5 checksums.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

static get_fetcher_datanames()[source]#

Gets available dataset and function names.

Returns:
available_data: dict

Available dataset and function names.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

static load_module(module_path)[source]#

Load / reload an external module.

Parameters:
module_path: string

the path to the module relative to the main script

Returns:
module: module object
run(data_names, subjects=None, include_optional=False, include_afq=False, hcp_bucket='hcp-openaccess', hcp_profile_name='hcp', hcp_study='HCP_1200', hcp_aws_access_key_id=None, hcp_aws_secret_access_key=None, out_dir='')[source]#

Download files to folder and check their md5 checksums.

To see all available datasets, please type “list” in data_names.

Parameters:
data_namesvariable string

Any number of Nifti1, bvals or bvecs files.

subjectsvariable string, optional

Identifiers of the subjects to download. Used only by the HBN & HCP dataset. For example with HBN dataset: –subject NDARAA948VFH NDAREK918EC2

include_optionalbool, optional

Include optional datasets.

include_afqbool, optional

Whether to include pyAFQ derivatives. Used only by the HBN dataset.

hcp_bucketstring, optional

The name of the HCP S3 bucket.

hcp_profile_namestring, optional

The name of the AWS profile used for access.

hcp_studystring, optional

Which HCP study to grab.

hcp_aws_access_key_idstring, optional

AWS credentials to HCP AWS S3. Will only be used if profile_name is set to False.

hcp_aws_secret_access_keystring, optional

AWS credentials to HCP AWS S3. Will only be used if profile_name is set to False.

out_dirstring, optional

Output directory.

SplitFlow#

class dipy.workflows.io.SplitFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, vol_idx, out_dir, out_split])

Splits the input 4D file and extracts the required 3D volume.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, vol_idx=0, out_dir='', out_split='split.nii.gz')[source]#

Splits the input 4D file and extracts the required 3D volume.

Parameters:
input_filesvariable string

Any number of Nifti1 files

vol_idxint, optional

Index of the 3D volume to extract.

out_dirstring, optional

Output directory.

out_splitstring, optional

Name of the resulting split volume

ExtractB0Flow#

class dipy.workflows.io.ExtractB0Flow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files[, ...])

Extract on or multiple b0 volume from the input 4D file.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, b0_threshold=50, group_contiguous_b0=False, strategy='mean', out_dir='', out_b0='b0.nii.gz')[source]#

Extract on or multiple b0 volume from the input 4D file.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

group_contiguous_b0bool, optional

If True, each contiguous b0 volumes are grouped together.

strategystr, optional

The extraction strategy, of either:

  • first: select the first b0 found.

  • all: select them all.

  • mean: average them.

When used in conjunction with the batch parameter set to True, the strategy is applied individually on each continuous set found.

out_dirstring, optional

Output directory.

out_b0string, optional

Name of the resulting b0 volume.

ExtractShellFlow#

class dipy.workflows.io.ExtractShellFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, bvectors_files)

Extract shells from the input 4D file.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, bvals_to_extract=None, b0_threshold=50, bvecs_tol=0.01, tol=20, group_shells=True, out_dir='', out_shell='shell.nii.gz')[source]#

Extract shells from the input 4D file.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

bvals_to_extractstring, optional

List of b-values to extract. You can provide a single b-values or a range of b-values separated by a dash. For example, to extract b-values 0, 1, and 2, you can use ‘0-2’. You can also provide a list of b-values separated by a comma. For example, to extract b-values 0, 1, 2, 8, 10, 11 and 12, you can use ‘0-2,8,10-12’.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Threshold used to check that norm(bvec) = 1 +/- bvecs_tol

tolint, optional

Tolerance range for b-value selection. A value of 20 means volumes with b-values within ±20 units of the specified b-values will be extracted.

group_shellsbool, optional

If True, extracted volumes are grouped into a single array. If False, returns a list of separate volumes.

out_dirstring, optional

Output directory.

out_shellstring, optional

Name of the resulting shell volume.

ExtractVolumeFlow#

class dipy.workflows.io.ExtractVolumeFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, vol_idx, grouped, ...])

Extracts the required volume from the input 4D file.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, vol_idx=0, grouped=True, out_dir='', out_vol='volume.nii.gz')[source]#

Extracts the required volume from the input 4D file.

Parameters:
input_filesstring

Any number of Nifti1 files

vol_idxstring, optional

Indexes of the 3D volume to extract. Index start from 0. You can provide a single index or a range of indexes separated by a dash. For example, to extract volumes 0, 1, and 2, you can use ‘0-2’. You can also provide a list of indexes separated by a comma. For example, to extract volumes 0, 1, 2, 8, 10, 11 and 12 , you can use ‘0-2,8,10-12’.

groupedbool, optional

If True, extracted volumes are grouped into a single array. If False, save a list of separate volumes.

out_dirstring, optional

Output directory.

out_volstring, optional

Name of the resulting volume.

ConcatenateTractogramFlow#

class dipy.workflows.io.ConcatenateTractogramFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(tractogram_files[, reference, ...])

Concatenate multiple tractograms into one.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(tractogram_files, reference=None, delete_dpv=False, delete_dps=False, delete_groups=False, check_space_attributes=True, preallocation=False, out_dir='', out_extension='trx', out_tractogram='concatenated_tractogram')[source]#

Concatenate multiple tractograms into one.

Parameters:
tractogram_listvariable string

The stateful tractogram filenames to concatenate

referencestring, optional

Reference anatomy for tck/vtk/fib/dpy file. support (.nii or .nii.gz).

delete_dpvbool, optional

Delete dpv keys that do not exist in all the provided TrxFiles

delete_dpsbool, optional

Delete dps keys that do not exist in all the provided TrxFile

delete_groupsbool, optional

Delete all the groups that currently exist in the TrxFiles

check_space_attributesbool, optional

Verify that dimensions and size of data are similar between all the TrxFiles

preallocationbool, optional

Preallocated TrxFile has already been generated and is the first element in trx_list (Note: delete_groups must be set to True as well)

out_dirstring, optional

Output directory.

out_extensionstring, optional

Extension of the resulting tractogram

out_tractogramstring, optional

Name of the resulting tractogram

ConvertSHFlow#

class dipy.workflows.io.ConvertSHFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, out_dir, out_file])

Converts SH basis representation between DIPY and MRtrix3 formats.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, out_dir='', out_file='sh_convert_dipy_mrtrix_out.nii.gz')[source]#

Converts SH basis representation between DIPY and MRtrix3 formats. Because this conversion is equal to its own inverse, it can be used to convert in either direction: DIPY to MRtrix3 or vice versa.

Parameters:
input_filesstring

Path to the input files. This path may contain wildcards to process multiple inputs at once.

out_dirstring, optional

Where the resulting file will be saved. (default ‘’)

out_filestring, optional

Name of the result file to be saved. (default ‘sh_convert_dipy_mrtrix_out.nii.gz’)

ConvertTensorsFlow#

class dipy.workflows.io.ConvertTensorsFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(tensor_files[, from_format, to_format, ...])

Converts tensor representation between different formats.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(tensor_files, from_format='mrtrix', to_format='dipy', out_dir='.', out_tensor='converted_tensor')[source]#

Converts tensor representation between different formats.

Parameters:
tensor_filesvariable string

Any number of tensor files

from_formatstring, optional

Format of the input tensor files. Valid options are ‘dipy’, ‘mrtrix’, ‘ants’, ‘fsl’.

to_formatstring, optional

Format of the output tensor files. Valid options are ‘dipy’, ‘mrtrix’, ‘ants’, ‘fsl’.

out_dirstring, optional

Output directory.

out_tensorstring, optional

Name of the resulting tensor file

ConvertTractogramFlow#

class dipy.workflows.io.ConvertTractogramFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, reference, pos_dtype, ...])

Converts tractogram between different formats.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, reference=None, pos_dtype='float32', offsets_dtype='uint32', out_dir='', out_tractogram='converted_tractogram.trk')[source]#

Converts tractogram between different formats.

Parameters:
input_filesvariable string

Any number of tractogram files

referencestring, optional

Reference anatomy for tck/vtk/fib/dpy file. support (.nii or .nii.gz).

pos_dtypestring, optional

Data type of the tractogram points, used for vtk files.

offsets_dtypestring, optional

Data type of the tractogram offsets, used for vtk files.

out_dirstring, optional

Output directory.

out_tractogramstring, optional

Name of the resulting tractogram

NiftisToPamFlow#

class dipy.workflows.io.NiftisToPamFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(peaks_dir_files, peaks_values_files, ...)

Convert multiple nifti files to a single pam5 file.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(peaks_dir_files, peaks_values_files, peaks_indices_files, shm_files=None, gfa_files=None, sphere_files=None, default_sphere_name='repulsion724', out_dir='', out_pam='peaks.pam5')[source]#

Convert multiple nifti files to a single pam5 file.

Parameters:
peaks_dir_filesstring

Path to the input peaks directions volume. This path may contain wildcards to process multiple inputs at once.

peaks_values_filesstring

Path to the input peaks values volume. This path may contain wildcards to process multiple inputs at once.

peaks_indices_filesstring

Path to the input peaks indices volume. This path may contain wildcards to process multiple inputs at once.

shm_filesstring, optional

Path to the input spherical harmonics volume. This path may contain wildcards to process multiple inputs at once.

gfa_filesstring, optional

Path to the input generalized FA volume. This path may contain wildcards to process multiple inputs at once.

sphere_filesstring, optional

Path to the input sphere vertices. This path may contain wildcards to process multiple inputs at once. If it is not define, default_sphere option will be used.

default_sphere_namestring, optional

Specify default sphere to use for spherical harmonics representation. This option can be superseded by sphere_files option. Possible options: [‘symmetric362’, ‘symmetric642’, ‘symmetric724’, ‘repulsion724’, ‘repulsion100’, ‘repulsion200’].

out_dirstring, optional

Output directory (default input file directory).

out_pamstring, optional

Name of the peaks volume to be saved.

TensorToPamFlow#

class dipy.workflows.io.TensorToPamFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(evals_files, evecs_files[, ...])

Convert multiple tensor files(evals, evecs) to pam5 files.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(evals_files, evecs_files, sphere_files=None, default_sphere_name='repulsion724', out_dir='', out_pam='peaks.pam5')[source]#

Convert multiple tensor files(evals, evecs) to pam5 files.

Parameters:
evals_filesstring

Path to the input eigen values volumes. This path may contain wildcards to process multiple inputs at once.

evecs_filesstring

Path to the input eigen vectors volumes. This path may contain wildcards to process multiple inputs at once.

sphere_filesstring, optional

Path to the input sphere vertices. This path may contain wildcards to process multiple inputs at once. If it is not define, default_sphere option will be used.

default_sphere_namestring, optional

Specify default sphere to use for spherical harmonics representation. This option can be superseded by sphere_files option. Possible options: [‘symmetric362’, ‘symmetric642’, ‘symmetric724’, ‘repulsion724’, ‘repulsion100’, ‘repulsion200’].

out_dirstring, optional

Output directory (default input file directory).

out_pamstring, optional

Name of the peaks volume to be saved.

PamToNiftisFlow#

class dipy.workflows.io.PamToNiftisFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(pam_files[, out_dir, out_peaks_dir, ...])

Convert pam5 files to multiple nifti files.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(pam_files, out_dir='', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_shm='shm.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Convert pam5 files to multiple nifti files.

Parameters:
pam_filesstring

Path to the input peaks volumes. This path may contain wildcards to process multiple inputs at once.

out_dirstring, optional

Output directory (default input file directory).

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_gfastring, optional

Generalized FA volume name to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy file to be saved.

MathFlow#

class dipy.workflows.io.MathFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(operation, input_files[, dtype, ...])

Perform mathematical operations on volume input files.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(operation, input_files, dtype=None, out_dir='', out_file='math_out.nii.gz')[source]#

Perform mathematical operations on volume input files.

This workflow allows the user to perform mathematical operations on multiple input files. e.g. to add two volumes together, subtract one: dipy_math "vol1 + vol2 - vol3" t1.nii.gz t1_a.nii.gz t1_b.nii.gz The input files must be in Nifti format and have the same shape.

Parameters:
operationstring
Mathematical operation to perform. supported operators are:
  • Bitwise operators (and, or, not, xor): &, |, ~, ^

  • Comparison operators: <, <=, ==, !=, >=, >

  • Unary arithmetic operators: -

  • Binary arithmetic operators: +, -, *, /, **, <<, >>

Supported functions are:
  • where(bool, number1, number2) -> number: number1 if the bool condition is true, number2 otherwise.

  • {sin,cos,tan}(float|complex) -> float|complex: trigonometric sine, cosine or tangent.

  • {arcsin,arccos,arctan}(float|complex) -> float|complex: trigonometric inverse sine, cosine or tangent.

  • arctan2(float1, float2) -> float: trigonometric inverse tangent of float1/float2.

  • {sinh,cosh,tanh}(float|complex) -> float|complex: hyperbolic sine, cosine or tangent.

  • {arcsinh,arccosh,arctanh}(float|complex) -> float|complex: hyperbolic inverse sine, cosine or tangent.

  • {log,log10,log1p}(float|complex) -> float|complex: natural, base-10 and log(1+x) logarithms.

  • {exp,expm1}(float|complex) -> float|complex: exponential and exponential minus one.

  • sqrt(float|complex) -> float|complex: square root.

  • abs(float|complex) -> float|complex: absolute value.

  • conj(complex) -> complex: conjugate value.

  • {real,imag}(complex) -> float: real or imaginary part of complex.

  • complex(float, float) -> complex: complex from real and imaginary parts.

  • contains(np.str, np.str) -> bool: returns True for every string in op1 that contains op2.

input_filesvariable string

Any number of Nifti1 files

dtypestring, optional

Data type of the resulting file.

out_dirstring, optional

Output directory

out_filestring, optional

Name of the resulting file to be saved.

MaskFlow#

class dipy.workflows.mask.MaskFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, lb[, ub, out_dir, out_mask])

Workflow for creating a binary mask

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, lb, ub=inf, out_dir='', out_mask='mask.nii.gz')[source]#

Workflow for creating a binary mask

Parameters:
input_filesstring

Path to image to be masked.

lbfloat

Lower bound value.

ubfloat, optional

Upper bound value.

out_dirstring, optional

Output directory.

out_maskstring, optional

Name of the masked file.

IOIterator#

class dipy.workflows.multi_io.IOIterator(*, output_strategy='absolute', mix_names=False)[source]#

Bases: object

Create output filenames that work nicely with multiple input files from multiple directories (processing multiple subjects with one command)

Use information from input files, out_dir and out_fnames to generate correct outputs which can come from long lists of multiple or single inputs.

Methods

create_directories

create_outputs

file_existence_check

set_inputs

set_out_dir

set_out_fnames

set_output_keys

create_directories()[source]#
create_outputs()[source]#
file_existence_check(args)[source]#
set_inputs(*args)[source]#
set_out_dir(out_dir)[source]#
set_out_fnames(*args)[source]#
set_output_keys(*args)[source]#

common_start#

dipy.workflows.multi_io.common_start(sa, sb)[source]#

Return the longest common substring from the beginning of sa and sb.

slash_to_under#

dipy.workflows.multi_io.slash_to_under(dir_str)[source]#

connect_output_paths#

dipy.workflows.multi_io.connect_output_paths(inputs, out_dir, out_files, *, output_strategy='absolute', mix_names=True)[source]#

Generate a list of output files paths based on input files and output strategies.

Parameters:
inputsarray

List of input paths.

out_dirstring

The output directory.

out_filesarray

List of output files.

output_strategystring, optional
Which strategy to use to generate the output paths.

‘append’: Add out_dir to the path of the input. ‘prepend’: Add the input path directory tree to out_dir. ‘absolute’: Put directly in out_dir.

mix_namesbool, optional

Whether or not prepend a string composed of a mix of the input names to the final output name.

Returns:
A list of output file paths.

concatenate_inputs#

dipy.workflows.multi_io.concatenate_inputs(multi_inputs)[source]#

Concatenate list of inputs.

basename_without_extension#

dipy.workflows.multi_io.basename_without_extension(fname)[source]#

io_iterator#

dipy.workflows.multi_io.io_iterator(inputs, out_dir, fnames, *, output_strategy='absolute', mix_names=False, out_keys=None)[source]#

Create an IOIterator from the parameters.

Parameters:
inputsarray

List of input files.

out_dirstring

Output directory.

fnamesarray

File names of all outputs to be created.

output_strategystring, optional

Controls the behavior of the IOIterator for output paths.

mix_namesbool, optional

Whether or not to append a mix of input names at the beginning.

out_keyslist, optional

Output parameter names.

Returns:
Properly instantiated IOIterator object.

EVACPlusFlow#

class dipy.workflows.nn.EVACPlusFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, save_masked, out_dir, ...])

Extract brain using EVAC+.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, save_masked=False, out_dir='', out_mask='brain_mask.nii.gz', out_masked='dwi_masked.nii.gz')[source]#

Extract brain using EVAC+.

See [13] for further details about EVAC+.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

save_maskedbool, optional

Save mask.

out_dirstring, optional

Output directory.

out_maskstring, optional

Name of the mask volume to be saved.

out_maskedstring, optional

Name of the masked volume to be saved.

References

BiasFieldCorrectionFlow#

class dipy.workflows.nn.BiasFieldCorrectionFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, bval, bvec, method, ...])

Correct bias field.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bval=None, bvec=None, method='n4', threshold=0.5, use_cuda=False, verbose=False, out_dir='', out_corrected='biasfield_corrected.nii.gz')[source]#

Correct bias field.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalstring, optional

Path to the b-value file.

bvecstring, optional

Path to the b-vector file.

methodstring, optional
Bias field correction method. Choose from:
  • ‘n4’: DeepN4 bias field correction. See [14] for more details.

  • ‘b0’: B0 bias field correction via normalization.

‘n4’ method is recommended for T1-weighted images where ‘b0’ method is recommended for diffusion-weighted images.

thresholdfloat, optional

Threshold for cleaning the final correction field in DeepN4 method.

use_cudabool, optional

Use CUDA for DeepN4 bias field correction.

verbosebool, optional

Print verbose output.

out_dirstring, optional

Output directory.

out_correctedstring, optional

Name of the corrected volume to be saved.

References

ReconstMAPMRIFlow#

class dipy.workflows.reconst.ReconstMAPMRIFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(data_files, bvals_files, bvecs_files, ...)

Workflow for fitting the MAPMRI model (with optional Laplacian regularization).

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(data_files, bvals_files, bvecs_files, small_delta, big_delta, b0_threshold=50.0, laplacian=True, positivity=True, bval_threshold=2000, save_metrics=(), laplacian_weighting=0.05, radial_order=6, sphere_name=None, relative_peak_threshold=0.5, min_separation_angle=25, npeaks=5, normalize_peaks=False, extract_pam_values=False, out_dir='', out_rtop='rtop.nii.gz', out_lapnorm='lapnorm.nii.gz', out_msd='msd.nii.gz', out_qiv='qiv.nii.gz', out_rtap='rtap.nii.gz', out_rtpp='rtpp.nii.gz', out_ng='ng.nii.gz', out_perng='perng.nii.gz', out_parng='parng.nii.gz', out_pam='mapmri_peaks.pam5', out_peaks_dir='mapmri_peaks_dirs.nii.gz', out_peaks_values='mapmri_peaks_values.nii.gz', out_peaks_indices='mapmri_peaks_indices.nii.gz')[source]#

Workflow for fitting the MAPMRI model (with optional Laplacian regularization). Generates rtop, lapnorm, msd, qiv, rtap, rtpp, non-gaussian (ng), parallel ng, perpendicular ng saved in a nifti format in input files provided by data_files and saves the nifti files to an output directory specified by out_dir.

In order for the MAPMRI workflow to work in the way intended either the Laplacian or positivity or both must be set to True.

Parameters:
data_filesstring

Path to the input volume.

bvals_filesstring

Path to the bval files.

bvecs_filesstring

Path to the bvec files.

small_deltafloat

Small delta value used in generation of gradient table of provided bval and bvec.

big_deltafloat

Big delta value used in generation of gradient table of provided bval and bvec.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

laplacianbool, optional

Regularize using the Laplacian of the MAP-MRI basis.

positivitybool, optional

Constrain the propagator to be positive.

bval_thresholdfloat, optional

Sets the b-value threshold to be used in the scale factor estimation. In order for the estimated non-Gaussianity to have meaning this value should set to a lower value (b<2000 s/mm^2) such that the scale factors are estimated on signal points that reasonably represent the spins at Gaussian diffusion.

save_metricsvariable string, optional

List of metrics to save. Possible values: rtop, laplacian_signal, msd, qiv, rtap, rtpp, ng, perng, parng

laplacian_weightingfloat, optional

Weighting value used in fitting the MAPMRI model in the Laplacian and both model types.

radial_orderunsigned int, optional

Even value used to set the order of the basis.

sphere_namestring, optional

Sphere name on which to reconstruct the fODFs.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The minimum distance between directions. If two peaks are too close only the larger of the two is returned.

npeaksint, optional

Maximum number of peaks found.

normalize_peaksbool, optional

If true, all peak values are calculated relative to max(odf).

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

out_dirstring, optional

Output directory.

out_rtopstring, optional

Name of the rtop to be saved.

out_lapnormstring, optional

Name of the norm of Laplacian signal to be saved.

out_msdstring, optional

Name of the msd to be saved.

out_qivstring, optional

Name of the qiv to be saved.

out_rtapstring, optional

Name of the rtap to be saved.

out_rtppstring, optional

Name of the rtpp to be saved.

out_ngstring, optional

Name of the Non-Gaussianity to be saved.

out_perngstring, optional

Name of the Non-Gaussianity perpendicular to be saved.

out_parngstring, optional

Name of the Non-Gaussianity parallel to be saved.

out_pamstring, optional

Name of the peaks volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

ReconstDtiFlow#

class dipy.workflows.reconst.ReconstDtiFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Workflow for tensor reconstruction and for computing DTI metrics using Weighted Least-Squares.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

get_fitted_tensor

get_fitted_tensor(data, mask, bval, bvec, b0_threshold=50, bvecs_tol=0.01, fit_method='WLS', optional_args=None)[source]#
classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, fit_method='WLS', b0_threshold=50, bvecs_tol=0.01, npeaks=1, sigma=None, save_metrics=None, nifti_tensor=True, extract_pam_values=False, out_dir='', out_tensor='tensors.nii.gz', out_fa='fa.nii.gz', out_ga='ga.nii.gz', out_rgb='rgb.nii.gz', out_md='md.nii.gz', out_ad='ad.nii.gz', out_rd='rd.nii.gz', out_mode='mode.nii.gz', out_evec='evecs.nii.gz', out_eval='evals.nii.gz', out_pam='peaks.pam5', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_sphere='sphere.txt', out_qa='qa.nii.gz')[source]#

Workflow for tensor reconstruction and for computing DTI metrics using Weighted Least-Squares.

Performs a tensor reconstruction [15], [16] on the files by ‘globing’ input_files and saves the DTI metrics in a directory specified by out_dir.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once.

fit_methodstring, optional

can be one of the following: ‘WLS’ for weighted least squares [17] ‘LS’ or ‘OLS’ for ordinary least squares [17] ‘NLLS’ for non-linear least-squares ‘RT’ or ‘restore’ or ‘RESTORE’ for RESTORE robust tensor fitting [18].

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Threshold used to check that norm(bvec) = 1 +/- bvecs_tol

npeaksint, optional

Number of peaks/eigen vectors to save in each voxel. DTI generates 3 eigen values and eigen vectors. The principal eigenvector is saved by default.

sigmafloat, optional

An estimate of the variance. Chang et al.[18] recommend to use 1.5267 * std(background_noise), where background_noise is estimated from some part of the image known to contain no signal (only noise) b-vectors are unit vectors.

save_metricsvariable string, optional

List of metrics to save. Possible values: fa, ga, rgb, md, ad, rd, mode, tensor, evec, eval

nifti_tensorbool, optional

Whether the tensor is saved in the standard Nifti format or in an alternate format that is used by other software (e.g., FSL): a 4-dimensional volume (shape (i, j, k, 6)) with Dxx, Dxy, Dxz, Dyy, Dyz, Dzz on the last dimension.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

out_dirstring, optional

Output directory.

out_tensorstring, optional

Name of the tensors volume to be saved. Per default, this will be saved following the nifti standard: with the tensor elements as Dxx, Dxy, Dyy, Dxz, Dyz, Dzz on the last (5th) dimension of the volume (shape: (i, j, k, 1, 6)). If nifti_tensor is False, this will be saved in an alternate format that is used by other software (e.g., FSL): a 4-dimensional volume (shape (i, j, k, 6)) with Dxx, Dxy, Dxz, Dyy, Dyz, Dzz on the last dimension.

out_fastring, optional

Name of the fractional anisotropy volume to be saved.

out_gastring, optional

Name of the geodesic anisotropy volume to be saved.

out_rgbstring, optional

Name of the color fa volume to be saved.

out_mdstring, optional

Name of the mean diffusivity volume to be saved.

out_adstring, optional

Name of the axial diffusivity volume to be saved.

out_rdstring, optional

Name of the radial diffusivity volume to be saved.

out_modestring, optional

Name of the mode volume to be saved.

out_evecstring, optional

Name of the eigenvectors volume to be saved.

out_evalstring, optional

Name of the eigenvalues to be saved.

out_pamstring, optional

Name of the peaks volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

ReconstDsiFlow#

class dipy.workflows.reconst.ReconstDsiFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Diffusion Spectrum Imaging (DSI) reconstruction workflow.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, qgrid_size=17, r_start=2.1, r_end=6.0, r_step=0.2, filter_width=32, remove_convolution=False, normalize_peaks=False, sphere_name=None, relative_peak_threshold=0.5, min_separation_angle=25, sh_order_max=8, extract_pam_values=False, parallel=False, num_processes=None, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Diffusion Spectrum Imaging (DSI) reconstruction workflow.

In DSI, the diffusion signal is sampled on a Cartesian grid in q-space. When using remove_convolution=True, the convolution on the DSI propagator that is caused by the truncation of the q-space in the DSI sampling is removed.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once.

qgrid_sizeint, optional

has to be an odd number. Sets the size of the q_space grid. For example if qgrid_size is 17 then the shape of the grid will be (17, 17, 17).

r_startfloat, optional

ODF is sampled radially in the PDF. This parameters shows where the sampling should start.

r_endfloat, optional

Radial endpoint of ODF sampling

r_stepfloat, optional

Step size of the ODf sampling from r_start to r_end

filter_widthfloat, optional

Strength of the hanning filter

remove_convolutionbool, optional

Whether to remove the convolution on the DSI propagator that is caused by the truncation of the q-space in the DSI sampling.

normalize_peaksbool, optional

Whether to normalize the peaks

sphere_namestring, optional

Sphere name on which to reconstruct the fODFs.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The minimum distance between directions. If two peaks are too close only the larger of the two is returned.

sh_order_maxint, optional

Spherical harmonics order (l) used in the DKI fit.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

parallelbool, optional

Whether to use parallelization in peak-finding during the calibration procedure.

num_processesint, optional

If parallel is True, the number of subprocesses to use (default multiprocessing.cpu_count()). If < 0 the maximal number of cores minus num_processes + 1 is used (enter -1 to use as many cores as possible). 0 raises an error.

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

ReconstCSDFlow#

class dipy.workflows.reconst.ReconstCSDFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Constrained spherical deconvolution.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, b0_threshold=50.0, bvecs_tol=0.01, roi_center=None, roi_radii=10, fa_thr=0.7, frf=None, sphere_name=None, relative_peak_threshold=0.5, min_separation_angle=25, sh_order_max=8, parallel=False, extract_pam_values=False, num_processes=None, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Constrained spherical deconvolution.

See [19] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Bvecs should be unit vectors.

roi_centervariable int, optional

Center of ROI in data. If center is None, it is assumed that it is the center of the volume with shape data.shape[:3].

roi_radiiint or array-like, optional

radii of cuboid ROI in voxels.

fa_thrfloat, optional

FA threshold for calculating the response function.

frfvariable float, optional

Fiber response function can be for example inputted as 15 4 4 (from the command line) or [15, 4, 4] from a Python script to be converted to float and multiplied by 10**-4 . If None the fiber response function will be computed automatically.

sphere_namestring, optional

Sphere name on which to reconstruct the fODFs.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The minimum distance between directions. If two peaks are too close only the larger of the two is returned.

sh_order_maxint, optional

Spherical harmonics order (l) used in the CSA fit.

parallelbool, optional

Whether to use parallelization in peak-finding during the calibration procedure.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

num_processesint, optional

If parallel is True, the number of subprocesses to use (default multiprocessing.cpu_count()). If < 0 the maximal number of cores minus num_processes + 1 is used (enter -1 to use as many cores as possible). 0 raises an error.

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

ReconstQBallBaseFlow#

class dipy.workflows.reconst.ReconstQBallBaseFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Constant Solid Angle.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, *, method='csa', smooth=0.006, min_signal=1e-05, assume_normed=False, b0_threshold=50.0, bvecs_tol=0.01, sphere_name=None, relative_peak_threshold=0.5, min_separation_angle=25, sh_order_max=8, parallel=False, extract_pam_values=False, num_processes=None, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_sphere='sphere.txt', out_gfa='gfa.nii.gz', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Constant Solid Angle.

See [20] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

methodstring, optional

Method to use for the reconstruction. Can be one of the following: ‘csa’ for Constant Solid Angle reconstruction ‘qball’ for Q-Ball reconstruction ‘opdt’ for Orientation Probability Density Transform reconstruction

smoothfloat, optional

The regularization parameter of the model.

min_signalfloat, optional

During fitting, all signal values less than min_signal are clipped to min_signal. This is done primarily to avoid values less than or equal to zero when taking logs.

assume_normedbool, optional

If True, clipping and normalization of the data with respect to the mean B0 signal are skipped during mode fitting. This is an advanced feature and should be used with care.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Threshold used so that norm(bvec)=1.

sphere_namestring, optional

Sphere name on which to reconstruct the fODFs.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The minimum distance between directions. If two peaks are too close only the larger of the two is returned.

sh_order_maxint, optional

Spherical harmonics order (l) used in the CSA fit.

parallelbool, optional

Whether to use parallelization in peak-finding during the calibration procedure.

extract_pam_valuesbool, optional

Whether or not to save pam volumes as single nifti files.

num_processesint, optional

If parallel is True, the number of subprocesses to use (default multiprocessing.cpu_count()). If < 0 the maximal number of cores minus num_processes + 1 is used (enter -1 to use as many cores as possible). 0 raises an error.

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

ReconstDkiFlow#

class dipy.workflows.reconst.ReconstDkiFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Workflow for Diffusion Kurtosis reconstruction and for computing DKI metrics.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

get_fitted_tensor

get_fitted_tensor(data, mask, bval, bvec, b0_threshold=50, fit_method='WLS', optional_args=None)[source]#
classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, fit_method='WLS', b0_threshold=50.0, sigma=None, save_metrics=None, extract_pam_values=False, npeaks=5, out_dir='', out_dt_tensor='dti_tensors.nii.gz', out_fa='fa.nii.gz', out_ga='ga.nii.gz', out_rgb='rgb.nii.gz', out_md='md.nii.gz', out_ad='ad.nii.gz', out_rd='rd.nii.gz', out_mode='mode.nii.gz', out_evec='evecs.nii.gz', out_eval='evals.nii.gz', out_dk_tensor='dki_tensors.nii.gz', out_mk='mk.nii.gz', out_ak='ak.nii.gz', out_rk='rk.nii.gz', out_pam='peaks.pam5', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_sphere='sphere.txt')[source]#

Workflow for Diffusion Kurtosis reconstruction and for computing DKI metrics.

Performs a DKI reconstruction [21], [22] on the files by ‘globing’ input_files and saves the DKI metrics in a directory specified by out_dir.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

fit_methodstring, optional

can be one of the following: ‘OLS’ or ‘ULLS’ for ordinary least squares ‘WLS’ or ‘UWLLS’ for weighted ordinary least squares

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

sigmafloat, optional

An estimate of the variance. Chang et al.[18] recommend to use 1.5267 * std(background_noise), where background_noise is estimated from some part of the image known to contain no signal (only noise)

save_metricsvariable string, optional

List of metrics to save. Possible values: fa, ga, rgb, md, ad, rd, mode, tensor, evec, eval

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

npeaksint, optional

Number of peaks to fit in each voxel.

out_dirstring, optional

Output directory.

out_dt_tensorstring, optional

Name of the tensors volume to be saved.

out_dk_tensorstring, optional

Name of the tensors volume to be saved.

out_fastring, optional

Name of the fractional anisotropy volume to be saved.

out_gastring, optional

Name of the geodesic anisotropy volume to be saved.

out_rgbstring, optional

Name of the color fa volume to be saved.

out_mdstring, optional

Name of the mean diffusivity volume to be saved.

out_adstring, optional

Name of the axial diffusivity volume to be saved.

out_rdstring, optional

Name of the radial diffusivity volume to be saved.

out_modestring, optional

Name of the mode volume to be saved.

out_evecstring, optional

Name of the eigenvectors volume to be saved.

out_evalstring, optional

Name of the eigenvalues to be saved.

out_mkstring, optional

Name of the mean kurtosis to be saved.

out_akstring, optional

Name of the axial kurtosis to be saved.

out_rkstring, optional

Name of the radial kurtosis to be saved.

out_pamstring, optional

Name of the peaks volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

References

ReconstIvimFlow#

class dipy.workflows.reconst.ReconstIvimFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Workflow for Intra-voxel Incoherent Motion reconstruction and for computing IVIM metrics.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

get_fitted_ivim

get_fitted_ivim(data, mask, bval, bvec, *, b0_threshold=50)[source]#
classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, split_b_D=400, split_b_S0=200, b0_threshold=0, save_metrics=None, out_dir='', out_S0_predicted='S0_predicted.nii.gz', out_perfusion_fraction='perfusion_fraction.nii.gz', out_D_star='D_star.nii.gz', out_D='D.nii.gz')[source]#

Workflow for Intra-voxel Incoherent Motion reconstruction and for computing IVIM metrics.

Performs a IVIM reconstruction [23], [24] on the files by ‘globing’ input_files and saves the IVIM metrics in a directory specified by out_dir.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

split_b_Dint, optional

Value to split the bvals to estimate D for the two-stage process of fitting.

split_b_S0int, optional

Value to split the bvals to estimate S0 for the two-stage process of fitting.

b0_thresholdint, optional

Threshold value for the b0 bval.

save_metricsvariable string, optional

List of metrics to save. Possible values: S0_predicted, perfusion_fraction, D_star, D

out_dirstring, optional

Output directory.

out_S0_predictedstring, optional

Name of the S0 signal estimated to be saved.

out_perfusion_fractionstring, optional

Name of the estimated volume fractions to be saved.

out_D_starstring, optional

Name of the estimated pseudo-diffusion parameter to be saved.

out_Dstring, optional

Name of the estimated diffusion parameter to be saved.

References

ReconstRUMBAFlow#

class dipy.workflows.reconst.ReconstRUMBAFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Reconstruct the fiber local orientations using the Robust and Unbiased Model-BAsed Spherical Deconvolution (RUMBA-SD) model.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, *, b0_threshold=50.0, bvecs_tol=0.01, roi_center=None, roi_radii=10, fa_thr=0.7, extract_pam_values=False, sh_order_max=8, parallel=True, num_processes=None, gm_response=0.0008, csf_response=0.003, n_iter=600, recon_type='smf', n_coils=1, R=1, voxelwise=True, use_tv=False, sphere_name='repulsion724', verbose=False, relative_peak_threshold=0.5, min_separation_angle=25, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Reconstruct the fiber local orientations using the Robust and Unbiased Model-BAsed Spherical Deconvolution (RUMBA-SD) model.

The fiber response function (FRF) is computed using the single-shell, single-tissue model, and the voxel-wise fitting procedure is used for RUMBA-SD [25].

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Bvecs should be unit vectors.

roi_centervariable int, optional

Center of ROI in data. If center is None, it is assumed that it is the center of the volume with shape data.shape[:3].

roi_radiivariable int, optional

radii of cuboid ROI in voxels.

fa_thrfloat, optional

FA threshold to compute the WM response function.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

sh_orderint, optional

Spherical harmonics order (l) used in the RUMBA fit.

parallelbool, optional

Whether to use parallelization in peak-finding during the calibration procedure.

num_processesint, optional

If parallel is True, the number of subprocesses to use (default multiprocessing.cpu_count()). If < 0 the maximal number of cores minus num_processes + 1 is used (enter -1 to use as many cores as possible). 0 raises an error.

gm_responsefloat, optional

Mean diffusivity for GM compartment. If None, then grey matter volume fraction is not computed.

csf_responsefloat, optional

Mean diffusivity for CSF compartment. If None, then CSF volume fraction is not computed.

n_iterint, optional

Number of iterations for fODF estimation. Must be a positive int.

recon_typestr, optional

MRI reconstruction method type: spatial matched filter (smf) or sum-of-squares (sos). SMF reconstruction generates Rician noise while SoS reconstruction generates Noncentral Chi noise.

n_coilsint, optional

Number of coils in MRI scanner – only relevant in SoS reconstruction. Must be a positive int. Default: 1

Rint, optional

Acceleration factor of the acquisition. For SIEMENS, R = iPAT factor. For GE, R = ASSET factor. For PHILIPS, R = SENSE factor. Typical values are 1 or 2. Must be a positive integer.

voxelwisebool, optional

If true, performs a voxelwise fit. If false, performs a global fit on the entire brain at once. The global fit requires a 4D brain volume in fit.

use_tvbool, optional

If true, applies total variation regularization. This only takes effect in a global fit (voxelwise is set to False). TV can only be applied to 4D brain volumes with no singleton dimensions.

sphere_namestr, optional

Sphere name on which to reconstruct the fODFs.

verbosebool, optional

If true, logs updates on estimated signal-to-noise ratio after each iteration. This only takes effect in a global fit (voxelwise is set to False).

relative_peak_thresholdfloat, optional
Only return peaks greater than relative_peak_threshold * m

where m is the largest peak.

min_separation_anglefloat, optional

The minimum distance between directions. If two peaks are too close only the larger of the two is returned.

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

ReconstSDTFlow#

class dipy.workflows.reconst.ReconstSDTFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Workflow for Spherical Deconvolution Transform (SDT)

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, *, ratio=None, roi_center=None, roi_radii=10, fa_thr=0.7, sphere_name=None, sh_order_max=8, lambda_=1.0, tau=0.1, b0_threshold=50.0, bvecs_tol=0.01, relative_peak_threshold=0.5, min_separation_angle=25, parallel=False, extract_pam_values=False, num_processes=None, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Workflow for Spherical Deconvolution Transform (SDT)

See [26] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

ratiofloat, optional

Ratio of the smallest to largest eigenvalue used in the response function estimation. If None, the response function will be estimated automatically.

roi_centervariable int, optional

Center of ROI in data. If center is None, it is assumed that it is the center of the volume with shape data.shape[:3].

roi_radiivariable int, optional

radii of cuboid ROI in voxels.

fa_thrfloat, optional

FA threshold to compute the WM response function.

sphere_namestr, optional

Sphere name on which to reconstruct the fODFs.

sh_order_maxint, optional

Maximum spherical harmonics order (l) used in the SDT fit.

lambda_float, optional

Regularization parameter.

taufloat, optional

Diffusion time.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Bvecs should be unit vectors.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The angle tolerance between directions.

parallelbool, optional

Whether to use parallelization in peak-finding.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

num_processesint, optional

If parallel is True, the number of subprocesses to use

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

ReconstSFMFlow#

class dipy.workflows.reconst.ReconstSFMFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Workflow for Sparse Fascicle Model (SFM)

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, *, sphere_name=None, response=None, solver='ElasticNet', l1_ratio=0.5, alpha=0.001, seed=42, b0_threshold=50.0, bvecs_tol=0.01, sh_order_max=8, relative_peak_threshold=0.5, min_separation_angle=25, parallel=False, extract_pam_values=False, num_processes=None, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Workflow for Sparse Fascicle Model (SFM)

See [27] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvalues files. This path may contain wildcards to use

mask_filesstring

Path to the input masks. This path may contain wildcards to use

sphere_namestring, optional

Sphere name on which to reconstruct the fODFs.

responsevariable int, optional

Response function to use. If None, the response function will be defined automatically.

solverstr, optional

This will determine the algorithm used to solve the set of linear equations underlying this model. It needs to be one of the following: {‘ElasticNet’, ‘NNLS’}

l1_ratiofloat, optional

The ElasticNet mixing parameter, with 0 <= l1_ratio <= 1. For l1_ratio = 0 the penalty is an L2 penalty. For l1_ratio = 1 it is an L1 penalty. For 0 < l1_ratio < 1, the penalty is a combination of L1 and L2.

alphafloat, optional

Sets the balance between least-squares error and L1/L2 regularization in ElasticNet :footcite:p`Zou2005`.

seedint, optional

Seed for the random number generator.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Bvecs should be unit vectors.

sh_order_maxint, optional

Maximum spherical harmonics order (l) used in the SFM fit.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The angle tolerance between directions.

parallelbool, optional

Whether to use parallelization in peak-finding.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

num_processesint, optional

If parallel is True, the number of subprocesses to use

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

ReconstGQIFlow#

class dipy.workflows.reconst.ReconstGQIFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Workflow for Generalized Q-Sampling Imaging (GQI)

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, *, method='gqi2', sampling_length=1.2, normalize_peaks=False, sphere_name=None, b0_threshold=50.0, bvecs_tol=0.01, sh_order_max=8, relative_peak_threshold=0.5, min_separation_angle=25, parallel=False, extract_pam_values=False, num_processes=None, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Workflow for Generalized Q-Sampling Imaging (GQI)

See [28] for further details about the method.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once.

methodstr, optional

Method used to compute the ODFs. It can be ‘standard’ or ‘gqi2’.

sampling_lengthfloat, optional

The maximum length of the sampling fibers.

normalize_peaksbool, optional

If True, the peaks are normalized to 1.

sphere_namestr, optional

Sphere name on which to reconstruct the fODFs.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Bvecs should be unit vectors.

sh_order_maxint, optional

Maximum spherical harmonics order (l) used in the SFM fit.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The angle tolerance between directions.

parallelbool, optional

Whether to use parallelization in peak-finding.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

num_processesint, optional

If parallel is True, the number of subprocesses to use

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

ReconstForecastFlow#

class dipy.workflows.reconst.ReconstForecastFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files, bvalues_files, ...[, ...])

Workflow for Fiber ORientation Estimated using Continuous Axially Symmetric Tensors (FORECAST).

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvalues_files, bvectors_files, mask_files, *, lambda_lb=0.001, dec_alg='CSD', lambda_csd=1.0, sphere_name=None, b0_threshold=50.0, bvecs_tol=0.01, sh_order_max=8, relative_peak_threshold=0.5, min_separation_angle=25, parallel=False, extract_pam_values=False, num_processes=None, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz', out_sphere='sphere.txt', out_b='B.nii.gz', out_qa='qa.nii.gz')[source]#

Workflow for Fiber ORientation Estimated using Continuous Axially Symmetric Tensors (FORECAST).

FORECAST [29], [30], [31] is a Spherical Deconvolution reconstruction model for multi-shell diffusion data which enables the calculation of a voxel adaptive response function using the Spherical Mean Technique (SMT) [30], [31].

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues_filesstring

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors_filesstring

Path to the bvectors files. This path may contain wildcards to use multiple bvalues files at once.

mask_filesstring

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

lambda_lbfloat, optional

Regularization parameter for the Laplacian-Beltrami operator.

dec_algstr, optional

Spherical deconvolution algorithm. The possible values are Weighted Least Squares (‘WLS’), Positivity Constraints using CVXPY (‘POS’) and the Constraint Spherical Deconvolution algorithm (‘CSD’).

lambda_csdfloat, optional

Regularization parameter for the CSD algorithm.

sphere_namestr, optional

Sphere name on which to reconstruct the fODFs.

b0_thresholdfloat, optional

Threshold used to find b0 volumes.

bvecs_tolfloat, optional

Bvecs should be unit vectors.

sh_order_maxint, optional

Maximum spherical harmonics order (l) used in the SFM fit.

relative_peak_thresholdfloat, optional

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_anglefloat, optional

The angle tolerance between directions.

parallelbool, optional

Whether to use parallelization in peak-finding.

extract_pam_valuesbool, optional

Save or not to save pam volumes as single nifti files.

num_processesint, optional

If parallel is True, the number of subprocesses to use

out_dirstring, optional

Output directory.

out_pamstring, optional

Name of the peaks volume to be saved.

out_shmstring, optional

Name of the spherical harmonics volume to be saved.

out_peaks_dirstring, optional

Name of the peaks directions volume to be saved.

out_peaks_valuesstring, optional

Name of the peaks values volume to be saved.

out_peaks_indicesstring, optional

Name of the peaks indices volume to be saved.

out_gfastring, optional

Name of the generalized FA volume to be saved.

out_spherestring, optional

Sphere vertices name to be saved.

out_bstring, optional

Name of the B Matrix to be saved.

out_qastring, optional

Name of the Quantitative Anisotropy to be saved.

References

MedianOtsuFlow#

class dipy.workflows.segment.MedianOtsuFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, save_masked, ...])

Workflow wrapping the median_otsu segmentation method.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, save_masked=False, median_radius=2, numpass=5, autocrop=False, vol_idx=None, dilate=None, finalize_mask=False, out_dir='', out_mask='brain_mask.nii.gz', out_masked='dwi_masked.nii.gz')[source]#

Workflow wrapping the median_otsu segmentation method.

Applies median_otsu segmentation on each file found by ‘globing’ input_files and saves the results in a directory specified by out_dir.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

save_maskedbool, optional

Save mask.

median_radiusint, optional

Radius (in voxels) of the applied median filter.

numpassint, optional

Number of pass of the median filter.

autocropbool, optional

If True, the masked input_volumes will also be cropped using the bounding box defined by the masked data. For example, if diffusion images are of 1x1x1 (mm^3) or higher resolution auto-cropping could reduce their size in memory and speed up some of the analysis.

vol_idxstr, optional

1D array representing indices of axis=-1 of a 4D input_volume. From the command line use something like ‘1,2,3-5,7’. This input is required for 4D volumes.

dilateint, optional

number of iterations for binary dilation.

finalize_maskbool, optional

Whether to remove potential holes or islands. Useful for solving minor errors.

out_dirstring, optional

Output directory.

out_maskstring, optional

Name of the mask volume to be saved.

out_maskedstring, optional

Name of the masked volume to be saved.

RecoBundlesFlow#

class dipy.workflows.segment.RecoBundlesFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(streamline_files, model_bundle_files[, ...])

Recognize bundles

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(streamline_files, model_bundle_files, greater_than=50, less_than=1000000, no_slr=False, clust_thr=15.0, reduction_thr=15.0, reduction_distance='mdf', model_clust_thr=2.5, pruning_thr=8.0, pruning_distance='mdf', slr_metric='symmetric', slr_transform='similarity', slr_matrix='small', refine=False, r_reduction_thr=12.0, r_pruning_thr=6.0, no_r_slr=False, out_dir='', out_recognized_transf='recognized.trk', out_recognized_labels='labels.npy')[source]#

Recognize bundles

See [3] and [32] for further details about the method.

Parameters:
streamline_filesstring

The path of streamline files where you want to recognize bundles.

model_bundle_filesstring

The path of model bundle files.

greater_thanint, optional

Keep streamlines that have length greater than this value in mm.

less_thanint, optional

Keep streamlines have length less than this value in mm.

no_slrbool, optional

Don’t enable local Streamline-based Linear Registration.

clust_thrfloat, optional

MDF distance threshold for all streamlines.

reduction_thrfloat, optional

Reduce search space by (mm).

reduction_distancestring, optional

Reduction distance type can be mdf or mam.

model_clust_thrfloat, optional

MDF distance threshold for the model bundles.

pruning_thrfloat, optional

Pruning after matching.

pruning_distancestring, optional

Pruning distance type can be mdf or mam.

slr_metricstring, optional

Options are None, symmetric, asymmetric or diagonal.

slr_transformstring, optional

Transformation allowed. translation, rigid, similarity or scaling.

slr_matrixstring, optional

Options are ‘nano’, ‘tiny’, ‘small’, ‘medium’, ‘large’, ‘huge’.

refinebool, optional

Enable refine recognized bundle.

r_reduction_thrfloat, optional

Refine reduce search space by (mm).

r_pruning_thrfloat, optional

Refine pruning after matching.

no_r_slrbool, optional

Don’t enable Refine local Streamline-based Linear Registration.

out_dirstring, optional

Output directory.

out_recognized_transfstring, optional

Recognized bundle in the space of the model bundle.

out_recognized_labelsstring, optional

Indices of recognized bundle in the original tractogram.

References

LabelsBundlesFlow#

class dipy.workflows.segment.LabelsBundlesFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(streamline_files, labels_files[, ...])

Extract bundles using existing indices (labels)

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(streamline_files, labels_files, out_dir='', out_bundle='recognized_orig.trk')[source]#

Extract bundles using existing indices (labels)

See [3] for further details about the method.

Parameters:
streamline_filesstring

The path of streamline files where you want to recognize bundles.

labels_filesstring

The path of model bundle files.

out_dirstring, optional

Output directory.

out_bundlestring, optional

Recognized bundle in the space of the model bundle.

References

ClassifyTissueFlow#

class dipy.workflows.segment.ClassifyTissueFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, bvals_file, method, ...])

Extract tissue from a volume.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, bvals_file=None, method=None, wm_threshold=0.5, b0_threshold=50, low_signal_threshold=50, nclass=None, beta=0.1, tolerance=1e-05, max_iter=100, out_dir='', out_tissue='tissue_classified.nii.gz', out_pve='tissue_classified_pve.nii.gz')[source]#

Extract tissue from a volume.

Parameters:
input_filesstring

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvals_filestring, optional

Path to the b-values file. Required for ‘dam’ method.

methodstring, optional
Method to use for tissue extraction. Options are:
  • ‘hmrf’: Markov Random Fields modeling approach.

  • ‘dam’: Directional Average Maps, proposed by [33].

‘hmrf’ method is recommended for T1w images, while ‘dam’ method is recommended for DWI Multishell images (single shell are not recommended).

wm_thresholdfloat, optional

The threshold below which a voxel is considered white matter. For data like HCP, threshold of 0.5 proves to be a good choice. For data like cfin, higher threshold values like 0.7 or 0.8 are more suitable. Used for ‘dam’ method.

b0_thresholdfloat, optional

The intensity threshold for a b=0 image. used only for ‘dam’ method.

low_signal_thresholdfloat, optional

The threshold below which a voxel is considered to have low signal. Used only for ‘dam’ method.

nclassint, optional

Number of desired classes. Used only for ‘hmrf’ method.

betafloat, optional

Smoothing parameter, the higher this number the smoother the output will be. Used only for ‘hmrf’ method.

tolerancefloat, optional

Value that defines the percentage of change tolerated to prevent the ICM loop to stop. Default is 1e-05. If you want tolerance check to be disabled put ‘tolerance = 0’. Used only for ‘hmrf’ method.

max_iterint, optional

Fixed number of desired iterations. Default is 100. This parameter defines the maximum number of iterations the algorithm will perform. The loop may terminate early if the change in energy sum between iterations falls below the threshold defined by tolerance. However, if tolerance is explicitly set to 0, this early stopping mechanism is disabled, and the algorithm will run for the specified number of iterations unless another stopping criterion is met. Used only for ‘hmrf’ method.

out_dirstring, optional

Output directory.

out_tissuestring, optional

Name of the tissue volume to be saved.

out_pvestring, optional

Name of the pve volume to be saved.

References

SNRinCCFlow#

class dipy.workflows.stats.SNRinCCFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(data_files, bvals_files, bvecs_files, ...)

Compute the signal-to-noise ratio in the corpus callosum.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(data_files, bvals_files, bvecs_files, mask_file, bbox_threshold=(0.6, 1, 0, 0.1, 0, 0.1), out_dir='', out_file='product.json', out_mask_cc='cc.nii.gz', out_mask_noise='mask_noise.nii.gz')[source]#

Compute the signal-to-noise ratio in the corpus callosum.

Parameters:
data_filesstring

Path to the dwi.nii.gz file. This path may contain wildcards to process multiple inputs at once.

bvals_filesstring

Path of bvals.

bvecs_filesstring

Path of bvecs.

mask_filestring

Path of a brain mask file.

bbox_thresholdvariable float, optional

Threshold for bounding box, values separated with commas for ex. [0.6,1,0,0.1,0,0.1].

out_dirstring, optional

Where the resulting file will be saved.

out_filestring, optional

Name of the result file to be saved.

out_mask_ccstring, optional

Name of the CC mask volume to be saved.

out_mask_noisestring, optional

Name of the mask noise volume to be saved.

BundleAnalysisTractometryFlow#

class dipy.workflows.stats.BundleAnalysisTractometryFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(model_bundle_folder, subject_folder, *)

Workflow of bundle analytics.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(model_bundle_folder, subject_folder, *, no_disks=100, out_dir='')[source]#

Workflow of bundle analytics.

Applies statistical analysis on bundles of subjects and saves the results in a directory specified by out_dir.

See [32] for further details about the method.

Parameters:
model_bundle_folderstring

Path to the input model bundle files. This path may contain wildcards to process multiple inputs at once.

subject_folderstring

Path to the input subject folder. This path may contain wildcards to process multiple inputs at once.

no_disksinteger, optional

Number of disks used for dividing bundle into disks.

out_dirstring, optional

Output directory.

References

LinearMixedModelsFlow#

class dipy.workflows.stats.LinearMixedModelsFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_metric_name(path)

Splits the path string and returns name of anatomical measure (eg: fa), bundle name eg(AF_L) and bundle name with metric name (eg: AF_L_fa)

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(h5_files, *[, no_disks, out_dir])

Workflow of linear Mixed Models.

save_lmm_plot(plot_file, title, bundle_name, ...)

Saves LMM plot with segment/disk number on x-axis and -log10(pvalues) on y-axis in out_dir folder.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

get_metric_name(path)[source]#

Splits the path string and returns name of anatomical measure (eg: fa), bundle name eg(AF_L) and bundle name with metric name (eg: AF_L_fa)

Parameters:
pathstring

Path to the input metric files. This path may contain wildcards to process multiple inputs at once.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(h5_files, *, no_disks=100, out_dir='')[source]#

Workflow of linear Mixed Models.

Applies linear Mixed Models on bundles of subjects and saves the results in a directory specified by out_dir.

Parameters:
h5_filesstring

Path to the input metric files. This path may contain wildcards to process multiple inputs at once.

no_disksinteger, optional

Number of disks used for dividing bundle into disks.

out_dirstring, optional

Output directory.

save_lmm_plot(plot_file, title, bundle_name, x, y)[source]#

Saves LMM plot with segment/disk number on x-axis and -log10(pvalues) on y-axis in out_dir folder.

Parameters:
plot_filestring

Path to the plot file. This path may contain wildcards to process multiple inputs at once.

titlestring

Title for the plot.

bundle_namestring

Bundle name.

xlist

list containing segment/disk number for x-axis.

ylist

list containing -log10(pvalues) per segment/disk number for y-axis.

BundleShapeAnalysis#

class dipy.workflows.stats.BundleShapeAnalysis(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(subject_folder, *[, clust_thr, ...])

Workflow of bundle analytics.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(subject_folder, *, clust_thr=(5, 3, 1.5), threshold=6, out_dir='')[source]#

Workflow of bundle analytics.

Applies bundle shape similarity analysis on bundles of subjects and saves the results in a directory specified by out_dir.

See [32] for further details about the method.

Parameters:
subject_folderstring

Path to the input subject folder. This path may contain wildcards to process multiple inputs at once.

clust_thrvariable float, optional

list of bundle clustering thresholds used in QuickBundlesX.

thresholdfloat, optional

Bundle shape similarity threshold.

out_dirstring, optional

Output directory.

References

buan_bundle_profiles#

dipy.workflows.stats.buan_bundle_profiles(model_bundle_folder, bundle_folder, orig_bundle_folder, metric_folder, group_id, subject, *, no_disks=100, out_dir='')[source]#

Applies statistical analysis on bundles and saves the results in a directory specified by out_dir.

See [32] for further details about the method.

Parameters:
model_bundle_folderstring

Path to the input model bundle files. This path may contain wildcards to process multiple inputs at once.

bundle_folderstring

Path to the input bundle files in common space. This path may contain wildcards to process multiple inputs at once.

orig_bundle_folderstring

Path to the input bundle files in native space. This path may contain wildcards to process multiple inputs at once.

metric_folderstring

Path to the input dti metric or/and peak files. It will be used as metric for statistical analysis of bundles.

group_idinteger

what group subject belongs to either 0 for control or 1 for patient.

subjectstring

subject id e.g. 10001.

no_disksinteger, optional

Number of disks used for dividing bundle into disks.

out_dirstring, optional

Output directory.

References

LocalFiberTrackingPAMFlow#

class dipy.workflows.tracking.LocalFiberTrackingPAMFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(pam_files, stopping_files, seeding_files)

Workflow for Local Fiber Tracking.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(pam_files, stopping_files, seeding_files, use_binary_mask=False, stopping_thr=0.2, seed_density=1, minlen=2, maxlen=500, step_size=0.5, tracking_method='deterministic', pmf_threshold=0.1, max_angle=30.0, sphere_name=None, save_seeds=False, nbr_threads=0, random_seed=1, seed_buffer_fraction=1.0, out_dir='', out_tractogram='tractogram.trk')[source]#

Workflow for Local Fiber Tracking.

This workflow use a saved peaks and metrics (PAM) file as input.

See [34] and [35] for further details about the method.

Parameters:
pam_filesstring
Path to the peaks and metrics files. This path may contain

wildcards to use multiple masks at once.

stopping_filesstring

Path to images (e.g. FA) used for stopping criterion for tracking.

seeding_filesstring

A binary image showing where we need to seed for tracking.

use_binary_maskbool, optional

If True, uses a binary stopping criterion. If the provided stopping_files are not binary, stopping_thr will be used to binarize the images.

stopping_thrfloat, optional

Threshold applied to stopping volume’s data to identify where tracking has to stop.

seed_densityint, optional

Number of seeds per dimension inside voxel. For example, seed_density of 2 means 8 regularly distributed points in the voxel. And seed density of 1 means 1 point at the center of the voxel.

minlenint, optional

Minimum length (nb points) of the streamlines.

maxlenint, optional

Maximum length (nb points) of the streamlines.

step_sizefloat, optional

Step size (in mm) used for tracking.

tracking_methodstring, optional
Select direction getter strategy:
  • “eudx” (Uses the peaks saved in the pam_files)

  • “deterministic” or “det” for a deterministic tracking

  • “probabilistic” or “prob” for a Probabilistic tracking

  • “closestpeaks” or “cp” for a ClosestPeaks tracking

  • “ptt” for Parallel Transport Tractography

By default, the sh coefficients saved in the pam_files are used.

pmf_thresholdfloat, optional

Threshold for ODF functions.

max_anglefloat, optional

Maximum angle between streamline segments (range [0, 90]).

sphere_namestring, optional

The sphere used for tracking. If None, the sphere saved in the pam_files is used. For faster tracking, use a smaller sphere (e.g. ‘repulsion200’).

save_seedsbool, optional

If true, save the seeds associated to their streamline in the ‘data_per_streamline’ Tractogram dictionary using ‘seeds’ as the key.

nbr_threadsint, optional

Number of threads to use for the processing. By default, all available threads will be used.

random_seedint, optional

Seed for the random number generator, must be >= 0. A value of greater than 0 will all produce the same streamline trajectory for a given seed coordinate. A value of 0 may produces various streamline tracjectories for a given seed coordinate.

seed_buffer_fractionfloat, optional

Fraction of the seed buffer to use. A value of 1.0 will use the entire seed buffer. A value of 0.5 will use half of the seed buffer then the other half. a way to reduce memory usage.

out_dirstring, optional

Output directory.

out_tractogramstring, optional

Name of the tractogram file to be saved.

References

PFTrackingPAMFlow#

class dipy.workflows.tracking.PFTrackingPAMFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(pam_files, wm_files, gm_files, ...[, ...])

Workflow for Particle Filtering Tracking.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(pam_files, wm_files, gm_files, csf_files, seeding_files, step_size=0.2, seed_density=1, pmf_threshold=0.1, max_angle=20.0, sphere_name=None, pft_back=2, pft_front=1, pft_count=15, pft_max_trial=20, save_seeds=False, min_wm_pve_before_stopping=0, nbr_threads=0, random_seed=1, seed_buffer_fraction=1.0, out_dir='', out_tractogram='tractogram.trk')[source]#

Workflow for Particle Filtering Tracking.

This workflow uses a saved peaks and metrics (PAM) file as input.

See [36] for further details about the method.

Parameters:
pam_filesstring
Path to the peaks and metrics files. This path may contain

wildcards to use multiple masks at once.

wm_filesstring

Path to white matter partial volume estimate for tracking (CMC).

gm_filesstring

Path to grey matter partial volume estimate for tracking (CMC).

csf_filesstring

Path to cerebrospinal fluid partial volume estimate for tracking (CMC).

seeding_filesstring

A binary image showing where we need to seed for tracking.

step_sizefloat, optional

Step size (in mm) used for tracking.

seed_densityint, optional

Number of seeds per dimension inside voxel. For example, seed_density of 2 means 8 regularly distributed points in the voxel. And seed density of 1 means 1 point at the center of the voxel.

pmf_thresholdfloat, optional

Threshold for ODF functions.

max_anglefloat, optional

Maximum angle between streamline segments (range [0, 90]).

sphere_namestring, optional

The sphere used for tracking. If None, the sphere saved in the pam_files is used. For faster tracking, use a smaller sphere (e.g. ‘repulsion200’).

pft_backfloat, optional

Distance in mm to back track before starting the particle filtering tractography. The total particle filtering tractography distance is equal to back_tracking_dist + front_tracking_dist.

pft_frontfloat, optional

Distance in mm to run the particle filtering tractography after the the back track distance. The total particle filtering tractography distance is equal to back_tracking_dist + front_tracking_dist.

pft_countint, optional

Number of particles to use in the particle filter.

pft_max_trialint, optional

Maximum number of trials to run the particle filtering tractography.

save_seedsbool, optional

If true, save the seeds associated to their streamline in the ‘data_per_streamline’ Tractogram dictionary using ‘seeds’ as the key.

min_wm_pve_before_stoppingint, optional

Minimum white matter pve (1 - stopping_criterion.include_map - stopping_criterion.exclude_map) to reach before allowing the tractography to stop.

nbr_threadsint, optional

Number of threads to use for the processing. By default, all available threads will be used.

random_seedint, optional

Seed for the random number generator, must be >= 0. A value of greater than 0 will all produce the same streamline trajectory for a given seed coordinate. A value of 0 may produces various streamline tracjectories for a given seed coordinate.

seed_buffer_fractionfloat, optional

Fraction of the seed buffer to use. A value of 1.0 will use the entire seed buffer. A value of 0.5 will use half of the seed buffer then the other half. a way to reduce memory usage.

out_dirstring, optional

Output directory.

out_tractogramstring, optional

Name of the tractogram file to be saved.

References

handle_vol_idx#

dipy.workflows.utils.handle_vol_idx(vol_idx)[source]#

Handle user input for volume index.

Parameters:
vol_idxint, str, list, tuple

Volume index or range of volume indices.

Returns:
vol_idxlist

List of volume indices.

HorizonFlow#

class dipy.workflows.viz.HorizonFlow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: Workflow

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(input_files[, cluster, rgb, ...])

Interactive medical visualization - Invert the Horizon!

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

run(input_files, cluster=False, rgb=False, cluster_thr=15.0, random_colors=None, length_gt=0, length_lt=1000, clusters_gt=0, clusters_lt=100000000, native_coords=False, stealth=False, emergency_header='icbm_2009a', bg_color=(0, 0, 0), disable_order_transparency=False, buan=False, buan_thr=0.5, buan_highlight=(1, 0, 0), roi_images=False, roi_colors=(1, 0, 0), out_dir='', out_stealth_png='tmp.png')[source]#

Interactive medical visualization - Invert the Horizon!

See [37] for further details about Horizon.

Interact with any number of .trk, .tck or .dpy tractograms and anatomy files .nii or .nii.gz. Cluster streamlines on loading.

Parameters:
input_filesvariable string

Filenames.

clusterbool, optional

Enable QuickBundlesX clustering.

rgbbool, optional

Enable the color image (rgb only, alpha channel will be ignored).

cluster_thrfloat, optional

Distance threshold used for clustering. Default value 15.0 for small animal brains you may need to use something smaller such as 2.0. The distance is in mm. For this parameter to be active cluster should be enabled.

random_colorsvariable str, optional

Given multiple tractograms and/or ROIs then each tractogram and/or ROI will be shown with different color. If no value is provided, both the tractograms and the ROIs will have a different random color generated from a distinguishable colormap. If the effect should only be applied to one of the 2 types, then use the options ‘tracts’ and ‘rois’ for the tractograms and the ROIs respectively.

length_gtfloat, optional

Clusters with average length greater than length_gt amount in mm will be shown.

length_ltfloat, optional

Clusters with average length less than length_lt amount in mm will be shown.

clusters_gtint, optional

Clusters with size greater than clusters_gt will be shown.

clusters_ltint, optional

Clusters with size less than clusters_gt will be shown.

native_coordsbool, optional

Show results in native coordinates.

stealthbool, optional

Do not use interactive mode just save figure.

emergency_headerstr, optional

If no anatomy reference is provided an emergency header is provided. Current options ‘icbm_2009a’ and ‘icbm_2009c’.

bg_colorvariable float, optional

Define the background color of the scene. Colors can be defined with 1 or 3 values and should be between [0-1].

disable_order_transparencybool, optional

Use depth peeling to sort transparent objects. If True also enables anti-aliasing.

buanbool, optional

Enables BUAN framework visualization.

buan_thrfloat, optional

Uses the threshold value to highlight segments on the bundle which have pvalues less than this threshold.

buan_highlightvariable float, optional

Define the bundle highlight area color. Colors can be defined with 1 or 3 values and should be between [0-1]. For example, a value of (1, 0, 0) would mean the red color.

roi_imagesbool, optional

Displays binary images as contours.

roi_colorsvariable float, optional

Define the color for the roi images. Colors can be defined with 1 or 3 values and should be between [0-1]. For example, a value of (1, 0, 0) would mean the red color.

out_dirstr, optional

Output directory.

out_stealth_pngstr, optional

Filename of saved picture.

References

Workflow#

class dipy.workflows.workflow.Workflow(*, output_strategy='absolute', mix_names=False, force=False, skip=False)[source]#

Bases: object

Methods

get_io_iterator()

Create an iterator for IO.

get_short_name()

Return A short name for the workflow used to subdivide.

get_sub_runs()

Return No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

run(*args, **kwargs)

Execute the workflow.

update_flat_outputs(new_flat_outputs, io_it)

Update the flat outputs with new values.

get_io_iterator()[source]#

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()[source]#

Return A short name for the workflow used to subdivide.

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()[source]#

Return No sub runs since this is a simple workflow.

manage_output_overwrite()[source]#

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run(*args, **kwargs)[source]#

Execute the workflow.

Since this is an abstract class, raise exception if this code is reached (not implemented in child class or literally called on this class)

update_flat_outputs(new_flat_outputs, io_it)[source]#

Update the flat outputs with new values.

This method is useful when a workflow needs to update the flat_outputs with new values that were generated in the run method.

Parameters:
new_flat_outputslist

List of new values to update the flat_outputs.

io_itIOIterator

The IOIterator object that was returned by get_io_iterator.