io
¶
|
Methods |
|
Load object from pickle file fname. |
|
Returns an array representation of an ornt string |
|
Returns a string representation of a 3d ornt |
|
Calculates the mapping needing to get from orn1 to orn2 |
|
Read b-values and b-vectors from disk |
|
Read gradient table information from a pair of files with extentions .bvec and .bval. |
|
|
|
Changes the orientation of a gradients or other vectors |
|
Save dix to fname as pickle. |
Module: io.bvectxt
¶
|
Returns an array representation of an ornt string |
|
Returns a string representation of a 3d ornt |
|
Calculates the mapping needing to get from orn1 to orn2 |
|
Read gradient table information from a pair of files with extentions .bvec and .bval. |
|
|
|
Changes the orientation of a gradients or other vectors |
|
Split the extension from a pathname. |
Module: io.dpy
¶
A class for handling large tractography datasets.
It is built using the h5py which in turn implement key features of the HDF5 (hierachical data format) API [R223c5a1ac171-1].
|
Methods |
alias of |
Module: io.gradients
¶
|
Create, return, and change directory to a temporary directory |
|
Read b-values and b-vectors from disk |
|
Split the extension from a pathname. |
Module: io.image
¶
|
|
|
|
|
Save Quality Assurance metrics. |
Module: io.peaks
¶
|
|
|
Points on the unit sphere. |
|
Load a PeaksAndMetrics HDF5 file (PAM5) |
|
Save SH, directions, indices and values of peaks to Nifti. |
Reshape peaks for visualization. |
|
|
|
|
Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5). |
Module: io.pickles
¶
Load and save pickles
|
Load object from pickle file fname. |
|
Save dix to fname as pickle. |
Module: io.stateful_tractogram
¶
|
Dictionary for which key access can do slicing on the values. |
|
Dictionary for which key access can do slicing on the values. |
Enum to simplify future change to convention |
|
|
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). |
alias of |
|
|
Container for streamlines and their data information. |
itemgetter(item, …) –> itemgetter object |
|
product(*iterables, repeat=1) –> product object |
|
|
Apply affine matrix aff to points pts |
Alias for bisect_right(). |
|
|
Deep copy operation on arbitrary Python objects. |
|
Will compare the spatial attribute of 2 references |
Module: io.streamline
¶
|
Methods |
Enum to simplify future change to convention |
|
|
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). |
|
Container for streamlines and their data information. |
|
Write a standard trk/tck header from spatial attribute |
|
Deep copy operation on arbitrary Python objects. |
|
Returns the StreamlinesFile object guessed from the file-like object. |
|
Will compare the spatial attribute of 2 references |
|
Load the stateful tractogram of the dpy format |
|
Load the stateful tractogram of the fib format |
|
Load the stateful tractogram of the tck format |
|
Load the stateful tractogram from any format (trk, tck, fib, dpy) |
|
Load the stateful tractogram of the trk format |
|
Load the stateful tractogram of the vtk format |
|
Load streamlines from vtk polydata. |
|
Save the stateful tractogram in dpy format |
|
Save the stateful tractogram in fib format |
|
Save the stateful tractogram in tck format |
|
Save the stateful tractogram in any format (trk, tck, vtk, fib, dpy) |
|
Save the stateful tractogram in trk format |
|
Save the stateful tractogram in vtk format |
|
Save streamlines as vtk polydata to a supported format file. |
Module: io.utils
¶
Utility functions for file formats
|
Class for single file NIfTI1 format image |
|
Write a standard nifti header from spatial attribute |
|
Write a standard trk/tck header from spatial attribute |
|
Create a nifti-compliant directional-encoded color FA image. |
|
Convert a nifti-compliant directional-encoded color FA image into a nifti image with RGB encoded in floating point resolution. |
|
Returns the StreamlinesFile object guessed from the file-like object. |
|
Will compare the spatial attribute of 2 references |
|
Will compare the spatial attribute of 2 references |
|
reshapes the input to have 5 dimensions, adds extra dimensions just before the last dimession |
|
Returns a Nifti1Image with a symmetric matrix intent |
Module: io.vtk
¶
|
Load a vtk polydata to a supported format file. |
|
Load streamlines from vtk polydata. |
|
Return package-like thing and module setup for package name |
|
Save a vtk polydata to a supported format file. |
|
Save streamlines as vtk polydata to a supported format file. |
|
Apply affine transformation to streamlines |
Dpy
¶
-
class
dipy.io.
Dpy
(fname, mode='r', compression=0)¶ Bases:
object
Methods
read one track each time
read the entire tractography
read_tracksi
(indices)read tracks with specific indices
write_track
(track)write on track each time
write_tracks
(tracks)write many tracks together
close
version
-
__init__
(fname, mode='r', compression=0)¶ Advanced storage system for tractography based on HDF5
- Parameters
- fnamestr, full filename
- mode‘r’ read
‘w’ write ‘r+’ read and write only if file already exists
- compression0 no compression to 9 maximum compression
Examples
>>> import os >>> from tempfile import mkstemp #temp file >>> from dipy.io.dpy import Dpy >>> def dpy_example(): ... fd,fname = mkstemp() ... fname += '.dpy'#add correct extension ... dpw = Dpy(fname,'w') ... A=np.ones((5,3)) ... B=2*A.copy() ... C=3*A.copy() ... dpw.write_track(A) ... dpw.write_track(B) ... dpw.write_track(C) ... dpw.close() ... dpr = Dpy(fname,'r') ... dpr.read_track() ... dpr.read_track() ... dpr.read_tracksi([0, 1, 2, 0, 0, 2]) ... dpr.close() ... os.remove(fname) #delete file from disk >>> dpy_example()
-
close
()¶
-
read_track
()¶ read one track each time
-
read_tracks
()¶ read the entire tractography
-
read_tracksi
(indices)¶ read tracks with specific indices
-
version
()¶
-
write_track
(track)¶ write on track each time
-
write_tracks
(tracks)¶ write many tracks together
-
load_pickle¶
-
dipy.io.
load_pickle
(fname)¶ Load object from pickle file fname.
- Parameters
- fnamestr
filename to load dict or other python object
- Returns
- dixobject
dictionary or other object
Examples
dipy.io.pickles.save_pickle
orientation_from_string¶
-
dipy.io.
orientation_from_string
(string_ornt)¶ Returns an array representation of an ornt string
orientation_to_string¶
-
dipy.io.
orientation_to_string
(ornt)¶ Returns a string representation of a 3d ornt
ornt_mapping¶
-
dipy.io.
ornt_mapping
(ornt1, ornt2)¶ Calculates the mapping needing to get from orn1 to orn2
read_bvals_bvecs¶
-
dipy.io.
read_bvals_bvecs
(fbvals, fbvecs)¶ Read b-values and b-vectors from disk
- Parameters
- fbvalsstr
Full path to file with b-values. None to not read bvals.
- fbvecsstr
Full path of file with b-vectors. None to not read bvecs.
- Returns
- bvalsarray, (N,) or None
- bvecsarray, (N, 3) or None
Notes
Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).
read_bvec_file¶
-
dipy.io.
read_bvec_file
(filename, atol=0.001)¶ Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.
- Parameters
- filename :
The path to the either the bvec or bval file
- atolfloat, optional
The tolorance used to check all the gradient directions are normalized. Defult is .001
reorient_vectors¶
-
dipy.io.
reorient_vectors
(input, current_ornt, new_ornt, axis=0)¶ Changes the orientation of a gradients or other vectors
Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.
R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior
Examples
>>> gtab = np.array([[1, 1, 1], [1, 2, 3]]) >>> reorient_vectors(gtab, 'ras', 'asr', axis=1) array([[1, 1, 1], [2, 3, 1]]) >>> reorient_vectors(gtab, 'ras', 'lps', axis=1) array([[-1, -1, 1], [-1, -2, 3]]) >>> bvec = gtab.T >>> reorient_vectors(bvec, 'ras', 'lps', axis=0) array([[-1, -1], [-1, -2], [ 1, 3]]) >>> reorient_vectors(bvec, 'ras', 'lsp') array([[-1, -1], [ 1, 3], [-1, -2]])
save_pickle¶
-
dipy.io.
save_pickle
(fname, dix)¶ Save dix to fname as pickle.
- Parameters
- fnamestr
filename to save object e.g. a dictionary
- dixstr
dictionary or other object
See also
Examples
>>> import os >>> from tempfile import mkstemp >>> fd, fname = mkstemp() # make temporary file (opened, attached to fh) >>> d={0:{'d':1}} >>> save_pickle(fname, d) >>> d2=load_pickle(fname)
We remove the temporary file we created for neatness
>>> os.close(fd) # the file is still open, we need to close the fh >>> os.remove(fname)
orientation_from_string¶
-
dipy.io.bvectxt.
orientation_from_string
(string_ornt)¶ Returns an array representation of an ornt string
orientation_to_string¶
-
dipy.io.bvectxt.
orientation_to_string
(ornt)¶ Returns a string representation of a 3d ornt
ornt_mapping¶
-
dipy.io.bvectxt.
ornt_mapping
(ornt1, ornt2)¶ Calculates the mapping needing to get from orn1 to orn2
read_bvec_file¶
-
dipy.io.bvectxt.
read_bvec_file
(filename, atol=0.001)¶ Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.
- Parameters
- filename :
The path to the either the bvec or bval file
- atolfloat, optional
The tolorance used to check all the gradient directions are normalized. Defult is .001
reorient_vectors¶
-
dipy.io.bvectxt.
reorient_vectors
(input, current_ornt, new_ornt, axis=0)¶ Changes the orientation of a gradients or other vectors
Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.
R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior
Examples
>>> gtab = np.array([[1, 1, 1], [1, 2, 3]]) >>> reorient_vectors(gtab, 'ras', 'asr', axis=1) array([[1, 1, 1], [2, 3, 1]]) >>> reorient_vectors(gtab, 'ras', 'lps', axis=1) array([[-1, -1, 1], [-1, -2, 3]]) >>> bvec = gtab.T >>> reorient_vectors(bvec, 'ras', 'lps', axis=0) array([[-1, -1], [-1, -2], [ 1, 3]]) >>> reorient_vectors(bvec, 'ras', 'lsp') array([[-1, -1], [ 1, 3], [-1, -2]])
splitext¶
-
dipy.io.bvectxt.
splitext
(p)¶ Split the extension from a pathname.
Extension is everything from the last dot to the end, ignoring leading dots. Returns “(root, ext)”; ext may be empty.
Dpy
¶
-
class
dipy.io.dpy.
Dpy
(fname, mode='r', compression=0)¶ Bases:
object
Methods
read one track each time
read the entire tractography
read_tracksi
(indices)read tracks with specific indices
write_track
(track)write on track each time
write_tracks
(tracks)write many tracks together
close
version
-
__init__
(fname, mode='r', compression=0)¶ Advanced storage system for tractography based on HDF5
- Parameters
- fnamestr, full filename
- mode‘r’ read
‘w’ write ‘r+’ read and write only if file already exists
- compression0 no compression to 9 maximum compression
Examples
>>> import os >>> from tempfile import mkstemp #temp file >>> from dipy.io.dpy import Dpy >>> def dpy_example(): ... fd,fname = mkstemp() ... fname += '.dpy'#add correct extension ... dpw = Dpy(fname,'w') ... A=np.ones((5,3)) ... B=2*A.copy() ... C=3*A.copy() ... dpw.write_track(A) ... dpw.write_track(B) ... dpw.write_track(C) ... dpw.close() ... dpr = Dpy(fname,'r') ... dpr.read_track() ... dpr.read_track() ... dpr.read_tracksi([0, 1, 2, 0, 0, 2]) ... dpr.close() ... os.remove(fname) #delete file from disk >>> dpy_example()
-
close
()¶
-
read_track
()¶ read one track each time
-
read_tracks
()¶ read the entire tractography
-
read_tracksi
(indices)¶ read tracks with specific indices
-
version
()¶
-
write_track
(track)¶ write on track each time
-
write_tracks
(tracks)¶ write many tracks together
-
Streamlines
¶
-
dipy.io.dpy.
Streamlines
¶ alias of
nibabel.streamlines.array_sequence.ArraySequence
InTemporaryDirectory
¶
-
class
dipy.io.gradients.
InTemporaryDirectory
(suffix='', prefix='tmp', dir=None)¶ Bases:
nibabel.tmpdirs.TemporaryDirectory
Create, return, and change directory to a temporary directory
Examples
>>> import os >>> my_cwd = os.getcwd() >>> with InTemporaryDirectory() as tmpdir: ... _ = open('test.txt', 'wt').write('some text') ... assert os.path.isfile('test.txt') ... assert os.path.isfile(os.path.join(tmpdir, 'test.txt')) >>> os.path.exists(tmpdir) False >>> os.getcwd() == my_cwd True
Methods
cleanup
-
__init__
(suffix='', prefix='tmp', dir=None)¶ Initialize self. See help(type(self)) for accurate signature.
-
read_bvals_bvecs¶
-
dipy.io.gradients.
read_bvals_bvecs
(fbvals, fbvecs)¶ Read b-values and b-vectors from disk
- Parameters
- fbvalsstr
Full path to file with b-values. None to not read bvals.
- fbvecsstr
Full path of file with b-vectors. None to not read bvecs.
- Returns
- bvalsarray, (N,) or None
- bvecsarray, (N, 3) or None
Notes
Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).
splitext¶
-
dipy.io.gradients.
splitext
(p)¶ Split the extension from a pathname.
Extension is everything from the last dot to the end, ignoring leading dots. Returns “(root, ext)”; ext may be empty.
load_nifti¶
-
dipy.io.image.
load_nifti
(fname, return_img=False, return_voxsize=False, return_coords=False)¶
save_qa_metric¶
-
dipy.io.image.
save_qa_metric
(fname, xopt, fopt)¶ Save Quality Assurance metrics.
- Parameters
- fname: string
File name to save the metric values.
- xopt: numpy array
The metric containing the optimal parameters for image registration.
- fopt: int
The distance between the registered images.
PeaksAndMetrics
¶
-
class
dipy.io.peaks.
PeaksAndMetrics
¶ Bases:
dipy.reconst.peak_direction_getter.EuDXDirectionGetter
- Attributes
- ang_thr
- qa_thr
- total_weight
Methods
initial_direction
The best starting directions for fiber tracking from point
get_direction
-
__init__
($self, /, *args, **kwargs)¶ Initialize self. See help(type(self)) for accurate signature.
Sphere
¶
-
class
dipy.io.peaks.
Sphere
(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)¶ Bases:
object
Points on the unit sphere.
The sphere can be constructed using one of three conventions:
Sphere(x, y, z) Sphere(xyz=xyz) Sphere(theta=theta, phi=phi)
- Parameters
- x, y, z1-D array_like
Vertices as x-y-z coordinates.
- theta, phi1-D array_like
Vertices as spherical coordinates. Theta and phi are the inclination and azimuth angles respectively.
- xyz(N, 3) ndarray
Vertices as x-y-z coordinates.
- faces(N, 3) ndarray
Indices into vertices that form triangular faces. If unspecified, the faces are computed using a Delaunay triangulation.
- edges(N, 2) ndarray
Edges between vertices. If unspecified, the edges are derived from the faces.
- Attributes
- x
- y
- z
Methods
find_closest
(xyz)Find the index of the vertex in the Sphere closest to the input vector
subdivide
([n])Subdivides each face of the sphere into four new faces.
edges
faces
vertices
-
__init__
(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)¶ Initialize self. See help(type(self)) for accurate signature.
-
edges
()¶
-
faces
()¶
-
find_closest
(xyz)¶ Find the index of the vertex in the Sphere closest to the input vector
- Parameters
- xyzarray-like, 3 elements
A unit vector
- Returns
- idxint
The index into the Sphere.vertices array that gives the closest vertex (in angle).
-
subdivide
(n=1)¶ Subdivides each face of the sphere into four new faces.
New vertices are created at a, b, and c. Then each face [x, y, z] is divided into faces [x, a, c], [y, a, b], [z, b, c], and [a, b, c].
y / / a/____ /\ / / \ / /____\/____ x c z
- Parameters
- nint, optional
The number of subdivisions to preform.
- Returns
- new_sphereSphere
The subdivided sphere.
-
vertices
()¶
-
property
x
¶
-
property
y
¶
-
property
z
¶
load_peaks¶
-
dipy.io.peaks.
load_peaks
(fname, verbose=False)¶ Load a PeaksAndMetrics HDF5 file (PAM5)
- Parameters
- fnamestring
Filename of PAM5 file.
- verbosebool
Print summary information about the loaded file.
- Returns
- pamPeaksAndMetrics object
peaks_to_niftis¶
-
dipy.io.peaks.
peaks_to_niftis
(pam, fname_shm, fname_dirs, fname_values, fname_indices, fname_gfa, reshape_dirs=False)¶ Save SH, directions, indices and values of peaks to Nifti.
reshape_peaks_for_visualization¶
-
dipy.io.peaks.
reshape_peaks_for_visualization
(peaks)¶ Reshape peaks for visualization.
Reshape and convert to float32 a set of peaks for visualisation with mrtrix or the fibernavigator.
- Parameters
- peaks: nd array (…, N, 3) or PeaksAndMetrics object
The peaks to be reshaped and converted to float32.
- Returns
- peaksnd array (…, 3*N)
save_peaks¶
-
dipy.io.peaks.
save_peaks
(fname, pam, affine=None, verbose=False)¶ Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).
- Parameters
- fnamestring
Filename of PAM5 file
- pamPeaksAndMetrics
Object holding peak_dirs, shm_coeffs and other attributes
- affinearray
The 4x4 matrix transforming the date from native to world coordinates. PeaksAndMetrics should have that attribute but if not it can be provided here. Default None.
- verbosebool
Print summary information about the saved file.
load_pickle¶
-
dipy.io.pickles.
load_pickle
(fname)¶ Load object from pickle file fname.
- Parameters
- fnamestr
filename to load dict or other python object
- Returns
- dixobject
dictionary or other object
Examples
dipy.io.pickles.save_pickle
save_pickle¶
-
dipy.io.pickles.
save_pickle
(fname, dix)¶ Save dix to fname as pickle.
- Parameters
- fnamestr
filename to save object e.g. a dictionary
- dixstr
dictionary or other object
See also
Examples
>>> import os >>> from tempfile import mkstemp >>> fd, fname = mkstemp() # make temporary file (opened, attached to fh) >>> d={0:{'d':1}} >>> save_pickle(fname, d) >>> d2=load_pickle(fname)
We remove the temporary file we created for neatness
>>> os.close(fd) # the file is still open, we need to close the fh >>> os.remove(fname)
PerArrayDict
¶
-
class
dipy.io.stateful_tractogram.
PerArrayDict
(n_rows=0, *args, **kwargs)¶ Bases:
nibabel.streamlines.tractogram.SliceableDataDict
Dictionary for which key access can do slicing on the values.
This container behaves like a standard dictionary but extends key access to allow keys for key access to be indices slicing into the contained ndarray values. The elements must also be ndarrays.
In addition, it makes sure the amount of data contained in those ndarrays matches the number of streamlines given at the instantiation of this instance.
- Parameters
- n_rowsNone or int, optional
Number of rows per value in each key, value pair or None for not specified.
- *args :
- **kwargs :
Positional and keyword arguments, passed straight through the
dict
constructor.
Methods
clear
()extend
(other)Appends the elements of another
PerArrayDict
.get
(k[,d])items
()keys
()pop
(k[,d])If key is not found, d is returned if given, otherwise KeyError is raised.
popitem
()as a 2-tuple; but raise KeyError if D is empty.
setdefault
(k[,d])update
([E, ]**F)If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v
values
()-
__init__
(n_rows=0, *args, **kwargs)¶ Initialize self. See help(type(self)) for accurate signature.
-
extend
(other)¶ Appends the elements of another
PerArrayDict
.That is, for each entry in this dictionary, we append the elements coming from the other dictionary at the corresponding entry.
- Parameters
- other
PerArrayDict
object Its data will be appended to the data of this dictionary.
- other
- Returns
- None
Notes
The keys in both dictionaries must be the same.
PerArraySequenceDict
¶
-
class
dipy.io.stateful_tractogram.
PerArraySequenceDict
(n_rows=0, *args, **kwargs)¶ Bases:
nibabel.streamlines.tractogram.PerArrayDict
Dictionary for which key access can do slicing on the values.
This container behaves like a standard dictionary but extends key access to allow keys for key access to be indices slicing into the contained ndarray values. The elements must also be
ArraySequence
.In addition, it makes sure the amount of data contained in those array sequences matches the number of elements given at the instantiation of the instance.
Methods
clear
()extend
(other)Appends the elements of another
PerArrayDict
.get
(k[,d])items
()keys
()pop
(k[,d])If key is not found, d is returned if given, otherwise KeyError is raised.
popitem
()as a 2-tuple; but raise KeyError if D is empty.
setdefault
(k[,d])update
([E, ]**F)If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v
values
()-
__init__
(n_rows=0, *args, **kwargs)¶ Initialize self. See help(type(self)) for accurate signature.
-
StatefulTractogram
¶
-
class
dipy.io.stateful_tractogram.
StatefulTractogram
(streamlines, reference, space, shifted_origin=False, data_per_point=None, data_per_streamline=None)¶ Bases:
object
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). Facilitate transformation between space and data manipulation for each streamline / point.
- Attributes
data_per_point
Getter for data_per_point
data_per_streamline
Getter for data_per_streamline
shifted_origin
Getter for shift
space
Getter for the current space
space_attribute
Getter for spatial attribute
streamlines
Partially safe getter for streamlines
Methods
Compute the bounding box of the streamlines in their current state
Safe getter for streamlines (for slicing)
Verify that the bounding box is valid in voxel space Will transform the streamlines for OBB, slow for big tractogram
Remove streamlines with invalid coordinates from the object.
Safe function to shift streamlines so the center of voxel is the origin
Safe function to shift streamlines so the corner of voxel is the origin
to_rasmm
()Safe function to transform streamlines and update state
to_vox
()Safe function to transform streamlines and update state
to_voxmm
()Safe function to transform streamlines and update state
-
__init__
(streamlines, reference, space, shifted_origin=False, data_per_point=None, data_per_streamline=None)¶ Create a strict, state-aware, robust tractogram
- Parameters
- streamlineslist or ArraySequence
Streamlines of the tractogram
- referenceNifti or Trk filename, Nifti1Image or TrkFile,
Nifti1Header, trk.header (dict) or another Stateful Tractogram Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Current space in which the streamlines are (vox, voxmm or rasmm) Typically after tracking the space is VOX, after nibabel loading the space is RASMM
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (corner of the voxel) True is NIFTI standard (center of the voxel)
- data_per_pointdict
Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i
- data_per_streamlinedict
Dictionary in which each key has X items X being the number of streamlines
Notes
Very important to respect the convention, verify that streamlines match the reference and are effectively in the right space.
Any change to the number of streamlines, data_per_point or data_per_streamline requires particular verification.
In a case of manipulation not allowed by this object, use Nibabel directly and be careful.
-
compute_bounding_box
()¶ Compute the bounding box of the streamlines in their current state
- Returns
- outputndarray
8 corners of the XYZ aligned box, all zeros if no streamlines
-
property
data_per_point
¶ Getter for data_per_point
-
property
data_per_streamline
¶ Getter for data_per_streamline
-
get_streamlines_copy
()¶ Safe getter for streamlines (for slicing)
-
is_bbox_in_vox_valid
()¶ Verify that the bounding box is valid in voxel space Will transform the streamlines for OBB, slow for big tractogram
- Returns
- outputbool
Are the streamlines within the volume of the associated reference
-
remove_invalid_streamlines
()¶ Remove streamlines with invalid coordinates from the object. Will also remove the data_per_point and data_per_streamline. Invalid coordinates are any X,Y,Z values above the reference dimensions or below zero Returns ——- output : tuple
Tuple of two list, indices_to_remove, indices_to_keep
-
property
shifted_origin
¶ Getter for shift
-
property
space
¶ Getter for the current space
-
property
space_attribute
¶ Getter for spatial attribute
-
property
streamlines
¶ Partially safe getter for streamlines
-
to_center
()¶ Safe function to shift streamlines so the center of voxel is the origin
-
to_corner
()¶ Safe function to shift streamlines so the corner of voxel is the origin
-
to_rasmm
()¶ Safe function to transform streamlines and update state
-
to_vox
()¶ Safe function to transform streamlines and update state
-
to_voxmm
()¶ Safe function to transform streamlines and update state
Streamlines
¶
-
dipy.io.stateful_tractogram.
Streamlines
¶ alias of
nibabel.streamlines.array_sequence.ArraySequence
Tractogram
¶
-
class
dipy.io.stateful_tractogram.
Tractogram
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)¶ Bases:
object
Container for streamlines and their data information.
Streamlines of a tractogram can be in any coordinate system of your choice as long as you provide the correct affine_to_rasmm matrix, at construction time. When applied to streamlines coordinates, that transformation matrix should bring the streamlines back to world space (RAS+ and mm space) [1]_.
Moreover, when streamlines are mapped back to voxel space [2]_, a streamline point located at an integer coordinate (i,j,k) is considered to be at the center of the corresponding voxel. This is in contrast with other conventions where it might have referred to a corner.
References
[1] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces [2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
- Attributes
- streamlines
ArraySequence
object Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
- data_per_streamline
PerArrayDict
object Dictionary where the items are (str, 2D array). Each key represents a piece of information \(i\) to be kept alongside every streamline, and its associated value is a 2D array of shape (\(T\), \(P_i\)) where \(T\) is the number of streamlines and \(P_i\) is the number of values to store for that particular piece of information \(i\).
- data_per_point
PerArraySequenceDict
object Dictionary where the items are (str,
ArraySequence
). Each key represents a piece of information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number values to store for that particular piece of information \(i\).
- streamlines
Methods
apply_affine
(affine[, lazy])Applies an affine transformation on the points of each streamline.
copy
()Returns a copy of this
Tractogram
object.extend
(other)Appends the data of another
Tractogram
.to_world
([lazy])Brings the streamlines to world space (i.e.
-
__init__
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)¶ - Parameters
- streamlinesiterable of ndarrays or
ArraySequence
, optional Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
- data_per_streamlinedict of iterable of ndarrays, optional
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every streamline, and its associated value is an iterable of ndarrays of shape (\(P_i\),) where \(P_i\) is the number of scalar values to store for that particular information \(i\).
- data_per_pointdict of iterable of ndarrays, optional
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number scalar values to store for that particular information \(i\).
- affine_to_rasmmndarray of shape (4, 4) or None, optional
Transformation matrix that brings the streamlines contained in this tractogram to RAS+ and mm space where coordinate (0,0,0) refers to the center of the voxel. By default, the streamlines are in an unknown space, i.e. affine_to_rasmm is None.
- streamlinesiterable of ndarrays or
-
property
affine_to_rasmm
¶ Affine bringing streamlines in this tractogram to RAS+mm.
-
apply_affine
(affine, lazy=False)¶ Applies an affine transformation on the points of each streamline.
If lazy is not specified, this is performed in-place.
- Parameters
- affinendarray of shape (4, 4)
Transformation that will be applied to every streamline.
- lazy{False, True}, optional
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines are modified in-place.
- Returns
- tractogram
Tractogram
orLazyTractogram
object Tractogram where the streamlines have been transformed according to the given affine transformation. If the lazy option is true, it returns a
LazyTractogram
object, otherwise it returns a reference to thisTractogram
object with updated streamlines.
- tractogram
-
copy
()¶ Returns a copy of this
Tractogram
object.
-
property
data_per_point
¶
-
property
data_per_streamline
¶
-
extend
(other)¶ Appends the data of another
Tractogram
.Data that will be appended includes the streamlines and the content of both dictionaries data_per_streamline and data_per_point.
- Parameters
- other
Tractogram
object Its data will be appended to the data of this tractogram.
- other
- Returns
- None
Notes
The entries in both dictionaries self.data_per_streamline and self.data_per_point must match respectively those contained in the other tractogram.
-
property
streamlines
¶
-
to_world
(lazy=False)¶ Brings the streamlines to world space (i.e. RAS+ and mm).
If lazy is not specified, this is performed in-place.
- Parameters
- lazy{False, True}, optional
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines are modified in-place.
- Returns
- tractogram
Tractogram
orLazyTractogram
object Tractogram where the streamlines have been sent to world space. If the lazy option is true, it returns a
LazyTractogram
object, otherwise it returns a reference to thisTractogram
object with updated streamlines.
- tractogram
itemgetter
¶
-
class
dipy.io.stateful_tractogram.
itemgetter
¶ Bases:
object
itemgetter(item, …) –> itemgetter object
Return a callable object that fetches the given item(s) from its operand. After f = itemgetter(2), the call f(r) returns r[2]. After g = itemgetter(2, 5, 3), the call g(r) returns (r[2], r[5], r[3])
Methods
__call__
($self, /, *args, **kwargs)Call self as a function.
-
__init__
($self, /, *args, **kwargs)¶ Initialize self. See help(type(self)) for accurate signature.
-
product
¶
-
class
dipy.io.stateful_tractogram.
product
¶ Bases:
object
product(*iterables, repeat=1) –> product object
Cartesian product of input iterables. Equivalent to nested for-loops.
For example, product(A, B) returns the same as: ((x,y) for x in A for y in B). The leftmost iterators are in the outermost for-loop, so the output tuples cycle in a manner similar to an odometer (with the rightmost element changing on every iteration).
To compute the product of an iterable with itself, specify the number of repetitions with the optional repeat keyword argument. For example, product(A, repeat=4) means the same as product(A, A, A, A).
product(‘ab’, range(3)) –> (‘a’,0) (‘a’,1) (‘a’,2) (‘b’,0) (‘b’,1) (‘b’,2) product((0,1), (0,1), (0,1)) –> (0,0,0) (0,0,1) (0,1,0) (0,1,1) (1,0,0) …
-
__init__
($self, /, *args, **kwargs)¶ Initialize self. See help(type(self)) for accurate signature.
-
apply_affine¶
-
dipy.io.stateful_tractogram.
apply_affine
(aff, pts)¶ Apply affine matrix aff to points pts
Returns result of application of aff to the right of pts. The coordinate dimension of pts should be the last.
For the 3D case, aff will be shape (4,4) and pts will have final axis length 3 - maybe it will just be N by 3. The return value is the transformed points, in this case:
res = np.dot(aff[:3,:3], pts.T) + aff[:3,3:4] transformed_pts = res.T
This routine is more general than 3D, in that aff can have any shape (N,N), and pts can have any shape, as long as the last dimension is for the coordinates, and is therefore length N-1.
- Parameters
- aff(N, N) array-like
Homogenous affine, for 3D points, will be 4 by 4. Contrary to first appearance, the affine will be applied on the left of pts.
- pts(…, N-1) array-like
Points, where the last dimension contains the coordinates of each point. For 3D, the last dimension will be length 3.
- Returns
- transformed_pts(…, N-1) array
transformed points
Examples
>>> aff = np.array([[0,2,0,10],[3,0,0,11],[0,0,4,12],[0,0,0,1]]) >>> pts = np.array([[1,2,3],[2,3,4],[4,5,6],[6,7,8]]) >>> apply_affine(aff, pts) #doctest: +ELLIPSIS array([[14, 14, 24], [16, 17, 28], [20, 23, 36], [24, 29, 44]]...)
Just to show that in the simple 3D case, it is equivalent to:
>>> (np.dot(aff[:3,:3], pts.T) + aff[:3,3:4]).T #doctest: +ELLIPSIS array([[14, 14, 24], [16, 17, 28], [20, 23, 36], [24, 29, 44]]...)
But pts can be a more complicated shape:
>>> pts = pts.reshape((2,2,3)) >>> apply_affine(aff, pts) #doctest: +ELLIPSIS array([[[14, 14, 24], [16, 17, 28]], <BLANKLINE> [[20, 23, 36], [24, 29, 44]]]...)
deepcopy¶
-
dipy.io.stateful_tractogram.
deepcopy
(x, memo=None, _nil=[])¶ Deep copy operation on arbitrary Python objects.
See the module’s __doc__ string for more info.
get_reference_info¶
-
dipy.io.stateful_tractogram.
get_reference_info
(reference)¶ Will compare the spatial attribute of 2 references
- Parameters
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict) Reference that provides the spatial attribute.
- Returns
- outputtuple
affine ndarray (4,4), np.float32, tranformation of VOX to RASMM
dimensions list (3), int, volume shape for each axis
voxel_sizes list (3), float, size of voxel for each axis
voxel_order, string, Typically ‘RAS’ or ‘LPS’
Dpy
¶
-
class
dipy.io.streamline.
Dpy
(fname, mode='r', compression=0)¶ Bases:
object
Methods
read one track each time
read the entire tractography
read_tracksi
(indices)read tracks with specific indices
write_track
(track)write on track each time
write_tracks
(tracks)write many tracks together
close
version
-
__init__
(fname, mode='r', compression=0)¶ Advanced storage system for tractography based on HDF5
- Parameters
- fnamestr, full filename
- mode‘r’ read
‘w’ write ‘r+’ read and write only if file already exists
- compression0 no compression to 9 maximum compression
Examples
>>> import os >>> from tempfile import mkstemp #temp file >>> from dipy.io.dpy import Dpy >>> def dpy_example(): ... fd,fname = mkstemp() ... fname += '.dpy'#add correct extension ... dpw = Dpy(fname,'w') ... A=np.ones((5,3)) ... B=2*A.copy() ... C=3*A.copy() ... dpw.write_track(A) ... dpw.write_track(B) ... dpw.write_track(C) ... dpw.close() ... dpr = Dpy(fname,'r') ... dpr.read_track() ... dpr.read_track() ... dpr.read_tracksi([0, 1, 2, 0, 0, 2]) ... dpr.close() ... os.remove(fname) #delete file from disk >>> dpy_example()
-
close
()¶
-
read_track
()¶ read one track each time
-
read_tracks
()¶ read the entire tractography
-
read_tracksi
(indices)¶ read tracks with specific indices
-
version
()¶
-
write_track
(track)¶ write on track each time
-
write_tracks
(tracks)¶ write many tracks together
-
StatefulTractogram
¶
-
class
dipy.io.streamline.
StatefulTractogram
(streamlines, reference, space, shifted_origin=False, data_per_point=None, data_per_streamline=None)¶ Bases:
object
Class for stateful representation of collections of streamlines Object designed to be identical no matter the file format (trk, tck, vtk, fib, dpy). Facilitate transformation between space and data manipulation for each streamline / point.
- Attributes
data_per_point
Getter for data_per_point
data_per_streamline
Getter for data_per_streamline
shifted_origin
Getter for shift
space
Getter for the current space
space_attribute
Getter for spatial attribute
streamlines
Partially safe getter for streamlines
Methods
Compute the bounding box of the streamlines in their current state
Safe getter for streamlines (for slicing)
Verify that the bounding box is valid in voxel space Will transform the streamlines for OBB, slow for big tractogram
Remove streamlines with invalid coordinates from the object.
Safe function to shift streamlines so the center of voxel is the origin
Safe function to shift streamlines so the corner of voxel is the origin
to_rasmm
()Safe function to transform streamlines and update state
to_vox
()Safe function to transform streamlines and update state
to_voxmm
()Safe function to transform streamlines and update state
-
__init__
(streamlines, reference, space, shifted_origin=False, data_per_point=None, data_per_streamline=None)¶ Create a strict, state-aware, robust tractogram
- Parameters
- streamlineslist or ArraySequence
Streamlines of the tractogram
- referenceNifti or Trk filename, Nifti1Image or TrkFile,
Nifti1Header, trk.header (dict) or another Stateful Tractogram Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Current space in which the streamlines are (vox, voxmm or rasmm) Typically after tracking the space is VOX, after nibabel loading the space is RASMM
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (corner of the voxel) True is NIFTI standard (center of the voxel)
- data_per_pointdict
Dictionary in which each key has X items, each items has Y_i items X being the number of streamlines Y_i being the number of points on streamlines #i
- data_per_streamlinedict
Dictionary in which each key has X items X being the number of streamlines
Notes
Very important to respect the convention, verify that streamlines match the reference and are effectively in the right space.
Any change to the number of streamlines, data_per_point or data_per_streamline requires particular verification.
In a case of manipulation not allowed by this object, use Nibabel directly and be careful.
-
compute_bounding_box
()¶ Compute the bounding box of the streamlines in their current state
- Returns
- outputndarray
8 corners of the XYZ aligned box, all zeros if no streamlines
-
property
data_per_point
¶ Getter for data_per_point
-
property
data_per_streamline
¶ Getter for data_per_streamline
-
get_streamlines_copy
()¶ Safe getter for streamlines (for slicing)
-
is_bbox_in_vox_valid
()¶ Verify that the bounding box is valid in voxel space Will transform the streamlines for OBB, slow for big tractogram
- Returns
- outputbool
Are the streamlines within the volume of the associated reference
-
remove_invalid_streamlines
()¶ Remove streamlines with invalid coordinates from the object. Will also remove the data_per_point and data_per_streamline. Invalid coordinates are any X,Y,Z values above the reference dimensions or below zero Returns ——- output : tuple
Tuple of two list, indices_to_remove, indices_to_keep
-
property
shifted_origin
¶ Getter for shift
-
property
space
¶ Getter for the current space
-
property
space_attribute
¶ Getter for spatial attribute
-
property
streamlines
¶ Partially safe getter for streamlines
-
to_center
()¶ Safe function to shift streamlines so the center of voxel is the origin
-
to_corner
()¶ Safe function to shift streamlines so the corner of voxel is the origin
-
to_rasmm
()¶ Safe function to transform streamlines and update state
-
to_vox
()¶ Safe function to transform streamlines and update state
-
to_voxmm
()¶ Safe function to transform streamlines and update state
Tractogram
¶
-
class
dipy.io.streamline.
Tractogram
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)¶ Bases:
object
Container for streamlines and their data information.
Streamlines of a tractogram can be in any coordinate system of your choice as long as you provide the correct affine_to_rasmm matrix, at construction time. When applied to streamlines coordinates, that transformation matrix should bring the streamlines back to world space (RAS+ and mm space) [1]_.
Moreover, when streamlines are mapped back to voxel space [2]_, a streamline point located at an integer coordinate (i,j,k) is considered to be at the center of the corresponding voxel. This is in contrast with other conventions where it might have referred to a corner.
References
[1] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces [2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
- Attributes
- streamlines
ArraySequence
object Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
- data_per_streamline
PerArrayDict
object Dictionary where the items are (str, 2D array). Each key represents a piece of information \(i\) to be kept alongside every streamline, and its associated value is a 2D array of shape (\(T\), \(P_i\)) where \(T\) is the number of streamlines and \(P_i\) is the number of values to store for that particular piece of information \(i\).
- data_per_point
PerArraySequenceDict
object Dictionary where the items are (str,
ArraySequence
). Each key represents a piece of information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number values to store for that particular piece of information \(i\).
- streamlines
Methods
apply_affine
(affine[, lazy])Applies an affine transformation on the points of each streamline.
copy
()Returns a copy of this
Tractogram
object.extend
(other)Appends the data of another
Tractogram
.to_world
([lazy])Brings the streamlines to world space (i.e.
-
__init__
(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)¶ - Parameters
- streamlinesiterable of ndarrays or
ArraySequence
, optional Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
- data_per_streamlinedict of iterable of ndarrays, optional
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every streamline, and its associated value is an iterable of ndarrays of shape (\(P_i\),) where \(P_i\) is the number of scalar values to store for that particular information \(i\).
- data_per_pointdict of iterable of ndarrays, optional
Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number scalar values to store for that particular information \(i\).
- affine_to_rasmmndarray of shape (4, 4) or None, optional
Transformation matrix that brings the streamlines contained in this tractogram to RAS+ and mm space where coordinate (0,0,0) refers to the center of the voxel. By default, the streamlines are in an unknown space, i.e. affine_to_rasmm is None.
- streamlinesiterable of ndarrays or
-
property
affine_to_rasmm
¶ Affine bringing streamlines in this tractogram to RAS+mm.
-
apply_affine
(affine, lazy=False)¶ Applies an affine transformation on the points of each streamline.
If lazy is not specified, this is performed in-place.
- Parameters
- affinendarray of shape (4, 4)
Transformation that will be applied to every streamline.
- lazy{False, True}, optional
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines are modified in-place.
- Returns
- tractogram
Tractogram
orLazyTractogram
object Tractogram where the streamlines have been transformed according to the given affine transformation. If the lazy option is true, it returns a
LazyTractogram
object, otherwise it returns a reference to thisTractogram
object with updated streamlines.
- tractogram
-
copy
()¶ Returns a copy of this
Tractogram
object.
-
property
data_per_point
¶
-
property
data_per_streamline
¶
-
extend
(other)¶ Appends the data of another
Tractogram
.Data that will be appended includes the streamlines and the content of both dictionaries data_per_streamline and data_per_point.
- Parameters
- other
Tractogram
object Its data will be appended to the data of this tractogram.
- other
- Returns
- None
Notes
The entries in both dictionaries self.data_per_streamline and self.data_per_point must match respectively those contained in the other tractogram.
-
property
streamlines
¶
-
to_world
(lazy=False)¶ Brings the streamlines to world space (i.e. RAS+ and mm).
If lazy is not specified, this is performed in-place.
- Parameters
- lazy{False, True}, optional
If True, streamlines are not transformed in-place and a
LazyTractogram
object is returned. Otherwise, streamlines are modified in-place.
- Returns
- tractogram
Tractogram
orLazyTractogram
object Tractogram where the streamlines have been sent to world space. If the lazy option is true, it returns a
LazyTractogram
object, otherwise it returns a reference to thisTractogram
object with updated streamlines.
- tractogram
create_tractogram_header¶
-
dipy.io.streamline.
create_tractogram_header
(tractogram_type, affine, dimensions, voxel_sizes, voxel_order)¶ Write a standard trk/tck header from spatial attribute
deepcopy¶
-
dipy.io.streamline.
deepcopy
(x, memo=None, _nil=[])¶ Deep copy operation on arbitrary Python objects.
See the module’s __doc__ string for more info.
detect_format¶
-
dipy.io.streamline.
detect_format
(fileobj)¶ Returns the StreamlinesFile object guessed from the file-like object.
- Parameters
- fileobjstring or file-like object
If string, a filename; otherwise an open file-like object pointing to a tractogram file (and ready to read from the beginning of the header)
- Returns
- tractogram_file
TractogramFile
class The class type guessed from the content of fileobj.
- tractogram_file
is_header_compatible¶
-
dipy.io.streamline.
is_header_compatible
(reference_1, reference_2)¶ Will compare the spatial attribute of 2 references
- Parameters
- reference_1Nifti or Trk filename, Nifti1Image or TrkFile,
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
- reference_2Nifti or Trk filename, Nifti1Image or TrkFile,
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
- Returns
- outputbool
Does all the spatial attribute match
load_dpy¶
-
dipy.io.streamline.
load_dpy
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, shifted_origin=False, bbox_valid_check=True, trk_header_check=True)¶ Load the stateful tractogram of the dpy format
- Parameters
- filenamestring
Filename with valid extension
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Space in which the streamlines will be transformed after loading (vox, voxmm or rasmm)
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (center of the voxel) True is NIFTI standard (corner of the voxel)
- Returns
- outputStatefulTractogram
The tractogram to load (must have been saved properly)
load_fib¶
-
dipy.io.streamline.
load_fib
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, shifted_origin=False, bbox_valid_check=True, trk_header_check=True)¶ Load the stateful tractogram of the fib format
- Parameters
- filenamestring
Filename with valid extension
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Space in which the streamlines will be transformed after loading (vox, voxmm or rasmm)
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (center of the voxel) True is NIFTI standard (corner of the voxel)
- Returns
- outputStatefulTractogram
The tractogram to load (must have been saved properly)
load_tck¶
-
dipy.io.streamline.
load_tck
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, shifted_origin=False, bbox_valid_check=True, trk_header_check=True)¶ Load the stateful tractogram of the tck format
- Parameters
- filenamestring
Filename with valid extension
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Space in which the streamlines will be transformed after loading (vox, voxmm or rasmm)
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (center of the voxel) True is NIFTI standard (corner of the voxel)
- Returns
- outputStatefulTractogram
The tractogram to load (must have been saved properly)
load_tractogram¶
-
dipy.io.streamline.
load_tractogram
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, shifted_origin=False, bbox_valid_check=True, trk_header_check=True)¶ Load the stateful tractogram from any format (trk, tck, fib, dpy)
- Parameters
- filenamestring
Filename with valid extension
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Space in which the streamlines will be transformed after loading (vox, voxmm or rasmm)
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (center of the voxel) True is NIFTI standard (corner of the voxel)
- Returns
- outputStatefulTractogram
The tractogram to load (must have been saved properly)
load_trk¶
-
dipy.io.streamline.
load_trk
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, shifted_origin=False, bbox_valid_check=True, trk_header_check=True)¶ Load the stateful tractogram of the trk format
- Parameters
- filenamestring
Filename with valid extension
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Space in which the streamlines will be transformed after loading (vox, voxmm or rasmm)
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (center of the voxel) True is NIFTI standard (corner of the voxel)
- Returns
- outputStatefulTractogram
The tractogram to load (must have been saved properly)
load_vtk¶
-
dipy.io.streamline.
load_vtk
(filename, reference, to_space=<Space.RASMM: 'rasmm'>, shifted_origin=False, bbox_valid_check=True, trk_header_check=True)¶ Load the stateful tractogram of the vtk format
- Parameters
- filenamestring
Filename with valid extension
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict), or ‘same’ if the input is a trk file. Reference that provides the spatial attribute. Typically a nifti-related object from the native diffusion used for streamlines generation
- spacestring
Space in which the streamlines will be transformed after loading (vox, voxmm or rasmm)
- shifted_originbool
Information on the position of the origin, False is Trackvis standard, default (center of the voxel) True is NIFTI standard (corner of the voxel)
- Returns
- outputStatefulTractogram
The tractogram to load (must have been saved properly)
load_vtk_streamlines¶
-
dipy.io.streamline.
load_vtk_streamlines
(filename, to_lps=True)¶ Load streamlines from vtk polydata.
Load formats can be VTK, FIB
- Parameters
- filenamestring
input filename (.vtk or .fib)
- to_lpsbool
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
- Returns
- outputlist
list of 2D arrays
save_dpy¶
-
dipy.io.streamline.
save_dpy
(sft, filename, bbox_valid_check=True)¶ Save the stateful tractogram in dpy format
- Parameters
- sftStatefulTractogram
The stateful tractogram to save
- filenamestring
Filename with valid extension
- Returns
- outputbool
Did the saving work properly
save_fib¶
-
dipy.io.streamline.
save_fib
(sft, filename, bbox_valid_check=True)¶ Save the stateful tractogram in fib format
- Parameters
- sftStatefulTractogram
The stateful tractogram to save
- filenamestring
Filename with valid extension
- Returns
- outputbool
Did the saving work properly
save_tck¶
-
dipy.io.streamline.
save_tck
(sft, filename, bbox_valid_check=True)¶ Save the stateful tractogram in tck format
- Parameters
- sftStatefulTractogram
The stateful tractogram to save
- filenamestring
Filename with valid extension
- Returns
- outputbool
Did the saving work properly
save_tractogram¶
-
dipy.io.streamline.
save_tractogram
(sft, filename, bbox_valid_check=True)¶ Save the stateful tractogram in any format (trk, tck, vtk, fib, dpy)
- Parameters
- sftStatefulTractogram
The stateful tractogram to save
- filenamestring
Filename with valid extension
- Returns
- outputbool
Did the saving work properly
save_trk¶
-
dipy.io.streamline.
save_trk
(sft, filename, bbox_valid_check=True)¶ Save the stateful tractogram in trk format
- Parameters
- sftStatefulTractogram
The stateful tractogram to save
- filenamestring
Filename with valid extension
- Returns
- outputbool
Did the saving work properly
save_vtk¶
-
dipy.io.streamline.
save_vtk
(sft, filename, bbox_valid_check=True)¶ Save the stateful tractogram in vtk format
- Parameters
- sftStatefulTractogram
The stateful tractogram to save
- filenamestring
Filename with valid extension
- Returns
- outputbool
Did the saving work properly
save_vtk_streamlines¶
-
dipy.io.streamline.
save_vtk_streamlines
(streamlines, filename, to_lps=True, binary=False)¶ Save streamlines as vtk polydata to a supported format file.
File formats can be VTK, FIB
- Parameters
- streamlineslist
list of 2D arrays or ArraySequence
- filenamestring
output filename (.vtk or .fib)
- to_lpsbool
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
- binarybool
save the file as binary
Nifti1Image
¶
-
class
dipy.io.utils.
Nifti1Image
(dataobj, affine, header=None, extra=None, file_map=None)¶ Bases:
nibabel.nifti1.Nifti1Pair
Class for single file NIfTI1 format image
- Attributes
- affine
- dataobj
- header
in_memory
True when any array data is in memory cache
- ndim
- shape
slicer
Slicer object that returns cropped and subsampled images
Methods
ImageArrayProxy
alias of
nibabel.arrayproxy.ArrayProxy
ImageSlicer
alias of
nibabel.spatialimages.SpatialFirstSlicer
as_reoriented
(ornt)Apply an orientation change and return a new image
filespec_to_file_map
(filespec)Make file_map for this class from filename filespec
filespec_to_files
(filespec)filespec_to_files class method is deprecated.
from_file_map
(file_map[, mmap, keep_file_open])class method to create image from mapping in file_map `
from_filename
(filename[, mmap, keep_file_open])class method to create image from filename filename
from_files
(file_map)from_files class method is deprecated.
from_image
(img)Class method to create new instance of own class from img
get_affine
()Get affine from image
get_data
([caching])Return image data from image with any necessary scaling applied
get_fdata
([caching, dtype])Return floating point image data with necessary scaling applied
get_filename
()Fetch the image filename
get_header
()Get header from image
get_qform
([coded])Return 4x4 affine matrix from qform parameters in header
get_sform
([coded])Return 4x4 affine matrix from sform parameters in header
get_shape
()Return shape for image
alias of
Nifti1Header
instance_to_filename
(img, filename)Save img in our own format, to name implied by filename
load
(filename[, mmap, keep_file_open])class method to create image from filename filename
make_file_map
([mapping])Class method to make files holder for this image type
orthoview
()Plot the image using OrthoSlicer3D
path_maybe_image
(filename[, sniff, sniff_max])Return True if filename may be image matching this class
set_filename
(filename)Sets the files in the object from a given filename
set_qform
(affine[, code, strip_shears])Set qform header values from 4x4 affine
set_sform
(affine[, code])Set sform transform from 4x4 affine
to_file_map
([file_map])Write image to file_map or contained
self.file_map
to_filename
(filename)Write image to files implied by filename string
to_files
([file_map])to_files method is deprecated.
to_filespec
(filename)to_filespec method is deprecated.
uncache
()Delete any cached read of data from proxied data
Harmonize header with image data and affine
get_data_dtype
set_data_dtype
-
__init__
(dataobj, affine, header=None, extra=None, file_map=None)¶ Initialize image
The image is a combination of (array-like, affine matrix, header), with optional metadata in extra, and filename / file-like objects contained in the file_map mapping.
- Parameters
- dataobjobject
Object containg image data. It should be some object that retuns an array from
np.asanyarray
. It should have ashape
attribute or property- affineNone or (4,4) array-like
homogenous affine giving relationship between voxel coordinates and world coordinates. Affine can also be None. In this case,
obj.affine
also returns None, and the affine as written to disk will depend on the file format.- headerNone or mapping or header instance, optional
metadata for this image format
- extraNone or mapping, optional
metadata to associate with image that cannot be stored in the metadata of this image type
- file_mapmapping, optional
mapping giving file information for this image format
Notes
If both a header and an affine are specified, and the affine does not match the affine that is in the header, the affine will be used, but the
sform_code
andqform_code
fields in the header will be re-initialised to their default values. This is performed on the basis that, if you are changing the affine, you are likely to be changing the space to which the affine is pointing. Theset_sform()
andset_qform()
methods can be used to update the codes after an image has been created - see those methods, and the manual for more details.
-
files_types
= (('image', '.nii'),)¶
-
header_class
¶ alias of
Nifti1Header
-
update_header
()¶ Harmonize header with image data and affine
-
valid_exts
= ('.nii',)¶
create_nifti_header¶
-
dipy.io.utils.
create_nifti_header
(affine, dimensions, voxel_sizes)¶ Write a standard nifti header from spatial attribute
create_tractogram_header¶
-
dipy.io.utils.
create_tractogram_header
(tractogram_type, affine, dimensions, voxel_sizes, voxel_order)¶ Write a standard trk/tck header from spatial attribute
decfa¶
-
dipy.io.utils.
decfa
(img_orig, scale=False)¶ Create a nifti-compliant directional-encoded color FA image.
- Parameters
- img_origNifti1Image class instance.
Contains encoding of the DEC FA image with a 4D volume of data, where the elements on the last dimension represent R, G and B components.
- scale: bool.
Whether to scale the incoming data from the 0-1 to the 0-255 range expected in the output.
- Returns
- imgNifti1Image class instance with dtype set to store tuples of
uint8 in (R, G, B) order.
Notes
For a description of this format, see:
https://nifti.nimh.nih.gov/nifti-1/documentation/nifti1fields/nifti1fields_pages/datatype.html
decfa_to_float¶
-
dipy.io.utils.
decfa_to_float
(img_orig)¶ Convert a nifti-compliant directional-encoded color FA image into a nifti image with RGB encoded in floating point resolution.
- Parameters
- img_origNifti1Image class instance.
Contains encoding of the DEC FA image with a 3D volume of data, where each element is a (R, G, B) tuple in uint8.
- Returns
- imgNifti1Image class instance with float dtype.
Notes
For a description of this format, see:
https://nifti.nimh.nih.gov/nifti-1/documentation/nifti1fields/nifti1fields_pages/datatype.html
detect_format¶
-
dipy.io.utils.
detect_format
(fileobj)¶ Returns the StreamlinesFile object guessed from the file-like object.
- Parameters
- fileobjstring or file-like object
If string, a filename; otherwise an open file-like object pointing to a tractogram file (and ready to read from the beginning of the header)
- Returns
- tractogram_file
TractogramFile
class The class type guessed from the content of fileobj.
- tractogram_file
get_reference_info¶
-
dipy.io.utils.
get_reference_info
(reference)¶ Will compare the spatial attribute of 2 references
- Parameters
- referenceNifti or Trk filename, Nifti1Image or TrkFile, Nifti1Header or
trk.header (dict) Reference that provides the spatial attribute.
- Returns
- outputtuple
affine ndarray (4,4), np.float32, tranformation of VOX to RASMM
dimensions list (3), int, volume shape for each axis
voxel_sizes list (3), float, size of voxel for each axis
voxel_order, string, Typically ‘RAS’ or ‘LPS’
is_header_compatible¶
-
dipy.io.utils.
is_header_compatible
(reference_1, reference_2)¶ Will compare the spatial attribute of 2 references
- Parameters
- reference_1Nifti or Trk filename, Nifti1Image or TrkFile,
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
- reference_2Nifti or Trk filename, Nifti1Image or TrkFile,
Nifti1Header or trk.header (dict) Reference that provides the spatial attribute.
- Returns
- outputbool
Does all the spatial attribute match
make5d¶
-
dipy.io.utils.
make5d
(input)¶ reshapes the input to have 5 dimensions, adds extra dimensions just before the last dimession
nifti1_symmat¶
-
dipy.io.utils.
nifti1_symmat
(image_data, *args, **kwargs)¶ Returns a Nifti1Image with a symmetric matrix intent
- Parameters
- image_dataarray-like
should have lower triangular elements of a symmetric matrix along the last dimension
- all other arguments and keywords are passed to Nifti1Image
- Returns
- imageNifti1Image
5d, extra dimensions addes before the last. Has symmetric matrix intent code
load_polydata¶
-
dipy.io.vtk.
load_polydata
(file_name)¶ Load a vtk polydata to a supported format file.
Supported file formats are OBJ, VTK, FIB, PLY, STL and XML
- Parameters
- file_namestring
- Returns
- outputvtkPolyData
load_vtk_streamlines¶
-
dipy.io.vtk.
load_vtk_streamlines
(filename, to_lps=True)¶ Load streamlines from vtk polydata.
Load formats can be VTK, FIB
- Parameters
- filenamestring
input filename (.vtk or .fib)
- to_lpsbool
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
- Returns
- outputlist
list of 2D arrays
optional_package¶
-
dipy.io.vtk.
optional_package
(name, trip_msg=None)¶ Return package-like thing and module setup for package name
- Parameters
- namestr
package name
- trip_msgNone or str
message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.
- Returns
- pkg_likemodule or
TripWire
instance If we can import the package, return it. Otherwise return an object raising an error when accessed
- have_pkgbool
True if import for package was successful, false otherwise
- module_setupfunction
callable usually set as
setup_module
in calling namespace, to allow skipping tests.
- pkg_likemodule or
Examples
Typical use would be something like this at the top of a module using an optional package:
>>> from dipy.utils.optpkg import optional_package >>> pkg, have_pkg, setup_module = optional_package('not_a_package')
Of course in this case the package doesn’t exist, and so, in the module:
>>> have_pkg False
and
>>> pkg.some_function() #doctest: +IGNORE_EXCEPTION_DETAIL Traceback (most recent call last): ... TripWireError: We need package not_a_package for these functions, but ``import not_a_package`` raised an ImportError
If the module does exist - we get the module
>>> pkg, _, _ = optional_package('os') >>> hasattr(pkg, 'path') True
Or a submodule if that’s what we asked for
>>> subpkg, _, _ = optional_package('os.path') >>> hasattr(subpkg, 'dirname') True
save_polydata¶
-
dipy.io.vtk.
save_polydata
(polydata, file_name, binary=False, color_array_name=None)¶ Save a vtk polydata to a supported format file.
Save formats can be VTK, FIB, PLY, STL and XML.
- Parameters
- polydatavtkPolyData
- file_namestring
save_vtk_streamlines¶
-
dipy.io.vtk.
save_vtk_streamlines
(streamlines, filename, to_lps=True, binary=False)¶ Save streamlines as vtk polydata to a supported format file.
File formats can be VTK, FIB
- Parameters
- streamlineslist
list of 2D arrays or ArraySequence
- filenamestring
output filename (.vtk or .fib)
- to_lpsbool
Default to True, will follow the vtk file convention for streamlines Will be supported by MITKDiffusion and MI-Brain
- binarybool
save the file as binary
transform_streamlines¶
-
dipy.io.vtk.
transform_streamlines
(streamlines, mat, in_place=False)¶ Apply affine transformation to streamlines
- Parameters
- streamlinesStreamlines
Streamlines object
- matarray, (4, 4)
transformation matrix
- in_placebool
If True then change data in place. Be careful changes input streamlines.
- Returns
- new_streamlinesStreamlines
Sequence transformed 2D ndarrays of shape[-1]==3