Esempio n. 1
0
def show_nifti(image_path_or_image, colormap='gray'):
    try:
        from niwidgets import NiftiWidget
        with warnings.catch_warnings():
            warnings.filterwarnings("ignore", category=FutureWarning)
            widget = NiftiWidget(image_path_or_image)
            widget.nifti_plotter(colormap=colormap)

    except Exception:
        if isinstance(image_path_or_image, nib.AnalyzeImage):
            nii = image_path_or_image
        else:
            image_path = image_path_or_image
            nii = nib.load(str(image_path))
        k = int(nii.shape[-1] / 2)
        plt.imshow(nii.dataobj[..., k], cmap=colormap)
        plt.show()
Esempio n. 2
0
 def __init__(self, path_image='path'):
     from niwidgets import NiftiWidget
     import nilearn.plotting as nip
     my_widget = NiftiWidget(path_image)
     my_widget.nifti_plotter(plotting_func=nip.plot_glass_brain)
Esempio n. 3
0
from itertools import islice
import matplotlib.pyplot as plt
import numpy as np

get_ipython().magic('matplotlib inline')

nhw = os.environ['NHW']

filename = os.path.join(nhw, 'data', 'subjA', 'subjA_t0_pet_mni.nii.gz')
filename_4D = os.path.join(nhw, 'data', 'subjA', 'subjA_4D_pet_mni.nii.gz')

# Here's the default niwidget:

# In[2]:

my_widget = NiftiWidget(filename)
my_widget.nifti_plotter()

# It is possible to specify your own plotting function rather than use the default plotter in niwidgets. All the arguments following the `plotting_func` variable are passed onto the plotting function as interactive parameters (unless you put the variable inside `ipywidgets`'  `fixed()` function). `colormap` or `cmap` is always passed as an interactive parameter, with colormap options being obtained from matplotlib within niwidgets.

# In[3]:

my_widget.nifti_plotter(plotting_func=nip.plot_glass_brain,
                        threshold=1.,
                        display_mode=fixed("lyrz"))

# ## Extending `niwidgets` functionality to 4D images

# The idea is to write a function to plot a single 3D volume from a 4D volume. This function needs to take as input the index along the 4th dimension to be plotted. We can then use niwidgets with our custom plotting function and specify the index along 4th dimension as an interactive parameter.
#
# Below is a function to plot a given time point on the glass brain from a 4D volume:
Esempio n. 4
0
def test_niftiwidget():
    NiftiWidget(example_t1)
Esempio n. 5
0
# -*- coding: utf-8 -*-

from logging import getLogger

import ipyvolume as ipv
from niwidgets import NiftiWidget
import numpy as np

logger = getLogger(__name__)

if __name__ == '__main__':
    ddd = np.random.random((100, 100, 100))
    ipv.quickvolshow(ddd)

    my_widget = NiftiWidget(fname)
    my_widget.nifti_plotter()

    pass
Esempio n. 6
0
#the notebook was launched from within the repo directory
gitRepoPath = subprocess.check_output(['git', 'rev-parse', '--show-toplevel'
                                       ]).decode('ascii').strip()

#move to the top of the directory
os.chdir(gitRepoPath)

#establish path to new nifti file
newAtlasPath = os.path.join(gitRepoPath, 'exampleData',
                            'renumberedAtlas.nii.gz')

#store the modified atlas data in a nifti object
renumberedAtlasNifti = nib.Nifti1Image(relabeledAtlas, atlasImg.affine,
                                       atlasImg.header)
#save the object down
nib.save(renumberedAtlasNifti, newAtlasPath)

#plot it
atlas_widget = NiftiWidget(newAtlasPath)
atlas_widget.nifti_plotter(colormap='nipy_spectral')

# As you use the above sliders to move through the parcellation there are several things to take note of.  First and perhaps most striking,  while the saggital view may be appropriately labled, it is oriented in a somewhat disorienting fashion (typically we might expect this to be oriented such that the left and right axes of the plot correspond to the anterior-posterior axis of the brain).  The coronal and axial slices are switched, but even switching them back wouldn't be a complete solution, as the mislabeled axial slice appears to be upside down.  These observations combined suggest that the affine matrix is either incorrect or is being interpreted incorrectly.  We'll consider this issue a bit more in the next chapter when we briefly look at affines for nifti images.  For now though, we can consider some other features of the visualization.
#
# Another feature of the visualization we can note is the qualatitive fashion which the parcellation actually "parcellates" (divides into pieces) the brain.  The vibrant, "color-by-numbers" image we're seeing above quite litteraly illustrates the distinct numerical entries in the NIfTI data object.  Just as in the digital image parcellation case, each data structure element can hold a single integer value, and can thus be associated with one, and *only* one, label, and by extension, color.  Multiple integers (and thus assignments) simply *won't* fit in the same data entry in the parcellation data structure.
#
# As a general note that is particular to brain parcellations, the anatomical entities represented in a parcellation will vary depending on what method and/or "ontology" (list of anatomical entities which are considered to "exist" for the purposes of a given parcellation) is applied.  For example, the brain regions represented in the parcellation presented above are derived from the ["Destrieux" 2009 atlas](https://dx.doi.org/10.1016%2Fj.neuroimage.2010.06.010).  The parcelled regions in this atlas (and the majority of other brain atlases, for that matter) correspond to brain regions which exhibit structural, functional, and/or microstructural homogenities relative to their neighboring brain regions.
#
# Now that we have explored three dimensional parcellations a bit, lets move on to a consideration of affine transforms and how they can be used to align parcellations with raw data.

# In[ ]:
Esempio n. 7
0
def test_creation():
    test_widget = NiftiWidget(examplet1)
Esempio n. 8
0
 def niwidgets_plot(self, **kwargs):
     widget = NiftiWidget(self.path)
     return widget.nifti_plotter(**kwargs)
Esempio n. 9
0
#the notebook was launched from within the repo directory
gitRepoPath = subprocess.check_output(['git', 'rev-parse', '--show-toplevel'
                                       ]).decode('ascii').strip()

#move to the top of the directory
os.chdir(gitRepoPath)

import nibabel as nib
#establish path to parcellation
atlasPath = os.path.join(gitRepoPath, 'exampleData', 'parc.nii.gz')
#load it as an object
atlasImg = nib.load(atlasPath)

from niwidgets import NiftiWidget
#plot it
atlas_widget = NiftiWidget(atlasImg)
atlas_widget.nifti_plotter(colormap='nipy_spectral')

# In the above table we can get a sense of which labels have the most streamlines connecting one another.  There's a number of things to note and keep in mind about these results.
#
# -all streamlines are assigned
# -this assignment reveals that not all streamlines are biologically valid
# -
#
#
# Given the assignment we've just performed, we can also selectively visualize sub-selections of streamlines based on which parcellation labels the streamline terminations are closest to.
#
# NOTE: not all labels share streamlines, and even amongst those that do, there may only be a few streamline that meet the connectivity criterion.

# In[4]:
#perform plotting of positive values
plt.subplot(1, 2, 2)
plt.hist(unwrappedData[largeBool], bins=400)
plt.xlabel('Voxel Data')
plt.ylabel('Number of Voxels')
plt.title('Distributon of voxel values greater than zero')

# It seems that about a quarter of the voxels contain data that we would consider to be reprsentatve of the brain, with the assumption that "empty" voxels correspond to background and/or uninformative voxels.  Depending on whether or not this T1 has had the brain "extracted" (i.e. the brain isolated from the rest of the head, neck, and body, via a masking process), this proporation may also include non-brain tissues.  Indeed, given that we took a look at this very same T1 image in the previous lesson, we know that this is the case an that the brian has been isolated from non brain tissues.  Furthermore, note how we had to split the histogram in two.  Had we not done this, the number of empty voxels would have overwhelmed the visualization and we wouldn't have been able to observe the distribution visible in the plot on the right due to the extreme number of values right below 0.
#
# Now that we have a sense of the numerical variability of the data in this nifti, lets get a sense of how these values are laid out spatially.  Keep in mind that, just like a digital image wherein the i,j entry of the data array represents a portion of space that spatially adjacent to the i,j-1 (or i,j+1, or i+1,j etc.), the i,j,k entry of a NIfTI is spatially adjacent to the i,j,k+1 entry.  We can get a better sense of this by interacting with the NIfTI using a [niwidget](https://nipy.org/niwidgets/).

# In[11]:

from niwidgets import NiftiWidget

t1Widget = NiftiWidget(t1Path)

t1Widget.nifti_plotter(colormap='gray')

# Take a moment to shift through the NIfTI image plotted above.  If you want a more standard visualization, feel free to switch the colormap to gray.  **As a challenge, try and shift the x, y, and z sliders to cross at the posterior commisure, and take note of the coordinates of this point**.  Take note that the left, posterior, inferior corner is the 0 coordinate.  The coordinate shift that results in the posterior commisure being at (0,0,0) occurs after the previously qoffset information has been applied.  After this transform has been applied, locations in the left hemisphere are characterized by coorinates that have a negative first value (x coordinate).
#
# As an alternative to trying to find the posterior commisure manually, we can also use the information in the header (assuming its accurate) to compute its location in this image data.

# In[12]:

print('T1 voxel resolution (in mm)')
print(img.header.get_zooms())
print('')

print('T1 voxel affine')
imgAff = img.affine