Skip to content

burnash/neurolearn

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Build Status

neurolearn

Python toolbox for analyzing neuroimaging data. It is based off of Tor Wager's object oriented matlab canlab core tools and relies heavily on nilearn and scikit learn

Current Tools

  • data.Brain_Data: Class to work with 4D imaging data in Python
  • data.Brain_Data.predict: Multivariate Prediction
  • data.Brain_Data.similarity: Calculate spatial similarity with another image
  • data.Brain_Data.distance: Calculate spatial distance of a group of images
  • data.Brain_Data.regress: Univariate Regression
  • data.Brain_Data.ttest: Univariate One Sample t-test
  • analysis.Roc: perform ROC analysis
  • pipelines.Couple_Preproc_Pipeline: preprocessing pipeline for multiband data
  • simulator.Simulator: Class for simulating multivariate data
  • mask.create_sphere: Create spherical masks

Installation

  1. Method 1

    git clone git+https://github.com/ljchang/neurolearn
    
  2. Method 2

    git clone https://github.com/ljchang/neurolearn
    python setup.py install
    

Documentation

Current Documentation can be found at readthedocs. Please see the ipython notebook examples for walkthroughs of how to use most of the toolbox.

Here is a jupyter notebook with a detailed overview of how to use the main Brain_Data class. We also have a notebook containing other analysis methods such as prediction and ROI curves (note it is now recommended to use the prediction Brain_Data method).

Preprocessing

Here is an example preprocessing pipeline for multiband data. It uses nipype and tools from SPM12 and FSL. Make sure that fsl, matlab, dcm2nii are on your unix environment path. It might be helpful to create a symbolic link somewhere common like /usr/local/bin. This pipeline can be run on a cluster see nipype workflow documentaiton. The nipype folder is quite large due to matlab's need for unzipped .nii files. It can be deleted if space is an issue.

  • Uses Chris Rorden's dcm2nii to convert dcm to nii
  • Uses Nipy's Trim to remove the first 10 volumes (i.e., disdaqs)
  • Uses FSL's topup to perform distortion correction. Default is AP (need to switch order of concatentation if PA is needed)
  • Uses SPM12 realignment to mean
  • Uses SPM12 to coregister functional to structural
  • Uses SPM12 new nonlinear normalization routine
  • Uses SPM12 smoothing with 6mm fwhm as default
  • Uses Artifact Detection Toolbox to detect scanner spikes.
  • Uses Nipype Datasink to write out key files to new output directory under subject name
  • Will create a quick montage to check normalization
  • Will output a plot of realignment parameters
  • Will output a covariate csv file with 24 parameter centered motion parameters, their squares, and the 12 derivatives (6 motion + 6 squared motion).

Here is an example script.

from nltools.pipelines import Couple_Preproc_Pipeline
import os

base_dir = '/Users/lukechang/Dropbox/Couple_Conflict/Data/Scanner'
spm_path = '/Users/lukechang/Resources/spm12/'
output_dir = '/Users/lukechang/Dropbox/Couple_Conflict/Data/Imaging'

# Get Subject ID
subject_list = os.listdir(os.path.join(base_dir))
subject_id = subject_list[1]

#Run Pipeline
wf = Couple_Preproc_Pipeline(base_dir=base_dir, output_dir=output_dir, subject_id=subject_id, spm_path=spm_path)
# wf.run('MultiProc', plugin_args={'n_procs': 8}) # This command runs the pipeline in parallel (using 8 cores)
wf.write_graph(dotfilename=os.path.join(output_dir,'Workflow_Pipeline.dot'),format='png')
wf.run()

pipeline

About

Python toolbox for analyzing imaging data

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 96.5%
  • Python 3.5%