Skip to content

Temporal alignment of pupillary response with stimulus events via deconvolution

Notifications You must be signed in to change notification settings

hauselin/2016-JASA-pupil-deconv-methods

 
 

Repository files navigation

Pupillometry deconvolution methods

This is a methods paper on deconvolution of pupillometry data for auditory experiments.

McCloy D, Larson E, Lau B, & Lee AKC (2016). Temporal alignment of pupillary response with stimulus events via deconvolution. The Journal of the Acoustical Society of America, 139(3), EL57–EL62. doi: 10.1121/1.4943787

Raw data is cleaned and aggregated with analyze-data.py (for the pupil impulse response experiment) and analyze-voc-data.py (for the vocoded letters experiment). These should create the summary data objects needed to create the figures (avg_data.npz for Figure 1; voc_data.npz and voc_data_wierda.npz for Figure 3).

Once those are in place, the makefile for the article accepts directives pre (for prepress version formatted similar to the final formatting in JASA-EL), sub (for the JASA-EL submittable version: with double spacing, line numbers, list of figures, etc), or web (for a prepub manuscript with my preferred formatting for posting on the web).

NB: Figure generation is not 100% automated for the make sub directive, because it requires opening the auto-generated PDF figures in Adobe Illustrator and saving them as EPS files in order to avoid rasterization of the semi-transparent regions of the plot (only strictly necessary for Figure 3).

About

Temporal alignment of pupillary response with stimulus events via deconvolution

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 58.0%
  • TeX 36.8%
  • Makefile 5.2%