Models of the functional relationship between dynamic sensory stimuli and neural activity form a foundation of research in sensory neuroscience. The advent of modern machine learning methods has introduced the possibility of new and more powerful models of sensory coding. Studies using convolutional neural networks (CNNs) and related models have shown that they can outperform traditional encoding models, in some cases by a substantial degree. In addition to standard applications describing feed-forward coding by single neurons, these methods can be adapted to multi-channel neural data and to characterization of behavior state-dependent changes in coding. While potentially powerful, CNNs can be challenging to implement and interpret, especially without expertise in computational methods. We have developed the Neural Encoding Model System (NEMS) as a python-based toolkit for fitting both traditional and machine learning models to sensory neurophysiology data.
NEMS was developed for use in the auditory system but it can be applied to any system representing information about dynamic extrinsic signals. It employs a modular design that allows elements from traditional encoding models (linear filters, synaptic plasticity, gain control) to be incorporated into artificial neural network models with broad flexibility fitting algorithms. Models can be fit using either scipy- or Tensorflow-based backends. A scripting system allows scaling to large datasets and compute clusters. The system also streamlines direct, quantitative comparison of a large family of models on the same dataset and characterizing functional equivalence of different model architectures.
Data types: calcium imaging, single unit, EEG
Sampling frequency: 0.1 Hz to 1 kHz
Time scale: 10s of seconds to hours
Modality/area: auditory system, can be adapted to sensory and motor systems.