High-accuracy Decoding of Complex Visual Scenes from Discrete Neurons Using Neural Networks
Randall J. Ellis and Michael Michaelides
Received Date: 16th August 2018
Neuronal diversity contributes to sensory encoding and decoding, popular research areas in neuroscience, neuroprosthetics and artificial intelligence. Deciphering this contribution necessitates the assessment of brain physiology in genetically-defined neurons during the presentation of discrete sensory cues. Neural networks are a powerful technique for formulating hierarchical representations of data using layers of nonlinear transformations. Here we leverage a collection of neuronal activity data, derived from ~25,000 genetically-defined neurons of the parcellated mouse visual cortex during the presentation of 118 naturalistic scenes, to demonstrate that neural networks can decode visual scenes from neuronal calcium responses with high (~96%) accuracy. Our findings reveal a neuroanatomical map of visual decoding strength traversing brain regions, cortical layers, neuron types, and time. Our findings also demonstrate the utility of feature selection in assigning contributions of neuronal diversity to visual decoding accuracy and the low requirement of network architecture complexity for high-accuracy decoding in this experimental context.
Read in full at bioRXiv.
This is an abstract of a preprint hosted on an independent third party site. It has not been peer reviewed but is currently under consideration at Nature Communications.