Article
An encoding model of visual and electrical stimuli in rat lateral geniculate nucleus: A deep learning approach
Search Medline for
Authors
Published: | November 30, 2017 |
---|
Outline
Text
Objective: Develop an encoding model that can predict the activity of rat Lateral Geniculate Nucleus (LGN) neurons in response to visual and electrical stimulation.
Materials and Methods: We recorded the activity of right LGN neurons in anesthetized female Albino rats using micro-electrode arrays during visual (7 rats) and electrical (3 rats) stimulation. For visual stimulation, a screen was placed tangent to the visual field of the left eye divided into 4 x 8 pixels; each was flashed 100 times. For electrical stimulation, we used a pulse train pattern consisting of 5 biphasic pulses applied 50 times per stimulation channel. An average of 13 and 8.6 neurons per rat were identified for visual and electrical stimulation, respectively. Extracted firing rates and the corresponding stimulation patterns were used to build a deep Convolutional Neural Network (CNN) encoding model using 80% of the data. We then used the model to predict the firing rates in the remaining 20% of the data.
Results: The mean correlation between the predicted and actual firing rates across all rats for the visual encoding was 0.5 (maximum was 0.79) and 0.65 (maximum was 0.86) for 10 ms and 50 ms firing rate windows, respectively. For the electrical encoding, the mean correlation was 0.22 (maximum was 0.42) and 0.59 (maximum was 0.69) for 10 ms and 50 ms firing rate windows, respectively.
Discussion: Our results demonstrate the efficacy of using deep CNNs in predicting the response of LGN neurons to visual and electrical stimulation where including the stimulation and firing history enhances the performance of the model.
Acknowledgement: This work was supported by Science and Technology Development Fund (STDF) grant 5168.