gms | German Medical Science

Artificial Vision 2017

The International Symposium on Visual Prosthetics

01.12. - 02.12.2017, Aachen

An encoding model of visual and electrical stimuli in rat lateral geniculate nucleus: A deep learning approach

Meeting Abstract

  • Seif Eldawlatly - Computer and Systems Engineering Department, Faculty of Engineering, Ain Shams University, Cairo, Egypt
  • E. Mounir - Computer and Systems Engineering Department, Faculty of Engineering, Ain Shams University, Cairo, Egypt
  • B. Abdullah - Computer and Systems Engineering Department, Faculty of Engineering, Ain Shams University, Cairo, Egypt
  • H. M. K. Mahdi - Computer and Systems Engineering Department, Faculty of Engineering, Ain Shams University, Cairo, Egypt

Artificial Vision 2017. Aachen, 01.-02.12.2017. Düsseldorf: German Medical Science GMS Publishing House; 2017. Doc17artvis18

doi: 10.3205/17artvis18, urn:nbn:de:0183-17artvis182

Published: November 30, 2017

© 2017 Eldawlatly et al.
This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at http://creativecommons.org/licenses/by/4.0/.


Outline

Text

Objective: Develop an encoding model that can predict the activity of rat Lateral Geniculate Nucleus (LGN) neurons in response to visual and electrical stimulation.

Materials and Methods: We recorded the activity of right LGN neurons in anesthetized female Albino rats using micro-electrode arrays during visual (7 rats) and electrical (3 rats) stimulation. For visual stimulation, a screen was placed tangent to the visual field of the left eye divided into 4 x 8 pixels; each was flashed 100 times. For electrical stimulation, we used a pulse train pattern consisting of 5 biphasic pulses applied 50 times per stimulation channel. An average of 13 and 8.6 neurons per rat were identified for visual and electrical stimulation, respectively. Extracted firing rates and the corresponding stimulation patterns were used to build a deep Convolutional Neural Network (CNN) encoding model using 80% of the data. We then used the model to predict the firing rates in the remaining 20% of the data.

Results: The mean correlation between the predicted and actual firing rates across all rats for the visual encoding was 0.5 (maximum was 0.79) and 0.65 (maximum was 0.86) for 10 ms and 50 ms firing rate windows, respectively. For the electrical encoding, the mean correlation was 0.22 (maximum was 0.42) and 0.59 (maximum was 0.69) for 10 ms and 50 ms firing rate windows, respectively.

Discussion: Our results demonstrate the efficacy of using deep CNNs in predicting the response of LGN neurons to visual and electrical stimulation where including the stimulation and firing history enhances the performance of the model.

Acknowledgement: This work was supported by Science and Technology Development Fund (STDF) grant 5168.