CS Events

PhD Defense

Non-neuronal Computational Principles for Increased Performance in Brain-inspired Networks

 

Download as iCal file

Friday, July 15, 2022, 10:00am - 12:00pm

 

Speaker: Vladimir Ivanov

Location : Virtual

Committee

Dr. Konstantinos Michmizos (Chair),

Dr. Casimir Kulikowsk

Dr. Dimitris Metaxa

Dr. Constantinos Siettos (University of Naples Federico II, Italy)

Event Type: PhD Defense

Abstract: Understanding the brain’s computational principles holds potential to lead to human-level machine learning and intelligence. Yet, current brain-inspired spiking neural network (SNN) methods assume that only neurons form the basis of computation in the brain, which contradicts increasing biological evidence associating the function of ubiquitous non-neuronal cells, known as astrocytes, to brain cognitive states, memory and learning. Interestingly, these information processing functions are closely linked to near-critical neuronal network dynamics, which are believed to underlie the brain’s impressive computational abilities. Inspired by this emerging data, in this talk I will present our efforts to understand the computational role of astrocytes in the brain and to translate our findings into brain-inspired neuron-astrocyte spiking networks. First, I will present our biologically plausible computational modeling approach to propose that astrocytes learn to detect and signal deviations in brain activity away from critical dynamics. Further, we suggest that astrocytes achieve this function by approximating the information content of their neuronal inputs. Translating these findings into a well established brain-inspired machine learning paradigm known as the liquid state machine (LSM), I will next present our proposed neuron-astrocyte liquid state machine (NALSM). Specifically, the NALSM uses a single astrocyte to organize recurrent (liquid) network dynamics at the critical regime where the LSM computational capacity peaks. We demonstrate that the NALSM achieves state-of-the-art LSM accuracy and does not need data-specific hand-tuning as do comparable LSM methods. Further, we demonstrate that the NALSM achieves comparable performance to fully-connected multilayer spiking neural networks trained via backpropagation, with a top accuracy of 97.61% on MNIST, 97.51% on N-MNIST, and 85.84% on Fashion-MNIST. Overall, our dual pronged approach leads to testable biological hypotheses about brain computation, which are then abstracted away and integrated directly into brain-inspired machine learning methods.

Organization

Department of Computer Science

School of Arts & Sciences

Rutgers University

Contact  Dr. Konstantinos Michmizos

Webex: https://rutgers.webex.com/meet/vai9