CS Events

Qualifying Exam

Towards Less Artificial Intelligence: Recurrent Spiking Neuronal-Astrocytic Networks Operating at the Edge of Chaos

 

Download as iCal file

Thursday, May 09, 2019, 12:30pm

 

Abstract:

Recurrent networks are the most brain-like and, ironically, the most difficult to train learning models. In my talk, I will propose that there could be fundamental parts and design methods missing in current neuromorphic learning models, which prevents them from performing on par with biology. I will give an overview of our rather heretic efforts to α) decipher the network-level principles of intelligence in a fundamentally new perspective that incorporates glial cells, the abundant yet long-neglected non-neuronal brain cells, and β) develop the next generation of less artificial, intrinsically intelligent, algorithms, that can be seamlessly integrated into newly developed neuromorphic hardware.

In the first part of my talk, I will present our computational exploration of a fascinating possibility that a single astrocytic cell is monitoring the computational performance of neuronal networks. Specifically, I will propose a biophysically constrained model of an astrocyte that exhibits homeostatic plasticity and learns to become receptive of chaotic dynamics in its synaptic space, emulated as a modified 2-Dimensional Ising model. I will further show how an astrocyte, using the frequency of its intracellular calcium activity, can impose criticality in recurrent biological networks and their artificial counterparts, both of which are complex nonlinear dynamical systems that maximize their computational capacity at the edge of chaos, where systems can balance robustness and versatility.

In the second part of my talk, I will describe how we are translating our computational understanding of the role of the non-neuronal cells into recurrent neuronal networks. Specifically, I will describe how non-neuronal plasticity mechanisms can regulate an Izhikevich-type spiking neuronal network (SNN). I will further demonstrate that the distribution of network spike velocities significantly impacted the network’s oscillation frequency and synchronization, both of which are heavily linked to information processing and learning in the brain.

I will conclude my presentation by describing our efforts to translate our findings into Intel’s Loihi, a neuromorphic research chip that is available in a handful of research sites in the world, including our Lab.

Speaker: Vladimir Ivanov

Location : CBIM 22

Committee

Professor Konstantinos Michmizos (Chair), Professor Casimir Kulikowski, Professor Thu Nguyen, Professor Eric Allender

Event Type: Qualifying Exam

Organization

Department of Computer Science