Skip to content Skip to navigation

Masters Defense: A neuroinspired oculomotor controller for a robotic head prototype


Robotic vision introduces requirements for real-time processing of fast-varying, noisy information in a continuously changing environment. In a real-world environment, convenient assumptions, such as static camera systems and deep learning algorithms devouring high volumes of ideally slightly-varying data, are hard to survive. Leveraging on recent neural studies that show smooth pursuit – that is the slow movement of the eyes – and saccade – their rapid movement – being handled by the same neuronal circuitry, we designed a neuromorphic oculomotor controller and placed it at the heart of a biomimetic robotic head prototype that we designed and built. The oculomotor controller is unique in the sense that 1) the encoding and processing of information is done via spikes communicated across models of biological neurons, and 2) it mimics the brain structure and therefore requires no training to operate. The tracking performance of our proposed system is comparable to that of PID controllers while the robotic eye kinematics are strikingly similar to those reported in human eye studies. Interestingly, employing biologically constrained methods of learning for the motor-related neurons resulted to synaptic weights that are comparable to the ones reported in the literature. This work contributes to the overarching goal of ComBra Lab, which is to develop neuromimetic “bottom-up” computational models of brain networks, where intelligence, emerging through a combination of network structure and learning, is used to control robots. A demo of the robot can be found here:

Praveenram Balachandar
Event Date: 
04/10/2017 - 12:15pm
K. Michmizos (chair), C. Kulikowski, P. Awasthi
Event Type: 
Masters Defense
Computer Science