New computational paradigms inspired by the brain
The lecture covers a brief introduction to neuromorphic engineering, some of the neuromorphic networks that the speaker has developed, and their potential applications, particularly in machine learning.
Talk abstract: The human brain is the most complex and powerful computing device in the known universe. It's ability to learn, recognise complex patterns and make sense of the world is truly remarkable as is its compact, power efficient hardware. Since the dawn of scientific inquiry the brain and the origin of human intelligence has been the subject of much debate and research. The field of Artificial Intelligence (AI) was born with a paper by Warren McCulloch and Walter Pitts in 1943 that described a simplified neuron model. Since this seminal work, the field of AI has branched off into sub-disciplines including Neural Networks (NNs), Machine Learning and Computational Neuroscience (CNS). Neuromorphic Engineering (NE) is another offshoot that approaches AI from a different point-of-view albeit with similar aims: to emulate the extraordinary performance of the brain. NE was born in 1989 when CalTech professor, Carver Mead, published “Analog VLSI and Neural Systems” a breakthrough publication that related the biology and physiology of neurons with the physics of transistors. This book paved the way for the neuromorphic approach that attempts to gain insight into biological functionality by building hardware systems that are subject to similar physical constraints - noise, area, power consumption and so on, as well as build intelligent computing machines. For much of the 26 year history of NE the focus has been on building sensory devices such as silicon retinas and silicon cochleas, however, in recent years the focus has been on building learning and classification networks that may one day replace conventional computer architectures. In this presentation, I’ll give a brief introduction to NE. I’ll also discuss some of the neuromorphic networks that we have developed in my lab and their potential applications, particularly in machine learning.