JavaScript is required to use this site. Please enable JavaScript in your browser settings.

Scalable Architectures for Neuromorphic Machine Learning

At the 11th International Summer School on AI and Big Data, Dr. Anand Subramoney (Royal Holloway, University of London, United Kingdom) will give a keynote speech on Scalable Architectures for Neuromorphic Machine Learning.

Scalable Architectures for Neuromorphic Machine Learning

Neuromorphic computing provides the potential to scale up AI models while remaining as energy efficient as the human brain. But what are the building blocks we need for scalable neuromorphic AI? I will discuss how to design architectures for neuromorphic machine learning from first principles, taking inspiration from biology without being constrained by biological details. I will present recent work from my group on using various forms of sparsity and distributed learning to improve the scalability and efficiency of neuromorphic deep learning models. 

Dr. Anand Subramoney

Anand Subramoney is an Assistant Professor in the Department of Computer Science at Royal Holloway, University of London. He is broadly interested in learning and intelligence, both algorithmic and biological. His current research focusses on understanding intelligence both through the engineering lens of neuromorphic computing and the biological lens of neuroscience. His research draws inspiration from both in his quest to build a better and more general artificial intelligence.

funded by:
Gefördert vom Bundesministerium für Bildung und Forschung.
Gefördert vom Freistaat Sachsen.