Research


I'm a principal investigator at the Institute of Neuroinformatics and a guest researcher at Angelika Steger's group in the Computer Science Department of ETH Zürich. My research interests lie at the intersection of neuroscience and machine learning. I try to understand the principles that enable humans to learn so efficiently, when compared to artificial neural networks. My goal is to develop better neuroscience-inspired machine learning algorithms, and in turn to use insights gained from designing these to understand learning in the brain.
My research is supported by a SNSF Ambizione Fellowship and an ETH Zurich Research Grant.
Previously, from 2015 to 2018, I was a postdoc with Walter Senn at the University of Bern. Together with Rui P. Costa and Yoshua Bengio we developed a model for error backpropagation in the cortex.
I received my PhD in computer science from IST (University of Lisbon, 2014) where I studied neural network models of memory with Andreas Wichert. Still at IST, in 2015, I was awarded a short research fellowship to work with Francisco C. Santos. During this period I studied energy-efficient synaptic plasticity rules with Mark van Rossum.
Current students and collaborators
Simon Schug — PhD student
Nicolas Zucchet — PhD student
Johannes von Oswald — PhD student (co-supervised with Angelika Steger)
Dominic Zhao — External collaborator at Common Sense Machines (previously: BSc student)
Alexandra Proca — Research assistant
Alexander Meulemans — Collaborator at the Computer Science Department of ETH Zürich
Seijin Kobayashi — Collaborator at the Computer Science Department of ETH Zürich
Angelika Steger — Collaborator at the Computer Science Department of ETH Zürich
Maciej Wołczyk — Visiting student
Upcoming and recent presentations
June 2022 (video on YouTube): I gave a lecture with Alexander, Simon and Nicolas for the MLSSN 2022 summer school in Krakow, Poland where we discussed bilevel optimization problems involving neural networks. We covered how to solve them with recurrent backpropagation, equilibrium propagation, and some of our own work on learning and meta-learning without error backpropagation. The lecture is now on YouTube..
April 2022: I gave an Oxford NeuroTheory Forum seminar presenting our work on biologically-plausible meta-learning.
April 2022: Alexander presented ongoing work on our new principle for learning at the NAISys 2022 conference in Cold Spring Harbor, NY.
March 2022: Nicolas presented our work on biologically-plausible meta-learning at the Swiss Computational Neuroscience Retreat in Crans Montana.
Recent papers
![]() |
Nicolas Zucchet, João Sacramento (2022). Beyond backpropagation: implicit gradients for bilevel optimization. Preprint: arXiv:2205.03076 [ paper ] |
---|
![]() |
Nicolas Zucchet*, Simon Schug*, Johannes von Oswald*, Dominic Zhao, João Sacramento (2021). A contrastive rule for meta-learning. Preprint: arXiv:2104.01677 [ paper ] * — equal contributions |
---|
![]() |
Alexander Meulemans*, Matilde T. Farinha*, Maria R. Cervera*, João Sacramento, Benjamin F. Grewe (2022). Minimizing control for credit assignment with strong feedback. ICML 2022 (Spotlight) [ paper ] * — equal contributions |
---|
![]() |
Johannes von Oswald*, Dominic Zhao*, Seijin Kobayashi, Simon Schug, Massimo Caccia, Nicolas Zucchet, João Sacramento (2021). Learning where to learn: Gradient sparsity in meta and continual learning. NeurIPS 2021 [ paper ] * — equal contributions |
---|
![]() |
Alexander Meulemans*, Matilde T. Farinha*, Javier G. Ordóñez, Pau V. Aceituno, João Sacramento, Benjamin F. Grewe (2021). Credit assignment in neural networks through deep feedback control. NeurIPS 2021 (Spotlight) [ paper ] * — equal contributions |
---|
![]() |
Christian Henning*, Maria R. Cervera*, Francesco D'Angelo, Johannes von Oswald, Regina Traber, Benjamin Ehret, Seijin Kobayashi, Benjamin F. Grewe, João Sacramento (2021). Posterior meta-replay for continual learning. NeurIPS 2021 [ paper ] * — equal contributions |
---|
![]() |
Jakob Jordan, João Sacramento, Willem A. M. Wybo, Mihai A. Petrovici*, Walter Senn* (2021). Learning Bayes-optimal dendritic opinion pooling. Preprint: arXiv:2104.13238 [ paper ] * — equal contributions |
---|
![]() |
Johannes von Oswald*, Seijin Kobayashi*, Alexander Meulemans, Christian Henning, Benjamin F. Grewe, João Sacramento (2020). Neural networks with late-phase weights. ICLR 2021 [ paper | code ] * — equal contributions |
---|
![]() |
Dominic Zhao, Seijin Kobayashi, João Sacramento*, Johannes von Oswald* (2020). Meta-learning via hypernetworks. NeurIPS Workshop on Meta-Learning 2020 [ paper ] * — equal contributions |
---|
![]() |
Alexander Meulemans, Francesco S. Carzaniga, Johan A. K. Suykens, João Sacramento, Benjamin F. Grewe (2020). A theoretical framework for target propagation. NeurIPS 2020 (Spotlight) [ paper | code ] |
---|
![]() |
Johannes von Oswald*, Christian Henning*, Benjamin F. Grewe, João Sacramento (2019). Continual learning with hypernetworks. ICLR 2020 (Spotlight) [ paper | talk video | code ] * — equal contributions |
---|
![]() |
Blake Richards*, Timothy P. Lillicrap*, ..., João Sacramento, ..., Denis Therien*, Konrad P. Körding* (2019). A deep learning framework for neuroscience. Nature Neuroscience 22:1761-1770 [ link to journal ] * — equal contributions |
---|
![]() |
Milton Llera, João Sacramento, Rui P. Costa (2019). Computational roles of plastic probabilistic synapses. Current Opinion in Neurobiology 54:90-97 [ link to journal ] |
---|
![]() |
João Sacramento, Rui P. Costa, Yoshua Bengio, Walter Senn (2018). Dendritic cortical microcircuits approximate the backpropagation algorithm. NeurIPS 2018 (Oral) [ paper | talk video ] |
---|
If my articles are behind a paywall you can't get through please send me an e-mail.
A complete list of publications is here.
Teaching
Currently, I'm a guest lecturer for the Learning in Deep Artificial and Biological Neuronal Networks course offered at ETH Zürich.
In the past I served as a teaching assistant at the Department of Computer Science and Engineering of IST, where I lectured practical classes on computer programming and basic algorithms.
Object-oriented Programming (Fall 2013)
Foundations of Programming (Spring 2011)
Object-oriented Programming (Fall 2010)
Algorithms and Data Structures (Spring 2010)
Object-oriented Programming (Fall 2009)
Algorithms and Data Structures (Spring 2009)
Information Technology Systems Design (Spring 2009)
Object-oriented Programming (Fall 2008)
Data Centres (Fall 2008)