Research

SNSF-funded research

I'm a junior principal investigator at the Institute of Neuroinformatics, ETH Zürich, hosted by the Benjamin Grewe lab. My research interests lie at the intersection of neuroscience and machine learning. I try to understand the principles that enable humans to learn so efficiently, when compared to artificial neural networks. My goal is to develop better neuroscience-inspired machine learning algorithms, and in turn to use insights gained from designing these to understand learning in the brain.

My research is supported by a SNSF Ambizione Fellowship.

Previously, from 2015 to 2018, I was a postdoc with Walter Senn at the University of Bern. Together with Yoshua Bengio (Mila, Université de Montréal) we developed a model for error backpropagation in cortex.

I received my PhD in computer science from IST (University of Lisbon, 2014) where I studied neural network models of memory with Andreas Wichert. Still at IST, in 2015, I was awarded a short research fellowship to work with Francisco C Santos. During this period I collaborated with Mark van Rossum (University of Edinburgh) on synaptic plasticity rules and single neuron memory capacity.

Recent papers

Credit assignment in neural networks through deep feedback control Meulemans A*, Farinha MT*, Ordóñez JG, Aceituno PV, Sacramento J, Grewe BF (2021). Credit assignment in neural networks through deep feedback control.
Preprint: arXiv:2106.07887
[ paper ]
* — equal contributions

 

A contrastive rule for meta-learning Zucchet N*, Schug S*, von Oswald J*, Zhao D, Sacramento J (2021). A contrastive rule for meta-learning.
Preprint: arXiv:2104.01677
[ paper ]
* — equal contributions

 

Posterior meta-replay for continual learning Henning C*, Cervera MR*, D'Angelo F, von Oswald J, Traber R, Ehret B, Kobayashi S, Sacramento J, Grewe BF (2021). Posterior meta-replay for continual learning.
Preprint: arXiv:2103.01133
[ paper ]
* — equal contributions

 

Neural networks with late-phase weights von Oswald J*, Kobayashi S*, Sacramento J*, Meulemans A, Henning C, Grewe BF (2020). Neural networks with late-phase weights.
ICLR 2021
[ paper | code ]
* — equal contributions

 

Meta-learning via hypernetworks Zhao D, von Oswald J, Kobayashi S, Sacramento J, Grewe BF (2020). Meta-learning via hypernetworks.
NeurIPS MetaLearn 2020
[ paper | poster ]

 

A theoretical framework for target propagation Meulemans A, Carzaniga FS, Suykens JAK, Sacramento J, Grewe BF (2020). A theoretical framework for target propagation.
NeurIPS 2020 (Spotlight)
[ paper | code ]

 

Continual learning with hypernetworks von Oswald J*, Henning C*, Sacramento J*, Grewe BF (2019). Continual learning with hypernetworks.
ICLR 2020 (Spotlight)
[ paper | talk video | code ]
* — equal contributions

 

A deep learning framework for neuroscience Richards B*, Lillicrap TP*, ..., Sacramento J, ..., Therien D*, Körding KP* (2019). A deep learning framework for neuroscience.
Nature Neuroscience 22:1761-1770
[ link to journal ]
* — equal contributions

 

Cortical backprop Llera M, Sacramento J, Costa RP (2019). Computational roles of plastic probabilistic synapses.
Current Opinion in Neurobiology 54:90-97
[ link to journal ]

 

Cortical backprop Sacramento J, Costa RP, Bengio Y, Senn W (2018). Dendritic cortical microcircuits approximate the backpropagation algorithm.
NeurIPS 2018 (Oral)
[ paper | talk video ]

If my articles are behind a paywall you can't get through please send me an e-mail.

A complete list of publications is here.

Teaching

Currently, I'm a guest lecturer for the Learning in Deep Artificial and Biological Neuronal Networks course offered at ETH Zürich.

In the past I served as a teaching assistant at the Department of Computer Science and Engineering of IST, where I lectured practical classes on computer programming and basic algorithms.