Theoretical and computational neuroscience
We build theoretical models, from single cells to large networks, to elucidate how the brain transforms sensation into behaviour
We are constantly stimulated by what we see, hear, smell, taste, and touch. Our brains take these complex inputs and transform them into actions and thoughts through what we call input/output functions. Every part of the brain, from single neurons to entire brain regions, can be described by such functions. Finding their accurate description has the potential to change lives, both through improved treatments for mental illness as well as better algorithms for artificial intelligence.
Our aim is to mathematically describe how the brain translates sensory input into behaviour at various spatial and temporal scales. We focus on how inputs onto a single neuron generate its output activity, and how neuronal networks process complex information.
Theoretical tools to understand the brain
Computers, as well as pen and paper, are the tools that, through mathematical analysis and simulations, will allow us to unveil how the brain processes information.
From models to experiments and back
Simulations allow us to compare a large variety of different scenarios, which are impossible to access via modern experimental technologies. Experimental work is essential for us to build our models, and in turn, our models will serve as inspiration for future experiments in a continuing incremental loop. The result is that, together, theory and experiments will help us shine a light on the mysteries of the brain, with countless applications to improve our lives, from health to technology.
: A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural networkAdvances in Neural Information Processing Systems, Curran Associates, Inc..
(2020). Complementary inhibitory weight profiles emerge from plasticity and allow flexible switching of receptive fields. Journal of Neuroscience, InPress.
(2020). Context-modular memory networks support high-capacity, flexible, and robust associative memories. bioRxiv, 2020.01.08.898528.
(2017). Inhibitory plasticity: Balance, control, and codependence. Annual Review of Neuroscience, 40, 557-579.
(2017). Learning and retrieval behavior in recurrent neural networks with pre-synaptic dependent homeostatic plasticity. Physica. A, Theoretical and statistical physics, 479, 279-286.
(2015). Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks. Nature Communications, 6, 6922.
(2013). Unsupervised learning in neural networks with short range synapses. AIP Conference Proceedings, 1510 (1), 251-254.
(2013). Spike timing analysis in neural networks with unsupervised synaptic plasticity. AIP Conference Proceedings, 1510 (1), 213-215.
(2013). Strategies to associate memories by unsupervised learning in neural networks. AIP Conference Proceedings, 1510 (1), 255-257.
: Associative memory in neuronal networks of spiking neurons: architecture and storage analysis, in: Villa, Alessandro EP; Duch, Wlodzislaw; Érdi, Péter; Masulli, Francesco; Palm, Günther(Ed.). (2012). Artificial Neural Networks and Machine Learning - ICANN 2012, Springer Berlin Heidelberg.
(2012). Model architecture for associative memory in a neural network of spiking neurons. Physica. A, Theoretical and statistical physics, 391 (3), 843-848.
(2010). Synchronization regimes in a map-based model neural network. Physica. A, Theoretical and statistical physics, 389 (3), 651-658.