Advertisement · 728 × 90

Posts by GerstnerLab

Post image

Episode #39 in #TheoreticalNeurosciencePodcast:
On modeling neural population activity with mean-field models – with Tilo Schwalger

theoreticalneuroscience.no/thn39

How can mean‑field models be systematically derived from the underlying microscopic dynamics of individual neurons?

3 weeks ago 18 7 0 1
Post image

Novelty is not just about whats new, but also what feels new given past experience. New study from Sophia Becker in Wulfram Gerstner’s @epfl-brainmind.bsky.social lab posits a model showing how similarity between familiar and novel stimuli shapes exploration and learning - doi.org/10.1016/j.ne...

2 weeks ago 19 6 0 1

Presenting a poster tomorrow at Cosyne 26:
[3-033] Compositional computation via shared latent dynamics in low-rank RNNs.

With @avm.bsky.social, we explore how RNNs can re-use the same dynamics across different tasks, and what it implies for their connectivity and neural activity.

1 month ago 9 2 0 0
Preview
Linking neural manifolds to circuit structure in recurrent networks Dimensionality reduction methods are widely used in neuroscience to investigate two complementary aspects of neural activity: the distribution of sing…

Excited to share our new paper to be published in Neuron!

With Valentin Schmutz @bio-emergent.bsky.social and Wulfram Gerstner @gerstnerlab.bsky.social, we explore how circuit structure in RNNs shapes network computation and single-neuron responses.
www.sciencedirect.com/science/arti...

1 month ago 37 11 1 1
Post image

🧵Excited to present our latest work at #Neurips25! Together with @avm.bsky.social, we discover 𝐜𝐡𝐚𝐧𝐧𝐞𝐥𝐬 𝐭𝐨 𝐢𝐧𝐟𝐢𝐧𝐢𝐭𝐲: regions in neural networks loss landscapes where parameters diverge to infinity (in regression settings!)

We find that MLPs in these channels can take derivatives and compute GLUs 🤯

4 months ago 14 6 2 0

This was a lot of fun! From my side, it started with a technical Q: what's the relation between two-side cavity and path integrals? Turns out it's a fluctuation correction - and amazingly, this also enable the "O(N) rank" theory by @david-g-clark.bsky.social and @omarschall.bsky.social. 🤯

5 months ago 13 3 0 0

P4 52 “Coding Schemes in Non-Lazy Artificial Neural Networks” by @avm.bsky.social

6 months ago 2 0 0 0
Advertisement

WEDNESDAY 14:00 – 15:30

P4 25 “Rarely categorical, always high-dimensional: how the neural code changes along the cortical hierarchy” by @shuqiw.bsky.social

P4 35 “Biologically plausible contrastive learning rules with top-down feedback for deep networks” by @zihan-wu.bsky.social

6 months ago 4 0 1 0

WEDNESDAY 12:30 – 14:00

P3 4 “Toy Models of Identifiability for Neuroscience” by @flavioh.bsky.social

P3 55 “How many neurons is “infinitely many”? A dynamical systems perspective on the mean-field limit of structured recurrent neural networks” by Louis Pezon

6 months ago 0 0 1 0

P2 65 “Rate-like dynamics of spiking neural networks” by Kasper Smeets

6 months ago 0 0 1 0

TUESDAY 18:00 – 19:30

P2 2 “Biologically informed cortical models predict optogenetic perturbations” by @bellecguill.bsky.social

P2 12 “High-precision detection of monosynaptic connections from extra-cellular recordings” by @shuqiw.bsky.social

6 months ago 1 0 1 0

Lab members are at the Bernstein conference @bernsteinneuro.bsky.social with 9 posters! Here’s the list:

TUESDAY 16:30 – 18:00

P1 62 “Measuring and controlling solution degeneracy across task-trained recurrent neural networks” by @flavioh.bsky.social

6 months ago 9 3 1 0
Post image Post image

New in @pnas.org: doi.org/10.1073/pnas...

We study how humans explore a 61-state environment with a stochastic region that mimics a “noisy-TV.”

Results: Participants keep exploring the stochastic part even when it’s unhelpful, and novelty-seeking best explains this behavior.

#cogsci #neuroskyence

6 months ago 99 36 0 3
Post image

🎉 "High-dimensional neuronal activity from low-dimensional latent dynamics: a solvable model" will be presented as an oral at #NeurIPS2025 🎉

Feeling very grateful that reviewers and chairs appreciated concise mathematical explanations, in this age of big models.

www.biorxiv.org/content/10.1...
1/2

7 months ago 110 23 4 4

Work led by Martin Barry with the supervision of Wulfram Gerstner and Guillaume Bellec @bellecguill.bsky.social

7 months ago 0 0 0 0
Advertisement

In experiments (models & simulations), we showed how this approach supports stable retention of old tasks while learning new ones (split CIfar-100, ASC…)

7 months ago 0 0 1 0

We designed a Bio-inspired Context-specific gating of plasticity and neuronal activity allowing for a drastic reduction in catastrophic forgetting.

We also show the capacity of our model of both forward and backward transfer! All of this thanks to the shared neuronal activity across tasks.

7 months ago 1 0 1 0
Post image

We designed a Gating/Availabilty model that detects selective neurons - most useful neuron for the task - during learning, shunt activity of the others (Gating) and decrease the learning rate of task selective neuron (Availability)

7 months ago 0 0 1 0
Preview
Context selectivity with dynamic availability enables lifelong continual learning “You never forget how to ride a bike”, – but how is that possible? The brain is able to learn complex skills, stop the practice for years, learn other…

🧠 “You never forget how to ride a bike”, but how is that possible?
Our study proposes a bio-plausible meta-plasticity rule that shapes synapses over time, enabling selective recall based on context

7 months ago 16 3 1 0

So happy to see this work out! 🥳
Huge thanks to our two amazing reviewers who pushed us to make the paper much stronger. A truly joyful collaboration with @lucasgruaz.bsky.social, @sobeckerneuro.bsky.social, and Johanni Brea! 🥰

Tweeprint on an earlier version: bsky.app/profile/modi... 🧠🧪👩‍🔬

7 months ago 38 13 0 0
Post image Post image

Attending #CCN2025?
Come by our poster in the afternoon (4th floor, Poster 72) to talk about the sense of control, empowerment, and agency. 🧠🤖

We propose a unifying formulation of the sense of control and use it to empirically characterize the human subjective sense of control.

🧑‍🔬🧪🔬

8 months ago 10 1 1 1
Emergent Rate-Based Dynamics in Duplicate-Free Populations of Spiking Neurons Can spiking neural networks (SNNs) approximate the dynamics of recurrent neural networks? Arguments in classical mean-field theory based on laws of large numbers provide a positive answer when each ne...

Work lead by Valentin Schmutz (@bio-emergent.bsky.social), in collaboration with Johanni Brea and Wulfram Gerstner.

8 months ago 5 0 0 0
From Spikes To Rates
From Spikes To Rates YouTube video by Gerstner Lab

Is it possible to go from spikes to rates without averaging?

We show how to exactly map recurrent spiking networks into recurrent rate networks, with the same number of neurons. No temporal or spatial averaging needed!

Presented at Gatsby Neural Dynamics Workshop, London.

8 months ago 61 17 2 1
OSF

Excited to present at the PIMBAA workshop at #RLDM2025 tomorrow!
We study curiosity using intrinsically motivated RL agents and developed an algorithm to generate diverse, targeted environments for comparing curiosity drives.

Preprint (accepted but not yet published): osf.io/preprints/ps...

10 months ago 7 1 0 0
Preview
Representational similarity modulates neural and behavioral signatures of novelty Novelty signals in the brain modulate learning and drive exploratory behaviors in humans and animals. While the perceived novelty of a stimulus is known to depend on previous experience, the effect of...

Stoked to be at RLDM! Curious how novelty and exploration are impacted by generalization across similar stimuli? Then don't miss my flash talk in the PIMBAA workshop (tmr at 10:30, E McNabb Theatre) or stop by my poster tmr (#74)! Looking forward to chat 🤩

www.biorxiv.org/content/10.1...

10 months ago 20 5 1 0
Advertisement

Our new preprint 👀

10 months ago 31 6 0 0

Interested in high-dim chaotic networks? Ever wondered about the structure of their state space? @jakobstubenrauch.bsky.social has answers - from a separation of fixed points and dynamics onto distinct shells to a shared lower-dim manifold and linear prediction of dynamics.

10 months ago 13 2 0 0
Post image

Episode #22 in #TheoreticalNeurosciencePodcast: On 50 years with the Hopfield network model - with Wulfram Gerstner

theoreticalneuroscience.no/thn22

John Hopfield received the 2024 Physics Nobel prize for his model published in 1982. What is the model all about? @icepfl.bsky.social

1 year ago 33 5 0 2
Preview
Brain models draw closer to real-life neurons Researchers at EPFL have shown how rough, biological spiking neural networks can mimic the behavior of brain models called recurrent neural networks. The findings challenge traditional assumptions and...

A cool EPFL News article was written about our recent neurotheory paper on spikes vs rates!

Super engaging text by science communicater Nik Papageorgiou.
actu.epfl.ch/news/brain-m...

Definitely more accessible than the original physics-style, 4.5-page letter 🤓
journals.aps.org/prl/abstract...

1 year ago 26 9 2 0
Preview
Learning from the unexpected A researcher at EPFL working at the crossroads of neuroscience and computational science has developed an algorithm that can predict how surprise and novelty affect behavior.

Super excited to see my PhD thesis featured by EPFL! 🎓
actu.epfl.ch/news/learnin...

P.S.: There's even a French version of the article! It feels so fancy! 😎 👨‍🎨 🇫🇷
actu.epfl.ch/news/apprend...

1 year ago 22 6 0 0