Episode #39 in #TheoreticalNeurosciencePodcast:
On modeling neural population activity with mean-field models – with Tilo Schwalger
theoreticalneuroscience.no/thn39
How can mean‑field models be systematically derived from the underlying microscopic dynamics of individual neurons?
Posts by GerstnerLab
Novelty is not just about whats new, but also what feels new given past experience. New study from Sophia Becker in Wulfram Gerstner’s @epfl-brainmind.bsky.social lab posits a model showing how similarity between familiar and novel stimuli shapes exploration and learning - doi.org/10.1016/j.ne...
Presenting a poster tomorrow at Cosyne 26:
[3-033] Compositional computation via shared latent dynamics in low-rank RNNs.
With @avm.bsky.social, we explore how RNNs can re-use the same dynamics across different tasks, and what it implies for their connectivity and neural activity.
Excited to share our new paper to be published in Neuron!
With Valentin Schmutz @bio-emergent.bsky.social and Wulfram Gerstner @gerstnerlab.bsky.social, we explore how circuit structure in RNNs shapes network computation and single-neuron responses.
www.sciencedirect.com/science/arti...
🧵Excited to present our latest work at #Neurips25! Together with @avm.bsky.social, we discover 𝐜𝐡𝐚𝐧𝐧𝐞𝐥𝐬 𝐭𝐨 𝐢𝐧𝐟𝐢𝐧𝐢𝐭𝐲: regions in neural networks loss landscapes where parameters diverge to infinity (in regression settings!)
We find that MLPs in these channels can take derivatives and compute GLUs 🤯
This was a lot of fun! From my side, it started with a technical Q: what's the relation between two-side cavity and path integrals? Turns out it's a fluctuation correction - and amazingly, this also enable the "O(N) rank" theory by @david-g-clark.bsky.social and @omarschall.bsky.social. 🤯
P4 52 “Coding Schemes in Non-Lazy Artificial Neural Networks” by @avm.bsky.social
WEDNESDAY 14:00 – 15:30
P4 25 “Rarely categorical, always high-dimensional: how the neural code changes along the cortical hierarchy” by @shuqiw.bsky.social
P4 35 “Biologically plausible contrastive learning rules with top-down feedback for deep networks” by @zihan-wu.bsky.social
WEDNESDAY 12:30 – 14:00
P3 4 “Toy Models of Identifiability for Neuroscience” by @flavioh.bsky.social
P3 55 “How many neurons is “infinitely many”? A dynamical systems perspective on the mean-field limit of structured recurrent neural networks” by Louis Pezon
P2 65 “Rate-like dynamics of spiking neural networks” by Kasper Smeets
TUESDAY 18:00 – 19:30
P2 2 “Biologically informed cortical models predict optogenetic perturbations” by @bellecguill.bsky.social
P2 12 “High-precision detection of monosynaptic connections from extra-cellular recordings” by @shuqiw.bsky.social
Lab members are at the Bernstein conference @bernsteinneuro.bsky.social with 9 posters! Here’s the list:
TUESDAY 16:30 – 18:00
P1 62 “Measuring and controlling solution degeneracy across task-trained recurrent neural networks” by @flavioh.bsky.social
New in @pnas.org: doi.org/10.1073/pnas...
We study how humans explore a 61-state environment with a stochastic region that mimics a “noisy-TV.”
Results: Participants keep exploring the stochastic part even when it’s unhelpful, and novelty-seeking best explains this behavior.
#cogsci #neuroskyence
🎉 "High-dimensional neuronal activity from low-dimensional latent dynamics: a solvable model" will be presented as an oral at #NeurIPS2025 🎉
Feeling very grateful that reviewers and chairs appreciated concise mathematical explanations, in this age of big models.
www.biorxiv.org/content/10.1...
1/2
Work led by Martin Barry with the supervision of Wulfram Gerstner and Guillaume Bellec @bellecguill.bsky.social
In experiments (models & simulations), we showed how this approach supports stable retention of old tasks while learning new ones (split CIfar-100, ASC…)
We designed a Bio-inspired Context-specific gating of plasticity and neuronal activity allowing for a drastic reduction in catastrophic forgetting.
We also show the capacity of our model of both forward and backward transfer! All of this thanks to the shared neuronal activity across tasks.
We designed a Gating/Availabilty model that detects selective neurons - most useful neuron for the task - during learning, shunt activity of the others (Gating) and decrease the learning rate of task selective neuron (Availability)
🧠 “You never forget how to ride a bike”, but how is that possible?
Our study proposes a bio-plausible meta-plasticity rule that shapes synapses over time, enabling selective recall based on context
So happy to see this work out! 🥳
Huge thanks to our two amazing reviewers who pushed us to make the paper much stronger. A truly joyful collaboration with @lucasgruaz.bsky.social, @sobeckerneuro.bsky.social, and Johanni Brea! 🥰
Tweeprint on an earlier version: bsky.app/profile/modi... 🧠🧪👩🔬
Attending #CCN2025?
Come by our poster in the afternoon (4th floor, Poster 72) to talk about the sense of control, empowerment, and agency. 🧠🤖
We propose a unifying formulation of the sense of control and use it to empirically characterize the human subjective sense of control.
🧑🔬🧪🔬
Work lead by Valentin Schmutz (@bio-emergent.bsky.social), in collaboration with Johanni Brea and Wulfram Gerstner.
Is it possible to go from spikes to rates without averaging?
We show how to exactly map recurrent spiking networks into recurrent rate networks, with the same number of neurons. No temporal or spatial averaging needed!
Presented at Gatsby Neural Dynamics Workshop, London.
Excited to present at the PIMBAA workshop at #RLDM2025 tomorrow!
We study curiosity using intrinsically motivated RL agents and developed an algorithm to generate diverse, targeted environments for comparing curiosity drives.
Preprint (accepted but not yet published): osf.io/preprints/ps...
Stoked to be at RLDM! Curious how novelty and exploration are impacted by generalization across similar stimuli? Then don't miss my flash talk in the PIMBAA workshop (tmr at 10:30, E McNabb Theatre) or stop by my poster tmr (#74)! Looking forward to chat 🤩
www.biorxiv.org/content/10.1...
Our new preprint 👀
Interested in high-dim chaotic networks? Ever wondered about the structure of their state space? @jakobstubenrauch.bsky.social has answers - from a separation of fixed points and dynamics onto distinct shells to a shared lower-dim manifold and linear prediction of dynamics.
Episode #22 in #TheoreticalNeurosciencePodcast: On 50 years with the Hopfield network model - with Wulfram Gerstner
theoreticalneuroscience.no/thn22
John Hopfield received the 2024 Physics Nobel prize for his model published in 1982. What is the model all about? @icepfl.bsky.social
A cool EPFL News article was written about our recent neurotheory paper on spikes vs rates!
Super engaging text by science communicater Nik Papageorgiou.
actu.epfl.ch/news/brain-m...
Definitely more accessible than the original physics-style, 4.5-page letter 🤓
journals.aps.org/prl/abstract...