Advertisement · 728 × 90

Posts by Alex van Meegen

I am looking for a theory/computational POSTDOC position in EU or east coast US. I am interested in how learning and plasticity shape population dynamics & representational geometries & how these changes are reflected in behavior.

If you are at #COSYNE2026 & interested, hit me up in Whova, not here

1 month ago 13 9 1 0
Poincare section of chaotic spiking network. Colors indicate local Lyapunov exponents.

Poincare section of chaotic spiking network. Colors indicate local Lyapunov exponents.

Belated update: I joined UIUC ECE as an Assistant Professor. Our lab works at the intersection of theoretical neuroscience, machine learning, and dynamical systems, with a focus on learning and spiking networks. I had to miss #COSYNE2026 for visa reasons. 1/2

1 month ago 48 6 4 0

Thanks, Jyotika!

1 month ago 0 0 0 0

Thanks!

1 month ago 1 0 0 0

Thank you, Cengiz!

1 month ago 0 0 0 0

Presenting a poster tomorrow at Cosyne 26:
[3-033] Compositional computation via shared latent dynamics in low-rank RNNs.

With @avm.bsky.social, we explore how RNNs can re-use the same dynamics across different tasks, and what it implies for their connectivity and neural activity.

1 month ago 9 2 0 0

Travelling to COSYNE seems to be the perfect opportunity to announce that I started my own lab at RWTH Aachen University earlier this year, funded by NRW's Ministry of Culture and Science through its Return Program. If you are at COSYNE and want to chat please reach out!

1 month ago 31 5 5 1

If you're at #cosyne2026 in Lisbon, come check out Juan Carlos Fernández del Castillo's poster 1-007 tonight, based on our paper on efficient coding in olfaction (www.biorxiv.org/content/10.1...)!

1 month ago 17 2 0 0
Advertisement
Preview
Linking neural manifolds to circuit structure in recurrent networks Dimensionality reduction methods are widely used in neuroscience to investigate two complementary aspects of neural activity: the distribution of sing…

Excited to share our new paper to be published in Neuron!

With Valentin Schmutz @bio-emergent.bsky.social and Wulfram Gerstner @gerstnerlab.bsky.social, we explore how circuit structure in RNNs shapes network computation and single-neuron responses.
www.sciencedirect.com/science/arti...

1 month ago 37 11 1 1
Preview
Linking neural manifolds to circuit structure in recurrent networks Neural population activity can be described either by low-dimensional dynamics on neural manifolds or by single-neuron selectivities. Using a theoretical approach, Pezon et al. relate these two statistical descriptions to circuit structure in recurrent networks. Their results reveal both degeneracies and specific constraints in how circuit structure shapes neural activity.
1 month ago 39 10 0 0
Home | Neuroscience | World Wide Theoretical Neuroscience Seminar WWTNS is a weekly digital seminar on Zoom targeting the theoretical neuroscience community. Its aim is to be a platform to exchange ideas among theoreticians.

Excited to be giving the van Vreeswijk Theoretical Neuroscience Seminar this Wednesday, Jan 14, where I'll talk about "Computation Through Neuronal-Synaptic Dynamics"!
www.wwtns.online

3 months ago 31 5 0 0

Our paper on data constrained RNN that generalize to optogenetic perturbations now citable on eLife:
doi.org/10.7554/eLif...

4 months ago 43 18 1 2
Preview
A theory of multi-task computation and task selection Neural activity during the performance of a stereotyped behavioral task is often described as low-dimensional, occupying only a limited region in the space of all firing-rate patterns. This region has...

1/X Excited to present this preprint on multi-tasking, with
@david-g-clark.bsky.social and Ashok Litwin-Kumar! Timely too, as “low-D manifold” has been trending again. (If you read thru the end, we escape Flatland and return to the glorious high-D world we deserve.) www.biorxiv.org/content/10.6...

4 months ago 85 20 1 2
Post image

🧵Excited to present our latest work at #Neurips25! Together with @avm.bsky.social, we discover 𝐜𝐡𝐚𝐧𝐧𝐞𝐥𝐬 𝐭𝐨 𝐢𝐧𝐟𝐢𝐧𝐢𝐭𝐲: regions in neural networks loss landscapes where parameters diverge to infinity (in regression settings!)

We find that MLPs in these channels can take derivatives and compute GLUs 🤯

4 months ago 14 6 2 0
Post image

Finally got the job ad—looking for 2 PhD students to start spring next year:

www.gao-unit.com/join-us/

If comp neuro, ML, and AI4Neuro is your thing, or you just nerd out over brain recordings, apply!

I'm at neurips. DM me here / on the conference app or email if you want to meet 🏖️🌮

4 months ago 81 51 1 5
Preview
Different learning algorithms achieve shared optimal outcomes in humans, rats, and mice Animals must exploit environmental regularities to make adaptive decisions, yet the learning algorithms that enabels this flexibility remain unclear. A central question across neuroscience, cognitive science, and machine learning, is whether learning relies on generative or discriminative strategies. Generative learners build internal models the sensory world itself, capturing its statistical structure; discriminative learners map stimuli directly onto choices, ignoring input statistics. These strategies rely on fundamentally different internal representations and entail distinct computational trade-offs: generative learning supports flexible generalisation and transfer, whereas discriminative learning is efficient but task-specific. We compared humans, rats, and mice performing the same auditory categorisation task, where category boundaries and rewards were fixed but sensory statistics varied. All species adapted their behaviour near-optimally, consistent with a normative observer constrained by sensory and decision noise. Yet their underlying algorithms diverged: humans predominantly relied on generative representations, mice on discriminative boundary-tracking, and rats spanned both regimes. Crucially, end-point performance concealed these differences, only learning trajectories and trial-to-trial updates revealed the divergence. These results show that similar near-optimal behaviour can mask fundamentally different internal representations, establishing a comparative framework for uncovering the hidden strategies that support statistical learning. ### Competing Interest Statement The authors have declared no competing interest. Wellcome Trust, https://ror.org/029chgv08, 219880/Z/19/Z, 225438/Z/22/Z, 219627/Z/19/Z Gatsby Charitable Foundation, GAT3755 UK Research and Innovation, https://ror.org/001aqnf71, EP/Z000599/1

paper🚨
When we learn a category, do we learn the structure of the world, or just where to draw the line? In a cross-species study, we show that humans, rats & mice adapt optimally to changing sensory statistics, yet rely on fundamentally different learning algorithms.
www.biorxiv.org/content/10.1...

5 months ago 81 19 1 1
Preview
Prediction of neural activity in connectome-constrained recurrent networks - Nature Neuroscience The authors show that connectome datasets alone are generally not sufficient to predict neural activity. However, pairing connectivity information with neural recordings can produce accurate predictio...

Connectome datasets alone are generally not sufficient to predict neural activity. However, pairing connectivity information with neural recordings can produce accurate predictions of activity in unrecorded neurons

www.nature.com/articles/s41...

5 months ago 45 11 0 1

This was a lot of fun! From my side, it started with a technical Q: what's the relation between two-side cavity and path integrals? Turns out it's a fluctuation correction - and amazingly, this also enable the "O(N) rank" theory by @david-g-clark.bsky.social and @omarschall.bsky.social. 🤯

5 months ago 13 3 0 0
Advertisement

First paper from the lab!
We propose a model that separates estimation of odor concentration and presence and map it on olfactory bulb circuits
Led by @chenjiang01.bsky.social and @mattyizhenghe.bsky.social joint work with @jzv.bsky.social and with @neurovenki.bsky.social @cpehlevan.bsky.social

5 months ago 36 13 2 1
Post image

Applying to do a postdoc or PhD in theoretical ML or neuroscience this year? Consider joining my group (starting next Fall) at UT Austin!
POD Postdoc: oden.utexas.edu/programs-and... CSEM PhD: oden.utexas.edu/academics/pr...

5 months ago 33 11 1 0
Post image

A study led by Cina Aghamohammadi is now out in ‪@natcomms.nature.com‬! We developed a mathematical framework for partitioning spiking variability, which revealed that spiking irregularity is nearly invariant for each neuron and decreases along the cortical hierarchy.
www.nature.com/articles/s41...

6 months ago 71 24 1 0
Post image

🎉 "High-dimensional neuronal activity from low-dimensional latent dynamics: a solvable model" will be presented as an oral at #NeurIPS2025 🎉

Feeling very grateful that reviewers and chairs appreciated concise mathematical explanations, in this age of big models.

www.biorxiv.org/content/10.1...
1/2

7 months ago 110 23 4 4

Lab members are at the Bernstein conference @bernsteinneuro.bsky.social with 9 posters! Here’s the list:

TUESDAY 16:30 – 18:00

P1 62 “Measuring and controlling solution degeneracy across task-trained recurrent neural networks” by @flavioh.bsky.social

6 months ago 9 3 1 0

Awesome. Congratulations!!

6 months ago 1 0 0 0

Check out our new preprint where we analyzed the dynamics of over ten thousand neurons across 223 brain areas and found a surprising universal principle that describes the organization of intrinsic timescales across the entire mouse brain, including subcortical structures!
#neuroskyence

7 months ago 36 5 0 0
Advertisement
Preview
Frontiers | Summary statistics of learning link changing neural representations to behavior How can we make sense of large-scale recordings of neural activity across learning? Theories of neural network learning with their origins in statistical phy...

Since I'm back on BlueSky - with @frostedblakess.bsky.social and @cpehlevan.bsky.social we wrote a brief perspective on how ideas about summary statistics from the statistical physics of learning could potentially help inform neural data analysis... (1/2)

7 months ago 34 10 1 0
Preview
Convergent motifs of early olfactory processing are recapitulated by layer-wise efficient coding The architecture of early olfactory processing is a striking example of convergent evolution. Typically, a panel of broadly tuned receptors is selectively expressed in sensory neurons (each neuron exp...

Excited to share new computational work, led by @jzv.bsky.social, driven by Juan Carlos Fernandez del Castillo + contribution from Farhad Pashakanloo. We recover 3 core motifs in the olfactory system of evolutionarily distant animals using a biophysically-grounded model + efficient coding ideas!

7 months ago 24 12 0 1
Preview
Associative synaptic plasticity creates dynamic persistent activity In biological neural circuits, the dynamics of neurons and synapses are tightly coupled. We study the consequences of this coupling and show that it enables a novel form of working memory. In recurren...

(1/26) Excited to share a new preprint led by grad student Albert Wakhloo, with me and Larry Abbott: "Associative synaptic plasticity creates dynamic persistent activity."
www.biorxiv.org/content/10.1...

7 months ago 38 9 1 0
Theoretical neuroscience has room to grow Nature Reviews Neuroscience - The goal of theoretical neuroscience is to uncover principles of neural computation through careful design and interpretation of mathematical models. Here, I examine...

I wrote a Comment on neurotheory, and now you can read it!

Some thoughts on where neurotheory has and has not taken root within the neuroscience community, how it has shaped those subfields, and where we theorists might look next for fresh adventures.

www.nature.com/articles/s41...

8 months ago 151 52 8 3
Preview
Connectivity structure and dynamics of nonlinear recurrent neural networks Studies of the dynamics of nonlinear recurrent neural networks often assume independent and identically distributed couplings, but large-scale connectomics data indicate that biological neural circuit...

Wanted to share a new version (much cleaner!) of a preprint on how connectivity structure shapes collective dynamics in nonlinear RNNs. Neural circuits have highly non-iid connectivity (e.g., rapidly decaying singular values, structured singular-vector overlaps), unlike classical random RNN models.

8 months ago 40 9 1 0