Presenting a poster tomorrow at Cosyne 26:
[3-033] Compositional computation via shared latent dynamics in low-rank RNNs.
With @avm.bsky.social, we explore how RNNs can re-use the same dynamics across different tasks, and what it implies for their connectivity and neural activity.
Posts by Louis Pezon
Travelling to COSYNE seems to be the perfect opportunity to announce that I started my own lab at RWTH Aachen University earlier this year, funded by NRW's Ministry of Culture and Science through its Return Program. If you are at COSYNE and want to chat please reach out!
[About the theory] To tackle this problem, we developed a unifying framework for the theory of neural fields and low-rank RNNs. We show that low-rank nets rely on an implicit spatial structure, and conversely, that neural fields obey latent dynamics akin to low-rank nets.
3. We find that the circuit structure
– can impose symmetries on the network's low-dim. dynamics
– constrains the topology of the set of all single-neuron responses.
From the second point, we pinpoint topological features of neural activity relevant for comparing it with models.
2. We find that “function does not determine form”: the same computation (i.e., the same low-dim. dynamics) can be implemented by networks with different circuit structures. Yet, circuit structure imposes subtle constraints on neural activity (see next).
1. Neural activity is often analyzed in two ways:
– low-dimensional dynamics in neural manifolds reflect computations
– single-neuron response properties (e.g., tuning, selectivity) can describe how the neural population is organized.
But it's not clear how this relates to network connectivity.
Excited to share our new paper to be published in Neuron!
With Valentin Schmutz @bio-emergent.bsky.social and Wulfram Gerstner @gerstnerlab.bsky.social, we explore how circuit structure in RNNs shapes network computation and single-neuron responses.
www.sciencedirect.com/science/arti...