Advertisement Β· 728 Γ— 90

Posts by Avery HW Ryoo

Preview
The curriculum effect in visual learning: the role of readout dimensionality Generalization of visual perceptual learning (VPL) to unseen conditions varies across tasks. Previous work suggests that training curriculum may be integral to generalization, yet a theoretical explan...

🚨 New preprint alert!

πŸ§ πŸ€–
We propose a theory of how learning curriculum affects generalization through neural population dimensionality. Learning curriculum is a determining factor of neural dimensionality - where you start from determines where you end up.
πŸ§ πŸ“ˆ

A 🧡:

tinyurl.com/yr8tawj3

6 months ago 80 26 1 2

Excited to share that POSSM has been accepted to #NeurIPS2025! See you in San Diego πŸ–οΈ

7 months ago 11 3 1 1
Preview
Neural Interfaces Neural Interfaces is a comprehensive book on the foundations, major breakthroughs, and most promising future developments of neural interfaces. The bo

I'm very excited to announce the publication of our new book Neural Interfaces, published by Elsevier. The book is a comprehensive resource for all those interested and gravitating around neural interfaces and brain-computer interfaces (BCIs).

shop.elsevier.com/books/neural...

8 months ago 6 1 1 0

🐐

9 months ago 1 0 0 0
Post image

Step 1: Understand how scaling improves LLMs.
Step 2: Directly target underlying mechanism.
Step 3: Improve LLMs independent of scale. Profit.

In our ACL 2025 paper we look at Step 1 in terms of training dynamics.

Project: mirandrom.github.io/zsl
Paper: arxiv.org/pdf/2506.05447

9 months ago 4 1 2 0
Post image

(1/n)🚨Train a model solving DFT for any geometry with almost no training data
Introducing Self-Refining Training for Amortized DFT: a variational method that predicts ground-state solutions across geometries and generates its own training data!
πŸ“œ arxiv.org/abs/2506.01225
πŸ’» github.com/majhas/self-...

10 months ago 12 4 1 1
Manitokan are images set up where one can bring a gift or receive a gift. 1930s Rocky Boy Reservation, Montana, Montana State University photograph. Colourized with AI

Manitokan are images set up where one can bring a gift or receive a gift. 1930s Rocky Boy Reservation, Montana, Montana State University photograph. Colourized with AI

Preprint Alert πŸš€

Multi-agent reinforcement learning (MARL) often assumes that agents know when other agents cooperate with them. But for humans, this isn’t always the case. For example, plains indigenous groups used to leave resources for others to use at effigies called Manitokan.
1/8

10 months ago 35 13 1 3
Preview
Generalizable, real-time neural decoding with hybrid state-space models Real-time decoding of neural activity is central to neuroscience and neurotechnology applications, from closed-loop experiments to brain-computer interfaces, where models are subject to strict latency...

Stay tuned for the project page and code, coming soon!

Link: arxiv.org/abs/2506.05320

A big thank you to my co-authors: @nandahkrishna.bsky.social*, @ximengmao.bsky.social*, @mehdiazabou.bsky.social, Eva Dyer, @mattperich.bsky.social, and @glajoie.bsky.social!

🧡7/7

10 months ago 6 1 0 0
Advertisement
Post image

Finally, we show POSSM's performance on speech decoding - a long context task that can quickly grow expensive for Transformers. In the unidirectional setting, POSSM beats the GRU baseline, achieving a phoneme error rate (PER) of 27.3 while having more robustness to variation in preprocessing.

🧡6/7

10 months ago 3 0 1 0
Post image

Cross-species transfer! πŸ΅βž‘οΈπŸ§‘

Excitingly, we find that POSSM pretrained solely on monkey reaching data achieves SOTA performance when decoding imagined handwriting in human subjects! This shows the potential of leveraging NHP data to bootstrap human BCI decoding in low-data clinical settings.

🧡5/7

10 months ago 4 0 2 0
Post image Post image

By pretraining on 140 monkey reaching sessions, POSSM effectively transfers to new subjects and tasks, matching or outperforming several baselines (e.g., GRU, POYO, Mamba) across sessions.

βœ… High RΒ² across the board
βœ… 9Γ— faster inference than Transformers
βœ… <5ms latency per prediction

🧡4/7

10 months ago 3 0 1 0
Post image

POSSM combines the real-time inference of an RNN with the tokenization, pretraining, and finetuning abilities of a Transformer!

Using POYO-style tokenization, we encode spikes in 50ms windows and stream them to a recurrent model (e.g., Mamba, GRU) for fast, frequent predictions over time.

🧡3/7

10 months ago 3 0 1 0
Post image

The problem with existing decoders?

πŸ˜” RNNs offer efficient, causal inference, but rely on rigid, binned input formats - limiting generalization to new neurons or sessions.

πŸ˜” Transformers enable generalization via tokenization, but have high computational costs due to the attention mechanism.

🧡2/7

10 months ago 4 0 1 0
Post image

New preprint! πŸ§ πŸ€–

How do we build neural decoders that are:
⚑️ fast enough for real-time use
🎯 accurate across diverse tasks
🌍 generalizable to new sessions, subjects, and even species?

We present POSSM, a hybrid SSM architecture that optimizes for all three of these axes!

🧡1/7

10 months ago 54 24 2 8

I am joining @ualberta.bsky.social as a faculty member and
@amiithinks.bsky.social!

My research group is recruiting MSc and PhD students at the University of Alberta in Canada. Research topics include generative modeling, representation learning, interpretability, inverse problems, and neuroAI.

10 months ago 11 2 1 0
POYO+ POYO+: Multi-session, multi-task neural decoding from distinct cell-types and brain regions

Scaling models across multiple animals was a major step toward building neuro-foundation models; the next frontier is enabling multi-task decoding to expand the scope of training data we can leverage.

Excited to share our #ICLR2025 Spotlight paper introducing POYO+ 🧠

poyo-plus.github.io

🧡

11 months ago 44 10 1 1

Interested in foundation models for #neuroscience? Want to contribute to the development of the next-generation of multi-modal models? Come join us at IVADO in Montreal!

We're hiring a full-time machine learning specialist for this work.

Please share widely!

#NeuroAI πŸ§ πŸ“ˆ πŸ§ͺ

1 year ago 57 31 1 1

πŸ“½οΈRecordings from our
@cosynemeeting.bsky.social
#COSYNE2025 workshop on β€œAgent-Based Models in Neuroscience: Complex Planning, Embodiment, and Beyond" are now online: neuro-agent-models.github.io
πŸ§ πŸ€–

1 year ago 37 11 1 1
Advertisement

Talk recordings from our COSYNE Workshop on Neuro-foundation Models 🌐🧠 are now up on the workshop website!

neurofm-workshop.github.io

1 year ago 34 10 1 1
Post image

Very late, but had a πŸ”₯ time at my first Cosyne presenting my work with @nandahkrishna.bsky.social, Ximeng Mao, @mattperich.bsky.social, and @glajoie.bsky.social on real-time neural decoding with hybrid SSMs. Keep an eye out for a preprint (hopefully) soon πŸ‘€

#Cosyne2025 @cosynemeeting.bsky.social

1 year ago 31 6 2 0

Excited to be at #Cosyne2025 for the first time! I'll be presenting my poster [2-104] during the Friday session. E-poster here: www.world-wide.org/cosyne-25/se...

1 year ago 8 3 0 0
Post image Post image

We'll be presenting two projects at #Cosyne2025, representing two main research directions in our lab:

πŸ§ πŸ€– πŸ§ πŸ“ˆ

1/3

1 year ago 46 9 1 1

@oliviercodol.bsky.social my opportunity to lose to scientists in a different field

1 year ago 2 0 1 0
Post image

Just a couple days until Cosyne - stop by [3-083] this Saturday and say hi! @nandahkrishna.bsky.social

1 year ago 6 3 0 0

This will be a more difficult Cosyne than normal, due to both the travel restrictions for people coming from the US and the strike that may be happening at the hotel in Montreal.

But, we can still make this an awesome meeting as usual, y'all. Let's pull together and make it happen!

πŸ§ πŸ“ˆ
#Cosyne2025

1 year ago 44 5 2 0

Hi! Currently there are no plans to livestream, but we may *potentially* post recordings in the future (contingent on speaker permission)

1 year ago 2 0 1 0

Join us at #COSYNE2025 to explore recent advancements in large-scale training and analysis of brain data! 🧠🟦

We also made a starter pack with (most of) our speakers: go.bsky.app/Ss6RaEF

1 year ago 17 4 0 0
Advertisement
Preview
COSYNE 2025 Workshop - Building a foundation model for the brain Join us to explore neuro-foundation models. March 31-April 1, 2025 in Mont Tremblant, Canada.

We have a great lineup of speakers and panelists, you can check out our schedule here: neurofm-workshop.github.io. Co-organized with: @mehdiazabou.bsky.social, @nandahkrishna.bsky.social, @colehurwitz.bsky.social, Eva Dyer, and @tyrellturing.bsky.social. We hope to see you there!

1 year ago 5 0 1 2
Post image

How can large-scale models + datasets revolutionize neuroscience πŸ§ πŸ€–πŸŒ? We are excited to announce our workshop: β€œBuilding a foundation model for the brain: datasets, theory, and models” at @cosynemeeting.bsky.social #COSYNE2025. Join us in Mont-Tremblant, Canada from March 31 – April 1!

1 year ago 44 17 2 9

forms.gle/1DPPVe8KLRWD...

here's a google form for ease!

1 year ago 0 0 0 0