Our Brain&AI team is releasing NeuralSet! 🚀🚀🚀
A fast, simple, and scalable package for Neuro-AI:
📦 `pip install neuralset`
💻 Code: lnkd.in/eamwxBUY
📄 Paper: lnkd.in/epbreyDy
As a PhD student working extensively with neuro recordings and AI models, I find this library to be a game-changer.
Posts by Jean-Rémi King
New package release from our amazing team 🎉🎉🎉
Using NeuralSet, you can go from a directory of downloaded data files to analysis-ready / AI-ready tensors with only a couple of lines of code
Time to power up 💪
S Panchavati, A Ratouchniak, M Careil & Jérémy Rapin.
🙏 Many thanks to everyone involved and happy coding :)
#Python #NeuroAI #OpenSource
Y Benchetrit @sdascoli.bsky.social, S Dahan,
H Banville @teonbrooks.com K Begany, S Khanna @pierreorhan.bsky.social, A Thual @honualx.bsky.social @corentinbel.bsky.social, J Bonnaire, C Caucheteux, T Desbordes, P Diego-Simón, J Millet, ...
🙏 This package is made possible thanks to the neuroscience software stack: MNE-Python Nilearn HuggingFace OpenNeuro and many more
Made possible thanks to J Raugel @jarodlevy.bsky.social, L Evanson @lucyzmf.bsky.social, @juliengadonneix.bsky.social, A Santos, S Houhamdi ...
…and more:
🎨 Decoding images from fMRI — DynaDiff: bsky.app/profile/jean...
👶 Modeling the development of language in iEEG: bsky.app/profile/jean...
⌨️ Decoding words from MEG: bsky.app/profile/jean...
NeuralSet already powers our Neuro-AI studies:
🧠🎬 Modeling fMRI responses to audio & video — Tribe v2: bsky.app/profile/sdas...
🗣️ Decomposing language processing with Utah arrays (@juliengadonneix.bsky.social): bsky.app/profile/juli...
👁️ Comparing fMRI and DINOv3: bsky.app/profile/jean...
NeuralSet fills a specific gap:
the hierarchical preprocessing and data loading of all brain recordings and all modalities.
Caching and cluster dispatch come for free.
Flip `infra={...}` and the same Study + Extractors run locally, reuse an on-disk cache, or fan out on SLURM — no other code change.
Prototype on a laptop, scale to 100 subjects with one kwarg.
How this works:
NeuralSet provides a modular pipeline:
Study → Events → Transforms → Extractors → Segmenter → Batch.
<10 lines of code
And get a scalable PyTorch `DataLoader` of preprocessed neural recordings and stimulus embeddings.
We're happy to release NeuralSet: a simple, fast, scalable package for Neuro-AI
Supports:
🧠 fMRI, EEG, MEG, iEEG, spikes… preprocessing
💬 text 🔊 audio ▶️ video 🏞️ image… embeddings
📦 pip install neuralset
🔍 facebookresearch.github.io/neuroai/neur...
📄 kingjr.github.io/files/neural...
🧵 Details👇
Yes, the project is deliberately shooting for an ambitious target. The committee will evaluate based on scientific impact, not strictly the numbers. HTH
I'm proud to be part of the Scientific Committee for the new $5M Digital Brain Project, to accelerate development of open source models of the human brain. Apply by May 15th for funding at digitalbrainproject.org
🙏 Thanks to
Scientific Committee:
@arthurmensch.bsky.social, AM Karmarrec, JA Sahel, L Melloni, @lune-bellec.bsky.social , @russpoldrack.org
Exec Committee: A Yavchitz, E Cascardi, G. Le Hénanff, J Boyle, JR King, P Bourdillon
Hopital de la Fondation Rothschild, Meta, Université de Montréal
🧠 the Digital Brain Project is now live:
$5M total · up to $500k per selected team
Let's open-source the modeling of the human brain brain activity!
➡️Apply on: digitalbrainproject.org
We're excited to share our new study on decoding brain activity in participants with post-stroke aphasia! We think this is an important step towards cognitive brain-computer interfaces for patients with language disorders
www.biorxiv.org/content/10.6...
1/8
🧪🧠📈 some very exciting work from my colleague is now out! Come check it out! 🤩🤩🤩
great work
@juliengadonneix.bsky.social, @lucyzmf.bsky.social, @jeanremiking.bsky.social, and team!
🥳🥳🥳
🥳Very pleased of our latest study on the coordination of language representations in small cortical patches of the human brain 🧠:
(I’m secretely hoping this will be asked by the reviewers such that I can convince @sdascoli.bsky.social we should do it :))
no :’( we struggled with basic preprocessing and decided to drop to give it a proper chance in our next version
Our latest work, on modeling the human brain responses to sight, sound and language at scale:
Great opportunity to work in a terrific lab:
Novel study: Emergence of Phonemic, Syntactic, and Semantic Representations in Neural Networks.
Joint work with P. Diego supervision: Y. Lakretz, E. Chemla, Y. Boubenec, @jeanremiking.bsky.social
We explore the emergence of 3 linguistic structures in Neural Networks.
Arxiv: arxiv.org/abs/2601.18617
📣 Our latest brain-to-text decoding results from our Brain team is out:
"Towards decoding individual words from non-invasive brain recordings"
📄 www.nature.com/articles/s41...
👥 Led by Stéphane d'Ascoli & w/ Corentin Bel, Jérémy RAPIN, Hubert Banville, Yohann Benchetrit and Christophe Pallier
Thanks to Meta and ENS for their support, all Open Sources models at HuggingFace, as well as K Armeni, JM Schoffelen for the great MEG dataset (nature.com/articles/s41...), thanks to the community and see you at
@NeuripsConf !
This study strengthens a surprising phenomenon:
Even though brains and LLMs differ in many ways (architecture, modality, and learning goals), they converge on similar sequence of computations.
Understanding why this convergence emerges may help bridge biological and AI.
Two more facts:
1. This temporal alignment is not explained by word predictability.
2. bidirectional models (not inferential like the brain) do not converge toward similar sequential dynamics.
However, this temporal alignment does not emerge automatically.
For the same model and configuration, it seem to depend on:
• Model size (larger → more brain-like)
• Context length (longer context → more brain-like)
Untrained models show almost no alignment.
LLMs build representations in the same order as the brain:
➡️ First layers ↔ early brain responses
➡️ Deep layers ↔ later brain responses
This temporal alignment is robust (r=0.99, p<1e-6) and holds across architectures (transformers & recurrent), sizes, and training regimes.