Interesting in how animals learn a sensory decision-making task from scratch? Come check out Helena Liu's (@helenaliu24.bsky.social) poster in the Saturday poster session at #cosyne2026 !
Posts by Jonathan Pillow
Proud to be a collaborator on this paper on compact deep neural network models of V4, with Ben Cowley (@benjocowley.bsky.social), Pati Stan, & Matt Smith, now finally out in print (by which I mean online).
Excited to be co-organising a #cosyne2026 workshop with Alison Comrie on 'algorithms for learning from scratch'! With a great line-up of speakers, we'll be tackling the question of what processes enable naive biological & artificial agents to adapt to new situations. Info here: tinyurl.com/4u8enf7k
New paper with @deanpospisil.bsky.social , in which we introduce a new estimator for the "signal eigenspectrum" (i.e., the eigenvalues of the noiseless population responses). We re-analyze data from Stringer et al 2019 and show eigenvalues of mouse V1 are well explained by a broken power.
Thanks, Dean!
We'd be grateful for any comments about points we overlooked, additional citations, as well as any corrections, clarifications, or suggestions for improvement! 🙏
We introduce metrics for quantifying the degree of alignment between the communication subspace and the dominant modes of input and output population activity. (e.g., Are the dominant modes of the input population the same ones driving communication?)
We also derive some useful (known) extensions, such as adding a ridge penalty ("ridge RRR") and non-spherical noise (accounting for correlated response noise), both of which preserve a closed-form solution.
Part of our motivation was our own difficulty understanding RRR and its mathematical origins (e.g., Why is this an eigenvector problem?). We thought others might benefit from a simple derivation and some figures and comparisons to build intuition.
Bichan Wu (@bichanw.bsky.social) & I wrote a tutorial paper on Reduced Rank Regression (RRR) — the statistical method underlying "communication subspaces" from Semedo et al 2019 — aimed at neuroscientists.
arxiv.org/abs/2512.12467
Good find, Spencer! 🙌
Great opportunity to learn to use fancy neural data analysis tools developed at the @flatironinstitute.org. Sign up for this workshop at SFN 2025!
This is excellent! 😂
Now out in @natcomms.nature.com: Mice and monkeys spontaneously shift through comparable cognitive states - and it's written all over their faces! (1/7)
www.nature.com/articles/s41...
In fact, Robbins 1956 ("An empirical Bayes approach to statistics") doesn't even consider Gaussian likelihoods. Only Poisson, geometric, binomial. So I'm puzzled about why this is the go-to citation. Are there multiple versions of this paper floating around??
Diffusion modeling folks: many sources (including Wikipedia) cite Robbins 1956 as the original source for Tweedie's formula (E[z | x] = x + sig^2 \grad log p(x)). But as far as I can tell — the formula appears nowhere in that paper. Did I miss it? Can anyone explain what's going on here?
"to run a contest that you end up not funding as a private organization at this time is… both extremely wasteful of these people’s time, and just devastating in terms of morale."
I spoke with @aniloza.bsky.social at STAT today about HHMI's decision not to fund this round of Hanna Gray applicants.
Mind-boggling that HHMI would pull this right as other sources of funding for early stage investigators are drying up! 😢
Haha, ok thanks! 😂
Wow, great — thanks! I didn't know about CB education articles, but that could indeed be a good fit. It is indeed seeking to give technical — albeit (I hope) accessible — details of the derivations!
Ok, awesome — thanks for this suggestion! I didn't know about this journal before...
Cool — thanks for the suggestion! Although the web-page here "New Methods", which this paper is not. (Rather it's trying to give a clear, accessible description of an old method). Do you think that could fly?
Congrats, Takaki — this is super-cool!!
But there are no new results, per se. Any thoughts or suggestions for where to publish would be most appreciated!
The RRR estimator dates back to Izenman 1975, but we have found the original stats literature a bit hard to digest. So our paper paper aims to build intuition and give a simple derivation of RRR, along with several extensions (e.g., L2 regularization, non-isotropic noise).
By way of background: RRR is the method used for estimating a "communication subspace" between brain regions, introduced in Semedo et al 2019, and now growing in popularity for the analysis of multi-region datasets.
We are writing a tutorial paper on reduced rank regression (RRR), aimed primarily at neuroscientists.
Q: Does anyone have suggestions for where to publish such a paper?
"We are passionate about science and its benefits to society, but we fear more what will happen if we do not help defend everyone’s fundamental rights." Op-ed I co-wrote w/ @samwang.bsky.social @jpillowtime.bsky.social @ilanawitten.bsky.social & David Tank www.dailyprincetonian.com/article/2025...
Pynapple is a Python-based neural analysis package designed to streamline your research. It integrates seamlessly with your projects, offering tools for processing, analyzing, and visualizing neural data. Follow us for updates, tutorials, and community insights!
#Pynapple #NeuralAnalysis #OpenSource
"Replacing animals with human-centered tools will provide better insight into human biology, speeding up the development of much-needed treatments for diseases like cancer and Alzheimer’s disease." 🤦♂️
The fact that the author refers to animal research as "animal testing" also reveals a lot.