Deeply honored that our paper was recognized with the #SANS2026 Award!
SANS was an important part of this paper's journey! @jadynpark.bsky.social first presented this work at SANS2023, and again in SANS2025, and we benefited greatly from the feedback and discussion!
Posts by Jin Ke
Out now in @nathumbehav.nature.com! We applied graph theoretic analyses to fMRI data of participants watching movies/listening to stories. Integration across large-scale functional networks mediates arousal-dependent enhancement of narrative memories. Open access link: rdcu.be/eKKAw
How does the brain🧠 make causal inferences and use memories to understand narratives🎬?
We built an RNN🤖 with key-value episodic memory that learns causal relationships between events and retrieves memories like humans do!
Preprint www.biorxiv.org/content/10.1...
w/ @qlu.bsky.social, Tan Nguyen &👇
I'm also deeply grateful to those who offered insightful feedback along the way: the BLRB community at UChicago, joint CogNeuro meeting at Yale, @esfinn.bsky.social 's lab and many others. I’d like to thank Emma Megla and Wilma Bainbridge for sharing the aphantasia data! (10/10)
HUGE thank you to everyone whose incredible efforts over the past three years made this project possible! @tchamberlain.bsky.social @hayoungsong.bsky.social @annacorriveau.bsky.social @zz112.bsky.social, Taysha Martinez, Laura Sams, Marvin Chun, @ycleong.bsky.social @monicarosenb.bsky.social (9/10)
Together, we found that ongoing thoughts at rest are reflected in brain dynamics and these network patterns predict everyday cognition and experiences.
Our work underscores the crucial role of subjective in-scanner experiences in understanding functional brain organization and behavior. (8/10)
Neuromarkers of these thoughts further generalized to HCP data (N=908), where decoded thought patterns predicted positive vs. negative trait-level individual differences measures. This suggests that links between rsFC and behavior might in part reflect differences in ongoing thoughts. (7/10)
Moreover, the model predicting whether people are thinking in the form of images distinguished an aphantasic individual—who lacks visual imagery—from their otherwise identical twin. Data from academic.oup.com/cercor/artic.... (6/10)
Thought models generalized beyond self-report, predicting non-introspective markers, such as pupil size, linguistic sentiment of speech and the strength of a sustained attention network (Rosenberg et al., 2016, 2020). (5/10)
How are these thoughts related to resting-state functional connectivity (rsFC) patterns? We found that similarity in ongoing thoughts tracks similarity in rsFC patterns within and across individuals, and that both thought ratings and topics could be reliably decoded from rsFC (4/10)
We observed a remarkable idiosyncrasy in ongoing thoughts between individuals and over time, both in terms of self-reported ratings as well as the content and topics of thoughts. (3/10)
In our “annotated rest” task, 60 individuals rested, and verbally described and rated their ongoing thoughts after each 30-sec rest period. (2/10)
New preprint! 🧠
Our mind wanders at rest. By periodically probing ongoing thoughts during resting-state fMRI, we show these thoughts are reflected in brain network dynamics and contribute to pervasive links between functional brain architecture and everyday behavior (1/10).
doi.org/10.1101/2025...
Preprint⭐
Our attention changes over time and differs across contexts—which is reflected in the brain🧠 Fitting a dynamical systems model to fMRI data, we find that the geometry of neural dynamics along the attractor landscape reflects such changes in attention!
www.biorxiv.org/content/10.1...
I’m thrilled to announce that I will start as a presidential assistant professor in Neuroscience at the City U of Hong Kong in Jan 2026!
I have RA, PhD, and postdoc positions available! Come work with me on neural network models + experiments on human memory!
RT appreciated!
(1/5)
Feeling fortunate that #SANS2025 was in Chicago, and so much of the lab was able to be part of the meeting! It's crazy how much the lab has grown over the past 3y10m, and I'm so proud of the work we are doing together! Happy that we could host the lab, alums (and surprise guests)!
#CASNL@SANS
To learn more about this dataset and the neural dynamics of narrative insight, check out our recent work (preprint below) led by the amazing @hayoungsong.bsky.social and chat with her on Saturday 1:50 PM - 3:00 PM! Poster ID: P3-B-30.
www.biorxiv.org/content/10.1...
Curious how the human brain updates social impression in a naturalistic setting? We scanned participants watching This Is Us, and found that sudden neural pattern shifts at insight moments of comprehension reflect impression updating. Come and chat Friday 4:15–5:15pm at #SANS2025, Poster P2-G-69.
New preprint! Excited to share our latest work “Accelerated learning of a noninvasive human brain-computer interface via manifold geometry” ft. outstanding former undergraduate Chandra Fincke, @glajoie.bsky.social, @krishnaswamylab.bsky.social, and @wutsaiyale.bsky.social's Nick Turk-Browne 1/8
Many thanks to Janice Chen, @lukejchang.bsky.social and @asieh.bsky.social for open sourcing the movie datasets! Also wanted to give a shoutout to UChicago MRIRC for helping us collect the North by Northwest data. (9/9)
We have made our model and analysis scripts publicly available to facilitate its use by other researchers in decoding moment-to-moment emotional arousal in novel datasets, providing a new tool to probe affective experience using fMRI. (8/9)
github.com/jinke828/Aff...
In conclusion, our findings reveal a generalizable representation of emotional arousal embedded in patterns of dynamic functional connectivity, suggesting a common underlying neural signature of emotional arousal across individuals and situational contexts. (7/9)
In contrast, using the same computational modeling approach, we were unable to find a generalizable neural representation of valence in functional connectivity. Null results are inherently difficult to interpret, but we offer several possible speculations in our paper. (6/9)
The network generalized to two additional, novel movies, where model-predicted arousal time courses corresponded with the plot of each movie, suggesting a methodological tool for researchers who wish to obtain continuous measures of arousal without having to collect additional human ratings. (5/9)
This generalizable arousal network is encoded in interactions between multiple large-scale functional networks including default mode network, dorsal attention network, ventral attention network and frontoparietal network. (4/9)
We observed robust out-of-sample generalizability of the arousal models across movie datasets that were distinct in low-level features, characters, narratives, and genre, suggesting a situation-general neural representation of arousal. (3/9)
One possibility is that there are generalizable neural patterns associated with valence or arousal across contexts and individuals. We utilized open movie-watching fMRI datasets and built predictive models of moment-to-moment valence and arousal from functional correlations in brain activity. (2/9)
From the excitement of reuniting with a long-lost friend to the anger when being treated unfairly, our daily lives are colored by diverse affective experiences. Despite the remarkable diversity, is there an intrinsic similarity in how the brain represents these experiences? (1/9)
Now out in Plos Computational Biology! We identified a generalizable neural signature of emotional arousal across contexts and individuals during movie watching.
work with the best team: @hayoungsong.bsky.social @Zihan Bai @monicarosenb.bsky.social @ycleong.bsky.social
dx.plos.org/10.1371/jour...
Remember what your partner said during a heated argument? Or the rush of getting your first job offer? Why do these emotionally arousing moments stick? Across 3 studies, and 3 arousal measures, we found that emotional arousal enhances memory encoding by promoting functional integration in the 🧠 1/🧵