5) a computational model beautifully done by Guanchun Li that implemented BTSP learning rules and a synaptic weight consolidation mechanism was able to generalize and made some mechanistic predictions
check it out!
Posts by Fish Qian
(continued) task-related and context-related information were encoded in separate subspace (see Wenbo Tang's preprint). Thus, it is active generalization, not passive reward-distance coding transfer. Moreover, this generalization process allows faster adaptation to reward switching in the future.
3) weak residual activity, influenced by past learning ("memory traces") and presumably driven by CA3, biased the BTSP events to guide new learning for generalization.
4) place cells changed reference frames from spatial cues to reward goal when switching contexts from familiar to novel belts.
How prior experience biases new learning for generalization? new preprint is out:
1) behavioral and hippocampal generalization co-emerged in a novel context.
2) BTSP was involved in forming cross-context stable representations, as well as aligning novel neural dynamics to familiar subspace.
It's officially published!! In my main postdoc work with @markplitt.bsky.social and @lgiocomo.bsky.social, we found that the hippocampus simultaneously encodes an animal's spatial position and its experience relative to reward in parallel population codes. ๐งต
www.nature.com/articles/s41...
Yes super helpful. I stick to the context-content-conclusion rule for every main figure paragraph.
You can use colorbrewer to start with. It sounds suspicious to input your figure and output a color-blind-friendly version without distorting anything.
OK If we are moving to Bluesky I am rescuing my favourite ever twitter thread (Jan 2019).
The renamed:
Bluesky-sized history of neuroscience (biased by my interests)