The transformer has no memory. Some thoughts on LLMs, memory, RAG, and how to use Claude. bcomone.atlassian.net/wiki/spaces/...
@bcom_foundation @bcom-foundation.bsky.social
@francesca-castaldo.bsky.social
Posts by Giulio Ruffini
All algorithmic agents welcome here!
5/5 Whether you are a single cell, a human brain, or an AI, to maintain homeostasis, you must run a generative model of your environment. You can't just react; you must predict! 🌍🤖
Check out the full Open Access paper here: 🔗 www.mdpi.com/1099-4300/28...
4/5 Why does this matter? It provides a rigorous, distribution-free mathematical backbone for neuroscience frameworks like the Free-Energy Principle and Active Inference.
3/5 The paper looks at regulation as data compression. It shows that if a regulator successfully keeps a system (embedded in the world) stable (reducing the algorithmic complexity of its output), it must share "mutual algorithmic information" with the world with high probability.
2/5 In a new paper, "The Algorithmic Regulator" (published in Entropy), I tackle this using Algorithmic Information Theory and Kolmogorov Complexity.
It takes the classic Good Regulator Theorem and the Internal Model Principle and complements and extends them for non-linear, deterministic systems.
Do you need to understand the world to survive in it?
A classic 1970 cybernetics theorem says yes: "Every good regulator of a system must be a model of that system." But proving this mathematically for complex, unpredictable real-world scenarios has always been notoriously difficult. 1/5
Annotated version: oup.silverchair-cdn.com/oup/backfile...
This is the paper where, after almost 20 years, I started taking my AIT obsessions a bit more seriously. I am happy I did... maybe you have some too... => Don't wast time! : )
academic.oup.com/nc/article/2... PS: the Supplementary Data part is more fun!
Philip, agree on 1... but I think the road for studying consciousness is assuming primordial "experience" and instead focus on "structured experience" - and this connects directly with mathematics. See my talk in the "Platonic Space Symposium" thoughtforms.life/wp-content/u...
Thanks, Johannes!
An Introduction to Galois Theory (with connections to AIT): zenodo.org/records/1845... ...This note aims to demystify Galois Theory by connecting its foundational definitions to a broader principle of computational and compositional tractability.
From The Sorcerer's Apprentice to Crystal Nights: Security Implications from Moltbot/Moltbook to Greg Egan's Crystal Nights! zenodo.org/records/1844...
Could life have begun with simpler molecules than we once thought? A new paper in @science.org by @edogia.bsky.social shows that a tiny RNA catalyst can self-replicate itself, suggesting that life may have been easier to emerge than expected. Getting closer. www.biorxiv.org/content/10.1...
Thanks for spreading the word, Ricard!
What if we had a Rosetta Stone for brain oscillations—one framework to translate between models and scales?
In this paper lead by F Castaldo and @ruffini.bsky.social they build a simple, systematic ladder of neural mass models showing how diverse formalisms connect arxiv.org/pdf/2512.10982
My annotated slides for my talk in the wonderful thoughtforms.life/symposium-on... organized by @drmichaellevin are here: giulioruffini.github.io/assets/slide...
Thanks! That was straight from the paper! Uploaded it, clicked on slides…. That’s it!
Paper: mdpi.com/1099-4300/27...
Related: arxiv.org/abs/2510.10586
More on Algorithmic agents: giulioruffini.github.io
Impressed: #notebooklm created a nice presentation of the Algorithmic Agent and Symmetry paper! github.com/giulioruffin...
15/ ... plus, linearization of Wilson-Cowan; Connecting SL with Wilson-Cowan.
14/ Other goodies: discussion using L-operators (for synapses) and transfer functionals.
Definition: A dataset is said to represent an oscillation when it can be most succinctly Lie-generated from a representation of U1 (plus noise). #ait #kolmogorov
12/ Bonus material: plenty of good stuff in the Appendix for aficionados, including links with Groups, Topology, and Algorithmic Information Theory (What is an oscillation)? @ERC_Research @neurotwin @Neuroelectrics
11/ If you use neural mass models (or teach them), I’d love feedback: what translation step is hardest in your workflow?
PDF: arxiv.org/pdf/2512.10982
#computationalneuroscience #neuralmass #EEG #MEG
10/ Practical cheat-sheet:
• Phase locking/entrainment → phase models
• Spectra/covariances → damped linear resonators
• Limit cycles near Hopf → Stuart–Landau
• Firing-rate E–I loops → Wilson–Cowan
• PSP/synaptic kinetics → NMM1
• First-principles spiking link → NMM2
9/ NMM2 (next-generation masses): Exact mean-field reductions of QIF networks yield dynamic (r,v) equations—a *dynamic* transfer function replacing static sigmoids, linking spikes ↔ masses.
8/ NMM1 (second-order synapses):
PING-like motifs, Jansen–Rit, Wendling, and laminar neural masses become variants of one formalism—
highlighting which parameters control resonance, PSPs, and phase shifts.