You think this could be more efficient or have other benefits over Ziggurat?
Posts by Sam Duffield
There's a load of fun examples in the cuthbert docs and we're always looking to add more! 🐛
state-space-models.github.io/cuthbert/exa...
In this highly nonlinear example it's more accurate and faster than both extended and particle filters.
state-space-models.github.io/cuthbert/exa...
The ensemble Kalman filter is now in cuthbert 🔥
The EnKF is one of those algorithms that "just works" - oftentimes in settings it has no right to
github.com/state-space-...
It was my birthday this week, it really had to be done
“Parallelizing MCMC Across the Sequence Length”: This one is really cool.
statmodeling.stat.columbia.edu/2026/02/03/p...
Come check it out if you're interested in time series, Monte Carlo, sequential problems.
We've got a suite of fun examples, lots more to add - contributions welcomed!
Super fun work with @adriencorenflos.bsky.social and Sahel Iqbal 🙌
New open source: cuthbert 🐛
State space models with all the hotness: (temporally) parallelisable, JAX, Kalman, SMC
Paper for full details. The proofs draw on ideas from vector calculus and Fourier analysis which was really fun to work through
arxiv.org/abs/2601.07834
Here is the decomposition:
I show that the scalar ϕ is unique(!) but you can choose and Q or D.
In diffusion the ϕ terms represent the "probability flow ODE" but there are actually many ODEs which satisfy p(x,t) depending on your choice of Q
This work combines, unifies and generalises two of my favourite papers
- Ma et al - Complete recipe for autonomous SDEs arxiv.org/abs/1506.04696
- Karras et al - Elucidating the Design Space of Diffusion arxiv.org/abs/2206.00364
New preprint! A Complete Decomposition of Stochastic Differential Equations
I characterise *all possible SDEs* that satisfy given time-dependent marginals p(x,t)
Not like you to not give the sauce, this looks interesting!
Usual MCMC algorithms are typically guaranteed to work well when used to sample from target distributions for which
i) mass is reasonably well-concentrated in the centre of the state space, and
ii) the log-density is smooth and of moderate growth.
Outside of this setting, things can go poorly.
Read more at arxiv.org/abs/2508.20883
Including scaling LRW up to image generation with Stable Diffusion 3.5 🐱
As described in the paper, LRW provides multiple benefits but the key motivation for us @normalcomputing.com was the co-design with novel stochastic computing hardware which we believe can drastically accelerate general-purpose SDE sampling.
New paper on arXiv! And I think it's a good'un 😄
Meet the new Lattice Random Walk (LRW) discretisation for SDEs. It’s radically different from traditional methods like Euler-Maruyama (EM) in that each iteration can only move in discrete steps {-δₓ, 0, δₓ}.
In slides from a recent talk - the { virtuous / vicious } cycle of filtering, smoothing, and parameter estimation in state space models.
Oh you king this is great thanks! I was at Lau Pa Sat the other day but went for shrimp noodles (which were great) because the satay queue was too long
Didn’t listen, good decision
Me: Hey so where’s good to eat round here?
Singapore taxi driver: Malaysia
However! We’re working on a much broader generalisation of abile which hopefully will be able to share soon 🤞🔜
Adjacent!
posteriors takes the natural gradient descent viewpoint on EKF arxiv.org/abs/1703.00209
Which is nice for online deep learning but not necessarily bespoke state-space model inference
We've also updated the paper and made some cool updates to the library 😎
Paper: arxiv.org/abs/2406.00104
Repo: github.com/normal-compu...
📃 Poster #419
🗓️ Sat 26th, 10:00–12:30
📍 #ICLR2025, Singapore
Swing by if you’re into probml, thermodynamic computing or just wanna say hi
posteriors 𝞡 published at ICLR!
I’ll be in Singapore next week, let’s chat all things scalable Bayesian learning! 🇸🇬👋
A new instalment of office decor:
F