almost-but-not-quite a GPU
Posts by Sam Power
very clever
can't help but read it as a bit mocking
bsky.app/profile/beta...
Learned that per Nils Lid Hjort, this is known as the "Eccentric Part of the Noncentral Chi Square".
mass spreading out over time
I hope you're proud of that 4th line!
related to bsky.app/profile/spmo...
little bit of Lévy
Oh that's kind of fun!
(Seems like the only reasonable answer should be "donut", but I do wonder whether there's another name in use for the filled-in surface)
Sphere is to ball as torus is to ...?
Thank you!
So in a world where the standard primitive for RNG is exponentials rather than uniforms, it's potentially interesting enough, but as-is, it's more of a cute optimisation sub-problem.
I think it's pretty low-overhead in terms of lines of code, which is nice. It's arguably limited in that you need the stream of exponentials a priori, which is easy enough to get from a stream of uniforms by taking logarithms, but harder to do "purely algebraically".
As with lots of this stuff, one doesn't have to believe any model to derive these things or have them be useful (the formulas are 'just true'), but the filtering perspective is at least good at formalising the goals of the problem concretely, and opens things up to lots of simple generalisations.
Continuing to clean up lots of old miscellaneous notes on the laptop. I think this is a cute one: deriving a stable approach to computing means and variances in the streaming setting by a connection to filtering.
'Online Mean and Variance Computations by Kalman Filtering'
github.com/sampower88/P...
More miscellaneous rejection sampling stuff, based on the 'almost normal' distribution, as introduced in a pair of papers on exact sampling with the pretty-cute title of "Quantile Mechanics". Conclusion: it is indeed almost normal.
'The Almost-Normal Distribution'
github.com/sampower88/P...
Associated bits and pieces:
Another little note, this time on the task of transporting standard exponential random variables into (positive) standard Gaussians, in an 'algebraically simple' way. As hinted in an earlier post, it's quite possible to do efficiently!
"Exponential-to-Gaussian Sampling"
github.com/sampower88/P...
Going into it, I was wondering whether it was "just a good, brute-force implementation", but I think that I'm now more confident that it's genuinely interesting, and that one can say some neat things about it. (Though whether there are new things to say about it in 2026, I can't be certain.)
An afternoon's work: sitting with the Ziggurat method (known for being the go-to method for simulating 'standard' random variables "at scale"), and making sense of things:
"Rejection Sampling with the Ziggurat Method"
github.com/sampower88/P...
Very nice, Matti. I remember thinking about the Semi-Separable HMC idea a lot during my PhD (with some idea to extend it in the presence of other graphical model-type structure) and always felt it was a promising approach to the RM setup.
Moreover, the deficiencies of Chebyshev in this context have almost nothing to do with finite variance being a weak assumption, and everything to do with under-using the structure of independence.
One consequence of digesting the Catoni estimator is that if you're serious about forming good confidence sets for means on the basis of independent random variables of finite variance, then Chebyshev's inequality is unacceptably bad (if tractable).
bsky.app/profile/spmo...
A production of @adriencorenflos.bsky.social !
Thank you!