Max Hird, Samuel Livingstone: High-dimensional Adaptive MCMC with Reduced Computational Complexity https://arxiv.org/abs/2604.09286 https://arxiv.org/pdf/2604.09286 https://arxiv.org/html/2604.09286
Posts by Rui-Yang Zhang
Clearly a must read for anyone even remotely interested in numerical integration (be it stochastic and deterministic).
Toni has been (hyper)active in studying these methods both theoretically and practically.
arxiv.org/abs/2602.16218
Congrats !
The UCL IMSS Annual Lecture will take place on the 27th April with a keynote from @lestermackey.bsky.social.
The theme is 'Computational Statistics and Machine Learning' and we'll have talks from Alessandro Barp, Paula Cordero Encinar & Po-Ling Loh.
imss2026.github.io
@statisticsucl.bsky.social
Very excited to announce the ProbAI Theory of Scaling Laws Workshop (warwick.ac.uk/fac/sci/stat...) at @warwickstats.bsky.social, 22-24 June! (1/4)
Accurate and thorough representation of prior and related work is one of the cornerstones of good research.
It is shocking to me that so many published NeurIPS papers, even from top institutions, have fabricated references.
I recommend reading the original report: gptzero.me/news/neurips/
Mathematical Colloquium (at King's College London): A duality in the foundations of probability and statistics through history by Vladimir Vovk
www.kcl.ac.uk/events/mathe...
How do large language models interpret words relating to probability like “unlikely,” “probably,” or “almost certain"?
The below shows what happens when we compare judgements from different models to a benchmark dataset of human judgments (data from: github.com/zonination/p...).
Usual MCMC algorithms are typically guaranteed to work well when used to sample from target distributions for which
i) mass is reasonably well-concentrated in the centre of the state space, and
ii) the log-density is smooth and of moderate growth.
Outside of this setting, things can go poorly.
The recording of my talk on 'Multilevel neural simulation-based inference' at the 'One World Approximate Bayesian Inference' seminar series is now available on YouTube.
Link: www.youtube.com/watch?v=hBWd...
Preferential Sampling refers to scenarios where observation locations are confounded by the field of interest which the same observations are used to infer. This recent arxiv (arxiv.org/abs/2511.03158) looked at how harmful ignoring preferential sampling would be - not much, according to the paper.
I’ll be giving a talk on a recently accepted NeurIPS paper at the next OWABI seminar on Thursday. The talk will cover simulation-based inference and how you can enhance accuracy when you have cheap approximate simulators at hand.
"The Principles of Diffusion Models" by Chieh-Hsin Lai, Yang Song, Dongjun Kim, Yuki Mitsufuji, Stefano Ermon. arxiv.org/abs/2510.21890
It might not be the easiest intro to diffusion models, but this monograph is an amazing deep dive into the math behind them and all the nuances
Let me advertise a bit our Online Monte Carlo seminar:
This coming Tuesday, we have Giorgos Vasdekis speaking on some very interesting recent work.
Moreover, we have confirmed our speaker line-up through until December - very exciting!
See sites.google.com/view/monte-c... for further details.
The first talk of the season will be this coming Tuesday (23 September), given by Alexandre Bouchard-Côté from UBC. Alex is a great speaker, so do join if you have the chance!
See sites.google.com/view/monte-c... for details, links, and so on.
Returning soon - stay tuned!
sites.google.com/view/monte-c...
Join us online for a discussion on
“Statistical exploration of the Manifold Hypothesis” and an opportunity to explore the intersection of geometry, statistics and machine learning.
📅 Wed 08 Oct | 🕓 4–6pm UK
🔗 Register + download the paper: rss.org.uk/training-eve...
“Everyone knows” what an autoencoder is… but there's an important complementary picture missing from most introductory material.
In short: we emphasize how autoencoders are implemented—but not always what they represent (and some of the implications of that representation).🧵
Gearing up for this workshop next week, with the finalised schedule attached!
For those who are unable to attend in person, but are interested in watching the talks, they will be streamed live on MS Teams. Please do get in touch with me if you'd like to stay informed about the stream.
An announcement, which might be of some interest:
In the period 2022-2024, myself and a number of other postdocs on the "CoSInES" and "Bayes4Health" EPSRC grants were involved in organising a number of internal tutorial workshops, on topics relevant to researchers in computational statistics.
Very cool!
New paper on arXiv! And I think it's a good'un 😄
Meet the new Lattice Random Walk (LRW) discretisation for SDEs. It’s radically different from traditional methods like Euler-Maruyama (EM) in that each iteration can only move in discrete steps {-δₓ, 0, δₓ}.
Just finished delivering a course on 'Robust and scalable simulation-based inference (SBI)' at Greek Stochastics. This covered an introduction to SBI, open challenges, and some recent contributions from my own group.
The slides are now available here: fxbriol.github.io/pdfs/slides-....
📣 Please share: We invite submissions to the 29th International Conference on Artificial Intelligence and Statistics (#AISTATS 2026) and welcome paper submissions at the intersection of AI, machine learning, statistics, and related areas. [1/3]
Liwen Xue, Axel Finke, Adam M. Johansen: Online Rolling Controlled Sequential Monte Carlo https://arxiv.org/abs/2508.00696 https://arxiv.org/pdf/2508.00696 https://arxiv.org/html/2508.00696
Really enjoyed listening to this interview with Mike Giles. Only knew him from his multilevel Monte Carlo work, and it was quite a nice surprise to learn about his contributions to CFD and experiences with industrial collaborations!
we're out here simulating, visualising, thriving
Congrats !!!