Advertisement · 728 × 90

Posts by Rui-Yang Zhang

Max Hird, Samuel Livingstone: High-dimensional Adaptive MCMC with Reduced Computational Complexity https://arxiv.org/abs/2604.09286 https://arxiv.org/pdf/2604.09286 https://arxiv.org/html/2604.09286

1 week ago 3 1 0 0
Post image Post image
3 weeks ago 2 0 0 0
Preview
Bayesian Quadrature: Gaussian Processes for Integration Bayesian quadrature is a probabilistic, model-based approach to numerical integration, the estimation of intractable integrals, or expectations. Although Bayesian quadrature was popularised already in...

Clearly a must read for anyone even remotely interested in numerical integration (be it stochastic and deterministic).

Toni has been (hyper)active in studying these methods both theoretically and practically.

arxiv.org/abs/2602.16218

2 months ago 9 2 0 0

Congrats !

2 months ago 1 0 0 0
IMSS Lecture 2026

The UCL IMSS Annual Lecture will take place on the 27th April with a keynote from @lestermackey.bsky.social.

The theme is 'Computational Statistics and Machine Learning' and we'll have talks from Alessandro Barp, Paula Cordero Encinar & Po-Ling Loh.

imss2026.github.io

@statisticsucl.bsky.social

2 months ago 7 4 0 0
Post image

Very excited to announce the ProbAI Theory of Scaling Laws Workshop (warwick.ac.uk/fac/sci/stat...) at @warwickstats.bsky.social, 22-24 June! (1/4)

2 months ago 7 3 1 1

Accurate and thorough representation of prior and related work is one of the cornerstones of good research.

It is shocking to me that so many published NeurIPS papers, even from top institutions, have fabricated references.

I recommend reading the original report: gptzero.me/news/neurips/

3 months ago 33 9 1 0
Advertisement
Post image

Mathematical Colloquium (at King's College London): A duality in the foundations of probability and statistics through history by Vladimir Vovk

www.kcl.ac.uk/events/mathe...

3 months ago 4 2 0 0
Post image

How do large language models interpret words relating to probability like “unlikely,” “probably,” or “almost certain"?

The below shows what happens when we compare judgements from different models to a benchmark dataset of human judgments (data from: github.com/zonination/p...).

4 months ago 57 13 2 3

Usual MCMC algorithms are typically guaranteed to work well when used to sample from target distributions for which

i) mass is reasonably well-concentrated in the centre of the state space, and
ii) the log-density is smooth and of moderate growth.

Outside of this setting, things can go poorly.

4 months ago 33 6 1 0

The recording of my talk on 'Multilevel neural simulation-based inference' at the 'One World Approximate Bayesian Inference' seminar series is now available on YouTube.

Link: www.youtube.com/watch?v=hBWd...

5 months ago 9 4 0 0
Post image
5 months ago 8 0 1 0
Post image Post image

Preferential Sampling refers to scenarios where observation locations are confounded by the field of interest which the same observations are used to infer. This recent arxiv (arxiv.org/abs/2511.03158) looked at how harmful ignoring preferential sampling would be - not much, according to the paper.

5 months ago 4 0 0 0

I’ll be giving a talk on a recently accepted NeurIPS paper at the next OWABI seminar on Thursday. The talk will cover simulation-based inference and how you can enhance accuracy when you have cheap approximate simulators at hand.

5 months ago 5 1 0 0
Preview
The Principles of Diffusion Models This monograph presents the core principles that have guided the development of diffusion models, tracing their origins and showing how diverse formulations arise from shared mathematical ideas. Diffu...

"The Principles of Diffusion Models" by Chieh-Hsin Lai, Yang Song, Dongjun Kim, Yuki Mitsufuji, Stefano Ermon. arxiv.org/abs/2510.21890
It might not be the easiest intro to diffusion models, but this monograph is an amazing deep dive into the math behind them and all the nuances

5 months ago 37 13 1 1
Post image Post image

Let me advertise a bit our Online Monte Carlo seminar:

This coming Tuesday, we have Giorgos Vasdekis speaking on some very interesting recent work.

Moreover, we have confirmed our speaker line-up through until December - very exciting!

See sites.google.com/view/monte-c... for further details.

5 months ago 22 7 0 1
Advertisement

The first talk of the season will be this coming Tuesday (23 September), given by Alexandre Bouchard-Côté from UBC. Alex is a great speaker, so do join if you have the chance!

See sites.google.com/view/monte-c... for details, links, and so on.

7 months ago 17 5 1 1
Post image

Returning soon - stay tuned!

sites.google.com/view/monte-c...

7 months ago 21 7 0 1
Post image

Join us online for a discussion on
“Statistical exploration of the Manifold Hypothesis” and an opportunity to explore the intersection of geometry, statistics and machine learning.

📅 Wed 08 Oct | 🕓 4–6pm UK
🔗 Register + download the paper: rss.org.uk/training-eve...

7 months ago 10 3 0 0
Post image

“Everyone knows” what an autoencoder is… but there's an important complementary picture missing from most introductory material.

In short: we emphasize how autoencoders are implemented—but not always what they represent (and some of the implications of that representation).🧵

7 months ago 70 10 2 1

Gearing up for this workshop next week, with the finalised schedule attached!

For those who are unable to attend in person, but are interested in watching the talks, they will be streamed live on MS Teams. Please do get in touch with me if you'd like to stay informed about the stream.

7 months ago 5 1 0 1

An announcement, which might be of some interest:

In the period 2022-2024, myself and a number of other postdocs on the "CoSInES" and "Bayes4Health" EPSRC grants were involved in organising a number of internal tutorial workshops, on topics relevant to researchers in computational statistics.

7 months ago 17 5 1 0

Very cool!

7 months ago 0 0 0 0
Video

New paper on arXiv! And I think it's a good'un 😄

Meet the new Lattice Random Walk (LRW) discretisation for SDEs. It’s radically different from traditional methods like Euler-Maruyama (EM) in that each iteration can only move in discrete steps {-δₓ, 0, δₓ}.

7 months ago 16 5 1 1
Post image

Just finished delivering a course on 'Robust and scalable simulation-based inference (SBI)' at Greek Stochastics. This covered an introduction to SBI, open challenges, and some recent contributions from my own group.

The slides are now available here: fxbriol.github.io/pdfs/slides-....

7 months ago 35 9 1 1
Advertisement
Post image

📣 Please share: We invite submissions to the 29th International Conference on Artificial Intelligence and Statistics (#AISTATS 2026) and welcome paper submissions at the intersection of AI, machine learning, statistics, and related areas. [1/3]

8 months ago 36 21 2 2

Liwen Xue, Axel Finke, Adam M. Johansen: Online Rolling Controlled Sequential Monte Carlo https://arxiv.org/abs/2508.00696 https://arxiv.org/pdf/2508.00696 https://arxiv.org/html/2508.00696

8 months ago 1 1 0 0

Really enjoyed listening to this interview with Mike Giles. Only knew him from his multilevel Monte Carlo work, and it was quite a nice surprise to learn about his contributions to CFD and experiences with industrial collaborations!

8 months ago 2 0 0 0
Video

we're out here simulating, visualising, thriving

9 months ago 16 2 1 0

Congrats !!!

9 months ago 1 0 1 0