dynamical system.
arxiv.org/pdf/2508.0...
Posts by Approximate Bayes Seminar
Simulation experiments demonstrate KASPE’s flexibility and performance relative to existing likelihood-free methods including approximate Bayesian computation in challenging inferential settings involving posteriors with heavy tails, multiple local modes, and over the parameters of a nonlinear
simulated datasets. We provide theoretical justification for KASPE and a formal connection to the likelihood-based approach of expectation propagation.
We develop an alternative framework called kernel-adaptive synthetic posterior estimation (KASPE) that uses deep learning to directly reconstruct the mapping between the observed data and a finite-dimensional parametric representation of the posterior distribution, trained on a large number of
Fitting these models to data requires likelihood-free inference methods that explore the parameter space without explicit likelihood evaluations, relying instead on sequential simulation, which comes at the cost of computational efficiency and extensive tuning.
Abstract: Generative models and those with computationally intractable likelihoods are widely used to describe complex systems in the natural sciences, social sciences, and engineering.
MS Teams link: teams.microsoft.com/...
Meeting ID: 374 263 351 331 567
Passcode: B8Em6Cg9
The next OWABI seminar www.warwick.ac.uk/owabi is Oksana Chkrebtii (Ohio State University) who will talk about "Likelihood-free Posterior Density Learning for Uncertainty Quantification in Inference Problems" on Thursday the 30th April at 1pm UK time (note the different time!).
Finally, we propose two ideas that are useful in the SBI framework beyond robust inference: an SBI based method to obtain closed form approximations of intractable models and an active learning approach to more efficiently sample the parameter space.
Alternatively, we introduce model expansion through exponential tilting as another way to account for model misspecification. We also develop an SBI based goodness-of-fit test to detect model misspecification.
We propose a framework where the target of inference is a projection parameter that minimizes a discrepancy between the true distribution and the assumed model. The method guarantees valid inference, even when the model is incorrectly specified and even if the standard regularity conditions fail.
methods or regularity conditions. Traditional SBI methods assume that the model is correct, but, as always, this can lead to invalid inference when the model is misspecified. This paper introduces robust methods that allow for valid frequentist inference in the presence of model misspecification.
Abstract: Simulation-Based Inference (SBI) is an approach to statistical inference where simulations from an assumed model are used to construct estimators and confidence sets. SBI is often used when the likelihood is intractable and to construct confidence sets that do not rely on asymptotic
The next OWABI seminar www.warwick.ac.uk/owabi will be given by Larry Wasserman (Carnegie Mellon University) who will talk about "Robust Simulation Based Inference" on Wednesday the 25th March at 2pm UK time (note the different day and time!) on Teams teams.microsoft.com/...
Finally, we propose two ideas that are useful in the SBI framework beyond robust inference: an SBI based method to obtain closed form approximations of intractable models and an active learning approach to more efficiently sample the parameter space.
We also develop an SBI based goodness-of-fit test to detect model misspecification.
The method guarantees valid inference, even when the model is incorrectly specified and even if the standard regularity conditions fail. Alternatively, we introduce model expansion through exponential tilting as another way to account for model misspecification.
This paper introduces robust methods that allow for valid frequentist inference in the presence of model misspecification. We propose a framework where the target of inference is a projection parameter that minimizes a discrepancy between the true distribution and the assumed model.
SBI is often used when the likelihood is intractable and to construct confidence sets that do not rely on asymptotic methods or regularity conditions. Traditional SBI methods assume that the model is correct, but, as always, this can lead to invalid inference when the model is misspecified.
Abstract: Simulation-Based Inference (SBI) is an approach to statistical inference where simulations from an assumed model are used to construct estimators and confidence sets.
OWABI⁷, 25 March 2026: Robust Simulation Based Inference (11am UK time)
Speaker: Larry Wasserman (Carnegie Mellon University)
Title: Robust Simulation Based Inference
warwick.ac.uk/fac/sc...
We validate our method on both a synthetic multi-dimensional time series and a real-world meteorological dataset; highlighting its practical utility for data assimilation for complex dynamical systems.
performance. For scalable inference, we employ easily parallelizable wastefree sequential Monte Carlo (SMC) samplers with preconditioned gradient-based kernels, enabling efficient exploration of high-dimensional parameter spaces such as those in DGFMs.
Since the true data-generating process often lies outside the assumed model class, we adopt an alternative notion of consistency and prove that, under mild conditions, both the prequential loss minimizer and the prequential posterior concentrate around parameters with optimal predictive
To overcome this, we introduce prequential posteriors, based upon a predictive-sequential (prequential) loss function; an approach naturally suited for temporally dependent data which is the focus of forecasting tasks.
Deep generative forecasting models (DGFMs) have shown excellent performance in these areas, but assimilating data into such models is challenging due to their intractable likelihood functions. This limitation restricts the use of standard Bayesian data assimilation methodologies for DGFMs.
Abstract: Data assimilation is a fundamental task in updating forecasting models upon observing new data, with applications ranging from weather prediction to online reinforcement learning.
The next OWABI seminar will be given by Shreya Sinha Roy (Warwick), who will talk about "Prequential posteriors" on Thursday the 26th February at 11am UK time.
arxiv.org/abs/2511.1...
www.warwick.ac.uk/owabi
MS Teams link: teams.microsoft.com/...
or superior performance to existing state-of-the-art methodssuch as Sequential Neural Posterior Estimation (SNPE).
Ref: Sharrock, Simons, Liu, Beaumont, Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models. PLMR. openreview.net/pdf?i...
the observation of interest, thereby reducing the simulation cost. We also introduce several alternative sequential approaches, and discuss their relative merits. We then validate our method, as well as its amortised, non-sequential, variant on several numerical examples, demonstrating comparable