On Feb 9, Jonas Arruda and @alex-andorra.bsky.social will give a live demo on diffusion models for SBI using BayesFlow. Don't miss out!
www.linkedin.com/feed/update/...
Posts by BayesFlow
All diffusion models, hyperparameter settings, and samplers are available in bayesflow 2!
Diffusion models & flow matching are reshaping simulation-based inference.
Thus, we wrote the first tutorial review on diffusion-based SBI. For an overview or a deep dive, check it out and let us know what you think:
arXiv: arxiv.org/abs/2512.20685
Web: bayesflow-org.github.io/diffusion-ex...
Simulations are no longer just βnice to have.β Theyβre reshaping how we do statistics.
Care to learn more? Check out our paper arxiv.org/abs/2503.24011, accepted for publication in the upcoming theme issue of Philosophical Transactions A.
BayesFlow released version 2.0.4, presented numerous findings at the MathPsych/ICCM 2025 conference at Ohio State University, and expanded its contributor list to 25 active members! Congrats to BayesFlow on all these new huge accomplishments!
π§ Check out the classic examples from Bayesian Cognitive Modeling: A Practical Course (Lee & Wagenmakers, 2013), translated into step-by-step tutorials with BayesFlow!
Interactive version: kucharssim.github.io/bayesflow-co...
PDF: osf.io/preprints/ps...
Iβm vengeance.
Finite mixture models are useful when data comes from multiple latent processes.
BayesFlow allows:
β’ Approximating the joint posterior of model parameters and mixture indicators
β’ Inferences for independent and dependent mixtures
β’ Amortization for fast and accurate estimation
π Preprint
π» Code
BayesFlow is a library for amortized Bayesian inference with neural networks.
β
Multi-backend via Keras 3: Use PyTorch, TensorFlow, or JAX.
β
Modern nets: Flow matching, diffusion, consistency models, normalizing flows, transformers
β
Built-in diagnostics and plotting
π github.com/bayesflow-or...
A study with 5M+ data points explores the link between cognitive parameters and socioeconomic outcomes: The stability of processing speed was the strongest predictor.
BayesFlow facilitated efficient inference for complex decision-making models, scaling Bayesian workflows to big data.
πPaper
Join us this Thursday for a talk on efficient mixture and multilevel models with neural networks by @paulbuerkner.com at the new @approxbayesseminar.bsky.social!
1οΈβ£ An agent-based model simulates a dynamic population of professional speed climbers.
2οΈβ£ BayesFlow handles amortized parameter estimation in the SBI setting.
π£ Shoutout to @masonyoungblood.bsky.social & @sampassmore.bsky.social
π Preprint: osf.io/preprints/ps...
π» Code: github.com/masonyoungbl...
Neural superstatistics are a framework for probabilistic models with time-varying parameters:
β
Joint estimation of stationary and time-varying parameters
β
Amortized parameter inference and model comparison
β
Multi-horizon predictions and leave-future-out CV
π Paper 1
π Paper 2
π» BayesFlow Code
The software implementation elegantly uses BayesFlowβs modular data pipeline:
- Observables are embedded by a summary network.
- Context information (eg, prior and likelihood type) bypasses the summary net and enters the normalizing flow as direct conditions.
π Code: github.com/bayesflow-or...
The paper was led by @elseml.bsky.social, with multiple high-impact applications:
π¦ Disease outbreak modeling
π Global warming thresholds
π§ Human decision-making
β¨ Sensitivity-aware amortized inference increases the amortization scope by a lot. Another step towards a Bayesian foundation model!
Any single analysis hides an iceberg of uncertainty.
Sensitivity-aware amortized inference explores the iceberg:
β
Test alternative priors, likelihoods, and data perturbations
β
Deep ensembles flag misspecification issues
β
No model refits required during inference
π openreview.net/forum?id=Kxt...
Hi, thanks for reaching out!
In the context of amortized inference, itβs been shown that many of the algorithms we use are susceptible to adversarial attacks, and this can be mitigated by regularizing wrt Fisher information.
π Paper by @mackelab.bsky.social:
arxiv.org/abs/2305.14984
To celebrate the new beginnings on Bluesky, let's reminisce about one of our highlights from the old days:
The unexpected shout-out by @fchollet.bsky.social that made everyone go crazy on the BayesFlow Slack server and led to a 15% increase in GitHub stars.
BayesFlow is a library for amortized Bayesian inference with neural networks.
β
Multi-backend via Keras 3: Use PyTorch, TensorFlow, or JAX.
β
Modern nets: Flow matching, diffusion, consistency models, normalizing flows, transformers
β
Built-in diagnostics and plotting
π github.com/bayesflow-or...