This release brought together 20 contributors, 9 of them first-time. A big thank you to everyone who contributed code, reviews, issues, and feedback. ๐
Try it: uv pip install sbi==0.26.1
Full changelog: github.com/sbi-dev/sbi/...
Posts by sbi - Simulation-based inference
New how-to guides: using Pyro models with sbi, hyperparameter tuning with @optuna.bsky.social, embedding time-series data, and a guide on the different abstraction levels available in sbi.
Under the hood, we refactored the training loop into modular components with typed config dataclasses. Net builder kwargs are now proper config objects instead of nested dicts: much easier to inspect and extend. And we made several security-related improvements (you never know ๐ค).
Neural nets: we ported the Mixture Density Network in-house (no more pyknos dependency) and improved its numerical stability. We now have EDM-style noise schedules for score-based models. And the transformer embedding handles 1D time-series data with automatic input projection.
New methods:
- Amortized VI posteriors: learn likelihood once, get posteriors for new obs instantly, no MCMC
- Diffusion model guidance: change your prior or likelihood without retraining
- NPE for IID data without embedding nets: via MCMC, VI & importance sampling.
sbi v.0.26.1 is out ๐. We initially planned this release for January, but then the Grenoble Hackathon and GSoC applications happened. Now we have three new methods, better neural nets, cleaner internals, better docs, and 9 new contributors ๐ค.
Highlights below ๐งต
GSoC really is a win-win: students gain OSS experience with dedicated mentorship, while open source projects get substantial improvements. Thanks to Massimiliano and Abel, to Manuel for mentoring and @janboelts.bsky.social for mentoring and coordination, and of course GSoC and @numfocus.bsky.social๐
Abel @abelabate.bsky.social improved our codebase with software engineering best practices: strong typing via dataclasses, clearly defined interfaces using protocols, and systematic refactoring. Multiple PRs transformed our internal architecture for better maintainability and developer experience.
This was a massive undertaking: masked transformers, score-based diffusion variants, comprehensive tests & tutorials - all integrated into sbi's existing API. Kudos to Massimiliano for navigating this complexity!
@nmaax.bsky.social implemented the SIMFORMER. Simformer combines transformers + diffusion models to learn arbitrary conditioning between parameters and data, enabling "all-in-one" SBI: posterior/likelihood estimation, predictive sampling, and even missing data imputation ๐
๐ sbi participated in GSoC 2025 through @numfocus.bsky.social and it was a great success: our two students contributed major new features and substantial internal improvements: ๐งต ๐
Materials from my EuroSciPy talk "Pyro meets SBI" are now available: github.com/janfb/pyro-meets-sbi
I show how we can use @sbi-devs.bsky.social-trained neural likelihoods in pyro ๐ฅ
Check it out if you need hierarchical Bayesian inference but your simulator / model has no tractable likelihood.
This release brought together 14 first-time contributors with our core team.
A big shout out to the community and everyone contributing to this release ๐ ๐
๐ฆ (uv) pip install sbi --upgrade
๐ป Join us: github.com/sbi-dev/sbi
Full Changelog: github.com/sbi-dev/sbi/...
Two more highlights: Your sbi-trained NLE can now be wrapped into a Pyro model object for flexible hierarchical inference. And based on your feedback, we added to(device) for priors and posteriorsโswitching between CPU and GPU is now even easier!
6/7
Here's where it gets wild: we unified flow matching (ODEs) and score-based models (SDEs). Train with one, sample with the other. E.g., train with the flexibility and stability of flow-matching, then handle iid data with score-based posterior sampling. ๐คฏ
5/7
We completely rebuilt our documentation! Switched to Sphinx for a cleaner, more modular structure. No more wading through lengthy tutorialsโnow you get short, targeted how-to guides for exactly what you need, plus streamlined tutorials for getting started.
๐ sbi.readthedocs.io/en/latest/
4/7
New inference methods: MNPE now handles mixed discrete and continuous parameters for posterior estimation (like MNLE but for posteriors).
And for our nostalgic users: we finally added SNPE-B, that classic sequential variant you've been asking about since 2020.
3/7
After the creative burst of the hackathon in March, we spent months cleaning up, testing, and polishing. Re-basing ten exciting feature branches into main takes timeโbut the result is worth it.
2/7
From hackathon to release: sbi v0.25 is here! ๐
What happens when dozens of SBI researchers and practitioners collaborate for a week? New inference methods, new documentation, lots of new embedding networks, a bridge to pyro and a bridge between flow matching and score-based methods ๐คฏ
1/7 ๐งต
More great news from the SBI community! ๐
Two projects have been accepted for Google Summer of Code under the NumFOCUS umbrella, bringing new methods and general improvements to sbi. Big thanks to @numfocus.bsky.social, GSoC and our future contributors!
A wide shot of approximately 30 individuals standing in a line, posing for a group photograph outdoors. The background shows a clear blue sky, trees, and a distant cityscape or hills.
Great news! Our March SBI hackathon in Tรผbingen was a huge success, with 40+ participants (30 onsite!). Expect significant updates soon: awesome new features & a revamped documentation you'll love! Huge thanks to our amazing SBI community! Release details coming soon. ๐ฅ ๐
๐ Exciting news! We are lauching an sbi office hour!
Join the sbi developers Thursdays 09:45-10:15am CET via Zoom (link: sbi Discord's "office hours" channel).
Get guidance on contributing, explore sbi for your research, or troubleshoot issues. Come chat with us! ๐ค
github.com/sbi-dev/sbi/...
sbi 0.24.0 is out! ๐ This comes with important new features:
- ๐ฏ Score-based i.i.d sampling
- ๐ Simultaneous estimation of multiple discrete and continuous parameters or data.
- ๐: mini-sbibm for quick benchmarking.
Just in time for our 1-week SBI hackathon starting tomorrow---stay tuned for more!
๐ Please help us improve the SBI toolbox! ๐
In preparation for the upcoming SBI Hackathon, weโre running a user study to learn what you like, what we can improve, and how we can grow.
๐ Please share your thoughts here: forms.gle/foHK7myV2oaK...
Your input will make a big differenceโthank you! ๐
What to expect:
- Coding sessions to enhance the sbi toolbox
- Research talks & lightning talks
- Networking & idea exchange
๐ In-person attendance is encouraged but a remote option is available.
It's free to attend, but seats are limited. Beginners are welcome! ๐ค
Letโs push SBI forwardโtogether! ๐
๐ Join the 4th SBI Hackathon! ๐
The last SBI hackathon was a fantastic milestone in forming a collaborative open-source community around SBI. Be part of it this year as we build on that momentum!
๐
March 17โ21, 2025
๐ Tรผbingen, Germany or remote
๐ Details: github.com/sbi-dev/sbi/...
More Info:๐งต๐
๐ Huge thanks to our contributors for this release, including 5 first-time contributors! ๐
Special shoutout to:
emmanuel-ferdman, CompiledAtBirth, tvwenger, matthewfeickert, and manuel-morales-a ๐
Let us know what you think of the new version!
โจ Highlights in v0.23.3:
- sbi is now available via condaforge ๐ ๏ธ
- we now support MCMC sampling with multiple i.i.d. conditions ๐ฏ (this is for you, decision-making researchers)
๐ก Plus, improved docs here and there, clarified SNPE-A behavior, and a couple of bug fixes.
๐ Just in time for the end of the year, weโve released a new version of sbi!
๐ฆ v0.23.3 comes packed with exciting features, bug fixes, and docs updates to make sbi smoother and more robust. Check it out! ๐
๐ Full changelog: github.com/sbi-dev/sbi/...