Posts by Thomas Pinder
If you maintain downstream code on top of GPJax, this is the rc to stress-test. Bug reports welcome on GitHub — tag them with the 0.14-migration label so we can triage before cutting the stable release. In the absence of severe issues, we'll be removing the release candidate tag on April 20th.
If you only use the high-level API (Prior, Posterior, fit), upgrading is a set of call-site changes. The migration guide walks through the three that matter: reading parameters with .unwrap(), freezing them with paramax.non_trainable, and dropping the old params_bijection/trainable kwargs from fit.
The main change is that we have swapped the NNX backend for @patrickkidger.bsky.social's Equinox. Additionally, we've integrated paramax for parameter constraints, Lineax for a feature-rich linear algebra layer, and removed custom bijectors in favour of Numpyro constraints.
GPJax 0.14.0rc1 is out!
Together with @theorashid.bsky.social, we've been able to remove 2200 lines of code whilst deepening our connection to the scientific JAX ecosystem, and streamlined our Numpyro integration to allow for bigger Bayesian models to be built.
"Questionable practices in machine learning"
arxiv.org/pdf/2407.12220
Perhaps. Hatch is backed by pypa though, and I believe the two frameworks can (and should) coexist.
Would Hatch be a solution here? Using uv with hatch is simple, and you can group dependencies. With ‘skip-install=true’ you can achieve the above.
We migrated to Hatch a while back in GPJax and I’ve been pleased. github.com/JaxGaussianP...
Any thoughts or feedback are welcome!
I got a Probabilistic Programming starter pack going. Hit me up if you're involved with #probprog R&D and want in!
go.bsky.app/JfvubEf
Hey! I write GPJax and some ProbProg JAX software. I’d love to be part of the list.
Hey! I’d love to be added please
Thanks for creating! Would love to be added