Advertisement ยท 728 ร— 90

Posts by Adrian Seyboldt

Nutpie

๐Ÿฅง nutpie got a website now! pymc-devs.github.io/nutpie/
If you're doing Bayesian inference with PyMC or Stan, this might be worth checking out. Nutpie can sample PyMC and Stan model, and typically twice as fast.
#BayesianStats #PyMC #Stan

10 months ago 5 0 0 1
PyTensor Chat โ€” egglog Python documentation

And we also experimented with egg a bit: egglog-python.readthedocs.io/latest/expla...

10 months ago 1 0 1 0

Do you somewhere have a write-up of how that works on an example? I can't think of a reason we couldn't do the same thing in pymc with pytensor? After all, we also have the model graph in a data structure and can analyze and modify it.

10 months ago 0 0 2 0

Had a lot of fun on the podcast! Hope it is as much fun listening to it as it was recording :-)

10 months ago 7 0 1 0

Cool stuff, will have to do some reading :-) If you want to add a sampler to this, would be fun to combine it with nuts-rs.

10 months ago 1 0 1 0

pytensor (and with it pymc) will use many of those helpers automatically using rewrites if appropriate, even if you write naive code. It doesn't always catch everything, so it is still good to know about them, but it can help beginners a lot.

10 months ago 4 0 0 0

I found that using zero-sum constrained regression values and then taking the softmax to map that to the simplex usually is very nice to work with.

1 year ago 1 0 0 0
Advertisement

My first instinct about how to model this isn't to use a MvNormal, but maybe to have one scalar variable for the total volume, and then do a regression on the simplex that tells you what ratio of the total volume is in which region?

1 year ago 2 0 1 0

I don't get it, what's so strange about that quoted sentence? A bit pretentious? But if you turn all nouns *and verbs" into "something", how would any sentence survive?

1 year ago 0 0 0 0

You could also do `use std::ops::Neg; num.ln().neg().ln().neg()`, not sure I'd really like to read it that way unless it is in a longer postfix chain anyway...
I sometimes just write `f64::ln(num)` though. Bit verbose with the type all the time, but I don't think it's too bad.

1 year ago 2 0 0 0

Funny, I would not want to go from arviz/xarray (with properly chosen dims and coords) to a dataframe. The only time I do that is if I want to make a plot with seaborn, but that's simply a `values.to_dataframe()` call away...

1 year ago 2 0 1 0

I'm here too :-)

1 year ago 3 0 0 0

I'd also love to be part of the list :-)

1 year ago 1 0 1 0
CUDA semantics โ€” PyTorch 2.5 documentation A guide to torch.cuda, a PyTorch module to run CUDA operations

You can do this easily in pytorch: pytorch.org/docs/stable/...
Also seems to work with onnx (github.com/pymc-devs/nu...)
But for some reason I can't find any references in the jax docs. I'm really confused by this by the way, and maybe I just misunderstand something...

1 year ago 1 0 0 0
Advertisement

I don't think you would have to write a kernel. The main problem with nuts on the gpu seems to be that the gpu waits while we check the turning criterion. But we could easily keep the GPU busy during that time with a different chain. And cuda streams are a mechanism for exactly this.

1 year ago 2 0 1 0

Really cool :-)
One thing that has always bugged me in jax is that I can't find a way to use multiple cuda streams. I think at least a part of the NUTS overhead goes away if different chains run in different streams, so that the GPU doesn't have to sit around idle when a different chain could run.

1 year ago 1 0 1 0