In other work, we investigate metalearning as a way to implement these ideas. The advantage being that a generative model can directly learn the conditional distribution of interest, without a bottleneck of approximate inference!
For more on that, see bsky.app/profile/anis... 3/3
Posts by Mark van der Wilk
This does lead to the question, what models should we use, and how should we do inference?
We use a VAE with Gaussian Process mappings (GPLVM), but the idea applies equally to Bayesian NNs, if inference can be made to work! 2/3
More in our investigation of using Bayesian Model Selection for Causal Discovery: Multivariate Graphs.
As previously, the message is: Causal discovery requires assumptions, and Bayes enables soft, realistic assumptions. Good Bayesian inference then leads to good performance. 1/3
Today at NeurIPS, weโll be presenting our Noether's Razor paper! ๐โจ
๐
Today Fri, Dec 13
โฐ 11 a.m. โ 2 p.m. PST
๐ East Exhibit Hall A-C, #4710 (ALL the way in the back I believe!)
w/ @mvdw.bsky.social @pimdh.bsky.social
Come say hi! ๐