Advertisement · 728 × 90
#
Hashtag

#BitVI

Advertisement · 728 × 90

Cool work done by @sladeka.bsky.social and together with @arnosolin.bsky.social!

Unfortunately, @sladeka.bsky.social has fallen ill and cannot attend @auai.org in person. Please join the virtual poster session at: www.auai.org/uai2025/gath... if you are interested in hearing more about #BitVI.

2 0 1 0
Entropy of BitVI for varying complexity of the target distribution.

Entropy of BitVI for varying complexity of the target distribution.

Bayesian neural networks with varying number of numerical precision.

Bayesian neural networks with varying number of numerical precision.

Moreover, #BitVI can help identify the numerical precision required to represent a target density, a crucial task in the quantisation of neural networks and when working with low-precision regimes.

1 0 1 0
Illustration of fixed-point numbers

Illustration of fixed-point numbers

Binary decision tree induced by BitVI for fixed-point representations.

Binary decision tree induced by BitVI for fixed-point representations.

Resulting circuit model used in BitVI with fixed-point representations.

Resulting circuit model used in BitVI with fixed-point representations.

In #BitVI, we exploit the fixed-point representations of numbers and a tractable variational approximation based on #circuits. Thus, enabling efficient ELBO computation and control of the numerical precision. Additionally, BitVI is easily extendable to other number systems, such as floating-point.

1 0 1 0
BitVI on 1D Gaussian mixture models.

BitVI on 1D Gaussian mixture models.

Remember that computers use bitstrings to represent numbers? We exploit this in our recent @auai.org paper and introduce #BitVI.

#BitVI directly learns an approximation in the space of bitstring representations, thus, capturing complex distributions under varying numerical precision regimes.

22 3 2 0