Advertisement · 728 × 90

Posts by Ruby Sedgwick

Trying to train RNNs in a biol plausible (local) way? Well, try our new method using predictive alignment. Paper just out in Nat. Com. Toshitake Asabuki deserves all the credit!
www.nature.com/articles/s41...

8 months ago 57 16 1 0

Excited to be presenting this work at ICML!

Our Bayesian method allows for causal discovery using more flexible assumptions that better reflect real-world data.

Come chat to us: Tues 4:30pm East Exhibition Hall A-B E-1912
Paper: arxiv.org/abs/2411.10154

9 months ago 1 0 0 0
Preview
Weighted-Sum of Gaussian Process Latent Variable Models This work develops a Bayesian non-parametric approach to signal separation where the signals may vary according to latent variables. Our key contribution is to augment Gaussian Process Latent Variable...

(8/8) Check out the paper here: arxiv.org/abs/2402.09122
It was a pleasure working with James Odgers, Chrysoula Kappatou, Ruth Misener and Sarah Filippi on this project! If you are interested in knowing more, we’d love to hear from you.

11 months ago 0 0 0 0

(7/8) We demonstrate this approach on a synthetic test case, a spectroscopy dataset and oil flow data. Compared to baselines like inverse linear model of coregionalisation, classical least squares and partial least squares, WS-GPLVM achieves competitive or better performance.

11 months ago 0 0 1 0

(6/8) This means not only do we get predictions of the component weights, but also a measure of uncertainty in these values. The Bayesian component weights and latent variables make the calculation of the evidence lower bound more challenging, and we show how this can be done.

11 months ago 0 0 1 0

(5/8) At the core of this approach is the idea that each pure signal depends on a latent variable, and these signals combine linearly. We also treat the component weights in a Bayesian way, allowing for the inclusion of useful priors, such as summing-to-one.

11 months ago 0 0 1 0

(4/8) This variability makes the separation task much harder. Most existing methods assume fixed pure signals. We introduce WS-GPLVM - a Bayesian nonparametric model that relaxes those assumptions.

11 months ago 0 0 1 0
Advertisement

(3/8) Take spectroscopy for example: following Beer-Lambert’s law, the observed spectra is a linear combination of the spectra of the pure components, but these pure component spectra vary depending on experimental conditions.

11 months ago 0 0 1 0

(2/8) In many real-world datasets, each observation is a mixture of underlying signals, with no observations of the pure signals. Think: chemical spectra, audio sources, hyperspectral images.
But what happens when the pure signals vary between samples due to some unobserved variables?

11 months ago 0 0 1 0
Illustrative example of WS-GPLVM. This image shows how the model can retrieve the component mixtures, latent variables and pure spectra from a data set where some spectra have known weight fractions and some don't.

Illustrative example of WS-GPLVM. This image shows how the model can retrieve the component mixtures, latent variables and pure spectra from a data set where some spectra have known weight fractions and some don't.

New paper at #AISTATS2025: Weighted-Sum of Gaussian Process Latent Variable Models (WS-GPLVM)
We tackle a core challenge in signal separation where pure components vary nonlinearly across samples using latent variable Gaussian processes. (1/8)

11 months ago 1 0 1 0