Advertisement ยท 728 ร— 90

Posts by Viacheslav Borovitskiy (Hiring PhD Students)

Post image

Bonus: a concrete success story. In the traffic prediction benchmark below, a Geometric GP (powered by GeometricKernels) significantly outperforms GNN Ensembles and Bayesian GNNs in both prediction (RMSE) and uncertainty (NLL) quality. Reproduce it with github.com/vabor112/pem...

1 month ago 1 0 0 0

๐Ÿ’ป GitHub: github.com/geometric-ke...
๐Ÿ“„ JMLR Paper: www.jmlr.org/papers/v26/2...

#MachineLearning #GeometricDeepLearning #GaussianProcesses #Kernels #Graphs #Manifolds #JMLR #OpenSource

1 month ago 3 0 1 0

Huge thanks to my co-authors Peter Mostowsky, @dvinnie.bsky.social, Iskander Azangulov, Noรฉmie Jaquier, @mjhutchinson141.bsky.social, Aditya Ravuri, Leonel Rozo, @avt.im, all contributors and all users!

1 month ago 0 0 1 0

โœ… Spaces: Graphs, Meshes, Hyperspheres, Tori, Hyperbolic spaces, SPD matrices, & Lie Groups (SO(n), SU(n)).
โœ… Infrastructure: Run seamlessly on PyTorch, JAX, TensorFlow, or NumPy.
โœ… Integrations: Plug-and-play wrappers for GPyTorch & GPJax.

1 month ago 0 0 1 0

What's in the library? GeometricKernels gives you the principled Heat (Diffusion) and Matรฉrn kernels needed to build these models out-of-the-box.

1 month ago 0 0 1 0

Why care about kernels in 2026? To build probabilistic models on geometric domains! Kernels drive Gaussian processes. While they famously struggle in high dim-s (images/text), many geometric domains are intrinsically low-dim, making these models shine (road networks/3D surfaces).

1 month ago 0 0 1 0
Post image

Happy to share a major milestone: after years of development, we are officially launching Version 1.0 of the GeometricKernels library!

To top it off, our accompanying paper has just been published in JMLR (MLOSS)! ๐ŸŽ‰

github.com/geometric-ke...

1 month ago 48 12 1 0

Note: I am also recruiting through @ellis.eu PhD program.

5 months ago 1 0 0 0
Viacheslav Borovitskiy personal page

Details: vab.im/vacancies/.

5 months ago 1 0 1 0
Post image

I am hiring a fully-funded #PhD in #ML to work at the University of Edinburgh on ๐ ๐ž๐จ๐ฆ๐ž๐ญ๐ซ๐ข๐œ ๐ฅ๐ž๐š๐ซ๐ง๐ข๐ง๐  and ๐ฎ๐ง๐œ๐ž๐ซ๐ญ๐š๐ข๐ง๐ญ๐ฒ ๐ช๐ฎ๐š๐ง๐ญ๐ข๐Ÿ๐ข๐œ๐š๐ญ๐ข๐จ๐ง.

Application deadline: 31 Dec '25. Starts May/Sep '26.
Details in the reply.

Pls RT and share with anyone interested!

5 months ago 16 7 1 3
Advertisement

Presenting today at #ICLR2025!
Poster session 1, 10:00-12:30, #427.
Oral Session 2F 16:18-16:30.

iclr.cc/virtual/2025...

11 months ago 8 2 0 0

Amazingly, Kacper did the bulk of the work for this ICLR oral as his undergrad thesis ๐Ÿ’ช

1 year ago 4 0 0 0
Schematic illustration of a scalar-valued residual deep GP with L hidden layers. The last layer is a scalar-valued GP on the manifold. If it is not present, the model is manifold-valued. If it is replaced with a Gaussian vector field (GVF), the model is a vector field on the manifold.

Schematic illustration of a scalar-valued residual deep GP with L hidden layers. The last layer is a scalar-valued GP on the manifold. If it is not present, the model is manifold-valued. If it is replaced with a Gaussian vector field (GVF), the model is a vector field on the manifold.

Excited to share our ICLR 2025 oral "Residual Deep Gaussian Processes on Manifolds"!

With @vabor112.bsky.social & @arkrause.bsky.social, we introduce manifold-to-manifold GPs that can be composed together, generalising deep GPs to manifolds. Applications include wind prediction & Bayes opt! 1/n

1 year ago 37 9 1 2

๐Ÿš€ ๐๐ก๐ƒ ๐Ž๐ฉ๐ฉ๐จ๐ซ๐ญ๐ฎ๐ง๐ข๐ญ๐ฒ ๐š๐ญ ๐ˆ๐ฆ๐ฉ๐ž๐ซ๐ข๐š๐ฅ

Looking for my first PhD student

๐Ÿ”ฌ Project: ๐ˆ๐ฆ๐ฉ๐š๐œ๐ญ ๐จ๐Ÿ ๐ž๐ง๐ฏ๐ข๐ซ๐จ๐ง๐ฆ๐ž๐ง๐ญ๐š๐ฅ, ๐ฏ๐ข๐ซ๐š๐ฅ, ๐›๐ž๐ก๐š๐ฏ๐ข๐จ๐ฎ๐ซ๐š๐ฅ & ๐ฉ๐ฌ๐ฒ๐œ๐ก๐จ-๐ฌ๐จ๐œ๐ข๐š๐ฅ ๐ž๐ฑ๐ฉ๐จ๐ฌ๐ฎ๐ซ๐ž๐ฌ ๐จ๐ง ๐ก๐ž๐š๐ฅ๐ญ๐ก

Joint with LSHTM & UKHSA

See links below.

๐Ÿ“… ๐ƒ๐ž๐š๐๐ฅ๐ข๐ง๐ž: 7 March 2024

๐Ÿ“ฉ Questions? DM me!

#PhD #HealthEquity #ImperialCollege

1 year ago 12 12 1 0
Preview
Gaussian Processes on Cellular Complexes In recent years, there has been considerable interest in developing machine learning models on graphs to account for topological inductive biases. In particular, recent attention has been given to Gau...

If you are interested in Matรฉrn kernels on more general complexes (cellular/higher dimension), check out arxiv.org/abs/2311.01198 by @miniapeur.bsky.social et al.

1 year ago 1 0 0 0
Preview
Hodge-Compositional Edge Gaussian Processes We propose principled Gaussian processes (GPs) for modeling functions defined over the edge set of a simplicial 2-complex, a structure similar to a graph in which edges may form triangular faces. This...

This makes them perform well on tasks ranging from modeling ๐’‚๐’“๐’ƒ๐’Š๐’•๐’“๐’‚๐’ˆ๐’†-๐’‡๐’“๐’†๐’† ๐’Ž๐’‚๐’“๐’Œ๐’†๐’•๐’” to ๐’‘๐’Š๐’‘๐’† ๐’๐’†๐’•๐’˜๐’๐’“๐’Œ๐’” or ๐’๐’„๐’†๐’‚๐’ ๐’„๐’–๐’“๐’“๐’†๐’๐’•๐’” (arxiv.org/abs/2310.19450).

1 year ago 0 0 0 0

Crucially, the resulting kernels support ๐’‚๐’–๐’•๐’๐’Ž๐’‚๐’•๐’Š๐’„ ๐’“๐’†๐’๐’†๐’—๐’‚๐’๐’„๐’† ๐’…๐’†๐’•๐’†๐’“๐’Ž๐’Š๐’๐’‚๐’•๐’Š๐’๐’ of the relative importance of the three Hodge decomposition parts.

1 year ago 0 0 2 0

On top of these, you can define a ๐‘ฏ๐’๐’…๐’ˆ๐’†-๐’„๐’๐’Ž๐’‘๐’๐’”๐’Š๐’•๐’Š๐’๐’๐’‚๐’ ๐‘ด๐’‚๐’•๐’†ฬ๐’“๐’ ๐’Œ๐’†๐’“๐’๐’†๐’, which is a linear combination of the pure div, pure curl, and harmonic Matรฉrn kernels, each possible with a different set of hyperparameters.

1 year ago 0 0 1 0

This leads to the ๐‘ฏ๐’๐’…๐’ˆ๐’† ๐‘ซ๐’†๐’„๐’๐’Ž๐’‘๐’๐’”๐’Š๐’•๐’Š๐’๐’ which splits any edge flow into three parts: pure divergence (curl-free), pure curl (div-free), and harmonic (curl-free & div-free). This allows three different Matรฉrn kernels , one for each part.

1 year ago 0 0 1 0
Advertisement

However, a simplicial 2-complex offers much more. Its structure allows characterizing key properties of edge flows using the discrete concepts of divergence (๐’…๐’Š๐’—) and ๐’„๐’–๐’“๐’, measuring how edge flows diverge at nodes and circulate along faces.

1 year ago 0 0 1 0
Preview
Matรฉrn Gaussian Processes on Graphs Gaussian processes are a versatile framework for learning unknown functions in a manner that permits one to utilize prior information about their properties. Although many different Gaussian process m...

Any simplicial 2-complex comes with a ๐‘ฏ๐’๐’…๐’ˆ๐’† ๐‘ณ๐’‚๐’‘๐’๐’‚๐’„๐’Š๐’‚๐’ matrix, which can be used to define Matรฉrn kernels on its edge set in exactly as in the Matรฉrn GPs on Graphs paper (arxiv.org/abs/2010.15538).

1 year ago 0 0 1 0

However, if you want to keep it simple and think about graphs rather than simplicial 2-complexes, the interface allows it: the library can define a reasonable set of triangles for you, for any graph you provide.

1 year ago 0 0 1 0

First, a bit about theory. Mathematically, the kernels are defined on the edge set of a ๐’”๐’Š๐’Ž๐’‘๐’๐’Š๐’„๐’Š๐’‚๐’ 2-๐’„๐’๐’Ž๐’‘๐’๐’†๐’™, i.e. a graph along with a set of triangular faces formed by some of its edges.

1 year ago 0 0 1 0
Post image

Good news! GeometricKernels now supports the new ๐‘ฏ๐’๐’…๐’ˆ๐’†-๐’„๐’๐’Ž๐’‘๐’๐’”๐’Š๐’•๐’Š๐’๐’๐’‚๐’ ๐’Œ๐’†๐’“๐’๐’†๐’๐’” for flow-type data on graphs (thnx Maosheng Yang).

Example notebook: geometric-kernels.github.io/GeometricKer....
For the theory behind, see arxiv.org/abs/2310.19450.

Some details below.

1 year ago 7 1 1 0
Preview
Uncertainty in multivariate, non-Euclidean, and functional spaces: theory and practice - Isaac Newton Institute Contemporary sciences abound in various complex data types (beyond the classical vector description) including graphs, rankings, manifolds, time series, sets,...

Link: newton.ac.uk/event/rclw01/

1 year ago 0 0 0 0
Post image

Check out the workshop "Uncertainty in multivariate, non-Euclidean, and functional spaces: theory and practice"

Where: Isaac Newton Institute, Cambridge
When: 6-9 May 2025

Application deadline: 12 Jan 2025. Link in reply
(contributed talks/posters are welcome)

I will be there!

1 year ago 10 3 1 1
Post image

The NeurIPS Workshop on Bayesian Decision-making and Uncertainty has started - our first talk is by @mvdw.bsky.social!

Join us at East Meeting Room 8, 15, or online!

1 year ago 41 6 1 0
Post image

We are organising the First International Conference on Probabilistic Numerics (ProbNum 2025) at EURECOM in southern France in Sep 2025. Topics: AI, ML, Stat, Sim, and Numerics. Reposts very much appreciated!

probnum25.github.io

1 year ago 46 24 3 6
Advertisement