Advertisement · 728 × 90

Posts by lebellig

Post image Post image

While it is intuitively clear that straighter trajectories should reduce discretization error when integrating an ODE (for instance, in flow matching), I could not find a precise bound. I therefore rewrote the proof of Cauchy-Lipschitz to make this explicit. github.com/gpeyre/Discr...

4 days ago 25 6 1 0
Post image

I wrote a short mathematical companion tutorial to my notebook on discrete diffusion models. It gives an informal derivation of the connection between maximum likelihood estimation of the backward transition kernel and denoising score matching. github.com/gpeyre/Discr...

1 week ago 39 5 0 0
Video

What if AI could invent enzymes that nature hasn’t seen? 👩‍🔬🧑‍🔬

Introducing 🪩 DISCO: Diffusion for Sequence-structure CO-design

📝 Blog: disco-design.github.io
📄 Paper: arxiv.org/abs/2604.05181
💻 Code: github.com/DISCO-design...

1 week ago 54 18 1 7
DISCO — Teaching AI to Invent Enzymes Nature Never Imagined DISCO is a multimodal generative model that co-designs protein sequence and 3D structure to create entirely new enzymes for reactions never seen in biology.

Two research blog posts that look really interesting 👀

"Teaching AI to Invent Enzymes Nature Never Imagined", DISCO: Diffusion for Sequence-structure CO-design disco-design.github.io

"How to Generate Text in One Step", Flow Map Language Models one-step-lm.github.io/blog/

1 week ago 2 0 0 0
Preview
PoM: A Linear-Time Replacement for Attention with the Polynomial Mixer This paper introduces the Polynomial Mixer (PoM), a novel token mixing mechanism with linear complexity that serves as a drop-in replacement for self-attention. PoM aggregates input tokens into a comp...

🚨 arxiv.org/abs/2604.06129

PoM: A Linear-Time Replacement for Attention with the Polynomial Mixer

This paper is the result of doing a lab-wide hackathon on an idea I've had for some time. Probably the paper with the highest number of authors I've ever done.

It's a CVPR Findings 26.

Thread 🧵👇

1 week ago 58 18 4 2
Preview
Muon Dynamics as a Spectral Wasserstein Flow Gradient normalization is central in deep-learning optimization because it stabilizes training and reduces sensitivity to scale. For deep architectures, parameters are naturally grouped into matrices ...

For those interested in normalized gradient methods and optimal transport: I introduce a new class of "spectral" Wasserstein distances for which spectrally normalized gradient descent (Muon but without momentum and small step size ...) is a spectral-W gradient flow: arxiv.org/abs/2604.04891

1 week ago 24 10 1 0
Post image

We're excited to provide more information about our upcoming annual workshop - The 2026 Nordic Workshop on AI for Climate (climateainordics.com/events/2026-...) - to be held on June 26th, 2026 at University of Copenhagen!

Registration link coming very soon!

3 weeks ago 3 4 0 1
Jacobi Fields in Machine Learning — Olga Zaghen An intuitive introduction to Jacobi fields and their applications in machine learning on Riemannian manifolds.

🔮 Working on ML on curved manifolds? Don't miss out on Jacobi Fields! 🔮

I wrote a quick, highly visual and hopefully accessible introduction to the topic: "Jacobi Fields in Machine Learning" 🤠 Check it out here: olgatticus.github.io/blog/jacobi-...!

4 weeks ago 13 4 1 1

Today NeurIPS is announcing our official satellite event in Paris.

After responding to the call from Ellis following the success of EurIPS in December, we are pleased to reach a new milestone by joining forces with the NeurIPS organizing committee for the 2026 edition.

4 weeks ago 89 32 1 9
Post image Post image Post image

Self-Supervised Flow Matching for Scalable Multi-Modal Synthesis by Hila Chefer et al. (arxiv.org/abs/2603.06507).
New SSL loss for flow matching that encourages meaningful representation learning without relying on an external visual encoder for alignment. Improves generation on many modalities.

1 month ago 3 0 0 0
Advertisement
Video

📢 We’re launching Proteina-Complexa — and after the Jensen keynote mention, we definitely had to post this thread now ;)
Atomistic binder design with generative pretraining + test-time compute, plus large-scale wet-lab validation.
Project page: research.nvidia.com/labs/genair/...
🧵 1/n

1 month ago 37 16 1 3
Sander Dieleman -  Diffusion models for image and video generation | ML in PL 2025
Sander Dieleman - Diffusion models for image and video generation | ML in PL 2025 YouTube video by ML in PL

In October, I gave a talk at ML in PL in Warsaw: a whirlwind tour of what goes into training image and video generation models at scale.

📺 video: www.youtube.com/watch?v=qFIT...
🖼️ slides: docs.google.com/presentation...

1 month ago 17 6 0 0
Preview
INGÉNIEUR·E DE RECHERCHE : MODÈLES GÉNÉRATIFS PROFONDS POUR L'IMAGERIE SATELLITAIRE - CDD 12 MOIS Famille: Recherche et Enseignement Type de contrat : CDD lié à convention Catégorie: A Lieu de travail : Champs-sur-Marne Réf. 20260313-1590

📢 Je recrute : ingé ou postdoc (12 mois)

➡️ www.ign.fr/nous-rejoind...

Venez entraîner des grands modèles génératifs pour le bien commun :
🗺️ données ouvertes (images aériennes/satellites)
🏞️ application au suivi du changement climatique et à la gestion des catastrophes naturelles

#lastig #ign

1 month ago 7 6 0 0
Post image Post image Post image

The Spacetime of diffusion models: an information geometry perspective by Rafał Karczewski et al. (arxiv.org/abs/2505.17517)
blog: rafalkarczewski.github.io/blog/2026/di...

Geodesics in the (xt, t) spacetime of diffusion models -> new distance between clean data points + transition path sampling!!

1 month ago 6 0 0 0
Post image Post image Post image

I have added a new tutorial on discrete diffusion models:
github.com/gpeyre/ot4ml

1 month ago 57 17 0 0

I had a draft about how wild the pace of new generative models was… written two months ago. It’s already outdated. Somehow, things are moving even faster now... (and yes I’m back to posting about generative models)

2 months ago 3 0 0 0
Preview
The unification of representation learning and generative modelling A deep dive into the convergence of discriminative and generative AI, covering 4 phases of evolution from REPA to RAE and beyond.

Too many REPA / RAE / representation alignment papers lately?
I was lost too, so I wrote a blog post that organizes the space into phases and zooms in on what actually matters for general/molecular ML.
Curious what folks think - link below!

🔗 Blog: kdidi.netlify.app/blog/ml/2025...

2 months ago 11 2 0 0

My first impression is that it will look like GANs for inverse problems but maybe there is something to do with the training drift term

2 months ago 1 0 1 0
Preview
Generative Modeling via Drifting Generative modeling can be formulated as learning a mapping f such that its pushforward distribution matches the data distribution. The pushforward behavior can be carried out iteratively at inference...

imo the original paper (arxiv.org/abs/2602.04770) is well written, but there are already some implementations/blog posts about it (github.com/Algomancer/M...)

2 months ago 3 2 1 0
Advertisement

🔳 Discrete drifting models
🔳 Riemannian drifting models
🔳 Optimal Transport drifting models
🔳 Image2image drifting models
🔳 Time-dependent drifting models (tricky one)
🔳 Adversarial drifting models
🔳 Wasserstein drifting models
🔳 Variational drifting models
🔳 Functional drifting models

2 months ago 6 0 3 0

Very cool PhD project on generative models for dense detection of rare events in Earth Observation 🌍🌱

Nicolas has been my supervisor for the last 3 years, highly recommend doing a PhD with him!

2 months ago 2 0 0 0
Preview
26-252 Dense Detection of Rare Events in Remote Sensing Using Generative Models Offre d’emploi 26-252 Dense Detection of Rare Events in Remote Sensing Using Generative Models au CNES à 75003 Paris !

📢 Fully funded PhD - 🌍 Dense Detection of Rare Events in Remote Sensing using Generative Models

Leverage generative models, unsupervised segmentation and explainability techniques to map disasters

w/ @javi-castillo.bsky.social and Flora Weissgerber

Apply ⤵️
recrutement.cnes.fr/fr/annonce/4...

2 months ago 8 8 1 1

Is it a vscode plugin?

2 months ago 0 0 1 0
Post image Post image Post image

Meta Flow Maps enable scalable reward alignment, Peter Potaptchik et al. (arxiv.org/abs/2601.14430)

This article introduces Meta Flow Maps: a stochastic generalization of consistency models (one-step generation) that allows efficient reward steering at inference time or during fine-tuning.

2 months ago 5 0 0 0
Post image

I'm excited to open the new year by sharing a new perspective paper.

I give a informal outline of MD and how it can interact with Generative AI. Then, I discuss how far the field has come since the seminal contributions, such as Boltzmann Generators, and what is still missing

3 months ago 19 5 1 1

Should we ban Brian Eno from bandcamp?

3 months ago 4 0 0 0
Preview
A Cambridge PhD thesis in three research questions Geometric Deep Learning for Molecular Modelling and Design: A personal scientific journey

New blog 💙: I reflect on why I worked on what I worked on...

I think a PhD is a very special time. You get to challenge yourself, push your boundaries, and grow. My thoughts go against the current AI/academia narrative online, so I hope you find it interesting.

chaitjo.substack.com/p/phd-thesis...

3 months ago 10 2 1 0

Yes estimating distance between distributions with single sample sounds irrelevant. I wonder if flow-based artefacts are sufficiently similar across models with the same FID, allowing us to learn the score predictive model. I may try later!

3 months ago 3 0 0 0

Agree! I wonder if some generation artefacts are signatures that allow to predict the FID score (suppose that they are present in almost all generated images by a given model)

3 months ago 0 0 0 0
Advertisement

You may add the real test (or training 👀) dataset if you are into leaderboard chasing

3 months ago 0 0 0 0