Advertisement · 728 × 90

Posts by Luca Scimeca

We explore how to train conditional generative models to sample molecular conformations from their Boltzmann distribution — using only a reward signal.

9 months ago 0 0 0 0

📌 GenBio Workshop

Torsional-GFN: A Conditional Conformation Generator for Small Molecules

👥 Authors

Lena Néhale Ezzine*, Alexandra Volokhova*, Piotr Gaiński, Luca Scimeca, Emmanuel Bengio, Prudencio Tossou, Yoshua Bengio, and Alex Hernández-García

(* equal contribution)

9 months ago 0 0 1 0

Read the paper here:

arxiv.org/pdf/2502.06999

9 months ago 0 0 0 0

• Works out-of-the-box with large priors like StyleGAN3, NVAE, Stable Diffusion 3, and FoldFlow 2.
• Unifies constrained generation, RL-with-human-feedback, and protein design in a single framework.
• Outperforms both amortized data-space samplers and traditional MCMC across tasks.

9 months ago 0 0 0 0

• We show how to turn any pretrained generator (GAN, VAE, flow) into a conditional sampler by training a diffusion model directly in noise space.
• The diffusion sampler is trained with RL
• Noise-space posteriors are smoother, giving faster, more stable inference.

9 months ago 0 0 0 0

👥 Where you’ll find our work:

📌 Main Track

Outsourced Diffusion Sampling: Efficient Posterior Inference in Latent Spaces of Generative Models

👥 Authors
Siddarth Venkatraman, Mohsin Hasan, Minsu Kim, Luca Scimeca, Marcin Sendera, Yoshua Bengio, Glen Berseth, Nikolay Malkin

9 months ago 2 0 3 0

I’m attending ICML in Vancouver this week!

It’s already been great to connect, chat, and hear about the amazing work happening across the community.

If you’re attending and would like to meet up, feel free to reach out!

(More details below)

#ICML2025 #MachineLearning #AI #DiffusionModels #GenAI

9 months ago 3 1 2 0

🔹 Outsourced Diffusion Sampling: Efficient Posterior Inference in Latent Spaces of Generative Models.
📝 Authors: Siddarth Venkatraman, Mohsin Hasan, Minsu Kim, Luca Scimeca, …, Yoshua Bengio, Nikolay Malkin
paper: arxiv.org/pdf/2502.06999
📍 To be presented at FPI-ICLR2025 & ICLR 2025 DeLTa Workshops

11 months ago 1 0 0 0
Advertisement

🔹 Solving Bayesian Inverse Problems with Diffusion Priors and Off-Policy RL.
📝 Authors: Luca Scimeca, Siddarth Venkatraman, Moksh Jain, Minsu Kim, Marcin Sendera, Mohsin Hasan, …, Yoshua Bengio, Glen Berseth, Nikolay Malkin
📍 To be presented at ICLR 2025 DeLTa Workshop

11 months ago 0 0 0 0

🔹 Mitigating Shortcut Learning with Diffusion Counterfactuals and Diverse Ensembles.
📝 Authors: Luca Scimeca, Alexander Rubinstein, Damien Teney, Seong Joon Oh, Yoshua Bengio
paper: arxiv.org/pdf/2311.16176
📍 To be presented at SCSL @ ICLR 2025 Workshop

11 months ago 0 0 0 0

🔹 Shaping Inductive Bias in Diffusion Models through Frequency-Based Noise Control.
📝 Authors: Thomas Jiralerspong, Berton Earnshaw, Jason Hartford, Yoshua Bengio, Luca Scimeca
paper: arxiv.org/pdf/2502.10236?
📍 To be presented at FPI-ICLR2025 & ICLR 2025 DeLTa Workshops

11 months ago 0 0 0 0

Thrilled to share that we will be presenting 4 papers across 3 workshops at #ICLR2025 in Singapore this week!

If you're attending, let’s connect! Feel free to DM me for more details about the work or potential collaborations.
See you at the venue! 🇸🇬

(More info to follow)

@mila-quebec.bsky.social

11 months ago 2 1 4 0

Thank Alex for his great efforts and work ethic. Thank @damienteney.bsky.social and @lucascimeca.bsky.social for their continued help with this paper. We’ll humbly address the criticisms to improve it further for future opportunities.

1 year ago 5 1 1 0
Preview
GitHub - GFNOrg/diffusion-samplers Contribute to GFNOrg/diffusion-samplers development by creating an account on GitHub.

Come check out our neurips poster today! We will be at West Ballroom #7101 from 4:30pm - 7:30pm.

Website: github.com/gfnorg/diffu...

1 year ago 1 1 0 0

If you're attending, come check out our posters or feel free to reach out to connect during the conference!

Looking forward to insightful conversations and connecting with everyone; See you all at NeurIPS!

#NeurIPS2024 #NIPS24 #MachineLearning #DiffusionModels #Research #AI

1 year ago 0 0 0 0
Advertisement

Amortizing Intractable Inference in Diffusion Models for Bayesian Inverse Problems. Venkatraman, S., Jain, M., Scimeca, L., Kim, M., Sendera, M.,…, Bengio, Y., Malkin, K.

1 year ago 1 0 0 0
Preview
Improved off-policy training of diffusion samplers We study the problem of training diffusion models to sample from a distribution with a given unnormalized density or energy function. We benchmark several diffusion-structured inference methods, inclu...

On Diffusion Models for Amortized Inference: Benchmarking and Improving Stochastic Control and Sampling. Sendera, M., Kim, M., Mittal, S., Lemos, P., Scimeca, L., Rector-Brooks, J., Adam, A., Bengio, Y., and Malkin, N.
arxiv.org/abs/2402.05098

1 year ago 0 0 0 0
LinkedIn This link will take you to a page that’s not on LinkedIn



Amortizing Intractable Inference in Diffusion Models for Vision, Language, and Control. Venkatraman, S., Jain, M., Scimeca, L., Kim, M., Sendera, M.,…, Bengio, Y., Malkin, K.
arxiv.org/abs/2405.20971

1 year ago 0 0 1 0
LinkedIn This link will take you to a page that’s not on LinkedIn

Excited to share that we will be presenting three papers at #NeurIPS2024 this week in Vancouver, pushing forward our work on Diffusion Models!

1 year ago 0 0 3 0

Hi, can I be added to the pack? :)

1 year ago 2 0 1 0