๐คน New blog post!
I write about our recent work on using hierarchical trees to enable sparse attention over irregular data (point clouds, meshes) - Erwin Transformer, accepted to ICML 2025
blog: maxxxzdn.github.io/blog/erwin/
paper: arxiv.org/abs/2502.17019
Compressed version in the thread below:
Posts by AMLab
Don's miss this exciting opportunity! ๐ฅ
๐จApplication deadline on the 15th of June
One more week to apply to this exciting position... and another position on #CausalRepresentationLearning and #ReinforcementLearning for learning provably correct #concepts from raw data opening up soon!
New PhD position at the University of Amsterdam in @amlab.bsky.social on learning concepts with theoretical guarantees using #causality and #RL with me, Frans Oliehoek (TU Delft) and Herke van Hoof ๐ฅ
Deadline: 15 June
werkenbij.uva.nl/en/vacancies...
๐ณ Controlled Generation with Equivariant Variational Flow Matching
By @eijkelboomfloor.bsky.social , @zmheiko.bsky.social ,
@sharvaree.bsky.social , @erikjbekkers.bsky.social ,
@wellingmax.bsky.social , @canaesseth.bsky.social *, @jwvdm.bsky.social *
๐ As above - paper TBA soon ๐
๐งต8 / 8
๐ Exponential Family Variational Flow Matching for Tabular Data Generation
By Andrรฉs Guzmรกn-Cordero*, @eijkelboomfloor.bsky.social *,
@jwvdm.bsky.social
๐ This paper will be shared soon - keep your eyes open! ๐คฉ
๐งต7 / 8
โ ๏ธ The Perils of Optimizing Learned Reward Functions: Low Training Error Does Not Guarantee Low Regret
By Lukas Fluri*, @leon-lang.bsky.social *, Alessandro Abate, Patrick Forrรฉ, David Krueger, Joar Skalse
๐ arxiv.org/abs/2406.15753
๐งต6 / 8
๐ SDE Matching: Scalable and Simulation-Free Training of Latent Stochastic Differential Equations
By @gbarto.bsky.social , Dmitry Vetrov, @canaesseth.bsky.social
๐ arxiv.org/abs/2502.02472
๐งต5 / 8
๐ On the Importance of Embedding Norms in Self-Supervised Learning
By Andrew Draganov, @sharvaree.bsky.social , Sebastian Damrich, Jan Niklas Bรถhm, Lucas Maes, Dmitry Kobak, @erikjbekkers.bsky.social
๐ arxiv.org/abs/2502.09252
๐งต4 / 8
โ๏ธ A machine learning approach to duality in statistical physics
By Prateek Gupta, Andrea Ferrari, @nabiliqbal.bsky.social
๐ arxiv.org/abs/2411.04838
๐งต3 / 8
๐ธ Erwin: A Tree-based Hierarchical Transformer for Large-scale Physical Systems
By @maxxxzdn.bsky.social , @jwvdm.bsky.social , @wellingmax.bsky.social
๐ arxiv.org/abs/2502.17019
๐งต2 / 8
Exciting news: AMLab is happy to have 7 papers accepted at #ICML2025! ๐
See the thread below for the full list ๐ and meet us in Vancouver to discuss them further! ๐จ๐ฆ
๐งต1 / 8
๐ฅ 3 Best Paper Awards for AMLab members!
1. Towards Variational Flow Matching on General Geometries by @olgatticus.bsky.social et al.
2. Generative Uncertainty in Diffusion Models by @metodjazbec.bsky.social et al.
3. SDE Matching by @gbarto.bsky.social et al.
Congrats to everyone! ๐ฅ
Test of Time Winner
Adam: A Method for Stochastic Optimization
Diederik P. Kingma, Jimmy Ba
Adam revolutionized neural network training, enabling significantly faster convergence and more stable training across a wide variety of architectures and tasks.
๐คน Excited to share Erwin: A Tree-based Hierarchical Transformer for Large-scale Physical Systems
joint work with @wellingmax.bsky.social and @jwvdm.bsky.social
preprint: arxiv.org/abs/2502.17019
code: github.com/maxxxzdn/erwin
A few weeks ago, I presented SNAP at the wonderful #Bellairs Workshop on Causality in Barbados๐ข
This Friday ๐ซฐmeets ๐ค as I will get to present SNAP again at the kick-off of the newest season of @causalclub.di.unipi.it! Check out this, and their other amazing upcoming talks at causalclub.di.unipi.it
Variational Flow Matching goes Riemannian! ๐ฎ
In this preliminary work, we derive a variational objective for probability flows ๐ on manifolds with closed-form geodesics, and discuss some interesting results.
Dream team: Floor, Alison & Erik (their @ below) ๐ฅ
๐ arxiv.org/abs/2502.12981
๐งต1/5
Congratulations to "my" first PhD student Dr. @rmassidda.it who defended today at the University of Pisa *with honours* a thesis on "Methodological Advancements for Causal Abstraction Learning" ๐
Riccardo is an amazing @ellis.eu PhD co-supervised w/ Davide Bacciu
@ellisamsterdam.bsky.social
๐ก โก๏ธ ๐ง โก๏ธ ๐
๐๐ฅ
Facts ๐๐ป
The Christmas spirit has arrived at AMLab! ๐โจ
Yesterday we kicked off the holidays with a festive group dinner and a fun Secret Santa exchange. ๐
Wishing everyone a restful and joyful winter break and a happy new year! โ๏ธ๐ซ
looks can be deceiving! ;)
Professor Imbens also had a mentoring session with our PhD students actively working on causality, discussing their ideas and the potential impact of their applications! ๐จโ๐ฌ๐ฉโ๐ฌ
@matyasch.bsky.social @roelhulsman.bsky.social @rmassidda.it @danruxu.bsky.social ๐ฅ
Yesterday, we had the honor of hosting 2021 Nobel Laureate in Economics, Guido Imbens, at our Lab! ๐คฉ
We had the chance to attend his inspiring talk on Experimental Design in Marketplaces ๐๐ฉโ๐ป
A big thank you to @smaglia.bsky.social for making this event possible ๐ฅ
๐งโ๐ซNeural Flow Diffusion Models at #NeurIPS2024 tomorrow! Discover how to build learnable noising processes for straight-line generative trajectories end-to-end and without simulations!๐คฏ
๐West Ballroom A-D #6809
โฐFri 13 Dec 4:30 pm โ 7:30 pm
๐https://neurips.cc/virtual/2024/poster/94656
Here are the dapper @metodjazbec.bsky.social and Alexander Timans, presenting their work on early-exit with risk control.
Come see @eijkelboomfloor.bsky.social and @gbarto.bsky.social present their work on variational flow matching now in West Ballroom A-D #7103!
Franรงois Cornet and @gbarto are presenting their work on Equivariant Neural Diffusion now at East Exhibition #2403!
Come talk to @zmheiko.bsky.social about his work on sample-efficient black box variational inference! (East Exhibition Hall #4105)