... @k-neklyudov.bsky.social, @yoshuabengio.bsky.social, @alextong.bsky.social,
@francesarnold.bsky.social, and Cheng-Hao Liu at @caltech.edu and @mila-quebec.bsky.social 🤗
Posts by Marta Skreta
So grateful for this incredible collaboration with @jarridrb.bsky.social, @tlambert99.bsky.social @daro9000.bsky.social ky.social, Yueming Long, Zi-Qi Li, Xi Zhang, @mirunacretu.bsky.social, @francescazfl.bsky.social, Tanvi Ganapathy, Emily Jin, @joeybose.bsky.social, Jason Yang, ...
Not only are the enzymes functional, they're also evolvable starting points. One round of mutagenesis on dCT-H11 yielded a 4x activity increase for spirocyclopropanation and even inverted stereoselectivity. The chemistry nature never explored is now within reach! 🌍
Take the top design, dCT-H11. Its closest structural match (PDB 3CRJ) is a non-enzymatic transcription factor from a Dead Sea extremophile. DISCO completely repurposed this fold for carbene chemistry with a novel active-site geometry and very low sequence identity. 🦠🔬
Perhaps the most interesting property? These active sites don't exist in nature. When searched against 200M+ structures in the AlphaFold Database, the majority of DISCO's generated binding motifs have no close natural homologs.
The success continued with B–H insertion. A single DISCO design achieved 5,170 TTN, outperforming three rounds of laboratory directed evolution by over 2x. 🤯
It mastered selective C(sp³)–H insertion, one of the most challenging transformations in organic chemistry. A single computational step reached 2,360 TTN, exceeding the endpoint of a previous 14-round directed evolution campaign. 🎯
The ultimate test is the wet lab. 🧪 DISCO was challenged to design enzymes for carbene-transfer reactions—chemistry alien to the natural world.
Because DISCO generates sequence & structure together, it unlocks multimodal inference-time steering. 🧭 Deriving multimodal Feynman-Kac Correctors, DISCO steers generation on the fly—like forcing the creation of dense disulfide bonds (FKC-MM) or binding a target while avoiding a decoy (FKC-SG). 🎯🚫
How does it work? DISCO aligns sequence & structure bidirectionally via cross-modal recycling, self-correction, and noisy guidance. It also introduces an entropy-adaptive sequence temperature to properly balance information across modalities during generation! ⚖️
Introducing the 💃🕺Studio-179 benchmark 🕺 💃: DISCO outperforms baselines on 178/179 targets, along with sequence-specific DNA and RNA binders! 📈 It also shines in unconditional design, generating highly diverse, novel, and co-designable proteins.
Evolution is an amazing chemist, but the reactions it has explored represent a remarkably narrow slice of what is possible. Existing AI models require predefined theozymes & generate backbones before sequences. DISCO generates both sequence & structure simultaneously without the need for theozymes.
14 rounds of directed evolution and over a year of wet lab work. That's what it took to engineer an enzyme for selective C(sp³)–H insertion, one of the most challenging transformations in organic chemistry. DISCO surpasses this with a single plate.
What if AI could invent enzymes that nature hasn’t seen? 👩🔬🧑🔬
Introducing 🪩 DISCO: Diffusion for Sequence-structure CO-design
📝 Blog: disco-design.github.io
📄 Paper: arxiv.org/abs/2604.05181
💻 Code: github.com/DISCO-design...
📄📄📄 The AI4Mat-NeurIPS-2025 workshop is now open for submissions until August 22, 2025 (AOE)!
Consider submitting full-length papers or shorter-length findings. We also have a special track for papers on benchmarking AI for materials design.
sites.google.com/view/ai4mat/...
Singapore EXPO sign outside of the convention center where ICLR was held.
Co-organizer photo with Santiago, myself, and Marta at the end of the workshop.
Thanks to all the speakers and participants for the engaging discussions today, and for making the AI4Mat@ICLR 2025 workshop a great success! Thanks also to Santiago, @martaowesyou.bsky.social, and the rest of the co-organizing team for the effort putting this together. Great to be a part of it!
🚀 Looking for reaction conditions that work well for multiple substrates? CurryBO can help🍛
Now out on arXiv: arxiv.org/abs/2502.18966
A short explanation thread 👇
We at @digital-discovery.bsky.social are very happy to announce a new paper type called "Commit". Inspired by version control systems such as git, the idea is that if you have an update on a short and pointed publication, you can send it as a commit. We envision commits to be co-cited with the
Excited to have #selfdrivinglaboratories listed as one of the seven technologies to watch in 2025 by @nature.com Thanks to the #matterlab, the @accelerationc.bsky.social and of course all the global community on SDLs! @uoft.bsky.social @vectorinst.bsky.social l
www.nature.com/articles/d41...
"The Superposition of Diffusion Models using the Îto Density Estimator" (@martaowesyou.bsky.social, @lazaratan.bsky.social et al.)
It's nice to see an easy-to-compute log-likelihood estimator for SDE sampling of diffusion models (not just ODE)
📄 arxiv.org/abs/2412.17762
🐍 github.com/necludov/sup...
New paper just dropped! How do you combine pre-trained diffusion models without having to train a new one 🤓?
Turns out you can use our all new Ito density estimator 🔥 to compute densities under a diffusion model efficiently 🚀!
Work with an absolute dream of a team: @lazaratan.bsky.social @joeybose.bsky.social @alextong.bsky.social and @k-neklyudov.bsky.social 🤗🚀⚡️
📄Paper: arxiv.org/abs/2412.17762
💻Code: github.com/necludov/sup...
🤗HuggingFace: huggingface.co/superdiff
Super super excited to share our work SuperDiff 🦹♀️ for superimposing pretrained diffusion models at inference time 💪
Check out the 🧵 to see how we superimposed proteins as well as images, all thanks to a fast new density estimator. Curious to see what 🍩 & 🗺️ would produce?
exciting new workshop announcement!! join us in Singapore for Frontiers in Probabilistic Inference: Learning Meets Sampling 🌏⚡️😃 details below 👇 #ICLR2025
3/3 🧵Work with my fellow Matter Lab quantum tunnelers: Philipp Schleich, Lasse B. Kristensen, @rovargash.bsky.social, @aspuru.bsky.social
📜 https://buff.ly/4iu8w6M
💻 https://buff.ly/3D9HaD2
See you at NeurIPS!
2/3 🧵Here, we enter the quantumverse with deep equilibrium networks, where we demonstrate that a single layer of a quantum circuit can perform as well or better than multiple independent layers when trained like a DEQ.
1/3 🧵Quantum conundrum: we want expressive circuits, but current hardware only allows short coherence times, and so more parameters = more problems. Check out our #NeurIPS2024 paper “Quantum Deep Equilibrium Models”.