[6/6] My amazing co-author Weronika Kłos will present MapPFN at the Generative AI in Genomics Workshop (Gen²) at #ICLR2026 in Rio. 🇧🇷
📅 Mon, Apr 27, 2026, 1:10–1:55 PM (UTC-3)
📍 Room 211, Riocentro Convention and Event Center
Posts by Marvin Sextro
[5/6]
🌐 Project Page: marvinsxtr.github.io/MapPFN/
📄 Paper: arxiv.org/pdf/2601.21092
💻 Code: github.com/marvinsxtr/M...
Huge thanks to Weronika Kłos and Gabriel Dernbach for this collaboration, and to @tuberlin.bsky.social, @bifold.berlin , Aignostics, and Charité for their support!
[4/6] A single pre-trained MapPFN adapts to new datasets and arbitrary gene sets. Zero-shot, it recovers differentially expressed genes on par with baselines trained on real single-cell data. Fine-tuned, it consistently outperforms baselines across downstream datasets.
[3/6] MapPFN meta-learns to map pre- to post-perturbation distributions from a synthetic biological prior of in silico gene knockouts, decoupling it from limited experimental data. At inference, it adapts to unseen biological contexts via in-context learning.
[2/6] Single-cell perturbation datasets cover only a tiny slice of possible interventions and cell states. Existing methods cannot leverage new interventional evidence at inference time, forcing them to retrain for every new dataset.
[1/6] How can we build virtual cell foundation models that adapt to unseen biological contexts?
Meet MapPFN, the first prior-data fitted network (PFN) for perturbation prediction. Meta-learned from a synthetic biological prior, it adapts at inference via in-context learning. 🧵