Advertisement · 728 × 90

Posts by Yanay Rosen

Preview
Transcript-specific enrichment enables profiling of rare cell states via single-cell RNA sequencing - Nature Genetics Programmable Enrichment via RNA FlowFISH by sequencing (PERFF-seq) isolates rare cells based on RNA marker transcripts for single-cell RNA sequencing profiling of complex tissues, with applicability t...

Out today in @naturegenet.bsky.social -- PERFF-seq! With @tsionabay.bsky.social , @ronanchaligne.bsky.social, Bob Stickels, Meril Takizawa, + Ansu Satpathy, we describe this new assay to study rare populations with programmable nucleic acid cytometry. 1/n
www.nature.com/articles/s41...

1 year ago 165 44 4 7
Preview
How to build the virtual cell with artificial intelligence: Priorities and opportunities Advances in AI and omics enable the creation of AI virtual cells (AIVCs)—multi-scale, multimodal neural network models that simulate molecules, cells, and tissues across diverse states. This vision outlines their design and collaborative development, promising to transform biological research through high-fidelity simulations, accelerating discoveries, and fostering interdisciplinary open science collaborations.

Very excited to see our perspective on Building the AI Virtual Cell published today in Cell! 🏗️🔮⭐️ www.cell.com/cell/fulltex...

With @bunnech.bsky.social @yusufroohani.bsky.social, Jure Leskovec, Emma Lundberg, Stephen Quake, Aviv Regev and Theofanis Karaletsos

1 year ago 3 0 0 0
Post image

🧬 Thrilled to share Knowledge Graph GWAS (KGWAS), the largest AI model that integrates >10 millions of multi-modal and multi-scale functional genomics data to improve GWAS power by 100% while discovering novel disease-critical variants, genes, cells, and networks!

1/15🧵

1 year ago 6 3 1 0
Preview
Does your model understand genes? A benchmark of gene properties for biological and text models The application of deep learning methods, particularly foundation models, in biological research has surged in recent years. These models can be text-based or trained on underlying biological data, es...

arxiv.org/abs/2412.04075

1 year ago 1 0 0 0
overview of results for PLAID!

overview of results for PLAID!

1/🧬 Excited to share PLAID, our new approach for co-generating sequence and all-atom protein structures by sampling from the latent space of ESMFold. This requires only sequences during training, which unlocks more data and annotations:

bit.ly/plaid-proteins
🧵

1 year ago 121 37 1 3
Model Scale vs. Performance curves for ESM C models, with comparisons to ESM2 and other protein LMs. ESMC performs better than existing state of the art for the same model parameter scale.

Model Scale vs. Performance curves for ESM C models, with comparisons to ESM2 and other protein LMs. ESMC performs better than existing state of the art for the same model parameter scale.

Introducing ESM Cambrian, a new family of protein language models, focused on creating representations of the underlying biology of proteins.

1 year ago 50 16 1 2