We’re excited to announce our expanded partnership with Boehringer Ingelheim. Together, we are building the future of AI‑driven antibody discovery and optimization. www.openprotein.ai/strategic-partnership-with-boehringer-ingelheim
Posts by
New on OpenProtein.AI:
→ Improved protein design GUI & refolding metrics for candidate filtering
→ New structure design models (RFdiffusion, BoltzGen)
Nanobody and miniprotein design walkthroughs now live! Links in thread.
Boltz-1 & Boltz-2 now live via GUI & APIs! Predict protein, protein–RNA/DNA/ligand structures with confidence scores & binding affinity metrics for virtual screening. Compare finetuned models in the new overview page to find your best performer fast.
www.openprotein.ai/early-access...
Product update: Indel Analysis lets you score insertions/deletions across your sequence using PoET-2. You can now also compare multiple 3D structures in Mol* to evaluate design alternatives.
Sign up now: www.openprotein.ai/early-access...
Why does no one in AI protein engineering work on indels?
We’re solving this at OpenProtein.AI. Check out our upcoming indel design tool! 🤩 1/4
@openprotein.bsky.social
Product update: PoET-2 now supports structure inputs for enhanced prediction and design via Python APIs. Check out our new inverse folding tutorial to see it in action.
🔗 docs.openprotein.ai/walkthroughs...
Sign up for OpenProtein.AI: www.openprotein.ai/early-access...
This is just the beginning of what's possible with AI that truly understands the molecular machinery of life. Join us in transforming protein engineering: www.openprotein.ai/early-access...
Ready to try it yourself? PoET-2 is available now on OpenProtein.AI:
- Free academic access
- Python client & APIs
- Web interface
Want to see the technical details? Read our white paper: www.openprotein.ai/a-multimodal-foundation-model-for-controllable-protein-generation-and-representation-learning
The implications are enormous for:
- Drug discovery
- Enzyme engineering
- Protein therapeutics
- And much more
This means PoET-2 doesn't just memorize - it learns fundamental principles of how proteins work, enabling accurate zero-shot variant effect prediction and highly data efficient property learning.
How does it work? PoET-2's tiered attention mechanism processes large protein families with order equivariance and long context lengths, letting it learn from evolutionary examples at inference time.
In real-world testing, PoET-2 can:
- Design proteins with multiple simultaneous constraints
- Learn from just dozens of examples
- Make accurate predictions for challenging proteins
- Run fast inference on standard hardware
PoET-2 introduces a powerful prompt grammar for controlled protein generation - enabling everything from inverse folding to motif scaffolding in a single model.
The results are remarkable:
- 500x more compute efficient than contemporary models
- 30x less experimental data needed for protein optimization
- Improved on structure understanding
- Handles insertions and deletions naturally
Key breakthrough: PoET-2's multimodal architecture learns to reason about sequences, structures, and evolutionary relationships simultaneously through in-context learning.
Most protein language models rely on massive scale - up to 100B parameters - to memorize sequences from nature. PoET-2 takes a fundamentally different approach, learning the grammar of protein evolution.
🧬 Announcing PoET-2: A breakthrough protein language model that achieves trillion-parameter performance with just 182M parameters, transforming our ability to understand proteins.