ModernBERT Pretrained for Patent Text Outperforms Baselines
Researchers released three ModernBERT models—base‑PT, base‑VX, and Mosaic‑BERT‑large—trained on 60 million patent records, achieving up to three‑times faster inference than PatentBERT. Read more: getnews.me/modernbert-pretrained-fo... #modernbert #patentnlp
0
0
0
0