So you want to skip our thinning proofs—but you’d still like our out-of-the-box attention speedups? I’ll be presenting the Thinformer at two ICML workshop posters tomorrow!
Catch me at Es-FoMo (1-2:30, East hall A) and at LCFM (10:45-11:30 & 3:30-4:30, West 202-204)
Posts by Annabelle Michael Carrell
9 months ago
5
4
0
0
If you’re not at ICML, you can still read our work. Our new theoretically principled algorithms beat recent baselines across multiple tasks—including Transformer approximation!
9 months ago
5
0
0
0
Your data is low-rank, so stop wasting compute! In our new paper on low-rank thinning, we share one weird trick to speed up Transformer inference, SGD training, and hypothesis testing at scale. Come by ICML poster W-1012 Tuesday at 4:30!
9 months ago
25
7
2
2