Advertisement · 728 × 90
#
Hashtag

#textgrad

Advertisement · 728 × 90
Preview
Reducing GPT-4 API Cost by reducing Prompt Decompression To reduce the size of a prompt, you can use compression techniques. One way to do this is by using GPT’s ability to compress and decompress tokens. A recent tweet from @VictorTaelin suggests that GPT…

Old, but it's still amazing how many tokens you can reduce with this technique. medium.com/@parasmadan....
I think it's nice to keep these strategies in mind while there are new concepts every day. I like the combination with #TextGrad for some crazy performance increases #AI #LLM #performance

0 0 0 0

Here's the non-paywall version of our #TextGrad paper rdcu.be/efRp4! 📜

2 0 0 1
Preview
Tiny satellite sets new record for secure quantum communication Hear the biggest stories from the world of science | 19 March 2025

I had a lot of fun discussing #textgrad on the @nature.com podcast with @climateadam.bsky.social! It starts at around 12 minutes here www.nature.com/articles/d41...

12 4 1 0
Preview
Optimizing generative AI by backpropagating language model feedback - Nature Generative artificial intelligence (AI) systems can be optimized using TextGrad, a framework that performs optimization by backpropagating large-language-model-generated feedback; TextGrad enable...

💡The key idea of #textgrad is to optimize by backpropagating textual gradients produced by #LLM

Paper: www.nature.com/articles/s41...
Code: github.com/zou-group/te...

Amazing job by Mert Yuksekgonul leading this project w/ Fede Bianchi, Joseph Boen, Sheng Liu, Pan Lu, Carlos Guestrin, Zhi Huang

3 0 0 0
Post image

⚡️Really thrilled that #textgrad is published in @nature.com today!⚡️

We present a general method for genAI to self-improve via our new *calculus of text*.

We show how this optimizes agents🤖, molecules🧬, code🖥️, treatments💊, non-differentiable systems🤯 + more!

19 4 1 2