Advertisement ยท 728 ร— 90

Posts by Raphael Schumann

Same boat as your AC

1 year ago 2 0 1 0

Could you add me please?

1 year ago 5 0 0 0

CBOW vs. Skip-gram

1 year ago 6 0 0 0

Great work! Are you going to release the models?

1 year ago 6 0 0 0

A starter pack for #NLP #NLProc researchers! ๐ŸŽ‰

go.bsky.app/SngwGeS

1 year ago 251 99 45 13

#EMNLP has a nice set of tokenization/subword modeling papers this year.

It's a good mix of tokenization algorithms, tokenization evaluation, tokenization-free methods, and subword embedding probing. Lmk if I missed some!

Here is a list with links + presentation time (in chronological order).

1 year ago 47 16 5 2

First time ML/NLP Bluesky feels alive.

1 year ago 3 0 0 0

This helped a lot!

1 year ago 1 0 0 0

I make sure to even delete paths with my username from code in supplementary material

2 years ago 1 0 0 0
State of the art - ACL Wiki

TIL that the ACL Wiki has/had a state-of-the-art overview:

aclweb.org/aclwiki/Stat...

2 years ago 1 0 0 0
Advertisement
Post image

It also works with Flash Attention 2, although I don't see additional speedups. I don't think FA is optimized for generation.

2 years ago 0 0 0 0
Preview
Using padding and prefill during inference in huggingface transformers Using padding and prefill during inference in huggingface transformers - run_padding_prefill.py

Conceptually it is clear that this works but I wasn't aware that huggingface passes this through correctly.
Github Gist to reproduce:
gist.github.com/raphael-sch/...

2 years ago 0 0 1 0

You have to place the padding tokens in between the prefill and input tokens (example with 3 prefilled tokens):
input_ids: [0, 0, X, X, X, X]
position_ids: [0, 0, 3, 4, 5, 6]
attn_mask: [1, 1, 1, 0, 0, 1, 1, 1, 1]

2 years ago 0 0 1 0
Post image

Turns out that with the right attention_mask and position_ids you can prefill tokens AND pad batches in huggingface transformers. This speeds up inference, especially if if each instance has the same system prompt prepended. Code below โ†“

2 years ago 4 0 1 1