Advertisement · 728 × 90

Posts by Lucas Beyer (bl16)

index the codebase first. Then ask o1 in "codebase chat" the same questions you would like to ask the author/owner of the codebase.

Mostly useful when digging into unknown/new codebases and trying to understand them. Or asking about possible bugs :)

1 year ago 4 0 0 0

yes, I noticed that over Christmas break and ever since, it's just... a lot more boring here.
My first reason to open social media is to be entertained. Second reason to entertain. Maybe last third to learn something new.
Not much of any of these here. I know I know, be the change and all that.

1 year ago 0 0 1 0
Post image

I noticed that I'm not using bsky much anymore. Not sure why, vibes.

Anyways, someone noticing that DeepSeek refuses to answer *anything* about Xi Jinping, even the question whether he exists at all, triggered me writing a short snippet on safety fine-tuning: lb.eyer.be/s/safety-sft...

1 year ago 88 8 6 0
Post image

First candidate for banger of the year appeared, only 2 days in:

1 year ago 27 0 0 1
Post image Post image

OpenAI skips o2, previews o3 scores, and they're truly crazy. Huge progress on the few benchmarks we think are truly hard today. Including ARC AGI.
Rip to people who say any of "progress is done," "scale is done," or "llms cant reason"
2024 was awesome. I love my job.

1 year ago 113 14 11 5

It reeeeally depends what are loss1 and loss2, both regarding what’s standard and what’s wasted.
I honestly think you are confused, the three codes in three different posts of you mean three different things. I don’t mean it in a negative way, but clearing it up would take more time than I want :/

1 year ago 1 0 2 0

Yeah it seems either he had a mistake in the OP, or the subject of the discussion has drifted :)

1 year ago 0 0 0 0
Advertisement

Until we got good enough AI supported search, no, you can’t realistically expect them to find anything and everything from the past 30 years when the vocab and everything changes.

1 year ago 10 0 1 0

Well, no, two things:

1. In the OP indeed *both* formulations waste compute, so yeah :)

2. In 2nd post, you are not doing the same thing as in your OP! In your 2nd, you are doing good old micro batching which indeed the second way is the standard way.

So what you say keeps changing O.o

1 year ago 0 0 1 0

I would enjoy that meeting ;)

1 year ago 3 0 0 0

Yeah it’s silly to expect the new generation to know everything the old generation did, doing so shows complete lack of empathy.

1 year ago 7 0 1 0

If the two graphs are completely disjoint, then there is no point in this. If they have some commonality (like model) then this does the common part twice.

1 year ago 1 0 3 0

I’m somewhat confident both of these are sins lol second one wastes a ton of compute!

1 year ago 6 0 2 0
Post image

Not quite because this one is not stacked, so I give it better chance to scale:

1 year ago 1 0 0 0

A post by @cloneofsimo on Twitter made me write up some lore about residuals, ResNets, and Transformers. And I couldn't resist sliding in the usual cautionary tale about small/mid-scale != large-scale.

Blogpost: lb.eyer.be/s/residuals....

1 year ago 81 9 3 2
Advertisement

That’s what the globe was for!

1 year ago 2 0 1 0

One of the physics of llm papers studied that and found you need a certain amour of repetitions of a factoid before it’s memorized. Repetition can be either multi epochs or just the same fact in another document. Number of needed repeats is also related to model size.

1 year ago 11 2 2 0

OK OK I’ll admit it, I’m feeding off your fomo! There can’t be enough fomo! Mmmm fomo!

1 year ago 2 0 0 0

Yeah they compress videos to shit here, and are considering making good quality videos a paying feature.

1 year ago 3 0 1 0

No talk but two posters, I’m just a middle author but will try to be there (locca and NoFilter). That being said my main occupation here will be meeting many of my new colleagues.

1 year ago 1 0 0 0

Good morning Vancouver!

Things are different here: this guy is alone, chonky, and not scared at all, I was more scared of him towards the end lol.

Also look at that … industrialization

1 year ago 31 0 3 0

lol exactly. I said cdg whenever possible.

That being said, our flight is operated by AirCanada, including a layover in Canada, that is a lot worse.

1 year ago 5 0 2 0

Oops! Thanks

1 year ago 0 0 0 0
Video

Good morning! On my way to NeurIPS, slightly sad to leave this beautiful place and my family for the week, but also excited to meet many new and old friends at NeurIPS!

1 year ago 66 0 3 1
Advertisement

This afternoon flight? I’m taking that too

1 year ago 2 0 2 0

Aye, finally I can without it being weird lol

1 year ago 4 0 0 0
Post image
1 year ago 3 0 1 0
[M2L 2024] Transformers - Lucas Beyer
[M2L 2024] Transformers - Lucas Beyer YouTube video by Mediterranean Machine Learning (M2L) summer school

One of the best tutorials for understanding Transformers!

📽️ Watch here: www.youtube.com/watch?v=bMXq...

Big thanks to @giffmana.ai for this excellent content! 🙌

1 year ago 55 8 0 0

Yep, and related to this: bsky.app/profile/giff...

1 year ago 2 0 0 0

I’m confused, why not nbviewer.com which has existed and working for a decade?

1 year ago 4 0 1 0