Posts by Neal Lathia
Am just guessing- the default way of scaling a company is more people. It would be experimental to scale a company just with AI coding, so it’s likely only being tried in smaller startups (I can think of one!)
Maybe the difference is between solved (technically possible) and solved (established methodology)
In the last week I’ve heard of the “chat with AI” text box in products as both good (flexible to the long tail of user needs) and bad (you haven’t thought about user needs)
Biggest product crisis since Google’s text box on a white page
Example: a company that is gloating about the % of their PRs that are auto approved by AI just emailed us to say they decided not to fix a bug we reported. Slow clap 👏
The ideas we had moved on to (impact, time to value, safety & scalability, speed of iteration, doing more with less) are enduring and even more valuable now. The main difference is how far you can drive your team along the AI adoption curve to maximise this even more.
I thought that in Engineering we had collectively moved beyond the “lines of code,” the “number of PRs,” and “tickets completed” metrics, but all of them are back in the spotlight now that AI writes the code. They are still as meaningless.
Richard Sutton talks about this
Everyone always talking about how awesome AI is. But have you ever tried keyhole surgery? 🤯
Hello there
youtube.com/shorts/89h5U...
valuable & helps to sleep easier
An unexpected valuable thing in b2b: people actually saying no when they aren't interested. Have started doing that as much as I can
Intentional or not, Principle Engineer (instead of Principal Engineer) is a brilliant job title
Even the models don’t like today’s state of flux
I have yet to be sent a link to a blue sky post, but still regularly get sent links from almost all other social networks. Enjoy
Very “trust me bro” vibes whenever I see it
"""I had Claude Code implement a very simplified version of Git in 13 languages. Ruby, Python, and JavaScript were the fastest, cheapest, and most stable. Statically typed languages were 1.4–2.6× slower and more expensive."""
dev.to/mame/which-p...
Starting to think that the quiet foundation of tools that "increase productivity" is that they are addictive. Just like the way social networks grew with feeds.
These tools are creating productivity by hitting us with some dopamine
Agents using code repos and ticketing systems is cool. Who is working on agent calendars so that I can schedule & see work that’s booked in?
We're currently comparing Claude & Codex. The most fascinating feedback I've heard about one of them is "for some reason I just hate talking to it" and I think that captures so much about what is difficult about building AI products these days
Someone I was chatting to at a conference said "AI should be the sat nav, not the car" to talk about AI adoption and so I asked him what he thinks about Waymo
Rapidly rebranding all my search benchmarks as eval awareness benchmarks
What I learned from being a CTO: there is always someone waiting for me to do something that I haven't done yet
Just over 50% of Gradient Labs is builders now, we wrote up a bit about our ways of working as we navigate this super competitive & ever-changing space
blog.gradient-labs.ai/p/creating-a...
To date, HCI researchers have had no support on signaling their paper's relevance to AI, esp. when that connection is tenuous at best. We introduce a systematic framework to ensure LLMs are mentioned at every stage of paper reporting—from framing, to evaluation, to implications.
Progress in customer software has been about adopting the next wave and pushing forward, rather than _being_ the next wave.
Every company that was mobile-first a decade ago leapfrogged everyone else and invented new experiences (but they did not need to build mobile phones themselves)
The second hardest is “what would help my users achieve their goal [faster, better, more safely]?”
Product features are useful because they don’t need you to know to ask for them; they empower you to think beyond your base need
The “vibe” influx is being presented in absolute terms (build with AI or buy), but almost all technology impacts *what* and *how* you build, not whether you build at all
That one means: just because you can, doesn’t mean you should
The hardest problem in product has never been "what should we build" - it remains "what is the highest leverage thing we should build next" (and, critically, "what should we absolutely not build")