That’s exactly where machines win and so companies will optimize. Not because they’re evil - just because that’s what they do.
pay one person ⬅️ OR ➡️ run something 24/7 that keeps improving
You already know the direction. If not… just pretend.
Have a great weekend!
Posts by desunit
It just… runs. Same speed at 2am, 7pm, Sunday, whatever.
No "bad day", no context switching fatigue, no emotional drain from 10 tiny interruptions.
That changes the game more than you think because most work is about consistency.
1️⃣ show up
2️⃣ do the task
3️⃣ don’t drop quality
You know how it works - you wake up fresh. Maybe 2-4 hours of real focus if you’re lucky. After that it’s downhill. Meetings. Messages. Random stuff pulling you apart.
By afternoon, even simple decisions feel heavy.
Humans are bursty.
AUTOMATA ISN'T
It can connect small issues into big ones - the kind of thinking that usually takes experience to develop and it does so damn naturally.
If it can handle the hardest task then what about about the remaining parts: boring emails, reports, planning, all the stuff we hate and we do daily.
A chalkboard illustrates contrasting productivity between humans and automata, highlighting mental fatigue and efficiency in automation.
Everybody is talking about a new model Mythos, cybersecurity, etc.... but they're missing the actual story.
What’s interesting is that it wasn’t even built for it…. and still beats ‼️ people who spent years there.
I think this is the main part here.
I hope you've found this thread helpful.
Follow me @desunit for more.
Like/Repost the quote below if you can:
Paper: arxiv.org/pdf/2603.20617
But what's the solution!?
Let the free market cure itself, though it will be painful. Historically, this rebalancing usually saved the system but .... much slower than layoffs happen.
Thoughts?
Their solution is a tax on automation (Pigouvian automation tax) but I doubt it will work.
@nntaleb in Antifragility also doesn’t believe in regulation because every time, the winners abuse the system:
> arbitrage regulation
> move jurisdictions
> reshape rules in their favor
etc.
The problem is that over-automation is inevitable because competition pushes firms into it.
The extreme case is high productivity and zero demand.
The paper also highlights that common solutions such as wage adjustments, universal basic income, capital taxes, etc., won’t work here.
I remember @RayDalio in "How the Economic Machine Works" explained that really well. Transactions are the driving force of the economy. If people lose their jobs, they spend less; if they spend less, demand for all firms goes down, and the whole economy slows down.
A cartoon illustrates global tax strategies, featuring a businessman, a robot, and remote workers near a beach, highlighting profit shifting.
AI-driven layoffs undermine the entire economy by reducing consumer demand.
I've expressed these concerns a couple of times already, and now we see a paper with the results.
Execs buy the narrative but workers deal with reality and don't know how to use it properly for their tasks.
What is the most interesting part that:
- 61% of executives trust AI for critical decisions
- only 9% of workers do
Workers lose 51 days/year to tech friction (equal to the time AI supposedly saves) 🤷♂️
They failed at defining what humans + AI should do together.
Food for thought from my previous article. I was quite surprised to read State of Digital Adoption 2026 from WalkMe, where they state that 80% of employees are basically not using AI.
- 54% bypass it
- 33% ignore it completely
Companies are spending 54M on AI but most of sits unused.
Guys, what's going?! Is it AI disruption or ... ?
I hope you've found this thread helpful.
Follow me @desunit for more.
Like/Repost the quote below if you can:
Feels like we’re not just using new tools.
We’re becoming… slightly different people.
> stop clinging to your old identity
> don’t turn your brain off - just stop defending it
> get out of your own way
The future is less predictable now. What matters shifts:
> less ego
> more adaptability
> more flow
Not "be smarter than AI" but learn how to move with it
When your sense of self breaks a bit after a big realization... kind of disorienting. That’s exactly how this feels.
What helps? Definitely not fighting it - adjust to it. I catch myself repeating this to other developers all the time.
> spend more time with AI
After spending real time building with AI, you start noticing something uncomfortable: it’s not just helping you think - it’s outthinking you.
What if the thing that made me valuable… isn’t special anymore?
The author compares this to Zen Buddhism - specifically "Zen sickness".
AI just quietly took your seat. The one where you felt like the thinker.
I found a great article that talks about AI Zen.
Before the Anthropic Opus 4.6 release, AI felt like a better Google. Now it feels like… a better version of you?!
faster.
clearer.
more complete.
I hope you've found this thread helpful.
Follow me @desunit for more.
Like/Repost the quote below if you can:
No splitting decisions between two people with different risk tolerances. No "we need to align on this" when you already know what to do.
Co-founders were a workaround for limits that no longer exist.
AI can be one for you - it takes 0% equity, doesn't argue about the roadmap, and is available at 2am without a calendar invite.
No co-founder means no six-month disagreement about whether to pivot.
> friendship turning into a liability
> firing your co-founder/being fired by them
One founder I know put it plainly: I realized I was giving away 50% of my equity to someone I couldn't fire, just to argue with them.
The advice to get a co-founder was about what made investors comfortable approving a term sheet ... and that's shifting.
Co-founders also come with the things nobody puts in the pitch deck:
> equity disputes that surface 18 months in
> one person scaling, one person not