Nobody's recently asked, but in case anybody's interested I also will agree to buy $100 billion of your future services if you give me $5 billion upfront
Posts by Jeff Horwitz
The size of Meta's future layoff waves will depend on the speed of improvements in AI tooling and capacity.
It sets up a wild workplace dynamic.
www.reuters.com/world/meta-t...
i love when content is on platforms
ugggh
On X, community notes being used to flag X for running allegedly illegal undisclosed ads for Nvidia.
No, it would still be dying. Separate axes.
Chinese government decides to regulate AI from the right to personal likeness, child safety and addiction angles.
www.reuters.com/world/china/...
Okay so here’s my actual Bluesky-is-dying hypothesis:
The entire web is dying. Users aren’t going from BlueSky to another site (x/insta/threada/tiktok). Users are going to chatbots.
I know traffic to news sites has cratered (like 90%). My hunch is traffic to all the social platforms is down too.
Smoke inhalation is the only version known to actually potentially kill you
He wasn’t totally right. But he wasn’t totally wrong either
Did you lick the cat? Or just snort it?
I’m a reporter and I’m regularly unclear if I enjoy writing or reporting.
Just wanted to add my two cents to this discussion
The compromise @caseynewton.bsky.social is proposing would be a big deal in the tech policy world. The feud between Section 230 absolutists and those seeking to rein in social media companies is a big reason why regulation always withers on the vine.
I hugely respect your commitment to this bit.
I also wish suggest "Fully sentient, AGI toilet" to the list.
A court document referencing rhino ketamine
Elon Musk's lawyers have asked the court to suppress talk of "inflammatory and highly irrelevant topics" like Donald Trump and Rhino Ketamine
storage.courtlistener.com/recap/gov.us...
Interesting read. Thanks.
That the current fight involves child safety -- a thing that dredges up past battles over video game violence and CD warning labels of "graphic language" -- just raises the temperature, I think.
The weird current state of play is that legal outcomes that news outlets and the general public seem to view as consequential blows against "Big Tech" are going unremarked by the brand-name critics of "Big Tech."
There's just silence - and maybe even a sense that the litigation is harmful.
Someone should probably write something about the internecine fault lines around 230 and child well being -- and the split between advocacy groups that want regulation versus the tech-libertarian groups that fear government power more than platform power.
The KOSA fight aggravated those tensions.
In practice, exposure is a double-edged sword. Research restrictions get tighter after leaks, but press coverage and lawsuits drive action. On child safety, Meta's lawyers have been internally warning for two years that pain was coming from public exposure.
Those warnings changed product.
I'm VERY aware that leaks have consequences, but it's intereesting to see a CDT staffer saying that public scrutiny of internal company documents ultimately harms social media well being work.
Sorta feels like many brand-name tech critic orgs are either quiet or resentful of the meta verdicts.
For anyone with doubt about this, just go search the word "holdout" in the documents in the FB Archive at Harvard.
fbarchive.org
I spent seven years researching this. Platform regulation discussions would be better if they began with the premise that design choices do have measurable effects.
There's an unfortunate tendency to combine speech concerns about negligent platform design lawsuits with a claim that platform design can't be blamed for bad user outcomes.
Platforms run A/B tests showing the effects of discrete features and then weigh safety vs growth tradeoffs. That's settled.
it's not working!!!!
Also, beyond the money in the New Mexico case, there will be a bench trial in May to adjudicate NM's claim that Meta needs to change its product to comply with state law.
I should clarify that there were two separate findings of liability involving 37,500 violations. So that's ~$375 million in a state that contains 0.62% of the US population.
www.reuters.com/sustainabili...
For anyone losing track: the New Mexico Case was running parallel to the LA bellwether social media teen mental health case.
Jury finding for the state across the board is, however, a big moment for the crowd arguing that product liability offers a way around Section 230.
New Mexico took a different tack from the other states going after broader youth well being stuff. (It is not part of the massive multi-state case).
There is overlap, though, and I don't think there's anything stopping similar cases from being brought by AGs.
Verdict is in for the New Mexico child safety case against Meta. Goes against Meta on every question.
Jury finds Meta liable for ~37,000 violations. Maximum liability of $5,000 per violation.