That said, the AI industry resembling a multitrillion-dollar broken-up polycule can't be helping. One influential anonymous X account run by an OpenAI employee - speaking of interesting communications strategies! — worries that it might have some downsides: tweet from Roon: the ai labs, in competing with each other, are burning huge amounts of the commons on public trust in ai to win minor points against the others. their lobbyists, pr machines, lawsuits. it's the very opposite of what marxist class struggle analysis would tell you end tweet. From inside the industry, or even if you spend enough time steaming in the Al hothouses of X or LinkedIn, this map of intra-AI rivalries and vendettas is legible and, for some of these guys, ideologically coherent, rooted in old and substantive disagreements about how to build intelligent machines. From the outside, though, old, festering disagreements about alignment, AI safety, and novel corporate governance structures tend to lose a lot of texture, and the situation can be read, accurately if not necessarily sufficiently, as something simpler and more familiar: Another new industry in the throes of massive expansion, its investors desperate for upside and its principal actors engaged in a ruthless land grab and fight for dominance that feels, to them, like a matter of life or death. That fight is all in pursuit of an outcome that they've explained is (1) probably inevitable and (2) might be pretty bad, and which therefore sounds awfully predatory.
It can be deflating to reimagine the Al boom as a more pedestrian business story with particularly colorful executives expressing contempt for their rivals and making things personal on the way to, say, packaged-beverage dominance. But the maximally dysfunctional dynamics of the pre-takeoff AI industry can also be read as an early, bad sign of how things might play out for everyone else: like they always do, but maybe worse. Here is a visible, prepared, and substantively aligned "small group of elites," including a few of the richest people in the entire world, suggesting that it's time to collectively "rethink the social contract" and warning that we're about to be "tested as a species," as they're in the process of succumbing completely to a crude, winner-take-all market logic, utterly failing to coordinate among themselves, fighting regulation with lobbyists, getting pissed as hell in public, and opening up a bunch of fronts in a total industrial war for scarce resources — power, compute, water - with immediate and unmitigated externalities. (Granted, comprehensive high-level coordination among them might look like something else people don't particularly love: a cabal.) Individually, to receptive audiences, they can explain how all this happened and rationalize their own roles. To much of the rest of the world, though, they just look like a group of people who worried about building the thing and then couldn't figure out how not to, who cautioned against getting trapped in an arms race and then started one anyway. They see people warning about the speed of change as they step over one another to make it accelerate. They see people urging humility and accusing one another of having God complexes while engaging in a naked struggle for power.
I wrote about the mutual contempt that motivates our polycular AI elite nymag.com/intelligence...