Advertisement · 728 × 90

Posts by Jeffrey Shallit 🇺🇦

If Carlson was so completely wrong about Trump, what else could he be completely wrong about? A normal human being would ask themselves that.

17 hours ago 0 0 0 0

If you wanted to appoint someone to destroy the US military and its reputation, as Putin desires, then a drunk fundamentalist nutcase like Hegseth would be near the top of the list.

17 hours ago 4 0 0 0
Preview
Opinion | When Your Child Dies of Measles

To RFK Jr. and all the anti-vaxxers: this is your handiwork. www.nytimes.com/2026/04/21/o...

22 hours ago 204 85 8 9

As a matter of principle, you can either have a dinner that celebrates the First Amendment or you can have a dinner that features Donald Trump. You can’t have both because the president holds the freedom of the press in contempt. www.mediamatters.org/donald-trump...

21 hours ago 3405 873 65 34

You forgot religious nut, grifter, Nazi, and sycophant.

21 hours ago 0 0 0 0

For the moment it is useful; you can tell almost instantly whether it is written by an LLM and whether the human author cares enough to both to check the references or not.

But they are already getting better and soon it will be much more difficult to tell.

1 day ago 3 0 1 0

Well, I can see that you sure know what you are talking about, and you display the confidence of one who certainly knows.

1 day ago 0 0 0 0

It is, in fact, a big time sink now. There was an author who submitted a paper w/ hallucinated references, which I pointed out, and he replied by asking *the same LLM that originated the fake reference* whether it existed and it said yes, which he then triumphantly showed me to "prove" I was wrong.

1 day ago 4 0 2 0

Not poli sci, but for a mathematics journal I edit, submissions have gone up by more than 50%. And nearly all of the LLM-generated submissions are junk that get immediate rejection.

1 day ago 4 0 1 0

Man, the whole "anti-woke" moral panic just looks worse & worse in the rearview mirror.

1 day ago 333 37 11 2
Advertisement

It's not debunked, but celebrated. This is my area. And even if somehow everyone has missed something, and it is debunked, there are dozens of other examples, including in my own work.

1 day ago 0 0 0 0

I was replying specifically to the claim "Actually, what keeps happening is that there is some splashy announcement of a niche use found (supposedly predicting breast cancer is an example), with lots of media coverage, whilst the debunking is limited to niche circles."

Try to keep up.

1 day ago 0 0 0 0
Erdős Problem #1196 - Discussion thread

Some of the world's greatest mathematicians are using LLM's to help solve formerly unsolved mathematics problems. You can pretend that this isn't happening, but it is nevertheless.

Just one example of many: www.erdosproblems.com/forum/thread...

1 day ago 2 0 3 0

The future of biomedical research is to make sure a large percentage of the total research budget around the world is dedicated to showing common treatment x doesn’t cause autism again and again and again……because of liars, grifters and charlatans.

1 day ago 11 7 1 0

The whole "wind turbines cause all kinds of ills" crowd is very, very weird.

1 day ago 14 2 1 0

You have literally nothing of value to say. Muted.

1 day ago 0 0 0 0
Video

I am in Barcelona today to help unite progressive parties from around the world in a defense of democracy and a war against corruption. A first of its kind event. Necessary to beat back the forces of fascism.

2 days ago 10498 2277 337 153

Bigots and racists all the way down

2 days ago 4 0 0 0

So, you didn't withdraw your bogus argument, and you're too scared to read someone who disagrees with Prof. Bender.

That sounds like someone whose mind is already made up and is resistant to hearing a dissenting view.

2 days ago 0 0 1 0
Advertisement

It is accurate in exactly the same sense that calling a library "a repository for dead trees" is accurate. Namely, it deliberately omits what is important and useful about both tools.

In other words, it is propaganda designed more to amuse and denigrate than to enlighten.

2 days ago 0 0 0 0
Yet Another Bad Analysis of AI In the past I've commented on bad discussions of thinking, intelligence, brains, and computers, such as those by Gary N. Smith , Doug Hofst...

I addressed this misconception here:
recursed.blogspot.com/2025/07/yet-...

Furthermore I know something that you probably don't: LLM's like Gemini and ChatGPT are deliberately trained to deny that they have beliefs and to say they are not intelligent in the human sense.

2 days ago 2 0 1 0

Your attempt at deriving a self-contradiction fails utterly because you've assumed that a source of information that is not always reliable is useless for solving problems, which is clearly false. Almost every source of information is unreliable. Even dictionaries and encyclopedias have errors. 1/2

2 days ago 0 0 1 0

You believe that calling an LLM a "synthetic text-extruding machine" is accurate, yet you ask the opinion of one about the accuracy of this description?

You couldn't make this silliness up if you tried.

2 days ago 1 0 2 0

I think you missed the argument. The argument is not "their motivations are suspect, so they are wrong". The argument is "they are wrong, and by the way their motivations are suspect."

2 days ago 0 0 1 0

"You're funny" is not an insult. "You're silly" would be an extremely gentle insult. I didn't use it before, but it's appropriate now.

2 days ago 0 0 1 0

That's like saying "it doesn't take any skill to interact with grad students in math", because you can just say to them "can't you see that your proof doesn't work?" or "this is obviously a half-formed idea, why don't you go back to your room and think more about it?".

2 days ago 1 0 0 0

The only reason why people resort to perjorative and inaccurate names like "stochastic parrots" and "synthetic text extruding machines" is that (a) they think they're clever and (b) they're worried their arguments can't stand on their own.

2 days ago 0 0 1 0

You're funny! Since my *whole point* was that calling them "synthetic text extruding machines" instead of what everyone else calls them (namely, LLM's) is silly and weakens the argument Prof. Bender was making.

2 days ago 0 0 2 0
Advertisement

James Joyce supposedly said to a friend that in a day's work he had written only 7 words. And he was still not sure what order they should go in.

2 days ago 11 2 1 0

The next administration must pass legislation that requires Trump to pay back double any settlement amount his corruption receives.

Any nominee who promises that has my vote.

3 days ago 3 1 1 0