Doe had broken up with the user in 2024, and he used ChatGPT to process the split, according to emails and communications cited in the lawsuit. Rather than push back on his one-sided account, it repeatedly cast him as rational and wronged, and her as manipulative and unstable. He then took these AI-generated conclusions off the screen and into the real world, using them to stalk and harass her. This manifested in several AI-generated, clinical-looking psychological reports that he distributed to her family, friends, and employer.
For months, her then-fiancé and partner of several years had been fixating on her and their relationship with OpenAI’s ChatGPT. In mid-2024, she explained, they’d hit a rough patch as a couple; in response, he turned to ChatGPT, which he’d previously used for general business-related tasks, for “therapy.”
Before she knew it, she recalled, he was spending hours each day talking with the bot, funneling everything she said or did into the model and propounding on pseudo-psychiatric theories about her mental health and behavior. He started to bombard the woman with screenshots of his ChatGPT interactions and copy-pasted AI-generated text, in which the chatbot can be seen armchair-diagnosing her with personality disorders and insisting that she was concealing her real feelings and behavior through coded language. The bot often laced its so-called analyses with flowery spiritual jargon, accusing the woman of engaging in manipulative “rituals.”
Trying to communicate with her fiancé was like walking on “ChatGPT eggshells,” the woman recalled. No matter what she tried, ChatGPT would “twist it.”
“He would send [screenshots] to me from ChatGPT, and be like, ‘Why does it say this? Why would it say this about you, if this is not true?'” she recounted. “And it was just awful, awful things.”
Shortly after moving out, the former fiancé began to publish multiple videos and images a day on social media accusing the woman of an array of alleged abuses — the same bizarre ideas he’d fixated on so extensively with ChatGPT.
In some videos, he stares into the camera, reading from seemingly AI-generated scripts; others feature ChatGPT-generated text overlaid on spiritual or sci-fi-esque graphics. In multiple posts, he describes stabbing the woman. In another, he discusses surveilling her. (The posts, which we’ve reviewed, are intensely disturbing; we’re not quoting directly from them or the man’s ChatGPT transcripts due to concern for the woman’s privacy and safety.)
The ex-fiancé also published revenge porn of the woman on social media, shared her full name and other personal information, and doxxed the names and ages of her teenage children from a previous marriage. He created a new TikTok dedicated to harassing content — complete with its own hashtag — and followed the woman’s family, friends, and neighbors, as well as other teens from her kids’ high school.
“I’ve lived in this small town my entire life,” said the woman. “I couldn’t leave my house for months… people were messaging me all over my social media, like, ‘Are you safe? Are your kids safe? What is happening right now?'”
A woman sued OpenAI last week alleging that ChatGPT reinforced the obsessive, violent delusions of her stalker (her ex-boyfriend.)
This woman's claims (as detailed by TechCrunch, left) are chillingly similar to those of a completely different woman whose story Futurism reported on in Feb (right):