Doe had broken up with the user in 2024, and he used ChatGPT to process the split, according to emails and communications cited in the lawsuit. Rather than push back on his one-sided account, it repeatedly cast him as rational and wronged, and her as manipulative and unstable. He then took these AI-generated conclusions off the screen and into the real world, using them to stalk and harass her. This manifested in several AI-generated, clinical-looking psychological reports that he distributed to her family, friends, and employer.
For months, her then-fiancĆ© and partner of several years had been fixating on her and their relationship with OpenAIās ChatGPT. In mid-2024, she explained, theyād hit a rough patch as a couple; in response, he turned to ChatGPT, which heād previously used for general business-related tasks, for ātherapy.ā Before she knew it, she recalled, he was spending hours each day talking with the bot, funneling everything she said or did into the model and propounding on pseudo-psychiatric theories about her mental health and behavior. He started to bombard the woman with screenshots of his ChatGPT interactions and copy-pasted AI-generated text, in which the chatbot can be seen armchair-diagnosing her with personality disorders and insisting that she was concealing her real feelings and behavior through coded language. The bot often laced its so-called analyses with flowery spiritual jargon, accusing the woman of engaging in manipulative ārituals.ā Trying to communicate with her fiancĆ© was like walking on āChatGPT eggshells,ā the woman recalled. No matter what she tried, ChatGPT would ātwist it.ā āHe would send [screenshots] to me from ChatGPT, and be like, āWhy does it say this? Why would it say this about you, if this is not true?'ā she recounted. āAnd it was just awful, awful things.ā
Shortly after moving out, the former fiancĆ© began to publish multiple videos and images a day on social media accusing the woman of an array of alleged abuses ā the same bizarre ideas heād fixated on so extensively with ChatGPT. In some videos, he stares into the camera, reading from seemingly AI-generated scripts; others feature ChatGPT-generated text overlaid on spiritual or sci-fi-esque graphics. In multiple posts, he describes stabbing the woman. In another, he discusses surveilling her. (The posts, which weāve reviewed, are intensely disturbing; weāre not quoting directly from them or the manās ChatGPT transcripts due to concern for the womanās privacy and safety.) The ex-fiancĆ© also published revenge porn of the woman on social media, shared her full name and other personal information, and doxxed the names and ages of her teenage children from a previous marriage. He created a new TikTok dedicated to harassing content ā complete with its own hashtag ā and followed the womanās family, friends, and neighbors, as well as other teens from her kidsā high school. āIāve lived in this small town my entire life,ā said the woman. āI couldnāt leave my house for months⦠people were messaging me all over my social media, like, āAre you safe? Are your kids safe? What is happening right now?'ā
A woman sued OpenAI last week alleging that ChatGPT reinforced the obsessive, violent delusions of her stalker (her ex-boyfriend.)
This woman's claims (as detailed by TechCrunch, left) are chillingly similar to those of a completely different woman whose story Futurism reported on in Feb (right):