We welcome the @europarl.europa.eu's decision to back a landmark ban on #NudificationApps - technology that has been weaponised to create non-consensual intimate imagery, causing devastating harm to victims.
This is a crucial step >> ec.europa.eu/commission/p...
#EndSexualViolenceNow
The UK just CRIMINALIZED nudification apps AI tools that generate fake nude images of women without consent. It's a start. But thousands still run freely in the US. When will Congress protect us? #NudificationApps #EndAIAbuse #CriminalizeNotNormalize
www.gov.uk/government/n...
π§΅ 5/6
More likely, the first steps would be enforcement notices or fines, which could run into hundreds of millions of dollars given Xβs global turnover.
#DRAGON #Grok #onlinesafetyact #nudificationapps #Ofcom
π§΅ 4/6
Importantly, Ofcom has stressed that a ban would be a last resort. The regulator has to follow due process, build a legally robust case, & give X the opportunity to respond. Otherwise it risks being overturned in court.
#DRAGON #Grok #onlinesafetyact #nudificationapps #Ofcom
π§΅ 1/6
The government has made clear it would support the regulator if it ultimately chose to block the platform from operating in the UK β effectively a ban.
#DRAGON #Grok #onlinesafetyact #nudificationapps #Ofcom
Following the spread of AI-generated sexual images of women & children produced via Xβs built-in Grok tool, Ofcom has launched its most serious investigation yet under the Online Safety Act.
π tinyurl.com/3x7xtnvs
π§ tinyurl.com/3tfk55k3
π§΅π
#DRAGON #Grok #onlinesafetyact #nudificationapps #Ofcom
π§΅ 7/7
The aim is to move beyond reacting to abuse after the fact, & instead prevent harm from happening in the first place.
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct
π§΅ 6/7
Alongside the ban, the government has said it will work with safety-tech companies to develop stronger technical protections, including tools that can detect & block intimate imagery before it is created or shared.
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct
π§΅ 5/7
The @iwf.org.uk has reported that nearly one in five young people seeking help through its Report Remove service had images that were manipulated or altered.
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct
π§΅ 4/7
Experts have warned that nudification tools are increasingly being used to create fake nude images of women & girls, including children, with devastating psychological & reputational impacts.
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct
π§΅ 3/7
The decision follows growing concern from child protection organisations & campaigners about the real-world harm caused by these apps.
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct
π§΅ 2/7
While creating explicit deepfakes without consent is already a criminal offence under the Online Safety Act, this change goes further by targeting the tools themselves, not just their misuse.
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct
π§΅ 1/7
The move, part of a wider strategy to halve violence against women and girls, will make it illegal to create, supply or profit from technology designed to generate these images.
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct
Last month, the UK government announced plans to ban βnudificationβ apps β AI tools that can digitally remove clothing from images to create non-consensual, sexually explicit deepfakes.
π tinyurl.com/3ybdskky
π§΅π
#DRAGON #onlinesafety #onlineharms #nudificationapps #OnlineSafetyAct