Advertisement · 728 × 90
#
Hashtag
#AIisracist
Advertisement · 728 × 90

Why am I not surprised?
“In effect, AI would go on to replicate bias resulting from gaps in AI training data.”
#AIisracist

0 0 0 0
Why ai age verification can be racist #aiisracist #ageverification #onlinesafetyactuk
Why ai age verification can be racist #aiisracist #ageverification #onlinesafetyactuk YouTube video by Lewis Wilcock commentary

Why ai age verification can be racist #aiisracist #ageverification #onlinesafetyactuk #promosky #tiktok #youtube

TikTok: vm.tiktok.com/ZNduTYaX2/

YouTube: youtube.com/shorts/rXb3C...

1 1 0 0
An infographic titled “Real-Life Horror Stories” with an eye emoji (👀) at the beginning. The background is a dark cyberpunk-style design with glitch effects. The text provides three real examples:
📌 2018: Amazon’s AI falsely flagged Congress members as criminals. ACLU tested Amazon’s facial recognition software (Rekognition), and it falsely matched 28 Congress members to mugshot photos. The errors disproportionately targeted Black & Brown politicians.
📌 2019: Detroit police arrested an innocent Black man based on faulty AI. Facial recognition software misidentified a driver’s license photo as a shoplifter, leading to his wrongful arrest and 30 hours of detention.
📌 2025: CBP One’s AI blocked Black migrants from applying for asylum. CBP One’s facial recognition app—used at the U.S.-Mexico border—failed to recognize Black faces, locking them out of the process. DHS shut it down in January 2025 after massive failures.
The handle “@ELI5BSI” is displayed at the bottom in neon glitch text.

An infographic titled “Real-Life Horror Stories” with an eye emoji (👀) at the beginning. The background is a dark cyberpunk-style design with glitch effects. The text provides three real examples: 📌 2018: Amazon’s AI falsely flagged Congress members as criminals. ACLU tested Amazon’s facial recognition software (Rekognition), and it falsely matched 28 Congress members to mugshot photos. The errors disproportionately targeted Black & Brown politicians. 📌 2019: Detroit police arrested an innocent Black man based on faulty AI. Facial recognition software misidentified a driver’s license photo as a shoplifter, leading to his wrongful arrest and 30 hours of detention. 📌 2025: CBP One’s AI blocked Black migrants from applying for asylum. CBP One’s facial recognition app—used at the U.S.-Mexico border—failed to recognize Black faces, locking them out of the process. DHS shut it down in January 2025 after massive failures. The handle “@ELI5BSI” is displayed at the bottom in neon glitch text.

📌 2018: Amazon’s AI falsely flagged Congress members as criminals.
📌 2019: Detroit police arrested an innocent Black man based on faulty AI.
📌 2025: CBP One’s AI blocked Black migrants from seeking asylum.
These aren’t “glitches.” This is how the system works. #AIisRacist

1 0 1 0