Advertisement · 728 × 90
#
Hashtag
#AutomatedRacism
Advertisement · 728 × 90
Preview
Automated Racism @AmnestyUK found out that 3/4 of police forces across the UK are using technology to try to “predict crime” - and almost no one knows about it. Learn more:

Last year, we contributed to the #AutomatedRacism report on police use of predictive technology, which fuels bias and discrimination. We call for a ban on harmful tech. Join us at #CommunitiesForJustice to #StopAutomatedRacism now! ✊🏽 www.amnesty.org.uk/knowledge-hu...

0 0 0 0
Post image

#PoppyBourke joined #Liberty's legal team in 2025 to focus on privacy and technology litigation. Join us at #CommunitiesForJustice as Poppy discusses #Liberty's work, police technology, and our rights facing #AutomatedRacism and #injustice. #WeCopWatch

0 0 0 0
Preview
Police start live facial recognition trial at London stations The British Transport Police initiative has been described by campaigners as "authoritarian".

www.bbc.co.uk/news/article...
It comes as little surprise to any of us that #BTP has deployed a facial recognition trial despite a landmark legal challenge in the High Court and despite knowing the technology's harms. This is #automatedinjustice in action. This is #automatedracism.

0 1 0 0
Post image

#BashartMalik is a film director, speaker, and advocate for equality in the arts. He chairs the #DiverseArtistsNetwork and directed #IamJudah. Join us at #CommunitiesForJustice as he discusses the harms of police technology and how we can challenge #AutomatedRacism together.

0 0 0 0
Post image

#JohnPegram has been fighting for his rights since 2018. Join us at #CommunitiesForJustice as he discusses copwatching, stop-and-search practices, #AutomatedRacism, and the #OffenderManagementApp, sharing his powerful lived experiences. 🕵️‍♂️ #BristolCopWatch #WeCopWatch

0 1 0 0
Preview
Automated racism: How police data and algorithms code discrimination into policing - Police and Human Rights Resources Police and Human Rights Resources is an initiative of

policehumanrightsresources.org/automated-ra...

If you haven't checked out #AmnestyInternational's #AutomatedRacism report, it's a must-read. Predictive policing systems can harm communities. Proud to have contributed to this important work. #WeCopWatch 📚✊🏾

0 0 0 0
Preview
Surveillance is not safe guarding Think Family Education (TFE)  is an app that has been in use in schools for several years, in fact The Bristol Cable first reported on TFE in 2021.  The app is connected to a council and police database.

We are shining a light on Think Family Education. Join us on 28/2/26 at the #ECC for a discussion with #BashartMalik on why #AutomatedRacism has no place in education. Please share and donate to support the film! #EndThinkSurveillance ✊📚

1 0 0 0
Preview
Stop Automated Racism Amnesty International UK found out that 3/4 of police forces across the UK are using technology to try to “predict crime”, but it is having racist and discriminatory impacts. Almost no one knows about it. Learn more:

www.amnesty.org.uk/predictive-p...

This year, we're amplifying our fight against #AutomatedRacism. Explore the #AmnestyInternational report on predictive policing to grasp its detrimental effects on our communities. 📊✨ #WeCopWatch

0 0 0 0
Road mapping paths to accountability By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…

We are challenging the police’s unlawful use of #AI. Join us on 28.2.26 at the #ECC to demystify and shine a light on #automatedracism #WeCopWatch

copwatchersorg.wordpress.com/2025/11/08/r...

1 0 0 0
Preview
Road mapping paths to accountability By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…

copwatchersorg.wordpress.com/2025/11/08/r...

Today is #HumanRightsDay. #OMA violates the presumption of innocence and Article 8 rights, breaching #PublicSectorEqualityDuty. It is racial profiling in action. John and #Liberty are holding the police accountable for #AutomatedRacism. 🚨

2 0 0 0
Preview
Bristol Copwatch November Community Meeting Join us with guest speakers in creating a space to share, learn and get involved in local community groups.

www.eventbrite.co.uk/e/bristol-co...

Our next community meeting is this Thursday at the #ECC! Join speakers John Pegram, Habib Kadiri, Ken Hinds, and Bashart Malik to discuss stopping #AutomatedRacism, anti-gang "precision" stop and search, and more! We see you because #WeCopWatch. ✊🏽

0 0 0 0
Preview
Amnesty says Bristol police prediction software 'supercharges racism' It is a damning 120-page report into the use by police of computer programmes that predict crime and criminals

www.bristolpost.co.uk/news/bristol...

We are concerned about the police's use of predictive technology and its potential harm. For a broader perspective, check out the #AutomatedRacism report. 📊 #WeCopWatch

2 1 0 0
Preview
Privatising the algorithm: When predictive policing moves beyond the state The proliferation of algorithmic technologies such as live facial recognition risks deepening the very abuses stop and search rules aim to limit

www.stop-watch.org/news-opinion...

The rise of algorithmic technologies like live facial recognition may exacerbate the very issues that stop-and-search regulations seek to address. #AutomatedInjustice #AutomatedRacism

0 0 0 0

Avon and Somerset Police are being held to account for use of technology they should never have been given access to at all. The constabulary has risk scored over 364,000 people and have said nothing. It is #automatedinjustice. It is #automatedracism in action. #WeCopWatch

2 1 0 0
Preview
Modern policing has always been predictive. We are Amnesty International UK. We are ordinary people from across the world standing up for humanity and human rights.

www.amnesty.org.uk/blogs/human-...

From Colonial Times to Modern Predictive Policing #automatedracism #automatedinjustice

0 0 0 0
Preview
Transport for London rejects Amnesty predictive policing advert Transport for London (TfL) has rejected Amnesty International's advert about predictive policing for Elephant and Castle tube station.

“It deepens wounds, and it makes you feel like you’re being criminalised.” www.selondoner.co.uk/news/1905202... #automatedracism #WeCopWatch

0 0 0 0
Preview
Resilient. Hello. I thought I should check in as I’m aware I missed a week last week writing this blog of mine. I was overseas for Easter, seeing my mum in Spain, which was, as always, good, and it was …

thelittlefighterwithabigheart.com/2025/04/26/r...

#WeCopWatch #AutomatedRacism #Trauma #NoToSection60

@stopwatchuk.bsky.social @netpol.org @amnestyuk.bsky.social

0 0 0 0

Our founding member John Pegram is now taking legal action against Avon and Somerset Police for risk scoring via the Offender Management App. This is about all of us. If you believe you have been risk scored we'd like to hear from you.
#WeCopWatch #AutomatedRacism

2 3 0 0
Preview
Covert Racism in AI: How Language Models Are Reinforcing Outdated Stereotypes | Stanford HAI Despite advancements in AI, new research reveals that large language models continue to perpetuate harmful racial biases, particularly against speakers of African American English.

Excited about the Trump regime's push to automate federal jobs and services with AI.

#automatedracism

1 0 0 0
Preview
Stop Automated Racism Amnesty International UK found out that 3/4 of police forces across the UK are using technology to try to “predict crime”, but it is having racist and discriminatory impacts. Almost no one knows about...

Are you interested in contributing to a film highlighting the impact of #PredictivePolicing on our lives?
The #AutomatedRacism report revealed "dangerous discrimination" in police #AI tech! if you see the same issues we do, we would love to hear from you.
www.amnesty.org.uk/predictive-p...

1 1 0 0