Advertisement · 728 × 90
#
Hashtag
#StopAutomatedRacism
Advertisement · 728 × 90
What is ADM? 31 March 2026 - We have updated this draft guidance to reflect changes to the UK GDPR following the Data (Use and Access) Act 2025 (DUAA).

"One key risk to mitigate is bias and discrimination in algorithmic systems, which may reflect historical inequalities. This can lead to unfair outcomes for individuals." The #ICO agree then that the police's use of #AI is unlawful. #StopAutomatedRacism

ico.org.uk/for-organisa...

0 0 0 0
Post image

#WeCopWatch #databreach #stopautomatedracism #ReimaginePublicSafety

0 1 0 0
Preview
Police AI chief admits crime-fighting tech will have bias but vows to tackle it Exclusive: NCA’s Alex Murray says he hopes new £115m police AI centre can limit unfairness found in tools

"It's not Robocop." We think the point is being missed. Addressing #MachineBias is not enough. We need police transparency and accountability regarding their tools and community impact. Public scrutiny is essential. #StopAutomatedRacism 🕵️‍♂️⚖️ www.theguardian.com/technology/2...

2 0 0 0
Preview
Automated Racism @AmnestyUK found out that 3/4 of police forces across the UK are using technology to try to “predict crime” - and almost no one knows about it. Learn more:

Last year, we contributed to the #AutomatedRacism report on police use of predictive technology, which fuels bias and discrimination. We call for a ban on harmful tech. Join us at #CommunitiesForJustice to #StopAutomatedRacism now! ✊🏽 www.amnesty.org.uk/knowledge-hu...

0 0 0 0
Preview
Road mapping paths to accountability By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…

copwatchersorg.wordpress.com/2025/11/08/r...

"We’re a frightening distance from anything close to accountability." Always have been. #WeCopWatch #StopAutomatedRacism

0 0 0 0
Preview
When AI meets law enforcement: The future of predictive policing New artificial intelligence systems are reshaping how police operate, offering efficiency gains while sparking debate among legal and ethics experts.

"Critics worry about racial and socioeconomic biases, mass-surveillance privacy violations, and a lack of transparency. " everywhere we go, it's the same #policestory #stopautomatedracism #WeCopWatch

www.thestreet.com/technology/w...

0 1 0 0
Preview
Labour's plans for predictive policing are full of the usual racism There is no way for predictive policing to be done in an ethical manner - punishing people for crimes that haven't happened is dystopian

www.thecanary.co/uk/analysis/...

There is no way for predictive policing to be done in an ethical manner. This approach inherently involves punishing individuals for crimes that have not yet occurred, creating a chilling and dystopian reality. #stopautomatedracism #automatedinjustice

0 0 0 0
Post image

#SeanMorrison is the investigations lead at the #BristolCable join us at #CommunitiesForJustice as Sean discusses #Section60 and shows you how to raise a #SAR to challenge #predictivepolicing #WeCopWatch #stopautomatedracism

0 0 0 0
Preview
End Pre-Crime Data and content is being weaponised to criminalise people without cause fuelled by facial recognition technology, AI and surveillance.

www.openrightsgroup.org/campaign/res...

Data and content are being weaponised to unjustly criminalise people, driven by facial recognition tech, AI, and surveillance. 🚨 #WeCopWatch #automatedinjustice #stopautomatedracism

1 0 0 0
Post image

Tickets are now on sale for #CommunitiesForJustice! Join us for Know Your Rights workshops, speaker panels, and discussions. Pay what you can! #WeCopWatch #StopAutomatedRacism
www.eventbrite.co.uk/e/communitie...

0 0 0 0
Preview
Public law litigation in the automated state - Public Law Project How can we use transparency mechanisms to make sure the public sector's automated decision-making is fair and lawful?

publiclawproject.org.uk/latest/publi...

Ensuring fairness in automated decisions requires transparency, accountability, and community engagement. Policing by machine is a failing experiment and not fit for purpose. #StopAutomatedRacism #WeCopWatch

0 0 0 0
Post image

www.eventbrite.co.uk/e/communitie...
Tickets are now on sale for our upcoming event #CommunitiesForJustice, offered on a "pay what you can" sliding scale. Join us for Know Your Rights workshops, speaker panels, and discussions in two rooms! #WeCopWatch #stopautomatedracism

0 1 0 0
Post image

www.eventbrite.co.uk/e/communitie...

Tickets are now on sale for our upcoming event #CommunitiesForJustice, offered on a "pay what you can" sliding scale. Join us for Know Your Rights workshops, speaker panels, and discussions in two rooms! #WeCopWatch #stopautomatedracism

0 0 0 0
Post image

www.eventbrite.co.uk/e/communitie...

Tickets are now on sale for our upcoming event #CommunitiesForJustice, offered on a "pay what you can" sliding scale. Join us for Know Your Rights workshops, speaker panels, and discussions in two rooms! #WeCopWatch #stopautomatedracism

0 0 0 0
Preview
AI to help police catch criminals before they strike Government launches AI crime prevention challenge to support safer streets.

Join us for an #AICrimeMapping discussion at #CommunitiesForJustice on 28/2/26. We're aware of the dangers and harms this tech poses to communities. #MassSurveillance and #criminalisation is not going to make anyone safer. #stopautomatedracism

www.gov.uk/government/n...

0 0 0 0
Preview
Home Office admits facial recognition tech issue with black and Asian subjects Calls for review after technology found to return more false positives for ‘some demographic groups’ on certain settings

www.theguardian.com/technology/2...
#Liberty will address facial recognition and #PredictivePolicing at our event, #CommunitiesForJustice, on February 28, 2026, at the #ECC. These technologies threaten our civil liberties and human rights. #StopAutomatedRacism ✊🏽

1 0 0 0
Preview
Road mapping paths to accountability By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…

copwatchersorg.wordpress.com/2025/11/08/r...

"Shouldn’t AI ethics frameworks involve diverse committees, including those with lived experience of the harms of policing? There has been no transparency or consultation with any of us." #stopautomatedracism
#OffenderManagementApp #Liberty

1 0 0 0
Preview
Suspicion by design: How AI and section 60 are reshaping everyday policing Both algorithmic and section 60 policing have well-documented racial biases and abuses. Now, newer technologies such as AI and LFR add another layer, argues Tom Dixon

www.stop-watch.org/news-opinion...

Algorithmic policing and section 60 practices are known for their racial biases and abuses. Tom Dixon argues that newer technologies like #AI and #LFR exacerbate this issue. #WeCopWatch #stopautomatedracism

0 0 0 0
Preview
a black background with the words hold police accountable in red ALT: a black background with the words hold police accountable in red

If you believe you have been impacted by predictive policing in your community or through the education system we would like to hear from you.

We are building a roadmap and community path to justice. We see you because #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism

1 1 0 0
Preview
Road mapping paths to accountability By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…

copwatchersorg.wordpress.com/2025/11/08/r...
Bristol Copwatch advocates for justice and police accountability. Name the only sentence without a full stop, and you’ll find the Offender Management App. #WeCopWatch #stopautomatedracism

0 0 0 0

"There is the impact such technology has on our lives, and the harms of policing we must consider. Perpetual criminalisation is something I went through in my youth.." #stopautomatedracism

0 0 0 0
Preview
Road mapping paths to accountability By John Pegram, founder, caseworker, public speaker Afternoon, fellow copwatchers! As the end of 2025 approaches at a rate of knots, it seems only right to update you all on current developments wi…

copwatchersorg.wordpress.com/2025/11/08/r...
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism

0 0 0 1
Preview
Predictive Policing and Subject Access Requests By John Pegram, case worker, public speaker, founder If you’ve been keeping tabs on all things police in Bristol and across Avon and Somerset you’d be right in thinking there’s a …

Look out for our latest blog post, "Road mapping paths to accountability", a follow-up to "Predictive policing and subject access requests" over the weekend. #WeCopWatch #stopautomatedracism

copwatchersorg.wordpress.com/2024/04/23/p...

1 0 0 0
Preview
Predictive Policing and Community Organising – Justice, Equity & Technology

justice-equity-technology.org/predictive-p...

This article is worth your time if you want to learn how to mobilise your community against predictive policing. Stay tuned for updates on our blog this weekend! #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism

0 0 0 0
Preview
Stop Automated Racism Amnesty International UK found out that 3/4 of police forces across the UK are using technology to try to “predict crime”, but it is having racist and discriminatory impacts. Almost no one knows about it. Learn more:

The police are creating biased and discriminatory intelligence profiles based on individuals' pasts, assuming mistakes will be repeated and ignoring the possibility of change. #WeCopWatch #StopAutomatedRacism

www.amnesty.org.uk/predictive-p...

1 0 0 0
Preview
Why ‘Predictive’ Policing Must be Banned The UK Government is trying to use algorithms to predict which people are most likely to become killers using sensetive personal data of hundreds of thousands of people.

www.openrightsgroup.org/blog/why-pre...

The UK Government is attempting to leverage algorithms to forecast which individuals may become violent offenders, utilising sensitive personal data from hundreds of thousands. #automatedinjustice #stopautomatedracism

1 1 0 0
Preview
Modern policing has always been predictive. We are Amnesty International UK. We are ordinary people from across the world standing up for humanity and human rights.

www.amnesty.org.uk/blogs/human-...

In today's world, advanced technologies and algorithms are enhancing policing predictive capabilities and entrenching and amplifying institutional racism and discrimination. #StopAutomatedRacism

0 0 0 0
Preview
Public law litigation in the automated state - Public Law Project How can we use transparency mechanisms to make sure the public sector's automated decision-making is fair and lawful?

publiclawproject.org.uk/latest/publi...

"A new project from Public Law Project, funded by the Nuffield Foundation, will investigate how transparency mechanisms can be adapted to facilitate fair, lawful, and non-discriminatory automated decision-making in the public sector." #stopautomatedracism

0 0 0 0
Preview
Predictive policing AI is on the rise − making it accountable to the public could curb its harmful effects AI that anticipates where crimes are likely to occur and who might commit them has a troubling track record. Democratic accountability could shine a light on the technology and how it’s used.

In 2021, four residents sued the county, resulting in a settlement where the sheriff's office acknowledged violations of rights. In the same police story, the Offender Management App violates human rights. #StopAutomatedRacism

theconversation.com/predictive-p...

1 2 0 0
Preview
Predictive Policing Subject Access Request SAR Template WHAT IS A DATA SUBJECT ACCESS REQUEST? A data subject access request (DSAR) is a request you can make to an organisation to find out what information they hold about you (known as your ‘personal data...

docs.google.com/document/d/1...

Start from here. #WeCopWatch #StopAutomatedRacism

0 1 0 0