"One key risk to mitigate is bias and discrimination in algorithmic systems, which may reflect historical inequalities. This can lead to unfair outcomes for individuals." The #ICO agree then that the police's use of #AI is unlawful. #StopAutomatedRacism
ico.org.uk/for-organisa...
"It's not Robocop." We think the point is being missed. Addressing #MachineBias is not enough. We need police transparency and accountability regarding their tools and community impact. Public scrutiny is essential. #StopAutomatedRacism 🕵️♂️⚖️ www.theguardian.com/technology/2...
Last year, we contributed to the #AutomatedRacism report on police use of predictive technology, which fuels bias and discrimination. We call for a ban on harmful tech. Join us at #CommunitiesForJustice to #StopAutomatedRacism now! ✊🏽 www.amnesty.org.uk/knowledge-hu...
copwatchersorg.wordpress.com/2025/11/08/r...
"We’re a frightening distance from anything close to accountability." Always have been. #WeCopWatch #StopAutomatedRacism
"Critics worry about racial and socioeconomic biases, mass-surveillance privacy violations, and a lack of transparency. " everywhere we go, it's the same #policestory #stopautomatedracism #WeCopWatch
www.thestreet.com/technology/w...
www.thecanary.co/uk/analysis/...
There is no way for predictive policing to be done in an ethical manner. This approach inherently involves punishing individuals for crimes that have not yet occurred, creating a chilling and dystopian reality. #stopautomatedracism #automatedinjustice
#SeanMorrison is the investigations lead at the #BristolCable join us at #CommunitiesForJustice as Sean discusses #Section60 and shows you how to raise a #SAR to challenge #predictivepolicing #WeCopWatch #stopautomatedracism
www.openrightsgroup.org/campaign/res...
Data and content are being weaponised to unjustly criminalise people, driven by facial recognition tech, AI, and surveillance. 🚨 #WeCopWatch #automatedinjustice #stopautomatedracism
Tickets are now on sale for #CommunitiesForJustice! Join us for Know Your Rights workshops, speaker panels, and discussions. Pay what you can! #WeCopWatch #StopAutomatedRacism
www.eventbrite.co.uk/e/communitie...
publiclawproject.org.uk/latest/publi...
Ensuring fairness in automated decisions requires transparency, accountability, and community engagement. Policing by machine is a failing experiment and not fit for purpose. #StopAutomatedRacism #WeCopWatch
www.eventbrite.co.uk/e/communitie...
Tickets are now on sale for our upcoming event #CommunitiesForJustice, offered on a "pay what you can" sliding scale. Join us for Know Your Rights workshops, speaker panels, and discussions in two rooms! #WeCopWatch #stopautomatedracism
www.eventbrite.co.uk/e/communitie...
Tickets are now on sale for our upcoming event #CommunitiesForJustice, offered on a "pay what you can" sliding scale. Join us for Know Your Rights workshops, speaker panels, and discussions in two rooms! #WeCopWatch #stopautomatedracism
www.eventbrite.co.uk/e/communitie...
Tickets are now on sale for our upcoming event #CommunitiesForJustice, offered on a "pay what you can" sliding scale. Join us for Know Your Rights workshops, speaker panels, and discussions in two rooms! #WeCopWatch #stopautomatedracism
Join us for an #AICrimeMapping discussion at #CommunitiesForJustice on 28/2/26. We're aware of the dangers and harms this tech poses to communities. #MassSurveillance and #criminalisation is not going to make anyone safer. #stopautomatedracism
www.gov.uk/government/n...
www.theguardian.com/technology/2...
#Liberty will address facial recognition and #PredictivePolicing at our event, #CommunitiesForJustice, on February 28, 2026, at the #ECC. These technologies threaten our civil liberties and human rights. #StopAutomatedRacism ✊🏽
copwatchersorg.wordpress.com/2025/11/08/r...
"Shouldn’t AI ethics frameworks involve diverse committees, including those with lived experience of the harms of policing? There has been no transparency or consultation with any of us." #stopautomatedracism
#OffenderManagementApp #Liberty
www.stop-watch.org/news-opinion...
Algorithmic policing and section 60 practices are known for their racial biases and abuses. Tom Dixon argues that newer technologies like #AI and #LFR exacerbate this issue. #WeCopWatch #stopautomatedracism
If you believe you have been impacted by predictive policing in your community or through the education system we would like to hear from you.
We are building a roadmap and community path to justice. We see you because #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism
copwatchersorg.wordpress.com/2025/11/08/r...
Bristol Copwatch advocates for justice and police accountability. Name the only sentence without a full stop, and you’ll find the Offender Management App. #WeCopWatch #stopautomatedracism
"There is the impact such technology has on our lives, and the harms of policing we must consider. Perpetual criminalisation is something I went through in my youth.." #stopautomatedracism
copwatchersorg.wordpress.com/2025/11/08/r...
If you're considering action against predictive policing or believe you've been risk-scored by the Offender Management App, check out our Sunday long read, "Road mapping paths to accountability." You'll never walk alone. #StopAutomatedRacism
Look out for our latest blog post, "Road mapping paths to accountability", a follow-up to "Predictive policing and subject access requests" over the weekend. #WeCopWatch #stopautomatedracism
copwatchersorg.wordpress.com/2024/04/23/p...
justice-equity-technology.org/predictive-p...
This article is worth your time if you want to learn how to mobilise your community against predictive policing. Stay tuned for updates on our blog this weekend! #WeCopWatch #EndThinkSurveillance #StopAutomatedRacism
The police are creating biased and discriminatory intelligence profiles based on individuals' pasts, assuming mistakes will be repeated and ignoring the possibility of change. #WeCopWatch #StopAutomatedRacism
www.amnesty.org.uk/predictive-p...
www.openrightsgroup.org/blog/why-pre...
The UK Government is attempting to leverage algorithms to forecast which individuals may become violent offenders, utilising sensitive personal data from hundreds of thousands. #automatedinjustice #stopautomatedracism
www.amnesty.org.uk/blogs/human-...
In today's world, advanced technologies and algorithms are enhancing policing predictive capabilities and entrenching and amplifying institutional racism and discrimination. #StopAutomatedRacism
publiclawproject.org.uk/latest/publi...
"A new project from Public Law Project, funded by the Nuffield Foundation, will investigate how transparency mechanisms can be adapted to facilitate fair, lawful, and non-discriminatory automated decision-making in the public sector." #stopautomatedracism
In 2021, four residents sued the county, resulting in a settlement where the sheriff's office acknowledged violations of rights. In the same police story, the Offender Management App violates human rights. #StopAutomatedRacism
theconversation.com/predictive-p...
docs.google.com/document/d/1...
Start from here. #WeCopWatch #StopAutomatedRacism