#PROBabLEFutures #RAiUK #NPCC #PCMA #CCTV #VideoEvidence #DigitalEvidence #Policing #CriminalJustice
Posts by PROBabLE Futures: Probabilistic AI Systems in Law Enforcement
We are grateful to the National Police Chiefs' Council and PCMA organisers for the invitation and for the photographs captured during Marion’s session. It is always a privilege to contribute to conversations that shape the future of evidence, technology, and public trust.
Events such as these reinforce the collective commitment to ensuring that the use of video technologies is effective, proportionate, and trusted, and it was encouraging to see such strong engagement from colleagues across the sector.
Sharing knowledge openly and honestly is at the heart of PROBabLE Futures’ work, and opportunities like this conference are vital for strengthening understanding, confidence, and good practice across the policing and investigative community.
Marion’s presentation drew on her extensive experience and insight, highlighting the critical importance of CCTV and video evidence within investigations.
PROBabLE Futures were pleased to see Prof. Marion Oswald, MBE, support the 2026 NPCC CCTV and PCMA Conference through her contribution to the programme.
Great to see work connected to National Police Chiefs' Council priorities and the wider Responsible Ai UK programme helping to shape evidence‑led, responsible approaches to emerging technologies.
#CETaS #NPCC #PROBabLEFutures #RAiUK #Policing #EmergingTechnology #ResearchImpact
Many thanks to CETaS for publishing this timely research, and congratulations to Marion, Temi and Angela on this contribution.
Their article, Challenges of Remote AI Weapons Detection in Policing, explores some of the key technical, ethical and operational considerations around the use of AI in policing:
👉 cetas.turing.ac.uk/publications...
Great to see Prof. Marion Oswald, MBE, Temitope Lawal, PhD & Angela Paul, PhD (PROBabLE Futures) on having their article published by Centre for Emerging Technology and Security (CETaS):
'Challenges of Remote AI Weapons Detection in Policing'
We also raise serious concerns about AI involvement in witness statements, translation, and expert evidence, and caution against over‑prescriptive disclosure rules that risk distraction rather than clarity.
Read full response here: research.northumbria.ac.uk/probablefutu...
Our response argues that existing principles of professional responsibility already provide a clear and workable framework for accountability. The key question isn’t whether AI was used, but who is responsible for what is submitted to the court.
We’ve submitted PROBabLE Futures’ response to the Civil Justice Council consultation on “The Use of AI for Preparing Court Documents” - Dr. Kyri N. Kotsoglou / Temitope Lawal, PhD/ Prof. Marion Oswald, MBE
🔗 Read the full response: research.northumbria.ac.uk/probablefutu...
#Biometrics #FacialRecognition #TechGovernance #AIRegulation #ResponsibleAI #PROBabLEFutures #RAiUK
Our team highlights the need for a future‑proof, risk‑based regulatory model that ensures scientific validity, human oversight, and strong public accountability—across both public and private actors.
📢 New Consultation Response Published
We’ve submitted our response to the UK Government’s consultation on a new legal framework for law enforcement use of biometrics and facial recognition.
#RAi #PROBabLEFutures #FacialRecognition #Biometrics #TechGovernance #AIRegulation #DigitalRights #SurveillanceStudies #ResponsibleAI #AlgorithmicAccountability #PublicTrust #TechPolicy
🔗 Read the paper: research.northumbria.ac.uk/probablefutu...
👏 Huge well done to Nneoma and Aleeyah for this important contribution to the national conversation on biometric governance.
With FRT now common in shopping centres, transport hubs and retail spaces, the paper highlights rising risks around bias, misidentification, opaque watchlists and blurred accountability—making the case for actor‑neutral, risk‑based regulation.
PROBabLE Futures PhD Researchers Nneoma Ogbonna and Aleeyah Mahmood explore whether the UK’s upcoming facial recognition regulation should extend beyond policing to cover private‑sector use.
🏆 And a final win — Nneoma took home the poster prize!
A really insightful conference with important implications for responsible AI in policing.
hashtag#CCPR2026 hashtag#PolicingResearch hashtag#AppliedResearch hashtag#EvidenceBasedPolicing hashtag#RAi
Honest discussions on the limits of tools like ILAS, including false positives/negatives and limited workload reduction
Encouraging conversations with the Swiss Police Institute around AI use, emerging guidance, and future international collaboration
Repeated concerns about training culture in policing and what this means for the practical application of AI guidance
A cross‑jurisdictional perspective on human‑rights impact assessments of AI
Emerging risks around AI‑hacktivism and the vulnerability of law‑enforcement AI systems
Strong endorsement of the Responsible AI Checklist developed by Marion and Muffy, and insights from Project Odyssey mapping AI literacy across UK policing
Update from the Canterbury Church Policing Research (CCPR) Conference 2026
Nneoma attended the CCPR Conference at Canterbury Christ Church University this week and shared some valuable reflections from a packed programme on policing culture, wellbeing, leadership, and AI.
Key takeaways included:
🔗 Read the full blog here: research.northumbria.ac.uk/probablefutu...
#ResponsibleAI #LawEnforcement #RAi #PROBabLEFutures #Collaboraite, #Palantir, #OSIRT, #Microsoft #PAConsulting.
From analogies like the Police Dog model to critical discussions on Large Language Models and governance frameworks, the blog highlights why learning from past technologies is key to managing the speed and scale of AI today.
This piece reflects on insights from our recent PROBabLE Futures workshop, where experts from tech, justice, and academia explored how historical lessons can guide responsible AI adoption in policing.