Researchers supporting flagship project on artificial intelligence in law enforcement

Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now
Researchers from the University of Northampton are part of a UK-wide team of academics who are investigating the future use of probabilistic AI in law enforcement.

Probabilistic AI uses randomness and uncertainty to make predictions and the team behind the project want to make sure data used in the criminal justice system is of high quality and that it does not result in miscarriages of justice.

The four-year interdisciplinary research project, entitled ‘Probable Futures – Probabilistic AI Systems in Law Enforcement Futures, was announced during the CogX Festival in Los Angeles – a flagship event which brought together industry leaders, changemakers and policy makers from around the world to address the question `how do we get the next 10 years right’.

Hide Ad
Hide Ad

Dr Claire Paterson-Young, Associate Professor from the Institute for Social Innovation and Impact (ISII) at the University of Northampton is leading the Social Impact and Participatory element of the research and is supported by ISII Researcher, Michael Maher.

UON researchers ensure AI does not compromise the criminal justice systemUON researchers ensure AI does not compromise the criminal justice system
UON researchers ensure AI does not compromise the criminal justice system

They are part of the wider team which is led by Northumbria Law School’s Professor Marion Oswald MBE and includes researchers from Glasgow, Leicester, Cambridge and Aberdeen.

Dr Paterson-Young said: “There’s no doubt that artificial intelligence has brought about new ways of doing things and, as it develops, it will have a significant impact on the criminal justice systems across the globe.

“That impact is unknown at present so now is the right time to make sure we have robust ways to scrutinise future developments to ensure that the efficiency and precision of policing is not compromised and that trust in the criminal justice system is not degraded.”

Hide Ad
Hide Ad

Professor Oswald added “Our project, working alongside our law enforcement, third sector and commercial partners, will develop a framework to understand the implications of uncertainty and to build confidence in future Probabilistic AI in law enforcement, with the interests of justice and responsibility at its heart.

“We’re excited to work with RAI UK and the other Keystone projects to consider the implications of our research and our framework for other complex domains with probabilistic AI, such as healthcare, and to learn from their research.”

Professor Gopal Ramchurn, Chief Executive Officer of RAi UK, said they were excited to be launching the project in Los Angeles in the presence of sector professionals from around the world and added: “The concerns around AI are not just for governments and industry to deal with. It is important that AI experts engage with researchers from other disciplines and policy makers to ensure that we can better anticipate the issues that will be caused by AI.

“Our keystone projects will do exactly that and work with the rest of the AI ecosystem to bring others to our cause and amplify the impact of the research to maximise the benefit of AI to everyone in society”.

Hide Ad
Hide Ad

Funding has been awarded by Responsible AI UK (RAi UK) and form the pillars of its £31million programme that will run for four years. RAi UK is backed by UK Research and Innovation (UKRI), through the UKRI Technology Missions Fund and EPSRC.

Find out more at the Responsible Ai UK website.