Skip to main content
School of Law

UK police forces ‘supercharging racism’ with predictive policing

An interview from Amnesty International with Dr Daragh Murray was featured in an article for Computer Weekly on predictive police systems in the UK.

Published:
Two London Metropolitan police officers in high-vis jackets sat on parked motorcycles.

Predictive policing systems are used by police forces across the UK and utilise artificial intelligence and algorithms fed by police data sets to assess the likelihood of criminal behaviour in individuals or geographic locations. A new report from Amnesty International has found these systems are "supercharging racism" by unfairly targeting poor and racialised communities, who have been vastly overrepresented in the past police data. Amnesty claims this is creating a negative feedback loop, leading to further over-policing of certain groups and areas, and reinforcing the pre-existing discrimination as more data is fed into the systems.

In an Amnesty interview referenced in the article, Dr Daragh Murray, Senior Lecturer and IHSS Fellow at Queen Mary University of London, claims these systems are harmful because they are based on correlation rather than causation. “Essentially you’re stereotyping people, and you’re mainstreaming stereotyping, you’re giving a scientific objective to stereotyping,” he said.

Read the full article on Computer Weekly.

 

 

Back to top