Perils of predictive policing
Amnesty publishes a report warning of the perils of predictive policing
February 2025
Many TV detective series have technology at their core as our heroes vigorously pursue the wrongdoers. CCTV cameras are scrutinised for movements of the criminals, DNA evidence is obtained and of course fingerprints are taken. The story lines of countless detective series feature forensic evidence as a key component of police detection. The series and stories are reassuring by displaying law enforcement officers using all the techniques – scientific and technological – to keep us all safe and lock up the bad guys. Using science and algorithms to enable police forces to predict crime must be a good idea surely?
It is not. The Amnesty report, and other research, explain in great detail the problems and what the risks are. One of the persistent biases in the justice system is racism and it would be worth reading the book The Science of Racism by Keon West (Picador, pub. 2025). The author takes the reader through copious peer reviewed research conducted over many years in different countries explaining the extent of racism. Examples include many cv studies (US: resume) where identical cv’s, but with different names which indicate the ethnicity of candidates, produces markedly different results. There are similar examples from the world of medicine and academia. Racism is endemic and persists. As Keon West acknowledges, a similar book could be written about how women are treated differently.
The Amnesty report notes that Black people are twice as likely to be arrested; three times as likely to be subject to force and 4 times as likely to be subject to stop and search as white people. With such bias in place, the risk is that predictive policing might simply perpetuate existing prejudice and bias. The concern partly centres around the use of skin colour, where people live and their socio-economic background all used as predictive tools.
People have a deep faith in technology. On a recent Any Answers? programme (on BBC Radio 4), a debate about the death penalty and the problem of mistakes, several people showed a touching faith in DNA in particular inferring that mistakes cannot happen. People are mesmerised by the white suited forensic officers on television giving a sense of science and certainly. Technology is only as good as the human systems which use it however. There have been many wrongful arrests and prison sentences of innocent people despite DNA, fingerprints, CCTV and all the rest. Mistakes are made. The worry is that predictive policing could enhance discrimination.
People who are profiled have no way of knowing that they have been. There is a need to publish details of what systems the police and others are using. The police are reluctant to do this the report notes. What is the legal basis for effectively labelling people because of their skin colour, where they live and their socio-economic status?
The police are keen on the idea and around 45 forces use it. The evidence for its effectiveness is doubtful. The risks are considerable.
#policing #predictivePolicing #Racism