May 28 2019
Cops Using ‘Minority Report’ Predictive Policing Could See INNOCENT People FRAMED
Police forces are using “Minority Report-style” computer programs to predict where and when crime will happen – and innocent people…
Police forces are using “Minority Report-style” computer programs to predict where and when crime will happen – and innocent people are being targeted.
The Sci-Fi movie Minority Report was set in a world in which police arrested people for crimes that were only predicted and not committed. But now real life is emulating fiction, according to campaign group Liberty.
A report by Liberty called “Policing by Machine” reveals the widespread use of biased “predictive policing” that they claim threatens everyone’s rights and freedoms.
The report collates 90 Freedom of Information requests sent to every force in the UK.
These programs also “learn” over time and become more autonomous when making predictions, without having to be programmed.
There are two types of predictive policing, predictive mapping programs, and individual risk assessment programs.
Predictive mapping programs use police data about past arrests to identify “hot spots” of high risk and police are then directed to patrol the areas.
But Liberty says these tend to be areas that already experience “over-policing”.
Hannah Couchman, policy and campaigns officer at Liberty, told Daily Star Online: “It’s not the case that over-policing is a response to higher crime levels, it’s that police resources are being disproportionately focused on certain areas and certain groups – for example, stop and search powers are disproportionately deployed against people from BAME communities.
“This biased approach to policing is also reflected in the data collected by police, and this influences future strategy.”
Individual risk assessment programs use data to predict how a person will behave and even whether they are likely to commit or be victim to certain crimes.
Predictive mapping feeds off historical crime data and this doesn’t present a true picture of where crime is happening.
“There is a case for using data to address issues in policing and criminal justice, including bias, but rights issues surrounding privacy, free expression and discrimination are not being properly addressed in a way that make this tech part of a legitimate policing strategy.”
Liberty says the biggest problem with the program is the lack of transparency.
Police claim that a human will be overseeing the computer programs, but even officers deploying the technology will be unable to explain fully how it arrives at its conclusion.
This means people can’t hold the programs to account or properly challenge the predictions they make. (Read More)