UK Police forces under fire from human rights group over use of crime prediction software

Wiseman Admin2019, Criminal LawLeave a Comment

According to human rights organisation Liberty, Police forces in the UK are using computer algorithms and other types of crime prediction software.

The human rights group said it had sent a total of 90 Freedom of Information requests out last year to discover which forces used the technology.

It is believed that a minimum of 14 police forces in the UK has previously used, are currently using or are planning to use the programs, which assess a person’s likelihood of committing or falling victim to crime and where it could be committed.

Liberty claims that the algorithms are developed using historical data gathered as part of biased policing practices, with the danger of the AI-led programs learning from the information and perpetuating bias in future autonomous predictions.

Police forces including the Metropolitan, Kent, West Midland and Greater Manchester police were among the forces to experiment with two kinds of software, predictive mapping programs and individual risk assessment programs.

Mapping programs are used to identify ‘hot spots’ of high-risk crime areas based on police data of past offences, which can be skewed into presenting certain locations where people from black, Asian and minority ethnic (BAME) communities are more likely to be arrested as higher-crime areas.

Hannah Couchman, Advocacy and Policy Officer for Liberty, said: “Predictive policing is sold as innovation, but the algorithms are driven by data already imbued with bias, firmly embedding discriminatory approaches in the system

Life-changing decisions are being made about us that are impossible to challenge. In a democracy which should value policing by consent, red lines must be drawn on how we want our communities to be policed.”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.