Report claims GMP's use of algorithms and data is 'supercharging racism'
ITV News' Tasha Kacheri explains what the report from Amnesty International found and how they say 'predictive policing' is discriminating against certain groups.
A new report from Amnesty International UK claims to have exposed the grave dangers to society posed by ‘predictive policing’ systems and technology used across almost three quarters of the UK’s police forces.
They say the report - ‘Automated Racism – How police data and algorithms code discrimination into policing’ demonstrates how these systems are in flagrant breach of the UK’s national and international human rights obligations.
Amnesty found that at least 33 police forces – including Greater Manchester Police - across the UK have used predictive profiling or risk prediction systems. Of these forces, 32 have used geographic crime prediction, profiling, or risk prediction tools, and 11 forces have used individual prediction, profiling, or risk prediction tools.
Amnesty categorises the predictive systems into two types:
Location-Based Prediction is used to measure the likelihood of crimes being committed in geographic locations in the future.
Individual Profiling is used to flag individuals as potential offenders, based on their past actions. These profiles are not public and Amnesty claims they often do not rely on concrete evidence.
Amnesty argues that areas such as Manchester with high populations of Black and racialised people are repeatedly targeted by police, leading to more incidents being recorded which are then fed into the algorithms to create a "vicious cycle".
Amnesty's Director for Racial Justice Ilyas Nagdee said: "It's a vicious cycle. We're seeing that over-policing in particular communities is leading to discriminatory data which is then fed into the systems.
"That cycle repeats and repeats compounding discrimination for these marginalised communities and risking human rights violations."
The report also claims that Greater Manchester Police uses 'gang profiling' to identify suspicious individuals who could be affiliated with gangs and have the potential to commit crimes.
Amnesty claims that these profiles rely on suspicion or even ‘perception’ without objective evidence of offending.
Greater Manchester Police deny working from any gang profiling database, and instead work with communities to identify vulnerable people and potential criminals.
Sacha Deshmukh, Chief Executive at Amnesty International UK, said: "The use of predictive policing tools violates human rights. The evidence that this technology keeps us safe just isn't there. The evidence that it violates our fundamental rights is clear as day.
“These systems have been built with discriminatory data and only serve to supercharge racism.”
The report also highlights that predictive policing practices can lead to mass surveillance and infringement on freedoms such as privacy, expression, and association. It further warns that individuals and communities subjected to these systems may have no clear means to challenge law enforcement decisions based on algorithmic risk assessments.
While Amnesty’s findings are critical, the police maintain that data-driven tools help allocate resources efficiently and prevent crime.
In a statement, the National Police Chiefs’ Council acknowledged institutional racism within policing and emphasised ongoing efforts to address disparities.
The tactic of banning people from events in Manchester because they were perceived to be linked with gangs is one element of their so-called gang profiling.
In one instance, the XCalibre Task Force sought to exclude people from a cultural event based on its data-based profiling of their alleged involvement in gangs.
Amnesty and the charity Kids of Colour say Greater Manchester Police used discriminatory data when it sent letters in 2022 banning several young people from Manchester’s Caribbean Carnival.
Greater Manchester Police say they have not used banning letters in the past two years and have no plans to do it anymore.
The chair of the National Police Chiefs Council has previously admitted that policing is ‘institutionally racist’ - but say "hotspot policing and visible targeted patrols are the bedrock of community policing."
A NPCC spokesperson said: "Policing uses a wide range of data to help inform its response to tackling and preventing crime, maximising the use of finite resources. As the public would expect, this can include concentrating resources in areas with the most reported crime.
“We are working hard to improve the quality and consistency of our data to better inform our response, ensuring that all information and new technology is held and developed lawfully, ethically and in line with the Data Ethics Authorised Professional Practice (APP)."
In the year ending March 2023 there were 24.5 stops and searches for every 1,000 Black people, 9.9 stops and searches for every 1,000 people with mixed ethnicity, 8.5 for every 1,000 Asian people – and 5.9 for every 1,000 white people.
The majority of stops and searches in the UK – 69% – lead to no further action.
The second area is 'Profiling', where individuals are placed in a secret database and profiled as someone at risk of committing certain crimes, in the future.
Sacha Deshmukh, Chief Executive at Amnesty International UK, said: “No matter our postcode or the colour of our skin, we all want our families and communities to live safely and thrive.
“These tools to 'predict crime' harm us all by treating entire communities as potential criminals, making society more racist and unfair."
Amnesty International is urging the UK Government and devolved administrations to prohibit the use of predictive policing systems. The organisation’s recommendations include:
A ban on predictive policing technologies.
Greater transparency, including a public register detailing the systems used by authorities.
Clear accountability mechanisms to allow individuals to challenge decisions influenced by predictive algorithms.
In response to the report, a Greater Manchester Police spokesperson said: “Our priority is preventing crime and stopping people coming from harm. We have a range of ways we proactively do this, including engaging with communities and working with partners and charities to divert people away from criminality.
“Proactive policing is a particularly vital part of how we have brought violent crime down so 1,600 fewer victims came to harm last year across GM. We focus our local and specialist resources in the areas where a combination of community information and recent reporting suggests there is risk of people being subjected to harm.
“The work of the XCalibur Task Force in the past two decades has seen us engage with areas of Manchester where violent crime has previously blighted communities. It does not work from any ‘gang profiling’ database and instead uses information from the community to inform the life-changing work it continues to achieve alongside local community groups and agencies.”
A NPCC spokesperson said: "It is our responsibility as leaders to ensure that we balance tackling crime with building trust and confidence in our communities whilst recognising the detrimental impact that tools such as stop and search can have, particularly on Black people. The Police Race Action Plan is the most significant commitment ever by policing in England and Wales to tackle racial bias in its policies and practices, including an ‘explain or reform’ approach to any disproportionality in police powers.
“The national plan is working with local forces and driving improvements in a broad range of police powers, from stop and search and the use of Taser through to officer deployments and road traffic stops. The plan also contains a specific action around data ethics, which has directly informed the consultation and equality impact assessment for the new APP.”
Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To Know...