Apple plans to scan iPhones for child sexual abuse images

Credit: Pexels

Apple has unveiled plans to scan photo libraries stored on iPhones in the US for known images of child sexual abuse.

The technology, called neuralMatch, will scan images before they are uploaded onto iCloud Photos and compare them against a database of known child abuse imagery compiled by the National Center for Missing and Exploited Children (NCMEC).

These images will be converted into "hashes"- numerical codes that can be "matched" to an image on an Apple device.

If a strong enough "match" is found, then a human reviewer will assess and report the user to law enforcement, the company said.

The user’s account will be disabled and the NCMEC notified if child abuse is confirmed.

The system is being rolled out first in the US, in the autumn, with plans for a wider introduction, including in the UK.

Apple said the system had an "extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account".

In a blog post, the technology giant said the technology is part of a new suite of child protection systems that would "evolve and expand over time".

It added that the technology will also be able to detect edited but similar versions of original images.

Apple iPhone 6 Credit: PA Wire

Child protection groups have welcomed the news, but concerns have been raised that the technology could be broadened to scan phones for prohibited content or even political speech.

Matthew Green, a security researcher at Johns Hopkins University, said: "Regardless of what Apple's long term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content. "Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone."

Apple, however, has insisted the technology offers "significant" privacy benefits over current techniques because it only learns about users' photos if they have a collection of known child sexual abuse material in their iCloud Photos account.