Anglia Ruskin University given funding to tackle online child abuse through AI
Anglia Ruskin University (ARU) has won Government funding to help develop a new way of tackling online child abuse.
The partnership is one of five projects from across the UK and Europe to receive backing as part of the £555,000 Safety Tech Challenge Fund, administered by the Department for Digital, Culture, Media & Sport and the Home Office.
The money will help scientists work on Artificial Intelligence (AI) technology to block specific video content from being ever being filmed on a device.
Social media companies are increasingly using end-to-end encryption, which improves privacy for users but simultaneously makes the detection of illegal content more difficult for law enforcement agencies.
The Safety Tech Challenge Fund is focusing on initiatives to tackle child sexual abuse material despite this growth in end-to-end encryption.
ARU's Policing Institute for the Eastern Region (PIER) will work with SafeToNet to expand their SafeToWatch AI technology.
SafeToWatch uses AI to block specific video content from being created at source.
Rather than relying on buy-in from third parties, such as social media companies, SafeToWatch uses a device's camera app to identify inappropriate images and prevent them from being filmed.
Professor Samantha Lundrigan, Director of PIER at ARU, said: "PIER is delighted to be supporting SafeToNet on the development of its ground-breaking SafetoWatch technology.
"The SafetoWatch tool will involve the development of device-level artificial intelligence to prevent the uploading and sharing of indecent images.
"This is crucial for improving the protection of children, particularly as the use of end-to-end encryption continues to grow."
ARU will be given an initial £85,000 over the next five months to develop this technology so it can be trained to recognise child sexual abuse material in real time and prevent it from being created.
In future, SafeToWatch could potentially be installed as standard on any smart device.