Esther Ghey: Ofcom online safety rules step in right direction but more needs to be done

The regulator wants social media firms to use photo ID and change algorithms to prevent children seeing harmful content

Social media platforms must take action to stop their algorithms recommending harmful content to children, and put robust age-checking measures in place to protect them, Ofcom has said.

Esther Ghey, the mother of murdered teenager Brianna Ghey, has been campaigning for an age limit on smartphone usage and tighter controls on access to social media.

She said that the regulator's draft proposals are a step in the right direction but that they need to be "less ambiguous", when speaking to Good Morning Britain.

The regulator has published its draft children’s safety codes of practice, which set out how it expects online services to meet their new legal responsibilities to protect children online under the Online Safety Act.

The online safety laws require sites which can be accessed by children to assess and take action to reduce the risks their platform poses to them. Those found to be in breach of the code could be hit with large fines.

Ofcom is the new regulator for the sector, and has published a range of draft codes of practice in recent months. The new rules are set to come into full force towards the end of this year.

Codes will expect services to carry out robust age verification processes to stop children accessing harmful material. They will also ensure any recommendation algorithms – such as TikTok's “For You” page – do not serve dangerous or potentially harmful content to children.

Platforms which can be accessed by children and have a higher risk of harmful content appearing must make sure their algorithms filter out the most harmful content from children’s feeds.

The draft codes also require firms to have content moderation systems and processes in place, and ensure that swift action is taken against harmful content. Search engines are expected to have a “safe search” option for use by children.

Alongside the regulations being mooted by Ofcom, Brianna's mother also called for tech firms to produce a smartphone designed for children - one that restricts content for users.

Molly Russell died after viewing content related to depression, suicide and self harm on social media Credit: Family handout

Child online safety campaigner Ian Russell, the father of 14-year-old Molly Russell who took her own life in November 2017 after viewing harmful material on social media, also said more still needed to be done to protect young people from online harms.

In his role as chair of online safety charity, the Molly Rose Foundation, Mr Russell said: “Ofcom’s task was to seize the moment and propose bold and decisive measures that can protect children from widespread but inherently preventable harm.

“The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life.

“It’s over six years since Molly’s death, but the reality is that very little has yet changed. In some respects, the risks for teens have actually got worse.

“That’s why it’s hugely important that the next prime minister commits to finish the job to and strengthen the Online Safety Act to give children and families the protection they deserve.”

Ofcom chief executive, Dame Melanie Dawes, said: “We want children to enjoy life online. But, for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control.

“In line with new online safety laws, our proposed Codes firmly place the responsibility for keeping children safer on tech firms.

“Our measures, which go way beyond current industry standards, will deliver a step-change in online safety for children in the UK. Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms to account."

Ofcom's proposals suggest algorithms should also reduce the visibility of lower risk, but still potentially harmful, material. Credit: PA

Sir Peter Wanless, chief executive of children’s charity, the NSPCC, said the draft code was a “welcome step in the right direction” towards protecting children online.

“The draft codes set appropriate, high standards and make it clear that all tech companies will have work to do to meet Ofcom’s expectations for keeping children safe,” he said.

“Tech companies will be legally required to make sure their platforms are fundamentally safe by design for children when the final code comes into effect, and we urge them to get ahead of the curve now and take immediate action to prevent inappropriate and harmful content from being shared with children and young people."

Michelle Donelan said the measures would bring about ‘fundamental change’ for children in the UK Credit: Leon Neal/PA

Technology Secretary, Michelle Donelan, said: “When we passed the Online Safety Act last year, we went further than almost any other country in our bid to make the UK the safest place to be a child online.

“Once in place, these measures will bring in a fundamental change in how children in the UK experience the online world. I want to assure parents that protecting children is our number one priority and these laws will help keep their families safe.

“To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”

Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To Know…