Welsh woman shares her story of being groomed online as social media sites face tougher laws

A Gwynedd woman has spoken out about her experiences of online grooming as tech firms face tougher laws to ensure their platforms are safer for users.

New legislation would force social media sites to do more to protect children from being exposed to things such as grooming, bullying and pornography.

Mared Parry was 14 years old when she started receiving messages on social media sites from men in their 20s.

The teenager initially felt reassured because the messages were friendly in tone and sent by people she had a few friends in common with online.

But as time went on, the messages became sinister, with Mared manipulated into sexual exchanges online.

She said the chats made her feel "like a grown-up" - but as an adult now realises that she was pushed into situations she didn't feel comfortable with.

"I didn't recognise it for what it was at the time," Mared said.

"They'd pressure me into sending semi-naked pictures and I just did it, because it didn't even cross my mind that it was grooming."

She didn't meet any of the messengers in person, but knew other girls who did, believing the men to be their boyfriends.

Mared, now 24, did not fully realise she had been groomed until she had left her teenage years behind.

She now shares her story and campaigns for change in a bid to protect other youngsters, who Mared fears are more at risk than ever during lockdown.

"We've had nearly a whole year of being in our houses; everyone using technology," she said.

"A lot of damage can be done over the internet. Who knows what's going on behind closed doors? It's terrifying to think."

Social media firms are facing stricter rules to ensure they protect users, especially children and the vulnerable. Credit: PA Images

Mared says she doesn't think enough has been done historically to protect people online, but proposed new laws could force social media platforms to "clean up their act".

Under the rules, regulator Ofcom will have the power to fine companies up to £18 million or 10% of their global turnover - whichever is higher - for failing to abide by a duty of care to their users, particularly children and the vulnerable.

It will also have the power to block non-compliant services from being accessed in the UK.

The proposed legislation will apply to any company in the world hosting user-generated content online which is accessible by people in the UK, or enables them to interact with others online.

A small group of high-profile platforms will face tougher responsibilities under a two-tier system, with Facebook, TikTok, Instagram and Twitter to be placed in Category 1 as the companies with the largest online presences and most features deemed high-risk.

In addition, companies will be asked to assess what content or activity on their platform is legal but could pose a risk of harm to adults.

The legislation is expected before Parliament next year.