Help end child abuse online

Children's lives are online. But we need the Online Safety Bill in place to protect them from online abuse. We can't do this alone.

Join us
Join us

Each day that passes without strong protections online puts children at risk of abuse.

In the past, children could ride a car without a seatbelt, but the government changed the law to protect them. Why should the safety of children be any different online? Over 1 in 7 11-18-year-olds have been asked to send sexual messages or images of themselves online. This needs to stop.

We need the government to deliver an Online Safety Bill that keeps children safe from abuse online. 

Join us

Frida's story

When I was 13, a man in his 30s contacted me on Facebook.

 

He started telling me he liked me, which was such a nice change to all the abuse I was getting at school. Then he asked for some explicit photos.

 

I felt like I could trust him. It went on for years.*

We want the Online Safety Bill to protect children by:

Children use many apps every day to socialise and play games. But so do abusers who can easily friend and follow them, and trace them from app to app. This is what we call ‘cross-platform risk’.

The government must do more to make the tech companies work together to tackle the way grooming and abuse can spread across the internet.

In the Online Safety Bill, we need to see the regulator, Ofcom, require companies to proactively share information about offenders, threats to children’s safety, or new features that could lead to child abuse.

We want the Online Safety Bill to recognise that even if certain apps and sites aren’t intended for children, tech companies should know that children are using them, and put measures in place to protect them from seeing harmful content.

We’ve seen the draft Bill and it doesn’t go far enough. The Bill says that only companies with a ‘significant’ number of children on their apps would have to protect them from seeing harmful content. We’re worried this high threshold could mean some sites won’t have to actively protect children.

The government have said this threshold of child users will be up for the regulator, Ofcom, to decide. But if we want to protect generations of children from abuse online, this needs to be much more clearly set out in the Bill.

The Online Safety Bill will make tech companies responsible for spotting risks to children’s safety on their platforms. Companies that don’t do this could be subject to fines. But we don’t think this goes far enough to bring companies in line.

We need the government to amend the Bill so that it properly holds to account the senior managers who decide how safely their sites and apps are designed. Where they allow serious crimes against children to happen on their platforms, it’s only right that tech bosses be subject to the toughest criminal penalties.

We want a Named Person Scheme, which makes individuals at tech companies personally liable where they fail to uphold their duty of care. This happens in the financial services sector, and we don’t think protecting children online is any less important. 

The government needs to listen to children when they say they’re being harmed online. During last year’s Covid-19 lockdowns, our NSPCC helpline saw a 60% increase in contacts from people worried about child sexual abuse online.

We need to make sure children have a voice – with a powerful champion to represent them and make sure their needs aren’t drowned out by big tech companies.

Right now, there is nothing in the Bill to make sure children are heard. And it’s simply not good enough for children who’ve experienced or at risk of abuse to be denied that powerful voice.

What we've achieved with your support so far

We fixed the Flaw in the Law – but it wasn’t enough

In 2014, it wasn’t illegal for an adult to send a child a sexual message. 50,000 people signed our Flaw in the Law petition calling on the government to make online grooming a crime.

The government listened and an adult sending sexual messages to children was included as a crime in the Sexual Offences Act of 2015. But social media platforms still don’t have safety measures in place to stop groomers.

Tech companies still need to protect their users

In April 2018 we started a petition asking the government to bring in laws to make social media platforms protect young users from sexual abuse online. 

In under a year we had an incredible 46,000 signatures – and the government launched the Online Harms White Paper in April 2019. But the proposed bill still hasn’t passed – and children still aren’t safe from online grooming.

30902-exp-2024-09.jpgWe're so close - help us pass the Online Safety Bill

In December 2020, the government announced that a future Online Safety Bill will place a legal duty of care on tech companies to protect young people on their platforms. Their goal is to make the UK the safest place in the world to be a child online.

The Draft Online Safety Bill is being discussed in Parliament – and with child sex offences being at a record high – it's more important than ever that tech companies do all they can to protect their young users.

Do you want more detail?

 

 

 

 

 

Do you want more detail?

 

 

 

 

 

 

Do you want more detail?

 

 

 

 

 

 

 

 

 

 

Do you want more detail?

Read our latest report on online safety here

If you’re an MP you can find our parliamentary briefing here

*DISCLAIMER

This is a true story but names have been changed to protect identities and photographs have been posed by models.