Government must protect children on social media

New laws must be put in place to force social media sites to keep children safe

A girl types on a laptopWe are urging the new Government to ensure that all social networks follow a universal set of rules to protect children from online abuse.

Social media sites' own security rules are failing to keep children safe. They are not doing enough to tackle issues such as child abuse, grooming, hate speech, and cyber-bullying on their platforms.

That's why we're calling for new laws which will force social networks to protect children online, whichever sites they use.

What we're calling for

We want to see the Government create a rulebook that would be enforced by an independent regulator. Those rules will ensure that social media companies do these 3 things:

  • provide Safe Accounts for under-18s with extra protections built in
  • create grooming and bully alerts to flag up sinister behaviour
  • Hire an army of dedicated online child safety guardians.

How will these rules protect children?

Social media platforms must be required by law to give under-18s Safe Accounts with extra protections built in. These should include:

    • High privacy settings as default. Location settings locked off, accounts not public or searchable using phone number or email. There should be privacy prompts when sharing personal information.
    • Control over who you connect with. Followers must be approved by the young person. Video chat and live streaming should be restricted to the young person's contacts.
    • Clear and child-friendly rules and reporting buttons. They should be easy to find and easy to read.

Automatically flagging harmful behaviour.

Such as sending lots of friend requests to unknown people. Also a high rejection rate of adults' friend requests to under-18s can be flagged automatically to moderators to review or take action.

Groomer notifications.

Patterns of grooming or abusive language can be tracked automatically to pick up on sinister messages, which can be reviewed by moderators. This works in a similar way to email prompts to remind you to attach a document if you've used the word 'attachment'.

When grooming behaviour is detected, a notification should be sent to the child so they pause and reflect on their contact with that person, and support can be offered where appropriate.

Hire experts in child protection

Every social media company must hire dedicated child safety moderators. They must also make public the number of reports they receive and how moderation decisions are made.

Harmful, violent, abusive or adult content can be proactively filtered using key words – either to block it for Safe Accounts or to issue a pop-up warning to young people.

"It had no strict controls which led to lots of hurtful messages being spread about people, which I believe contributed to people self-harming or just feeling negative about themselves."
16 year old girl / Reviewing ASKfm

CEO of NSPCC Peter WanlessPeter Wanless, chief executive of NSPCC, says:

"The internet can be a wonderful resource for young people to learn, socialise and get support. But leaving social media sites free to make up their own rules when it comes to child safety is unacceptable.

"We need legally enforceable universal safety standards that are built in from the start.

"We've seen time and time again social media sites allowing violent, abusive or illegal content to appear unchecked on their sites, and in the very worst cases children have died after being targeted by predators or seeing self-harm films posted online

"Enough is enough. Government must urgently bring in safe accounts, groomer alerts and specially trained child safety moderators as a bare minimum to protect our children. And websites who fail to meet these standards should be sanctioned and fined."