Child sexual abuse crimes reach record levels – here’s how the Online Safety Bill can effectively tackle grooming

Our new analysis shows online child sexual abuse (CSA) crimes have increased by more than a quarter since the government promised to bring in new laws to protect children on social media.

  • Freedom of Information data1 reveals that grooming and CSA material offences recorded by police have increased by 27% since Ministers promised to legislate (from 2018/19 to 2020/21).
  • This comes as we set out solutions to ensure the Online Safety Bill strongly tackles grooming and online CSA ahead of its second reading on 19 April 2022.
  • Our Time to Act report commends the government for bringing forward this legislation but recommends key changes to convert Ministers’ ambition to protect children into a world-leading response to the online CSA threat.


Worried about a child?

Find out more

Research shows public support for tackling CSA

Polling2 shows UK adults also strongly want a Bill that addresses CSA. A YouGov survey finds four in five UK adults think it’s very important it tackles online grooming and CSA images.

The poll found:

  • 87% think it’s important that the Online Safety Bill tackles the sharing of CSA images and four in five (81%) think it’s very important3.
  • 88% think it’s important it tackles online grooming and four in five (81%) think it’s very important4.
  • Four in five (82%) think social media companies should have a legal duty to work with each other to prevent online grooming from happening across multiple platforms5.
  • 86% want companies to understand how groomers and child abusers use their sites to abuse children or share CSA material and take action to prevent it by law5.

Frida*, who was 13 when she was groomed by a man on Facebook before being abused on encrypted WhatsApp, has been campaigning to make social media safer for children.

She said:

"The government now have a responsibility to ensure this legislation works and makes tech executives do everything in their power to address how their sites contribute to grooming.

No one took responsibility for the abuse I suffered except me. Not the man who abused me or anyone at the tech firms that enabled him."

The Time to Act report addresses the following areas

Offenders use social media to form networks, advertise a sexual interest in children and signpost illegal CSA content hosted on third party sites.

Amending the Bill to fight the ways offenders facilitate abuse online could prevent millions of interactions with accounts that contribute to CSA.

Techniques offenders used include:

  • Tribute sites: Setting up fake social media profiles of CSA survivors known to those with a sexual interest in children - these received 6 million interactions in just three months of 2021.
  • Facebook offender groups: Abusers are also using hidden Facebook groups for those with an interest in children, celebrating their eighth, ninth and tenth birthdays and have up to 50,000 members. Many remain live despite being reported to Meta6.
  • Carefully edited CSA videos: Abusers use a detailed understanding of what platforms will and won’t take down to post edited videos of real abuse scenes that get around content moderation rules and aren’t considered illegal.

We welcome that private messaging will be in scope of the Bill as this is where most CSA takes place on social media.

This approach could be improved further by giving Ofcom the ability to make firms use technology to identify CSA and grooming in private messaging, such as Photo DNA, which is commonly used by virtually all major sites.

Ministers have an opportunity to improve the legislation, so it does more to disrupt how abusers groom children on multiple social media sites and games.

We suggest amending the Bill so companies must:

  • Address cross-platform harm when meeting their safety duties.
  • Risk assess how their sites contribute to cross-platform abuse.
  • Legally cooperate with each other to disrupt grooming.

The Time to Act report also addresses several other areas where the Online Safety Bill can be strengthened, including providing far-reaching protection to children from harmful content.

We’re concerned that problematic sites like Only Fans and Telegram could claim they’re exempt from addressing harmful content to children.

A “child use test” should be removed so any service likely to be accessed by children is in scope.

Do you want more detail?

Read our report on online safety here.

If you're an MP, you can find our parliamentary briefing here.

Peter Wanless, NSPCC CEO, said:

"This historic Online Safety Bill can finally force tech companies to systemically protect children from avoidable harm.

With online child abuse crimes at record levels and the public rightly demanding action, it’s crucial this piece of legislation stands as a key pillar of the child protection system for decades to come.

Today’s NSPCC report sets out simple but targeted solutions for the Bill to be improved to stop preventable child sexual abuse and to finally call time on years of industry failure."

Abuse can stop with a call to the NSPCC Helpline. Will you help us answer every call?

*DISCLAiMER

Names have been changed to protect identities. Any photographs are posed by models.


References

  1. 1. NSPCC regularly submits Freedom of Information requests to UK police forces for recorded offences of Sexual Communication with a Child and indecent images. In 2018/19 the combined total number of recorded offences was 24,964. In 2020/21 it was 31,600. This is a 27% increase.

    Year

    2018/19

    2019/20

    2020/21

    Total

    Indecent images (Child abuse material)

    19631

    21342

    25281

    66254

    Sexual Communication with a Child (Grooming)

    5333

    5954

    6319

    17606

    Total online child sex offences

    24964

    27296

    31600

    83860

  2. 2. All polling figures, unless otherwise stated, are from YouGov Plc. Total sample size was 2,501 adults. Fieldwork was undertaken between 4th - 7th February 2022.  The survey was carried out online. The figures have been weighted and are representative of all UK adults (aged 18+).

  3. 3. Asked How important, if at all, do you think it is that the Online Safety Bill tackles each of the following? Sharing of child abuse images, 81% said it was Very important, 6% Fairly important, making Total important 87%.

  4. 4. Asked How important, if at all, do you think it is that the Online Safety Bill tackles each of the following? Online grooming i.e., contacting children for the purposes of sexual abuse, 81% said it was Very important, 7% Fairly important, making Total important 88%.

  5. 5. Asked Do you think social media companies should or should not be required to do each of the following by law? Work with other social media companies to prevent online grooming happening across multiple sites, e.g., work together to share information on how their sites are used for abuse’, 82% said they should have to do this by law. Asked whether they should have to ‘Understand how groomers and child abusers use their sites to abuse children or share child abuse material and take action to prevent it’, 86% said they should have to do this by law.

  6. 6. Putnam, L (2022) Facebook Has a Child Predation Problem. New York City: Wired. Article published 13th March 2022. Based on subsequent discussions with Prof Lara Putnam at the University of Pittsburgh.