Over 3,000 new grooming offences recorded since last year

We're calling for mandatory rules that ensure social media networks keep children safe online.

Figures show that 3,171 offences have been recorded in England and Wales1 across 80 platforms in England and Wales since a new anti-grooming law was introduced last year which criminalises sexual communication with a child. This amounts to almost 9 grooming offences on average per day.

Where police disclosed the gender and age of the victim2, records also reveal:

  • girls aged 12-15 were recorded in 62% of cases
  • under-11s were recorded in nearly a quarter of cases.

This news comes ahead of our annual flagship conference How safe are our children? Growing up online which begins on Wednesday 20th June and explores the potential risks the online world poses to children and young people. The conference also marks the launch of our annual report, How safe are our children?

Social media and online safety

For the 2,097 offences where police recorded the method used to communicate with a child:

  • Facebook, Snapchat or Instagram were used in 70% of cases
  • Facebook, Snapchat and Instagram were the top three most-recorded sites.

"They were so manipulative, but you don’t even notice it. Looking back at it now, it’s scary to think that I sent semi-naked pictures to older guys. It could have gone a lot further."
Mared Parry, from North Wales, was sent sexual messages from men 10 years older than her on Facebook when she was aged just 14.


Wild West Web: our campaign

Following our Wild West Web campaign, Matt Hancock, Secretary of State for Digital, Culture, Media and Sport, has announced that new laws will be brought in to regulate social networks, helping to keep children safe and prevent a number of offences, such as grooming.

We’re calling for Mr Hancock to bring in:

1. an independent regulator for social networks with fining powers

2. a mandatory code which introduces Safe Accounts for children; grooming alerts using algorithms; and fast-tracking of reports to moderators which relate to child safety

3. mandatory transparency reports forcing social networks to disclose how many safety reports they get, and how they deal with those reports.

Learn more about our campaign

Worried about a child?

Contact our trained helpline counsellors for help, advice and support.

0808 800 5000

Report a concern

Peter Wanless, NSPCC Chief Executive, said:

“These numbers are far higher than we had predicted, and every single sexual message from an adult to a child can have a huge impact for years to come.

“Social networks have been self-regulated for a decade and it’s absolutely clear that children have been harmed as a result.

“I urge Digital Secretary Matt Hancock to follow through on his promise and introduce safety rules backed up in law and enforced by an independent regulator with fining powers.

“Social networks must be forced to design extra protections for children into their platforms, including algorithms to detect grooming to prevent abuse from escalating.”


  1. Age and gender of the victim were disclosed in 2,343 instances. Police are not always able to identify the victim in each case, and in some cases there are multiple victims.

  2. All 43 police forces in England and Wales and British Transport Police were asked for the number of recorded offences under s.15A of the Sexual Offences Act 2003 recorded between April 3 2017 and April 2 2018. In total, 41 of 44 police forces gave a full or partial response for the full 12-month period. One police force responded to the request after the NSPCC’s How Safe Our Are Children 2018 annual report went to print, so nine months of data from this police force is printed in the report, compared to a full 12 months in this press release.