Facebook end-to-end encryption plans make stopping child abuse online harder

Facebook is turning back the clock on children’s safety with new plans that will make it harder to detect and disrupt child abuse online.

We’re calling for the Government to take action on the risks of end-to-end encryption, after our new data reveals Facebook-owned apps were used in more than 52% of online child sex crimes.1

30566-exp-2024-09.jpg

In one year, there were over 9,470 instances of child sex abuse images and online child sex offences recorded by police (where the means of communication was known). Of these:

  • 52% took place on Facebook-owned apps
  • over a third of the cases took place on Instagram
  • Facebook and Messenger were used in another 13%.

Now, with Facebook planning to proceed with end-to-end encryption, there’s a risk that without the necessary safeguards in place, many of these cases could go unreported. We’re calling on Culture Secretary, Oliver Dowden, to strengthen the Online Safety Bill to decisively tackle the biggest threat to children online - abuse in private messaging.

The Government needs to give Ofcom the power to take early and meaningful action against firms who put children at risk through dangerous design choices.  

 

Take action now


What is end-to-end encryption and what risks does it have?

End-to-end encryption means only the devices communicating have the ability to decrypt and read the messages. While this is useful for privacy, among other benefits, it also poses great risks to child safety.

Child protection experts, law enforcement, and Facebook themselves have said end-to-end encryption will prevent them from identifying and disrupting child abuse on their services. End-to-end encryption provides abusers with a shield that keeps online child abuse hidden from view. 

Private messaging is the most common avenue for abusers to contact children. Last month, the Office for National Statistics revealed that in three quarters of cases where a child has been approached by someone they don’t know online, strangers made contact via direct message.2

WhatsApp accounts for 1 in 10 instances in online child sexual abuse where Facebook’s apps were involved. But they only make up 1.3% of child abuse tip-offs from Facebook to the NCA because Facebook can’t see the content of messages to report any abuse.3

The Government must act now to make sure tech firms cannot roll out features or products that increase the risk of harm to children. 

Take action today 

We’ve helped propose a new law to make tech companies accountable for abuse that happens on their platforms. 

This new Online Harms Bill will help keep children and young people safe online.

22742-exp-2023-03.jpg

You can help us pass this Bill in our Wild West Web campaign. All it takes is a quick email to your local MP, and we even have a template you can use. 

The Online Harms Bill puts in place new rules to make sure tech companies put safety first through:

  • An independent regulator to enforce rules that keep children safe on social platforms, with powers to investigate and fine. 
  • Putting the responsibility on tech companies to keep young people safe online, with fines of up to €20 million and bans for boardroom directors who fail to do this.
  • Making social media companies create platforms that are safer by design, with safer accounts for young users, and easy processes for reporting and dealing with abuse.

Take action now


References

  1. 1. The NSPCC sent 46 police forces across England, Wales, Scotland and the Channel Islands a Freedom of Information request asking them for a breakdown of the platform/s used to commit sexual offences or indecent image offences committed against children aged under 18 for the period 1st October 2019 to September 30th 2020. A total of 35 police forces responded.

  2. 2. ONS figures show the majority of messages received by children who were contacted by someone they had never met in person was received privately in 74% cases.

  3. 3. Last year, the National Crime Agency (NCA) received just under 24,000 child abuse reports from Facebook and Instagram but 308 from Whatsapp.