Support for end-to-end encryption increases if child safety can be protected

An NSPCC/YouGov survey1 found 33% of UK adults support using end-to-end encryption on social media and messaging services, but this jumps to 62%2 if it’s rolled out only if and when tech firms can ensure children’s safety is protected.

30264-exp-2024-09.jpg

Private messaging is where most child sexual abuse happens online, and we're calling for an urgent reset of debates on end-to-end encryption to help keep children safe.

End-to-end encryption means only the devices communicating have the ability to decrypt and read the messages. While this is useful for privacy, it also presents risks for child safety and means abuse can go unnoticed online.

A major NSPCC roundtable event attended by the Home Secretary, Priti Patel, will bring together child protection, civil society and law enforcement experts from the UK, US, Canada, Ireland, and Australia. The reset of the debate will focus on showing how end-to-end encryption takes away platforms’ ability to find abuse in private messaging, and how this can be avoided. 

Currently, major tech firms use a range of tech to identify child abuse images and detect grooming and sexual abuse in private messages. But Facebook’s proposals for end-to-end encryption for Facebook Messenger and Instagram would make these tools useless, with an estimate of 70% of global child abuse reports lost. In 2018 these reports resulted in 2,500 arrests and 3,000 children being safeguarded in the UK3

.


The debate around end-to-end encryption has increasingly become an ‘either or’ argument skewed in favour of adult privacy over the safety and privacy rights of children. However, the latest polling suggests public support to balance the safety of children while maximising the privacy of all users, including the children who have been sexually abused. 

  • More than half (55%) of UK adults believe the ability to detect child abuse images is more important than the right to privacy
  • Nearly a third (32%) think they are equally important
  • Only 4% say privacy should be prioritised over safety4
  • 92% support social networks and messaging services having the technical ability detect child abuse images on their sites5
  • 91% support a technical ability to detect adults sending sexual images to children on their services6

At the roundtable event, we will share new research and analysis about the implications of end-to-end encryption for child protection and call for tech firms to refocus their approach through safer design features and investment in technology.

Tech firms should strive to achieve a new balance that properly weighs the benefits and risks of end-to-end encryption, underpinned by legal safeguards through regulation. 

  • The needs of all users, including children, must be considered
  • Children’s safety musn’t be characterised as a simple trade off against adult’s privacy
  • We need to reflect children’s digital rights under international law
  • Tech firms should respect the full range of fundamental rights at stake, and not prioritise some over others
  • Design features that can increase the risk of end-to-end encryption to children should be looked into, e.g. Facebook algorithms that suggest children as friends to adults or plans to auto delete messages on WhatsApp

NSPCC Report: End-to-End Encryption - Understanding the impacts for child safety online

NSPCC Discussion Paper: Private messaging and the rollout of end-to-end encryption - The implications for child protection

Sir Peter Wanless, NSPCC Chief Executive, said:

“Private messaging is the frontline of child sexual abuse but the current debate around end-to-end encryption risks leaving children unprotected where there is most harm.

The public wants an end to rhetoric that heats up the issue but shines little light on a solution, so it’s in firms’ interests to find a fix that allows them to continue to use tech to disrupt abuse in an end-to-end encrypted world.

We need a coordinated response across society, but ultimately Government must be the guardrail that protects child users if tech companies choose to put them at risk with dangerous design choices."

Help us end the Wild West Web

23501-exp-2023-10.jpg

We want to make tech companies accountable for the abuse happening on their platforms. We helped propose a new law, the Online Harms Bill, to make this happen.

But we need your support to make this law a reality and keep children safe from online abuse.

Take action today


References

  1. 1. YouGov surveyed 2,125 UK adults between 31st December and 4th January 2021. The survey was carried out online. The figures have been weighted and are representative of all UK adults (aged 18+).

  2. 2. When asked to what extent you support Social media and messaging services using end-to-end encryption there was 33% net support rising to 62% when asked if you support Social media and messaging services using end-to-end encryption if and when they can ensure children's safety is protected.

  3. 3. Statistics from Home Office (2019) Factsheet: encryption.

  4. 4. Asked Which ONE, if any, of the following statements comes closest to your view? 32% said the right to privacy and the ability to identify child abuse images are both equally important. 55% said the ability to identify child abuse images is more important than the right to privacy on social media. 4% said the right to privacy on social media is more important than the ability to identify child abuse images.

  5. 5. Asked if you support Social networking and messaging services having a technical ability to detect child abuse images on their sites (e.g. to actively find child abuse images) there was 92% net support.

  6. 6. Asked if you support Social networking and messaging services having a technical ability to detect adults sending sexual images to children on their services there was 91% net support.