Poll shows widescale public support for stronger laws to protect children from online abuse


An NSPCC/YouGov survey has found that 9 in 10 respondents want social media networks and messaging services to be safer for children in their design, and for tech bosses to be held responsible for safety.

The poll, which questioned more than 2,000 adults found that they overwhelmingly backed new laws to keep children safe on social media. The survey found that:

30672-exp-09-2024.jpg

  • 90% of respondents want firms to have a legal responsibility to detect child abuse, such as grooming, that takes place on their sites. 2
  • 80% believed that tech bosses should be fined for failure to make sure their sites safe 3
  • 70% supported making it a legal requirement for platforms to assess the risks of child abuse on their services and take steps to address them 4
  • Only 8% of adults thought sites are regularly designed safely for children.5

Demand for change – our latest report

In 2019 we set out detailed proposals for an online safety bill. Since then, we have been a leading voice for social media regulation. This week we released a report called ‘Delivering a Duty of Care’ which assessed the government’s plans for legislation, against the the six tests we created to measure the successful achievement of online safety. But our report found that the government is failing on 9 out of the 27 indicators. When it comes to tackling sexual abuse, tougher measures are needed.

The results of the poll suggest that most UK adults recognise this, and so do we. Our report highlights that online safety is a priority and it can be achieved, not just by making tech firms legally responsible for their output, but by:

  • clamping down on the “digital breadcrumbs” dropped by abusers to guide others towards illegal material. These include videos of children just moments before or after they are sexually abused - so-called ‘abuse image series’ - that are widely available on social media.
  • giving Ofcom the ability to tackle cross-platform risks, where groomers target children across the different sites and games they use - something firms have strongly resisted.
  • getting government to commit to senior management liability. This would make tech directors personally responsible for decisions, drive cultural change and providie a strong deterrent
  • making censure, fines and in some cases, criminal sanctions, the penalty for bosses who fail to make online a safe place for children. 

NSPCC Chief Executive Sir Peter Wanless is now urging the Culture Secretary, Oliver Dowden to listen and ensure that the landmark Online Safety Bill convincingly tackles online child abuse. 

NSPCC-PeterWanless_900x506.jpg

NSPCC Chief Executive, Sir Peter Wanless said: “Today’s polling shows the clear public consensus for stronger legislation that hardwires child protection into how tech firms design their platforms.

“Mr Dowden will be judged on whether he takes decisions in the public interest and acts firmly on the side of children with legislation ambitious enough to protect them from avoidable harm.

“For too long children have been an afterthought for Big Tech but the Online Safety Bill can deliver a culture change by resetting industry standards and giving Ofcom the power to hold firms accountable for abuse failings.”


References

  1. 1. YouGov surveyed 2,125 adults between 31 December and 4 January 2021.

  2. 2. 90% Net Supported the statement “Social networking and messaging services having a legal responsibility to detect child abuse taking place on their sites (i.e. by law)”.

  3. 4. Asked "To what extent would you support or oppose the following? If social media companies consistently fail to protect children from child abuse..." - 80% net supported the statement “Senior managers/ directors of social media companies being fined if their company consistently fails to protect children from child abuse” & 78% net supported the statement “Senior managers/ directors of social media companies being prosecuted (i.e. taken to court) if their company consistently fails to protect children from child abuse”. 

  4. 4. Asked “Which, if any, of the following do you think should be a legal requirement (i.e. by law) for social media/ messaging companies?” – 70 % said yes to “Assess the risks of child abuse on their services, and take steps to address them.” 

  5. 5. Asked “And in general, how often, if at all, do you think social networking and messaging services are currently doing each of the following?” 8% said they thought they regularly “Design their sites to be safe for children”.

    Asked to what extent they agree or disagree with the statement "Social networking and messaging services should be designed to be safe for children (i.e children's safety must be considered and services accessed by children should not include features that can cause them harm)" – 90% Net Agreed