Remove nude images shared online

Information about Report Remove, a tool to help young people report images and videos online.

Removing nude images - information for parents

For a child or young person, having a sexual image or video of themselves shared online can be a distressing situation. This can be difficult for parents and carers too, but there are ways you can support your child. If they’re under 18, they can use Report Remove.

We have more advice to help you understand the risks, and how to support your child if they’ve been sending, sharing or receiving nude images.

What is Report Remove?

Report Remove is a tool that allows young people to report an image or video shared online, to see if it’s possible to get it taken down. Provided by Childline and IWF, it keeps the young person informed at each stage of their report, and provides further support where necessary.

How does Report Remove work?

To use Report Remove, children just need to follow three steps:

  1. Follow the instructions to confirm their age. If they’re 13 or older, they can choose to prove their age using an age verification service called Yoti, though this is optional. They will need some ID if they want to do this.
  2. Log in or create a Childline account so they can receive updates on their report.
  3. Report and remove: share the image or video securely with the IWF, where a specialist analyst will review it and work to have it removed if it breaks the law. They will give it a digital fingerprint to help spot the image or video across the internet and take it down.

Childline will let the young person know the outcome of their report and provide further support where needed. And they’re always welcome to speak to a counsellor about how they feel, whether online via 1-2-1 chat and Childline email, or via the free confidential helpline on 0800 1111. More information about how each of the different ways to speak to a Childline counsellor can be found at

More about Report Remove

The Internet Watch Foundation (IWF), a UK charity, is there to help remove sexual images and videos of under 18s posted online.

Once the IWF has confirmed an image or video is against the law and can be removed from the internet they give it a digital fingerprint, called a hash. The hash can be used to find images online even if they have been cropped or slightly changed, and the image can then be removed.

IWF keeps a copy of the hash so that if the image is uploaded to the internet again it can be quickly removed and, in some cases, stop it even appearing online at all.

Yoti is an identity verification platform, and help Report Remove users to prove their age. There are two options if children choose to prove their age, either using the Yoti app or an in-browser portal. Children can use the following ID to confirm their age using Yoti:

  • UK passport or biometric residence permit
  • driving licence and provisional licence
  • Young Scot Card (Yoti app only)
  • CitizenCard (Yoti app only).

If they don't have any of these, they can get a young person's ID card. They’ll be able to get this for £5 with the promo code NSPCC5. Make sure they choose ‘Yoti CitizenCard’ on the first page.

The only information that Yoti shares with NSPCC or IWF about children who prove their age, is confirmation that they're under 18.

A nude image must include one of the following to be removed:

  • nude or semi-nude sexual posing
  • someone nude or semi-nude touching themselves in a sexual way
  • any sexual activity involving a child
  • someone hurting someone else sexually
  • sexual activity that includes animals.

The IWF is the organisation in the UK which is allowed to assess illegal content of children in order to get it taken down. It can only inform companies that they need to take down illegal images of children. Children are asked to prove their age using ID because it means that the IWF can be certain that the image is of a person under 18, and they can get the image taken down from more places.

Children under 13 are not asked to prove their age, as the IWF’s trained analysts have the visual expertise to assess that the person in the image is a child, which can be more challenging if the child in the image is older.

If a child does not have ID, they should still use Report Remove. This is because the IWF will still make an assessment of the age of the person in the image, and in many cases they will be able to use this assessment to be certain that it is a child. If IWF can’t be certain the content is of a child, it can still ask tech companies to take it down. This means that the content could still be removed from lots of places and that the young person can still choose to access emotional support from Childline.

Images that IWF assesses as illegal are hashed (given a digital fingerprint), and that hash is added to international hash lists, which includes major social media platforms and internet service providers.

As a result, if the image is reshared on the public web (not on the dark web, and not behind pay walls or log in walls), even if it has been changed slightly, it is likely to be found. IWF can then either send a takedown notice to the website or internet service provider, or it will be blocked automatically, depending on the way the hash was found.

However, images shared on encrypted messaging apps like WhatsApp would not be found, as there is not currently a route to spot these images when they have been end-to-end encrypted.

You can suggest that your child tries one of the following:

  • Try using your default browser (the browser that opens automatically if you click a web link) and close private browsing if you’re using the Yoti app
  • See if they can log in to their Childline locker and send an email to Childline explaining the problem and the team will be able to help.

Childline counsellors are always there for children and want to help. Children can talk to them about anything they’re going through.

As an adult, if you come across a nude or sexual image or video of a child, you can report it directly to the IWF. However, if your child has had an image shared online, we recommend that they use Report Remove as Childline will be able to support them.

Because Report Remove is a self-reporting tool, the young person needs to be involved in making the report. If the child is 13-17, they can choose to prove their age themselves using Yoti. You can support a young person to use Report Remove. Things like helping them find some ID and going through the process together can be really reassuring.

Children can use Report Remove on their own and don’t need to tell an adult they have made a report. Childline will know if a young person has made a report and will send them a message offering further support so the child doesn’t feel alone.

While sharing nude images of children is illegal, the law exists to protect children so children should not worry about police being involved if they make a report.

The only time police may become involved is if their report suggests they or someone else are at immediate risk of serious harm, or if police have come across the reported content for a reason outside of Report Remove (for example, if it was reported by the school) and they want to check the child is OK. There is potential for this to happen whether or not the child uses Report Remove.

Report Remove is designed for the child to remove their image and doesn’t actively involve the police. If you or your child would like to contact the police or are worried your child or another child is at risk, you can contact the police directly. If you are worried about how someone is communicating with a child online, you can report this to CEOP.

Children are supported by Childline throughout the reporting process by offers of support through Childline’s counselling service, and resources  on the Childline website. When the report has been assessed by IWF, Childline will send a message to the young person’s locker letting them know if the IWF could take it down, and offer them further support, including other routes young people could try to get content removed if it wasn’t illegal.

It's important to talk to your child about the risks of sharing sexual images or videos and let them know they can come to you if someone's pressuring them to send or share nudes, or if they're worried about something. You can see guidance about how to talk to your child about sharing nudes here.

You can also show your child the advice that Childline has about sharing nudes.

Remember, you can always talk to the NSPCC Helpline for support in how to help a young person who has had a nude or sexual image or video shared online.

Need advice about online safety?

If you’re stuck, not sure what to do, or if you’re worried about your child, you can also contact our trained helpline counsellors on 0808 800 5000.

Childline also has lots of information about online and mobile safety that will help you and your child.