Featured image for Sharing Nudes, Sextortion and Deepfakes: What Schools Need to Know in 2026

Sharing Nudes, Sextortion and Deepfakes: What Schools Need to Know in 2026

Last week I came across one of my old lesson plans on online safety. It was from around 2010. I opened it up and went through it, just to see if there was anything useful I could use in an upcoming session. It was quite shocking to see how tame it was and how tunnel-visioned (if I can say that!). It was mostly about the dangers of — which is obviously still important and still very much part of what we teach. But going through it, I was struck by how quaint my approach seemed, compared to what young people are dealing with now.

Nudification apps and AI-generated imagery

There are now apps and websites that can take a normal, clothed photo of someone and generate a realistic nude image from it. No skill needed. One photo is enough. In 2023, a group of boys aged 13 to 15 in a small Spanish town did exactly this to twenty of their female classmates, using pictures from Instagram. The UK government actually cited that case when they announced plans to ban these apps.

Since early 2025, creating a sexually explicit deepfake has been a criminal offence in the UK — up to two years in prison. And under the 2026 DfE guidance, a synthetic image of a child is treated the same as a real photograph. Creating it, having it, sharing it — all criminal.

Pupils need to know that these tools exist, that using them is a criminal offence even if they think it's just banter, and that having the image on their phone is enough. If a young person has been depicted in one of these images, the CEOP Report Remove tool can help get it taken down.

Sextortion

Sextortion used to be relatively rare. Now it's being run by organised criminal networks, often overseas, targeting young people on social media. They make contact, build trust quickly, get an image, and then threaten to share it unless money is paid.

Boys are disproportionately targeted, which a lot of people don't realise. Most of the image-sharing education in schools is still framed around girls. That needs to change.

Pupils need to know that these are professional criminals, not other teenagers. Paying never makes it stop. They should tell someone they trust and report to CEOP. And they're not in trouble for coming forward.

For staff, the thing to think about is how you handle the disclosure. When a child tells one person they trust, and before they know it five adults in the building are involved, they've lost control of who knows. That can feel just as violating as the original incident. So think about your disclosure pathways — who the child told, who genuinely needs to know, and how you keep their sense of control intact.

Sharing nudes

Image sharing between young people hasn't gone away — it's just been overshadowed by the AI and sextortion stories. The UKCIS updated their guidance in February 2024 to include deepfakes and AI-generated content.

The legal position is clear: possessing or sharing an of someone under 18 is a crime, even if it's of yourself, even if you , even if it was AI-generated. That includes forwarding, screenshotting, or saving. The 2026 DfE guidance says pupils should understand they won't be in trouble for asking for help — which is the right message, though delivering on it depends on school culture as much as school policy.

One thing I'd flag for DSLs: make sure you understand Outcome 21. When police record an image-sharing incident under Outcome 21, the crime is logged but no formal action is taken and it's unlikely to show up on future vetting checks. It's the thing that stops a single incident from following a child around for years. Too many DSLs either haven't heard of it or don't feel confident explaining it to parents.

The 2026 guidance

The updated DfE statutory guidance now requires schools to teach specifically about deepfakes for the first time — how common they are, how they cause harm, and how to spot them. It also explicitly covers AI-generated imagery alongside photographic images, and requires teaching that online content can normalise misogyny and violence.

If your online safety curriculum was written before 2024, you'll almost certainly have gaps.

Trusted resources

  • UKCIS guidance (updated Feb 2024): Sharing nudes and semi-nudes — the definitive incident response framework for schools (gov.uk)
  • CEOP Report Remove: For young people who need an intimate image taken down (childline.org.uk)
  • Childline: 0800 1111 — free, confidential support for children and young people
  • Internet Watch Foundation: For reporting AI-generated child sexual abuse material (iwf.org.uk)
  • DfE 2026 statutory guidance: Online safety curriculum content points (gov.uk)
  • Children's Commissioner report (April 2025): Nudification tools and sexually explicit deepfakes — the most detailed UK research on how these tools affect children

If your school needs support updating your programme or training staff ahead of September 2026, get in touch.

Want help with RSE training?

Talk to us about how Tailor Education can support your school.

Get in touch