Skip to main content
Hide this page

AI Generated CSAM

AI Generated Child Sexual Abuse Material Doesn’t Make the Harm Less ‘Real’.

Technology-assisted child sexual abuse (TACSA) happens when someone uses technology to enable sexual abuse of a child. It can take many forms - including grooming, coercion and image-based abuse - and can include sexual images created or altered using AI or 'nudifying' tools.

Smart tech brings new opportunities - and with it, new ways for some people misuse it to abuse and harm children. This guide helps parents and carers have calm, practical conversations with children about online abuse and the impact of AI, and explains what to say, what to do if you have concerns, and where to get help.

This matters because there is no hierarchy of abuse: TACSA is child sexual abuse and it is just as harmful as other forms of child sexual abuse. It can also bring additional technology-related impacts - especially shame, fear, confusion, and a loss of control when imagery exists, resurfaces or is further manipulated.

 

So where does AI come in?


Artificial Intelligence (AI) can be used to create highly realistic child sexual abuse material (CSAM). This can be done by altering existing photos or videos or by creating entirely new AI-generated sexual abuse content.

Ways AI may be used to harm children and young people include:

Using AI to manipulate an image of a real child.

Manipulating images of a real child.

AI can be misused to create or alter sexual abuse imagery using a child’s likeness - often from an existing photo.

The technology can make this easier to produce and repeat, increasing access to abusive content. Claims that AI-generated or AI-altered imagery is ‘less harmful’ are false: a real child can be harmed profoundly, including through new victimisation when their image is manipulated, and re-victimisation when further material is generated using the likeness of known victims.

Threats and blackmail

(sexual extortion)

AI-generated or AI-altered imagery can be used to threaten or pressure a child - including demands for money, further contact, or ongoing control. Offenders may create a sexualised image from an everyday photo and use it as leverage, meaning a child may not have shared any image at all before the abuse begins.

Worried child receiving threatening image online
Teenagers laughing at image on mobile phone

Child-on-child Harm


In some cases, under 18’s may use so called ‘nudifying’ tools to create images of their peers thinking it is ‘just a joke’. They might also do so to deliberately cause upset and distress. 

What to look out for

(without needing ‘proof’)


You don’t need to understand the technology to take action. Look for changes such as:

  • Sudden anxiety, withdrawal, sleep disturbance, or distress after notifications
  • Avoiding school or usual activities
  • Seeming frightened about something being shared, or worried about getting into trouble
  • Becoming unusually secretive and upset (not just normal privacy)

If you’re unsure, it’s still okay to ask gently and offer support.

What to do next


This is a short, practical guide to the key steps you can take to help you respond, get support, and take action. For more detailed guidance, see MCF’s resources: Conversations with your child about technology-assisted harm and Finding out your child has been harmed through technology-assisted child sexual abuse.

Step 1: Steady yourself first (before you speak)

If you’ve just found out - whether your child has told you, or you’ve discovered something - your first job is to regulate your reaction. Shock, anger, disbelief and overwhelm are normal, but your child will often be watching your response closely.

Quick reset: try ‘square breathing’ - breathe in for 4, hold for 4, out for 4, hold for 4.

 

Step 2: Work out which situation you’re in: Disclosure or Discovery

The first few minutes matter, and what you do next depends on how you found out.

 If this is a disclosure (your child has told you)

Your aim: keep the door open and make it safe to keep talking.

  • Give them your full attention (even if it’s an inconvenient moment).
  • Lead with support, not questions. Examples include:
    • “Thank you for telling me - I know that must have been difficult.”
    • “You’ve done the right thing telling me.”
    • “You’re not in trouble. It’s my job to keep you safe.”
  • Avoid 'why' questions - they can feel like an accusation and increase shame.
  • If AI manipulation is mentioned avoid describing the images as 'fake' or 'not real' as this diminishes the impact. Try to call them AI generated.

 

If this is a discovery (you found something or suspect something)

Your aim: avoid a confrontational moment that shuts communication down.

  • Choose a time and place that reduces defensiveness (a walk or car journey can help).
  • Keep your tone calm and your body language soft (sit at their level if you can).
  • Start with gentle, open prompts:
    • “I’ve noticed something that’s made me worried. I want to understand what’s been happening.”
    • This may be uncomfortable and I want you to know that I’m on your side - I’m here to help.”
  • Avoid launching into accusations, punishments or ultimatums.

 

Step 3: Lead with support (not an interrogation)

Whichever route you’re in, your child’s ability to accept help often depends on whether they feel believed and safe.

  • Listen calmly and let them speak in their own words.
  • Keep reassurance simple: “I’m glad you told me. We can take this step by step.”
  • Don’t push for detail; you don’t need “the full picture” to start supporting them.

 

Step 4: Think carefully before checking devices

It’s natural to want to check their phone immediately - but pause and ask yourself: Do I know what I’ll do if I find something?

Preferred approach: ask your child to show you, keeping them in control of the process. You may not learn everything, but you protect the relationship and keep future help-seeking open.

 

Step 5: Don’t share images - and keep key details safe

If images or videos are involved:

  • Don’t forward, share, or store them ‘for advice’.
  • If it’s safe, keep key details noted somewhere:
    • usernames / platform
    • URLs / dates / times
    • any messages showing pressure or threats

If possible, encourage your child not to delete everything straight away - it can help reporting.

 

Step 6: Get support and take action

It’s normal to feel conflicted about involving others, but some action will be needed for your child’s safety and recovery. You can take this one step at a time.

 

Who to tell (support around your child)

  • If peers are involved, contact the school and ask for the Designated Safeguarding Lead (DSL).
  • If you’re unsure what to do next, you can talk it through with Marie Collins Foundation (MCF): 01765 688827.

 

Where to report (formal routes)

 

Step 7: Keep daily life going alongside recovery

It’s understandable to want to clamp down hard. Blanket bans on tech or going out can be counterproductive long-term.

  • Keep conversations open.
  • Focus on rebuilding safety and trust.
  • Keep normal family talk going (meals, plans, everyday chat). Normality supports recovery.

 

Step 8: Put support around you as well

You may feel guilt, anger, grief, or self-blame. It is not your fault and not your child’s fault - blame belongs with the abuser. Try to avoid making big decisions while emotions are at their peak, and consider speaking to a trusted friend or professional.

Keep in Touch