Skip to main content
Hide this page

The Issue

The Nature of Technology-Assisted Child Sexual Abuse (TACSA)


TACSA can take many forms, such as the distribution of child sexual abuse material, sexual harassment, exposure to sexually explicit materials, extortion and grooming to name a few.   

Abusers use various methods to get the children to engage, such as though social media, gaming and once that contact occurs the abuser through a range of methods can coerce or control the child.

The Nature of Technology-Assisted Child Sexual Abuse (TACSA)
TASCA

Many victims talk about being ’duped’ by their abuser when the reality is the abuser was highly skilled, using highly honed tactics to manipulate and control. The abuser may be a stranger, or someone known to the child.  It is important to remember that most child sexual abuse happens within the home, perpetrated by someone close to the child.  Technology has made it easier for those who wish to target and use children for their own gratification by increasing access to victims, creating a false sense of safety thereby lowering inhibitions. 

The online environment has also led to children being increasingly exposed to harmful content. At Marie Collins Foundation we call this societal grooming.  It has skewed the perception of what is ‘normal’, a phenomenon that not only makes it difficult for children to recognise when something is ‘not right’, but also provides abusers with the opportunity to appear to be behaving within the boundaries of what is acceptable and normal.  

Scale of the problem


Understanding the true extent of child sexual abuse, in all its manifestations, is a perennial problem. It is the most hidden form of abuse of children and the least spoken about by many child victims. Most cases of child sexual abuse do not come to the attention of professionals. Sexual abuse occurs across all social classes, geographic areas, and ethnic and cultural groups. Victims are both boys and girls and can happen to any child regardless of their age. 

Sadly, the numbers relating to technology assisted child sexual abuse can often feel overwhelming given the vast number of reports received globally. Despite the believe that this is very much an underreported crime, the growth in the production of abusive images has been exponential in recent years and exacerbated by the COVID 19 pandemic.  

Scale of the problem
The Issue

36 million reports to the National Centre for Missing and Exploited Children (NCMEC) in 2023, relating to Child Sexual Abuse Material (CSAM), including a rise in reports of financial sextortion and the use of AI.

257% increase in the first six months of 2023 compared to the whole of 2022 in reports of Child Sexual Abuse (CSA) involving sexual extortion or 'sextortion' (IWF, 2024)

1,058% increase of indecent and abusive images containing to 7–10-year-olds children since the UK went into lockdown during the pandemic. (IWF, 2023)

6000% increase in online CSA in the EU in the last 10 years (Child Helpline International, 2023) 

275,652 URL's (webpages) confirmed as containing CSAM. Each URL could contain one, ten, hundreds or thousands of individual images or videos (IWF Annual Report 2023)

20,254 AI-generated images were found posted to one dark web CSAM forum in a one-month period (IWF, 2023)

1 in 4 children who had a potentially harmful experience online turned to no-one for support (Thorn, 2021)

555,000 and 850,000 UK based individuals posing various degrees of sexual risk to children. (NCA Strategic Threat Assessment, 2021)

The work of the Marie Collins Foundation has never been needed more, with reports such as that of the We protect Global alliance indicating the sustained growth of online sexual abuse and exposing is outstripping our global capacity to respond. (The We Protect Global Alliance’s Global Threat Assessment 2021)

It is important to remember that these are not just numbers. These are children whose abuse or images are being shared online, which will further increase the likelihood of additional harm risk and trauma to that child. 

MCF The Issue
The challenges.

The challenges.


One of the myths regarding TACSA is that online child sexual abuse has less impact and is of less immediate concern than offline abuse. Research has shown this is not the case with findings indicating that the consequences of TACSA is at least as severe and harmful as offline sexual abuse. Where images or videos of the abuse are created, the permanency and lack of control over who sees them leaves significant and long-term impacts for victims and survivors.  They are revictimized every time these are viewed.

Professionals often lack the confidence, knowledge, and skills to recognise and respond effectively to risks or instances of TACSA. They feel children know more about technology than they do therefore feel powerless about talking to children about this form of harm.

Child sexual abuse is silenced and denied by society.  People prefer to think it does not happen, but it does. This makes the subject difficult to talk about and, as a result, it is therefore difficult to obtain funding to support recovery services and difficult to get public support to improve the situation.

TACSA is often seen as the child’s fault for engaging with the abuser which silences further. Prevention is better than cure but to date prevention has not worked. That is why MCF is here. To assist and support recovery.  

MCF The Issue
Evolving Technology.

Evolving Technology.


As technology evolves so do additional ways to access and interact with children. 

Virtual Reality

Sometimes referred to as the metaverse.   

This is defined in the Oxford English Dictionary as “a virtual-reality space in which users can interact with a computer-generated environment and other users.”  People can interact in real time through realistic avatars.  Whilst this creates opportunities for fun, it also brings opportunities for abuse.  The interactions involve an immersive virtual environment, which can seem very realistic. 

 

There have been reports of people being verbally and sexually harassed in this space, including ‘virtual rape’.  

‘Virtual reality’s focus on creating a simulated immersive experience may cause harassing behaviours to feel more realistic, and therefore potentially more traumatic’.
Freeman et al. 2022 (Disturbing the Peace: Experiencing and Mitigating Emerging Harassment in Social Virtual Reality)

Professionals are struggling to work with children recovering from technology-assisted child sexual abuse before the emergence of VR.  This will have huge implications for our work going forward. 

Virtual Reality
Deep fake

Deep fake


A deepfake is an image, video, sound, voice, or GIF which has been manipulated by a computer to superimpose someone’s face, body, or voice onto something else. Users without sophisticated technological skills can easily access, create, and distribute deepfakes. 
 
Deepfakes can be used to harass, bully, or abuse a victim. 

  • Bullying - used in cases of cyberbullying to deliberately mock, taunt, or inflict public embarrassment on victims.
  • Extortion and Exploitation - used to create incriminating, embarrassing, or suggestive material. Some are so good it can be difficult to distinguish between them and the real thing.
  • Revenge, as retaliation or vengeance typically associated with the end of a relationship or not agreeing to a sexual relationship with the perpetrator.
  • Homophobic abuse - used to ‘out’ the person or as an attempt to ‘destroy their reputation.’ For young people struggling with their sexual orientation, being depicted in any sexualised deepfakes may be particularly distressing.
  • Image-Based Sexual Abuse - images of children harvested and used to generate sexualised deepfakes.  

 

We know that deepfake software can be used to remove clothing from victims digitally. In some cases, there are commercial services where users can pay to have images professionally manipulated.  

“Deepfakes present unique challenges for victim identification. Technology can be used to obscure the face of the child in material depicting genuine abuse making identification much harder. In other cases, the face of a different child might be superimposed onto the original image meaning law enforcement waste time trying to identify the wrong child,”  INHOPE. 

Deepfake The Issue
Marie Collins Foundation

Conclusion


The number of TACSA cases reported or discovered is increasing at an alarming rate as is the complexity of the cases we are seeing.  Technology is developing in ways we could never have imagined, and with these developments come great new opportunities.  However, these developments also bring new dangers.  It is only through working together and listening to the voices of victims and survivors that we can ensure children’s safety from technology assisted child sexual abuse.

Keep in Touch