As an organisation that supports victims of technology-assisted child sexual abuse, Marie Collins Foundation is furious that Meta has announced that they are now rolling out end-to-end encryption as a standard on Facebook Messenger and Instagram Direct despite repeated concerns about the impact that this will have on our ability to protect children and bring perpetrators to justice.
Marie Collins Foundation, along with other child protection organisations, have over the last two years campaigned to stop this from happening until there are appropriate safeguards in place. By this we mean a way to actively search on their platforms for offenders who are distributing sexually explicit images of children, videos of children being sexually abused (sometimes to order) and instances of grooming. We also mean being able to report and remove this content. When we have challenged the steps they are taking to ‘protect’ as not being good enough, we have not been listened to and neither have our concerns been taken seriously.
The information that Meta and other tech platforms obtained through scanning their platforms or receiving reports from users contributed to over 800 arrests of child sex abusers and safeguarded around 1,200 children per month according to UK Law Enforcement. Meta was the biggest contributor to this. In 2022 there were 31.8 million reports of child sexual abuse material made to NCMEC, and of these 21.1 million came from Facebook and 5 million from Instagram. The fact that Meta have worked in partnership with child protection agencies to safeguard children in the past is to be applauded. The fact that they will no longer be doing this is horrifying.
The ability for law enforcement to identify these children when the images are reported is a vital safeguard. The ability for law enforcement to track and identify perpetrators and bring them to justice protects children. The ability to block and take down known images significantly aids recovery and reduces the likelihood of revictimization of the children who are depicted in the images. Every time an image is viewed the child is abused again. Turning off the ability to detect child sexual abuse on Facebook Messenger and later Instagram Direct, will have a huge negative impact on the health and wellbeing of children who have been the victim of technology-assisted child sexual abuse.
Child sexual abuse is one of the least talked about and most silenced forms of abuse. Society silences the victims by turning a blind eye to what is being done to some children. Perpetrators silence children by telling them it is their fault, and nobody will believe them. The shame and guilt that children carry is huge. When this abuse is recorded and shared that shame and guilt is magnified. The thought of the image being out there for others to view causes intense anxiety and a feeling of being unsafe for the child.
Meta is shamefully shirking their responsibility to protect children and placing it instead onto children themselves by relying on user reporting. How does a child report when they do not know that they are being groomed and abused? How does a young child report that images of their parent abusing them have just been shared? How does a child report when they are being silenced by threats?
Meta is giving those who wish to sexually harm children a safe place to hide. They will no longer be able to detect the abuse, and we know that encrypted platforms are the preferred place for perpetrators to engage in child sexual abuse activities knowing they will not be caught.
“Why are we reducing our ability to find images of child sexual abuse, find those that are sharing them and stand up for the rights of children to be protected? Children should not be made responsible for protecting themselves. That is our responsibility. A society responsibility. A tech companies’ responsibility. Today is a sad day for those of us who care for our children. People should be outraged.” Victoria Green, CEO, Marie Collins Foundation
“I am incredibly disappointed in Meta’s decision. For two years I have been speaking out about the impact of this plan on victims and survivors of technology-assisted child sexual abuse like myself. I am a supporter of strong privacy measures, but very much feel that my right to privacy is being ignored. I have no control over the distribution of the images created of my sexual abuse when I was 13. I have to rely on companies like Meta taking responsibility for detecting and removing them for me. They have created the platforms that enable the sharing. Therefore, surely they have the responsibility to stop this abusive practice? They will no longer be able to do this which is, frankly, devastating.” Rhiannon-Faye McDonald, Head of Advocacy, Marie Collins Foundation.