Marie Collins Foundation Analysis
The Australian Government has announced a significant new measure: From today, 10 December 2025, major social-media platforms must prevent children under 16 from holding accounts. The duty falls squarely on tech companies - backed by substantial penalties - not on children or their parents.
The instinct behind the legislation is understandable. Policymakers and parents want stronger protections from harmful content, addictive design, and technology-assisted child sexual abuse (TACSA). The Marie Collins Foundation shares that ambition and does not oppose age limits in principle.
However, from a TACSA perspective, the implications of a blanket 16+ threshold are more complex than they first appear. Existing evidence and frontline experience suggest that restricting children’s access to one digital environment does not remove the risk of online sexual exploitation; instead, risk is likely to shift into less visible, less regulated spaces where children are harder to safeguard and support.
MCF’s position remains: children deserve digital environments that are safe by design, not environments that become safer only because children of a certain age are excluded. Our concern is that, without wider safety-by-design reforms and accessible routes to support, an under-16 ban on social media is unlikely to reduce TACSA and may create new risks. Below, we outline five key considerations for evaluating the under-16 restriction through a TACSA lens.
1. What the Australian law does - and does not do
The law introduces a Social Media Minimum Age, requiring platforms to take ‘reasonable steps’ to prevent under-16s from creating or maintaining accounts.
Key features:
- Children and parents are not penalised.
- The legal responsibility lies with platforms.
- This is not a ban on smartphones or wider internet access.
- Several services (e.g., gaming and messaging platforms) may initially fall outside scope.
The policy goal is clear: reduce exposure to harmful content and design features that facilitate exploitation. The challenge is whether excluding children can provide the same protection as redesigning the platforms themselves.
2. TACSA risk is displaced, not removed
TACSA does not occur only on social media. Research and casework consistently show that:
- Grooming and exploitation occur across multiple digital environments, including messaging apps, games, livestreaming platforms, and private channels.
- Offenders follow children into whichever environments allow contact.
- When one space becomes harder to access, children often move to less visible, less regulated, and higher-risk environments.
A ban may change where abuse occurs, but not whether it occurs. This mirrors findings from contextual safeguarding: restricting access to one environment often increases vulnerability in another, particularly where oversight is weaker.
3. Hidden use and circumvention increase safeguarding challenges
Age-gating on platforms is frequently bypassed, and early reporting on the Australian ban already shows that under-16s can work around new age checks in several ways. Experts have noted that children may use VPNs, fake IDs, appearance changes, or ask an older person to complete facial age-scans on their behalf. There are already reports of under-16s being incorrectly verified as over 16, as well as reports where children remain active simply because certain platforms have not yet fully implemented the new restrictions.
Under a stricter 16+ regime, it is therefore likely that some children will still access social media, but in less visible ways:
- using accounts registered with an adult age
- using accounts belonging to older siblings or peers, or - in grooming contexts - accounts created or controlled by offenders who “help” them get online
- avoiding safety tools or reporting mechanisms for fear that using them would expose that they should not be on the platform
These patterns reduce visibility, increase isolation, and make intervention significantly harder - particularly for younger children already experiencing sexual coercion or extortion.
4. The TACSA disclosure gap: why routes into support matter
Non-disclosure is one of the most significant challenges in TACSA. Many children do not recognise what has happened to them as abuse, feel unable to tell an adult, or fear repercussions from a perpetrator.
Evidence shows that:
- Children rarely approach formal services first.
- Help-seeking often begins with small, exploratory steps.
- These steps frequently occur in mainstream digital spaces - viewing survivor-informed content, passively seeking information, or engaging anonymously.
- Such early digital touchpoints can be a bridge to recognising harm and accessing professional support.
If under-16s are excluded from major social platforms, it will be essential to ensure that alternative, accessible routes to information and support are available and clearly signposted. Without this, there is a risk that:
- early opportunities to recognise abuse may be reduced
- survivor-informed content and support signposting may become less accessible
- children who circumvent restrictions will do so in less safe areas where safeguarding measures are limited
The new restrictions may also create an additional barrier to disclosure: if a child knows they are not supposed to be on a platform, they may be far less likely to seek help if something harmful happens there. In practice, this can also affect discovery. When abuse comes to light before a child has chosen to disclose - for example, through a found message or image - children may deny what has happened, minimise it, or say it was consensual, often because they fear getting into trouble or being disbelieved.
A system that increases fear of 'being caught' online therefore risks strengthening these trauma-based responses, making both disclosure and discovery more complex and less likely to lead to timely support.
5. The 'cliff-edge' problem: what happens when a child turns 16?
A further concern is the abrupt transition created by an under-16 prohibition. On their 16th birthday, young people would suddenly gain access to environments that:
- remain built around adult engagement models
- include high-risk features such as private messaging that allows instant contact from unknown accounts, rapid image sharing, and recommendation systems that can quickly start showing more of the same harmful or sexualised content once it has been viewed
- continue to host grooming, coercion, extortion, and image-based abuse
- often lack effective mechanisms to detect or interrupt TACSA
Although the policy draws a line at 16, it is important to recognise that young people remain children until 18 in safeguarding and child-rights frameworks, and therefore still require environments designed with child safety in mind.
Children who have been excluded from these platforms may enter them without experience, preparation, or awareness of how these systems work.
A protective digital ecosystem would:
- ensure platforms are safe by design for all children
- give young people age-appropriate, supported opportunities to understand digital platforms before gaining full access
- avoid sudden exposure to high-risk features where offenders can initiate contact
Without this, turning 16 becomes a point of heightened vulnerability: young people enter complex, high-risk online environments in a single step, while platform design and offender behaviour remain unchanged.
What a more effective TACSA response looks like:
- Safety by design. Platforms must redesign high-risk features, including private messaging that allows instant contact from unknown accounts, rapid image sharing, algorithmic systems that repeatedly promote harmful or sexualised content once it has been viewed, and search functions that enable offender access.
- Proportionate, privacy-respecting age assurance. Useful as part of a broader safety system, but insufficient on its own.
- Strengthening digital literacy. Children, parents and professionals need skills to recognise grooming, coercion, image-based abuse and online manipulation.
- Embedding lived experience participation. Victims and survivors of TACSA must have a voice in shaping digital-safety policy.
Conclusion: Safety is achieved through system change, not exclusion
MCF understands why many parents welcome an under-16 ban. The desire to keep children safe online is universal and deeply felt. But from a TACSA standpoint, exclusion alone is unlikely to:
- stop abuse
- close the disclosure gap
- or reach the most vulnerable children
Instead, exclusion risks displacing harm into less visible spaces, reducing access to early support, and weakening incentives for platforms to redesign the systems that facilitate abuse.
Children deserve digital environments that are safe because they are designed to be safe - not because children have been removed from them.