3. Why businesses must rethink their approach to young people’s data
Data law trends 2026
In brief
Governments around the world are accelerating efforts to regulate the digital experience of young people, from targeted age gating laws to sprawling content regulations like the UK’s Online Safety Act (UK OSA). The global regulatory tapestry is increasingly complex. While there’s growing support for age-appropriate design codes (AADCs) and a wave of new policy proposals, we’re also seeing more deliberate divergence in legal approaches, and new attempts to apply existing laws to novel issues.
We predict this tapestry will only get more complicated before it gets simpler. Not only will current AADCs, age assurance rules and content moderation laws amplify compliance cost and enforcement risk for companies in the short term, but strategic competition between governments to set the agenda will drive even more divergence. Privacy and free speech concerns may provide some friction, but broad-based support to improve the online experience of children seems set to drive ongoing change.
Age assurance rules are expanding, but diverging
A global trend is accelerating to regulate minors’ access to online services and content through age assurance measures. A key driver of complexity is the differing focus and technical requirements of these laws.
The UK OSA requires providers to restrict children’s access to various services that allow certain ‘harmful’ content (e.g. pornography and suicide content). In the US, there are similar efforts in various states to restrict children’s access to specific harmful content – although current efforts are largely focused on pornography.
Several US states have also enacted or proposed laws focused on service/feature access, for example requiring parental consent for minors to use social media services and restricting platforms’ use of algorithmic feeds. Australia has similarly taken action, with its ‘social media ban’ for children set to take force in December 2025.
In September 2025 the European Commission announced that an expert panel will deliver comprehensive advice on a social media ban for children under 16 by the end of the year. The European Commission president, Ursula von der Leyen, explicitly referenced Australia as a pioneer that Europe would be closely watching.
![]()
The European Commission is also aiming for a harmonized approach to age assurance across the EU by providing a blueprint for an age verification solution that may voluntarily be adopted by EU Member States.
Theresa Ehlen, Partner
Despite political momentum, important privacy, free speech and feasibility concerns remain – especially when biometric checks or digital IDs are involved. Jurisdictions are seeking to address these issues in different ways. The European Commission has recently funded a tender for the envisioned EU-wide age verification solution.
In parallel, several EU Member States have begun rolling out or testing national age verification tools, often tied to digital identity systems. In the UK, Ofcom and the Information Commissioner’s Office (ICO) are maintaining an ongoing dialogue and have issued guidance on their aligned approach. The UK government has also signaled an intention to actively enable innovation at the intersection of identity verification and privacy, through reforms to the UK Data Protection Act under the Data (Use and Access) Act, which establishes a comprehensive framework for digital verification services.
The US Supreme Court’s decision in Free Speech Coalition v Paxton is likely to embolden further laws, as it confirmed a more permissive constitutional review standard for age verification requirements relating to access to pornographic material.
With these initiatives advancing, businesses – especially platforms for young people – face a rapidly evolving challenge: navigating fragmented age assurance laws that demand different platform designs, accessibility standards and infrastructures across jurisdictions.
Age-appropriate design codes – an increasingly popular policy tool
AADCs are emerging as a powerful policy instrument, setting clear requirements for how online services should handle young users’ data. Common features include requiring high privacy settings by default, greater transparency, age assurance mechanisms and restrictions on profiling and targeted advertising. However, the scope of protection, age thresholds and enforcement mechanisms differ significantly between schemes.
In the US, an increasing number of states are advancing their own AADCs, modeled in part on the UK ICO’s Children’s Code – which was recently put on a statutory footing. These efforts are encountering significant First Amendment challenges. Critics argue that such laws may infringe on free speech rights by requiring platforms to restrict or alter content based on user age, effectively compelling speech or imposing broad limits on lawful expression.
Beginning in 2023, the European Commission sought to leverage the EU’s Digital Services Act (EU DSA) tool of voluntary codes of conduct to bring platforms behind AADCs. A special group began developing a new EU code of conduct on age-appropriate design (‘BIK+ Code’). As of this year, the European Commission appears focused on setting its own approach rather than following other countries. In its guidelines on the protection of minors, the Commission set out its interpretation of Article 28 EU DSA, including requirements for engaging design features and safeguards applied to AI chatbots integrated into online platforms.
Elsewhere, countries are using the AADC concept to inform local approaches. Australia is developing a legally binding Children’s Online Privacy Code modeled directly on the UK framework. In contrast, Singapore has strengthened existing protections by interpreting its data privacy laws through advisory guidelines for children’s data.
With the world’s largest youth population, the Indian government’s draft Digital India Act is expected to contain specific and stringent rules regarding the processing of children’s data, although the draft is currently on hold.
While current free speech challenges in the US may temporarily slow momentum, a broader global push may ultimately establish de facto global standards that minimize the significance of those laws not coming into force.
Expansive content regulations are likewise focused on children’s safety
Alongside rules for age assurance and design, a parallel trend sees governments implementing expansive content moderation regimes, with children’s safety often cited as the central justification for broad new duties.
![]()
Frameworks like the UK OSA and the EU DSA create comprehensive new obligations for online services, but their differing approaches introduce another layer of regulatory fragmentation.
Rachael Annear, Partner
The UK OSA is arguably the most prescriptive. It requires services with a UK user base to have systems and processes in place to reduce illegal content and, crucially, material deemed ‘harmful to children.’ This creates stringent obligations on platforms to conduct robust risk assessments and adopt measures such as notice-and-takedown frameworks and, in certain cases, automated content moderation tools.
The EU DSA takes a different, though equally comprehensive, approach. Rather than defining specific categories of ‘harmful’ content, it focuses on process and systemic risk. Platforms must swiftly remove illegal content once identified and, under Article 28(1) EU DSA, implement special protection measures for minors, including a ban on targeted advertising based on their data.
Recent guidelines from the European Commission on how services should approach the protection of minors under Article 28(1) have added significantly more depth to compliance expectations. These efforts are likely to be reinforced by renewed initiatives to pass new EU laws targeting child sexual abuse material.
As enforcement under these laws ramps up – and as other jurisdictions study the UK OSA and EU DSA as potential models – businesses face the likelihood of yet another layer of regulatory divergence.
Current enforcement trends paint a complex picture
Across Europe and the US, enforcement of child safety and privacy laws has predictably ramped up as more laws have come into force, regulators have received boosted funding, and the public and lawmakers have pressed more aggressively for action.
In the EU and UK, while a number of large services have implemented new or upgraded age assurance measures, child-focused regulatory actions have increased sharply. The Irish Data Protection Commission issued billions of euros in fines between 2022 and 2024, with a notable uptick in cases involving minors.
Italy’s Garante temporarily blocked access to a well-known AI service in 2023, in part due to concerns about the platform’s lack of age verification. Similarly, Coimisiún na Meán (CnaM), Ireland’s media regulator, recently opened an investigation into X for allegedly failing to apply age assurance under the Irish Online Safety Code in relation to pornographic material on the platform. This aligns with the broader strategy CnaM announced in April 2025, which placed issues affecting children at the center of its regulatory agenda for the coming years.
In the US, enforcement is more fragmented, involving actions by federal and state regulators as well as private litigants. In early 2025, the Federal Trade Commission reached a US$20m settlement with Cognosphere, LLC over allegations that its mobile game was deceptive and failed to obtain required parental consent for minors’ use. Last year, the Texas Attorney General launched a data privacy and enforcement initiative, and his office has since announced investigations into multiple tech platforms related to children’s data. Multiple private class actions against TikTok and its parent company over children’s use of the platform have been consolidated into an ongoing multidistrict litigation in federal court in California.
![]()
In the future, enforcement will be shaped by overlapping laws – privacy, consumer protection and content moderation – creating legal complexity.
Theresa Ehlen, Partner
A common challenge in enforcement will be evidentiary – particularly proving causation between platform design and harm to minors when alleged harms are primarily psychological. The ongoing ramp-up in enforcement warrants close attention to how regulators and private plaintiffs apply new and existing regulations.
Looking ahead
As we move forward, we anticipate that more jurisdictions will introduce laws aimed at regulating children’s online experiences. As these regulations evolve, we expect:
- Escalation of enforcement action – regulators, fueled by increased funding and public demand, will increasingly use a combination of privacy, consumer protection and content moderation laws to impose stricter operational requirements on platforms and issue larger penalties.
- Divergence in jurisdictional approaches – regions are pursuing individual courses of action, with policy goals at times prioritized over the creation of a unified international standard.
- Heightened compliance costs – companies, particularly those with younger audiences, will face pressure to invest in age assurance infrastructures and adapt platform designs to meet diverging legal requirements across jurisdictions.
The regulation of children’s digital experience is shifting from a patchwork of isolated efforts to a more systematic, yet highly divergent, global framework. While the shared goal is child safety, the way it is being implemented is creating jurisdictional and legal conflicts. Much of the burden of navigating this complexity falls on companies – service providers must proactively embed robust compliance and assurance frameworks directly into product development and design from the outset. Within this fragmented ecosystem, a strategic rather than reactive approach to child safety is no longer optional – it is paramount.
Our team
