Blog
White Label Consultancy | 6th February 2026
Child Digital Safety Meets Data Protection: The UAE’s New Child Digital Safety Law
Introduction
The UAE has recently issued the Federal Decree-Law No. 26 of 2025 on Child Digital Safety (“CDS Law”), which is a significant step towards enhancing children’s digital safety and privacy rights. Notably, the law places children’s privacy at the forefront of digital regulation, treating the handling of minors’ personal data as a core risk area rather than a secondary compliance issue.
The law’s introduction is particularly timely, reflecting the growing global concern about children’s exposure to privacy and cybersecurity risks not only through social media, but also across emerging AI-enabled environments such as immersive virtual games, metaverse platforms, and even toys. Children’s digital interactions expose them to heightened privacy-related risks, such as excessive data collection, behavioural profiling, addictive usage patterns, and sometimes unintended exposure to harmful material. This is because their behavioural data is frequently used to personalise content and optimise engagements, often without offering any transparent explanations or a clear understanding of how their data influences automated systems.
The CDS, therefore, reflects a broader global recognition by regulators that the processinng children’s personal data should require enhanced safeguards and privacy by design controls to reflect the heightened privacy and digital safety risks they face.
Scope, Application & Wider Policy Framework
The CDS Law, which came into force on 1 January 2026, provides a one-year transition period, with full compliance expected by January 2027. The law sets a structured framework to protect children from privacy intrusions, exploitation, and exposure to harmful or inappropriate digital content. “Child” is defined under the law as any individual under 18 years of age. It applies to digital platforms and internet service providers (ISPs) that operate in, or are directed at, users in the UAE. Effectively, both local and foreign tech platforms fall within the CDS’s scope so long as children within the UAE’s geographical scope can access them.
The CDS Law is part of a broader legislative focus on child protection that complements other UAE laws, such as the Child Rights Law (Wadeema’s Law), which establishes that children have a general right to privacy. It also sits alongside the UAE’s Federal Personal Data Protection Law (PDPL), which established a comprehensive, GDPR-style privacy regime for personal data processing. While the PDPL provides a general data protection framework, it does not contain detailed child-specific privacy provisions. The CDS Law therefore fills an important regulatory gap by bringing children’s data protection and digital risk squarely into focus within the UAE’s evolving privacy landscape.
Privacy Focused Obligations Under the CDS Law
A central feature of the CDS Law is its dedicated privacy and data protection framework, which is enshrined in Articles 7 & 10. Notably, the law imposes restrictions on processing children’s data, particularly for children under 13, whereby platform providers are generally prohibited from collecting, processing, publishing, or sharing children’s personal data unless they obtain explicit, documented, and verifiable parental consent, provide an accessible mechanism for withdrawing that consent, and clearly disclose their personal data processing practices to both the child and their caregiver in a privacy policy (Article 7). Such a policy must be presented in a clear, age-appropriate, and easily understandable manner, reinforcing transparency obligations toward both children and guardians.

Moreover, the law emphasises the importance of privacy by default settings for children’s accounts, whereby access to children’s data is restricted to authorised personnel and limited to what is necessary to provide the service (Article 10).
The law largely restricts, even prohibits, the commercial exploitation of children’s data, including targeted advertising and profiling beyond permitted purposes. Specifically, it imposes limitations on profiling and targeted advertising directed at children. Essentially, Article 7 states that where a child is under 13, digital platforms must entirely refrain from using children’s data for commercial purposes, such as for the purpose of providing targeted electronic advertising to the child, or for tracking the activity of children’s personal accounts for purposes exceeding the originally authorised purpose. There should also be parental control tools, which enable parents to set time limits and usage restrictions, as well as general controls on targeted advertising for all children under 18 (Articles 7 and 10).

Collectively, these provisions reflect international best practices in child data protection and underscore the importance of privacy-by-design and risk-based governance for digital services accessible to minors.
Implications for Digital Platforms
The CDS Law brings children’s online safety and privacy to the forefront of compliance considerations for digital platforms. Digital platforms and ISPs operating in, or directed at users in, the UAE must demonstrate that children’s safety and privacy are embedded within their software, enshrining the principle of privacy by design & default. As a starting point, they should implement the establishment of default and initial privacy settings that ensure the highest levels of protection for children’s accounts.
Organisations that collect or process children’s personal data must demonstrate that child safety is integrated into their technical and organisational measures and system architecture. This includes implementing effective and proportionate age-verification or age-assurance mechanisms based on age group, and ensuring that platform features, privacy settings, and user experiences adapt when a user is identified as a child. Platforms should map child-related data flows, reassess profiling and recommendation systems, and integrate child safety considerations into DPIAs, product reviews, and vendor onboarding.
AI features designed to drive engagement or behavioural advertising towards children warrant particular scrutiny, where they may contribute to excessive use or harmful exposure. Article 10 also requires age-based access controls and feature restrictions, including mechanisms to limit or disable functionalities that may encourage excessive interaction or prolonged engagement by children.
Conclusion
For businesses, child digital safety is no longer a peripheral compliance consideration. Rather, it has become a core digital risk and governance priority, with direct implications for product strategy, AI deployment, data monetisation models, and cross-border operations.
The UAE’s Child Digital Safety Law reflects a wider global shift in regulatory thinking: child protection, data protection, and AI governance are increasingly intertwined. For digital platforms, children’s data can no longer be treated as a standard user segment. Instead, it requires enhanced privacy safeguards, safety-by-design architecture, and ongoing, risk-based oversight.
Organisations that proactively embed child-centric privacy controls and robust governance frameworks into their digital services will be better positioned to manage regulatory exposure, maintain user trust, and operate responsibly in an environment where expectations around children’s digital rights and protections continue to intensify.