This comes as part of the ICO’s ongoing efforts to ensure that children under the age of 13 are not accessing services that are not designed for them. The ICO has also begun engaging directly with high-risk platforms, including TikTok, Snapchat, Instagram, Facebook, YouTube, and X (formerly Twitter), to assess their current age assurance practices. These companies have been asked to demonstrate their compliance with the ICO’s expectations within the next two months.
The issue
The digital age of consent under UK data protection laws is 13 years old and if you process the personal data of a child under the age of 13, parental consent is required. The ICO’s call to action is part of its Children’s Code strategy, which aims to ensure platforms prioritise the safety and privacy of children. In an open letter addressed to these platforms, the ICO highlighted that its Children’s Code strategy work identified the inadequacy of current practices, such as relying on self-declaration to verify users’ ages. This method is easily bypassed and exposes under 13s to risks, including the unlawful collection and use of their personal data without appropriate safeguards.
Background
The ICO’s Children’s Code is a statutory code which is taken into account when the ICO consider if an online service has complied with its data protection obligations under UK data protection laws and can also be used in evidence in court proceedings, and the courts must take its provisions into account wherever relevant. Generally, if you don’t conform to the standards in this code, you are likely to find it more difficult to demonstrate that your processing is fair and complies with UK data protection laws.
The Children’s Code applies to relevant information society services (ISS) which are likely to be accessed by children. An ISS is any service normally provided for remuneration, at a distance, by electronic means and at the individual’s request as a recipient of services. Age verification and parental consent should be compatible with the approach to age-appropriate application under this code. If you verify age and parental authority, then you need to do so in a privacy-friendly way.
What does the ICO expect?
The ICO emphasises that modern, privacy-conscious age assurance technologies are now widely available and therefore, should be implemented without delay. Examples of such technologies include facial age estimation, digital identification, and one-time photo matching. These tools provide a more accurate and secure way to verify user ages while complying with UK data protection laws.
Most platforms in the UK already set a minimum age of 13 for users, but the ICO points out that failing to enforce this minimum age breaches UK data protection laws. Where social media and video sharing platforms allow under 13s to access their services, they generally have no legal basis for processing the personal data of these children under UK data protection laws without parental consent.
The ICO expects social media and video-sharing platforms to adopt robust age assurance measures to uphold their own terms of service and protect children. If your service is not suitable for children under a minimum age set out in your terms of service, the IC state you should therefore prevent access to children under your minimum age by implementing an effective age gate. Such measures must comply with data protection principles, including being lawful, fair, proportionate, and secure, while also collecting the minimum necessary personal data.
Regulatory action
The ICO has made it clear that it will monitor industry practices and is prepared to take further regulatory action if necessary, such as reprimands and fines of up to £17.5m or 4% of annual turnover for the previous year whichever the greater. Recent enforcement actions, such as fines issued to Reddit (£14.47 million) and MediaLab (owners of Imgur) (£247,590), underscore the ICO’s commitment to holding platforms accountable for failing to protect children’s personal data and allowing access to services which are not meant for them.
The ICO’s efforts to improve online safety are supported by its strategic collaboration with His Majesty’s Government (HMG) under a Memorandum of Understanding (MoU). The MoU, led by the Department for Science, Innovation and Technology and the Cabinet Office, formalises the ICO’s partnership with the government to protect personal data while enabling responsible innovation. The ICO has also highlighted the importance of robust age assurance standards through initiatives like the Age Check Certification Scheme (ACCS). This scheme tests and certifies age verification technologies such as biometric verification and age estimation software to ensure compliance with data protection and privacy standards.
What next?
The ICO recognises that protecting children online requires coordinated efforts across the regulatory landscape. It is working closely with Ofcom, which enforces the Online Safety Act, to address these challenges. A joint statement from the two regulators, outlining their coordinated approach to online safety and data protection, is expected in March 2026.
HMG is also consulting on children’s use of digital technology, including setting a minimum social media age, restricting risky features like autoplay, raising the digital age of consent, improving age verification, making mobile phone guidance in schools statutory, and offering clearer parental controls and guidance. This consultation closes on 26 May 2026.
The ICO is also concerned about how platforms process children’s data to generate recommendations, particularly when it leads to harmful or addictive content. Investigations into TikTok and Meta regarding their recommender systems are ongoing, demonstrating the ICO’s focus on ensuring that children’s personal data is used responsibly.
While this open letter currently only applies to social media and video-sharing platforms, it is anticipated that such robust age assurance measures will be expected from other platforms and services likely to be accessed by children but not meant for them. For example, online marketplaces, dating apps, diet and health technologies, ticketing platforms for age-restricted events and more. These platforms, though not designed for younger users but may attract them, soon may be required to take proactive robust steps to prevent underage access by implementing effective safeguards rather than just self-declaration.
If you would like more information, please feel free to reach out to one of our dedicated data protection and interactive entertainment lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.