The UK’s data protection regulator calls for urgent action to strengthen age assurance measures on social media and video-sharing platforms

The UK’s data protection regulator calls for urgent action to strengthen age assurance measures on social media and video-sharing platforms

On 12 March 2026, the UK’s data protection regulator, the Information Commissioner’s Office (soon to be Information Commission) (ICO) has published an open letter to social media and video-sharing platforms operating in the UK calling on them to urgently strengthen their age assurance measures.

This comes as part of the ICO’s ongoing efforts to ensure that children under the age of 13 are not accessing services that are not designed for them. The ICO has also begun engaging directly with high-risk platforms, including TikTok, Snapchat, Instagram, Facebook, YouTube, and X (formerly Twitter), to assess their current age assurance practices. These companies have been asked to demonstrate their compliance with the ICO’s expectations within the next two months.

The issue

The digital age of consent under UK data protection laws is 13 years old and if you process the personal data of a child under the age of 13, parental consent is required. The ICO’s call to action is part of its Children’s Code strategy, which aims to ensure platforms prioritise the safety and privacy of children. In an open letter addressed to these platforms, the ICO highlighted that its Children’s Code strategy work identified the inadequacy of current practices, such as relying on self-declaration to verify users’ ages. This method is easily bypassed and exposes under 13s to risks, including the unlawful collection and use of their personal data without appropriate safeguards.

Background

The ICO’s Children’s Code is a statutory code which is taken into account when the ICO consider if an online service has complied with its data protection obligations under UK data protection laws and can also be used in evidence in court proceedings, and the courts must take its provisions into account wherever relevant. Generally, if you don’t conform to the standards in this code, you are likely to find it more difficult to demonstrate that your processing is fair and complies with UK data protection laws.

The Children’s Code applies to relevant information society services (ISS) which are likely to be accessed by children. An ISS is any service normally provided for remuneration, at a distance, by electronic means and at the individual’s request as a recipient of services. Age verification and parental consent should be compatible with the approach to age-appropriate application under this code. If you verify age and parental authority, then you need to do so in a privacy-friendly way.

What does the ICO expect?

The ICO emphasises that modern, privacy-conscious age assurance technologies are now widely available and therefore, should be implemented without delay. Examples of such technologies include facial age estimation, digital identification, and one-time photo matching. These tools provide a more accurate and secure way to verify user ages while complying with UK data protection laws.

Most platforms in the UK already set a minimum age of 13 for users, but the ICO points out that failing to enforce this minimum age breaches UK data protection laws. Where social media and video sharing platforms allow under 13s to access their services, they generally have no legal basis for processing the personal data of these children under UK data protection laws without parental consent.

The ICO expects social media and video-sharing platforms to adopt robust age assurance measures to uphold their own terms of service and protect children. If your service is not suitable for children under a minimum age set out in your terms of service, the IC state you should therefore prevent access to children under your minimum age by implementing an effective age gate. Such measures must comply with data protection principles, including being lawful, fair, proportionate, and secure, while also collecting the minimum necessary personal data.

Regulatory action

The ICO has made it clear that it will monitor industry practices and is prepared to take further regulatory action if necessary, such as reprimands and fines of up to £17.5m or 4% of annual turnover for the previous year whichever the greater. Recent enforcement actions, such as fines issued to Reddit (£14.47 million) and MediaLab (owners of Imgur) (£247,590), underscore the ICO’s commitment to holding platforms accountable for failing to protect children’s personal data and allowing access to services which are not meant for them.

The ICO’s efforts to improve online safety are supported by its strategic collaboration with His Majesty’s Government (HMG) under a Memorandum of Understanding (MoU). The MoU, led by the Department for Science, Innovation and Technology and the Cabinet Office, formalises the ICO’s partnership with the government to protect personal data while enabling responsible innovation. The ICO has also highlighted the importance of robust age assurance standards through initiatives like the Age Check Certification Scheme (ACCS). This scheme tests and certifies age verification technologies such as biometric verification and age estimation software to ensure compliance with data protection and privacy standards.

What next?

The ICO recognises that protecting children online requires coordinated efforts across the regulatory landscape. It is working closely with Ofcom, which enforces the Online Safety Act, to address these challenges. A joint statement from the two regulators, outlining their coordinated approach to online safety and data protection, is expected in March 2026.

HMG is also consulting on children’s use of digital technology, including setting a minimum social media age, restricting risky features like autoplay, raising the digital age of consent, improving age verification, making mobile phone guidance in schools statutory, and offering clearer parental controls and guidance. This consultation closes on 26 May 2026.

The ICO is also concerned about how platforms process children’s data to generate recommendations, particularly when it leads to harmful or addictive content. Investigations into TikTok and Meta regarding their recommender systems are ongoing, demonstrating the ICO’s focus on ensuring that children’s personal data is used responsibly.

While this open letter currently only applies to social media and video-sharing platforms, it is anticipated that such robust age assurance measures will be expected from other platforms and services likely to be accessed by children but not meant for them. For example, online marketplaces, dating apps, diet and health technologies, ticketing platforms for age-restricted events and more. These platforms, though not designed for younger users but may attract them, soon may be required to take proactive robust steps to prevent underage access by implementing effective safeguards rather than just self-declaration. 

If you would like more information, please feel free to reach out to one of our dedicated data protection and interactive entertainment lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.

AUTHORS

Nadia Ahmed Senior Associate

Nadia is an senior associate specialising in data protection, privacy and information law.

Nadia is an senior associate specialising in data protection, privacy and information law.

She advises on compliance with data protection laws and information laws, including the UK and EU General Data Protection Regulation (GDPR), the Data Protection Act 2018, the Freedom of Information Act (FOIA) and codes of practice issued by the ICO and other data protection regulators.

She assist clients with data protection agreements/addendums (DPA), data protection impact assessments (DPIA), drafting and reviewing privacy policies and cookies policies and cookie banners. Nadia handles contentious data protection matters too such as communications with the ICO, personal data breaches and data subject requests such as data subject access requests (DSAR). She keep clients informed of any changes to data protection laws and updated guidance from data protection regulators, and provides training to legal teams and employees on data protection best practices. Nadia has also been seconded to help ensure compliance with GDPR and information law procedures are effective and meet the necessary standards.

Nadia works with a wide range of clients, from small businesses to large corporations, to help them understand their legal obligations and develop data protection strategies and programmes for compliance with data protection laws. Such clients include those in the fashion and retail sector, streaming services, gaming, technology and more.

Nadia has completed the Certified Information Privacy Professionals/Europe (CIPP/E) by IAPP and is a member of the Society for Computers and Law.

Sacha Wilson Partner

Sacha is a commercial and regulatory lawyer with particular expertise in advertising, digital and data privacy. He is head of the firm’s cross-departmental advertising practice.

Sacha is a commercial and regulatory lawyer with particular expertise in advertising, digital and data privacy. He is head of the firm’s cross-departmental advertising practice.

Sacha advises clients from a variety of sectors, including some of the world’s best known brands, agencies and platforms. He is ranked for advertising and digital media in both The Legal 500 and Chambers and Partners and is recognised as one of the UK’s leading advertising lawyers.

Sacha advises on a range of commercial transactions and has particular expertise in advertising-related agreements (such as creative agency, media planning and buying, production and brand partnerships). He is particularly well known for his expertise in digital marketing and adtech.

Sacha also has expertise in general advertising compliance (including prize promotions, native advertising and influencer marketing) as well as ecommerce and online consumer regulations.

Sacha also works within the firm’s retail and technology practices and regularly advises well-known retail brands on a range of retail-focused commercial agreements including distribution, licensing, and franchise agreements, as well as clients across a range of industries on tech focused agreements such as software development, SaaS, and IT services contracts.

In relation to data privacy, Sacha has advised on all the key compliance areas, and has worked with a large number of clients on their data protection compliance programmes. He has particular expertise in the data privacy aspects of marketing, adtech and digital media. He frequently advises on the compliance aspects of adtech vendor arrangements, programmatic advertising, and mobile apps.

Sacha also has expertise in the legal issues associated with AI, particularly in the context of advertising and marketing. He regularly advises clients on the privacy, IP, contractual and regulatory issues associated with the use and deployment of AI for a range of purposes in the advertising and marketing industries.

Kostyantyn Lobov Partner

Kostyantyn is a partner and co-head of the firm's interactive entertainment practice.

Kostyantyn is a partner and co-head of the firm's interactive entertainment practice.

Kostyantyn advises on all aspects of intellectual property, advertising and regulatory issues. Being a litigator by background, he can advise at all stages of a dispute; from early-stage strategy and negotiations to litigation in the High Court and beyond. A significant part of his practice involves co-ordinating advice for projects spanning multiple jurisdictions. He also advises on the practical application of advertising codes and investigations by regulators such as the ASA, CMA and Trading Standards.

Kostyantyn works extensively with clients in the video game and esports sectors, including studios and publishers of all sizes. His wider client base includes brand owners, production companies and various members of the creative industries, tech startups, importers and distributors. They range in size from SMEs to multinational corporations with large in-house legal teams.

Kostyantyn is recognised as a leading lawyer in The Legal 500, IP Stars, and Chambers and Partners.

Sophie Lewis Associate

Sophie is an associate specialising in commercial and regulatory work in the firm's interactive entertainment practice.

Sophie is an associate specialising in commercial and regulatory work in the firm's interactive entertainment practice.

Sophie advises video game studios, developers, publishers and platforms on digital regulation and compliance. Her practice covers consumer law, advertising regulation, AI law, age ratings, loot boxes, gambling regulation, virtual currencies, micro-transactions, subscription models and online safety. Sophie has direct experience managing regulators, including responding to ASA investigations and a six-month secondment in the Consumer Enforcement Team at the UK's Competition and Markets Authority. She often coordinates multi-country projects for clients expanding internationally, translating complex regulatory frameworks into practical compliance plans.

She also supports clients on a wide spectrum of commercial contracts, including development agreements, publishing agreements, IP licences, EULAs, terms and conditions, influencer agreements, talent contracts, subscription agreements, NDAs and other service agreements. Sophie has a keen personal interest in video gaming and is passionate about supporting this sector. She regularly provides training on games regulation and breaks down complicated compliance issues for a range of audiences, from developers to legal teams and business operations.

Sophie is recognised as a key lawyer and leading associate in Legal 500 for video games.