EU AI Act Transparency Obligations: latest developments and key obligations

A core requirement imposed by the EU AI Act (the Act) is in respect of transparency obligations for the AI systems used.

The majority of the Act is expected to come into force on 2 August 2026. The European Parliament, however, has agreed a proposal that would delay the obligations imposed in respect of high risk AI systems. The remaining provisions of the Act remain largely unaffected, and businesses should operate on that basis, noting that breaching these obligations can result in a fine of up to EUR 15 million or 3% of their total worldwide annual turnover for the preceding financial year (whichever is higher).

The Act raised a number of questions around how companies would comply with their transparency obligations. This led to the creation of a draft code of practice (the “Code of Practice on Marking and Labelling of AI-generated content” (the Code)), integrating feedback from hundreds of participants and observers including industry, academia and other stakeholders.

The Code of Practice on marking and labelling of AI-generated content

The second draft of the Code was published on 3 March 2026 and a final version is expected by June 2026. The Code is subject to further amendments, but sets out four key requirements to demonstrate compliance:

  1. multi-layered marking through metadata embedding, imperceptible watermarking, or fingerprinting/logging;
  2. providers having to offer a free interface or publicly available tool enabling users and third parties to verify whether content is AI-generated;
  3. technical solutions for marking and detection must be effective and reliable; and
  4. continuous testing and improvement to keep pace with real-world developments.

The transparency obligations

The Code is underpinned by the underlying transparency obligations in the Act.

The extent of these obligations is influenced by different factors such as whether the AI system is classified as limited or high risk; and whether you are a deployer or provider.

For limited risk AI systems:

If you are a provider

A ‘provider’ is a company, individual, public authority, agency or body that: (a) develops, or procures the development of an AI system or general-purpose AI model; and (b) places it on the market or puts it into service under its own name or trademark. In other words, this applies to those who set out to create, or procure the creation of an AI system.

Providers of limited risk AI systems must comply with three core transparency requirements.

  1. AI systems must be designed to inform individuals that they are engaging with an AI system;
  2. Providers must ensure that outputs are marked in a machine-readable format and are detectable as artificially generated or manipulated; and
  3. Technical solutions employed must be effective, interoperable, robust and reliable.

The question of how providers can satisfy these requirements has been a recurring area of discussion, such that the European Commission has stepped in to provide guidance via the voluntary code of practice on the transparency of AI-generated content. We discuss this in further detail below.

If you are a deployer

In contrast, a ‘deployer’ is a company, individual, public authority, agency or body using an AI system under its authority, except where the AI system is used in a personal non-professional activity.

Given that deployers are effectively users with little to no control over the AI system, they are subject to much fewer disclosure requirements. The Act only imposes obligations on deployers of three specific types of AI systems:

  1. emotion recognition or biometric categorisation systems;
  2. deepfakes, where the system generates or manipulates image, audio or video content; or
  3. systems generating or manipulating text published to inform the public on matters of public interest.

For high risk AI systems:

If you are a provider

Unsurprisingly, the Act imposes the most obligations for this category. In general, it will include requirements for providers to supply instructions for safe use and information about accuracy, robustness, and cybersecurity. Individuals overseeing such systems must be suitably qualified to understand the system’s capacities and limitations, with various recordkeeping and risk management protocols.

If you are a deployer

Similar to above, deployers face fewer but a broader set of obligations reflective of the higher risk AI system. These include the implementation of specific governance, monitoring, transparency and impact assessment requirements. The key obligations can be grouped under two headings:

Operational obligations

The deployer must implement appropriate measures to ensure the high-risk AI system is used in accordance with the relevant instructions for use, that input data is relevant and sufficiently representative for the intended purpose of the system, and monitor its operation in order to be able to inform the provider in the event it identifies any risks or serious incidents.

Control and risk management obligations

A deployer must conduct a fundamental rights impact assessment (FRIA) before deploying the system, assign human oversight to individuals with the necessary competence, train and regularly monitor the AI system for risks, and keep the logs of the AI system in an automatic and documented manner for at least six months.

Future outlook

The trajectory is unmistakable: the Act positions transparency as a core principle, which is going to impact design choices, user interfaces and governance processes. Organisations will be expected to comply with the Code and the underlying transparency obligations that underpin it.

Companies leveraging AI along their supply chain should therefore prioritise embedding and documenting transparency measures that can withstand both regulatory and legal scrutiny, while ensuring alignment with wider IP governance and strategic commercial decisions.

For more information the EU AI Act and the Code and how they might impact your business, contact Sacha Wilson and Jacky Lai.

Unfair contract terms in consumer contracts: new draft guidance from the CMA

If you deal with consumers, then you need to know how consumer law applies to your contract terms and notices.

Ten years on from the introduction of the Consumer Rights Act 2015 (the CRA), the Competition and Markets Authority (the CMA) is revising its current guidance on unfair contract terms.

The draft guidance is aimed at making the guidance more accessible, helping businesses better understand and comply with the CRA. The consultation closed on 19 March 2026. Once finalised, it will replace the existing guidance on unfair contract terms.

Which terms are unfair?

Contract terms are unfair if they tilt the rights and responsibilities excessively in favour of the supplier. The law currently uses a ‘fairness test’ by looking at the words in the contract, taking into consideration what is being sold, how a term relates to other terms in the contract, and all the circumstances at the time the term was agreed.

Certain terms and notices giving rise to particular concerns are ‘blacklisted’ and deemed as unsuitable for use with consumers. These include terms that exclude or restrict liability for death or personal injury resulting from negligence, a consumer’s statutory rights and any associated remedies. Blacklisted terms are never enforceable against a consumer.

What are the key changes in the draft guidance?

Enhanced CMA enforcement powers under the DMCC:

The updated guidance integrates the Digital Markets, Competition and Consumer Act 2024 (the DMCC), enabling the CMA to impose penalties without going to court for businesses that use prohibited, non-transparent or unfair terms or notices. Fines may be up to 10% of a company’s global turnover or £300,000 (whichever is higher).

Transparency – more than words:

Transparency now covers not just the content itself, but also its presentation by requiring clear fonts and headings that follow a logical structure, supported by explanation of terms which may be complex or challenging to understand.

Fairness and consumer behaviour:

The requirement of ‘good faith’ should include a behavioural dimension. Suppliers must consider consumer psychology and avoid exploiting consumer biases — for instance, consumers’ tendency not to read standard terms thoroughly, or to underestimate future costs such as renewal or termination fees. Campaigns emphasising quick benefits, such as a free trial, while using tactics to minimise attention as to future costs will face greater scrutiny. Automatic renewal of subscriptions are also specifically noted as an area of concern, with the DMCC’s new subscription provisions (to enter into force no later than August 2026) adding further obligations.

The role of advertising:

Advertising is explicitly incorporated into the fairness assessment, requiring consistency between terms and marketing claims. Small print which removes or curtails more prominent claims, failing to highlight key terms during the marketing process, or inconsistency between marketing claims and the contract terms could give rise to an unfair commercial practices. Statements made by a supplier that a consumer is likely to see may also be treated as terms of the contract.

Exclusions and variations to the contract:

Vague language such as “liability is excluded so far as the law permits” will not remedy an unfair clause; and terms allowing a supplier to vary terms such as changing the description or price of the services or goods may now be deemed unfair should they be overly wide in scope or result in changes that may be unexpected to the customer.

What are the key takeaways for consumer businesses?

The draft guidance makes clear that unfair, onerous or significantly unbalanced terms will be closely scrutinised. Suppliers should ensure that lines of communication with customers are clear, transparent and user-friendly to understand.

Contract terms should similarly be reviewed to make sure that they strike a reasonable balance without prejudicing consumers by including reasonable protections around cancellation or refund rights.

For more information on how the new guidance will impact your consumer contracts, contact Sacha Wilson and Jacky Lai.

UK direct marketing laws made easier for charities

The UK’s new Data (Use and Access) Act 2025 will be changing the direct marketing laws to make it easier for charities to send electronic marketing to existing supporters and supporters who have expressed an interest in the charity without their express consent.

This is referred to as the “soft opt-in” rule which is currently relied on by many commercial businesses and will be amended to broaden the scope to charities.

How can charities rely on soft opt-in?

Charities can send electronic marketing such as, emails or text messages or direct messages on social media, without the consent of a person, providing:

  • The sole purpose of electronic marketing is to further the charity’s own charitable purpose(s)
  • The charity collected the contact details directly from the person themselves
  • The charity collected the contact details when a person:
    • expressed an interest in one or more of the charitable purposes; or
    • offered or provided support to further one or more of those purposes
  • People are given a simple and free of charge way of opting out of direct marketing at the time of:
    • collecting their contact details; and
    • every subsequent direct marketing message thereafter

How can charities start to rely on soft opt-in?

The UK’s data protection regulator, the Information Commissioner, has stated that this change allowing charities to rely on the “soft opt-in” rule is planned to commence from January 2026.

What is the latest from the UK regulators on soft opt-in?

The Information Commissioner has produced draft guidance and launched a consultation on the new rules aiming to gather feedback from charities. The consultation runs from 16 October to 27 November 2025 and details can be found here.

What can charities do now to prepare?

  • Review your privacy policy to inform people of the reliance on “soft opt-in”
  • Review your consent mechanisms and plan the changes needed to rely on “soft opt-in”
  • Review your current opt-out mechanism and plan the changes needed to rely on “soft opt-in”
  • Ensure you have a do not contact list of people who have opted out of receiving direct marketing
  • Review existing marketing lists to separate people who have given their consent to electronic marketing and people who will be sent it using the “soft opt-in” rule
  • Train staff on how to respond to queries and complaints from people about the direct marketing
  • Implement policies and procedures to ensure staff know how to implement “soft opt-in” and the rules around data protection

The Data (Use and Access) Act receives Royal Assent, bringing change to the UK’s data protection regime

On 19 June 2025, the UK’s Data (Use and Access) Act 2025 (the “DUA Act”) received Royal Assent.

This new legislation updates the UK’s current data protection regime which comprises of the UK General Data Protection Regulation (the “UK GDPR”), the Data Protection Act 2018 and the Privacy and Electronic Communications Regulations (the “PECR”). The DUA Act will come into force in phases, expected to commence at two, six and twelve months after Royal Assent, giving you time to implement the necessary data protection related changes to your organisation.

What does the DUA Act change and how does it impact organisations?

New ‘recognised legitimate interests’ lawful basis: when you use personal data for legitimate interests, you need to balance the impact on the people whose personal data you use, against the benefits arising from that use – this is commonly done by way of a legitimate interest assessment (“LIA”). However, the DUA Act now includes a list of recognised legitimate interests which means for such interests you don’t need to complete an LIA. This list will be in Schedule 4 to the DUA Act which inserts a new annex to the UK GDPR and includes interests such as:

  1. Sharing personal data if a public authority confirms it’s needed for their public task
  2. Using personal data to safeguard national security, public security, or defence
  3. Using personal data to respond to emergencies under the Civil Contingencies Act 2004
  4. Using personal data to detect, investigate, prevent crime, or prosecute offenders
  5. Using personal data to protect vulnerable individuals from physical, mental and emotional harm or neglect and support their well-being

A new ‘assumption of compatibility’: under the purpose limitation principle, if you re-use personal data you have already collected for a different purpose, you must ensure the new purpose is compatible with the purpose you initially collected it for. However, the DUA Act now includes a list of reuses of personal data that are assumed compatible with the original purpose. This list will be in Schedule 5 to the DUA Act which inserts a new annex to the UK GDPR. You can reuse previously consented personal data for a new purpose if necessary for one of the reasons below, but only if it’s not reasonable to obtain fresh consent, such as using personal data to:

  1. assess or collect taxes or duties; or
  2. comply with legal requirements.

‘Soft opt in’ for charities: if you’re a charity, it allows you to send electronic mail and SMS marketing to people whose personal data you collect when they support, or offer support or express an interest in, your work – providing you offered them a chance to opt out when you collected their personal data and you provide them with a chance to opt out in every electronic communication thereafter.

New cookie exemptions: the DUA Act allows you to set some types of cookies without having to get consent. Currently, you must get consent for all non-strictly necessary cookies. The list of exemptions will be in Schedule 12 to the DUA Act which inserts a new schedule to PECR, so you won’t need consent where the cookie or similar technology is for:

  1. the sole purpose of carrying out transmission of a communication over an electronic communications network;
  2. the non-exhaustive examples of strictly necessary purposes listed in the schedule, including security, fraud prevention, fault detection and authentication;
  3. the sole purpose of enabling a service provider to collect information for statistical purposes about how their online service is used;
  4. the sole purpose of enabling a service to adapt its appearance or functions in accordance with someone’s preferences; and
  5. the sole purpose of working out the subscriber or user’s geographical location when they request emergency assistance.

Reasonable and proportional search under data subject access requests (DSARs): it makes it clear that you only have to make reasonable and proportionate searches when someone asks for access to their personal data.

Complaints: individuals have certain rights such as the right to be informed, access, object, erase, restrict and rectify their personal data. The DUA Act introduces a right for people to complain to organisations and competent authorities if they think that they’ve used their personal information in a way that doesn’t comply with the law. This is similar to the complaints procedure under Freedom of Information Requests under FOIR. It places an obligation on organisations and competent authorities to:

  1. help people to make complaints, requiring them to take steps such as providing an electronic complaints form; and
  2. acknowledge complaints within 30 days and advise the complainant of the outcome without undue delay.

They must also take appropriate steps in the meantime, such as making enquiries into the subject matter of the complaint and keeping the complainant informed about progress.

Using personal data for scientific research: the DUA Act makes it clearer when you can use personal data for the purposes of scientific research, including commercial scientific research. It makes the following clarifications:

  1. People can give “broad consent” to an “area of scientific research” rather than “specific” consent – as long as: the exact purpose was unknown at the time of consent, the consent aligns with recognised ethical standards for the research area, and individuals are given the option to consent to only part of the processing.
  2. You can re-use people’s personal data for scientific research without giving them a privacy notice, if that would involve a disproportionate effort, so long as you protect their rights in other ways and still explain what you’re doing by publishing the notice on your website.  

Automated decision making – personal data: previously, decisions based solely on automated processing of personal data were restricted unless they were necessary for a contract between you and the individual, permitted by UK law or done with consent from the individual. Now, the DUA Act removes this restriction and allows an organisation to make solely automated decisions in a wider range of situations as long as it has appropriate safeguards in place – such safeguards include:

  1. providing the individual with information about the decision;
  2. allowing that person to make representations about the decision;
  3. enabling that person to obtain human intervention about the decision; and
  4. enabling that person to contest the decision.

There is no change to the restrictions around decision based solely on automated processing of special categories of personal data – they are still restricted unless you have consent from the individual or it was necessary for substantial public interest under the Data Protection Act 2018.

International transfers: various changes have been made to help make transferring personal data internationally easier. For example:

  1. The protection standard for transferring data now requires that it “is not materially lower” than UK GDPR and Data Protection Act 2018 standards (previously, it required that “the protection of natural persons guaranteed by the UK GDPR is not undermined”). This is referred to as the data protection test.
  2. Schedule 7 of the DUA Act form formalises the requirement for an organisation to do a transfer risk assessment for transfers subject to appropriate safeguards (such as standard contractual clauses). It does this by saying that an organisation must meet the data protection test “reasonably and proportionately”.

There are some operational and terminology changes such as: adequacy decisions are now called “transfers approved by regulations”, with the Secretary of State required to consider specific factors for the data protection test, implement ongoing monitoring instead of a four-year review period, gain new powers to recognise and introduce other transfer mechanisms, and make minor adjustments and restructuring to existing transfer requirements.

PECR breaches and enforcement: there are changes to the rules under PECR, including:

  1. the time period within which communications providers need to inform the ICO of a personal data breach from without undue delay or within 24 hours, to ”without undue delay and where feasible, not later than 72 hours after having become aware of it”, aligning it with the UK GDPR requirement to report a personal data breach;
  2. removing the requirement to establish that a contravention under PECR has caused substantial damage and distress; and
  3. allowing the ICO to impose monetary penalties up to a maximum of £17.5m for certain failures to comply, aligning it with the UK GDPR monetary penalty cap.

Changes to the ICO: there are multiple changes around the structure and powers of the UK’s data protection regulator, the Information Commissioner’s Office, such as:

  1. The ICO can compel individuals working for or on behalf of organisations to attend interviews and answer questions if there is suspected non-compliance or an offence under data protection law.
  2. An extension to the time for the ICO to issue penalty notices after a notice of intent from six months to six months or as soon as reasonably practicable.

Are there any new compliance requirements you have to meet?

Yes:

  1. If you provide an online service that is likely to be used by children, the DUA Act explicitly requires you to take their needs into account when you decide how to use their personal information. You should already satisfy this requirement if you conform to the ICO’s Age Appropriate Design Code.
  2. If you don’t already do so, the DUA Act requires you to take steps to help people who want to make complaints about how you use their personal data, such as providing an electronic complaints form. You must to acknowledge complaints within 30 days and respond to them ‘without undue delay’.   

Next steps

  1. Familiarise yourself with the changes that the DUA Act makes to data protection laws.
  2. Map out how the DUA Act can make your organisations compliance with data protection laws easier – such as “should do” and “must do” lists.
  3. Introduce a complaints escalation mechanism to allow individuals to complain to your organisation if they feel that the organisation has not complied with data protection laws.
  4. Implement a data protection compliance programme accordingly.

If you would like more information, please feel free to reach out to one of our dedicated data protection lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.

The UK’s Data (Use and Access) Bill passes as Lords’ concede on a push for AI transparency to protect creative industries

On 11 June, the House of Lords debated amendments to the Data (Use and Access) Bill (the Bill) and marked the culmination of an extensive “ping-pong” process between the House of Lords and the House of Commons regarding the protections for copyright holders in the context of artificial intelligence (AI).

What was the debate about?

  • The Government’s commitment to protecting copyright holders remains but it argues it cannot act prematurely without completing consultations on the issue. Emphasising the importance of transparency, enforcement and remuneration, it insisted on following due process, which includes analysing over 11,500 consultation responses and establishing technical and parliamentary working groups.
  • Several Lords, including Baroness Kidron and Lord Berkeley of Knighton, expressed frustration at the Government’s inaction. They argued that immediate transparency measures are needed to protect copyright holders from exploitation by AI companies. The creative sector fears that AI systems are using copyrighted works without consent or compensation, which could undermine the livelihoods of artists, writers, musicians and others.

What happened?

In efforts to ensure transparency and incentivise AI developers to comply with copyright law Lord Berkeley of Knighton introduced a new amendment to the Bill requiring AI developers to disclose which copyrighted works they use for training and how they access them, unless a licence has been agreed with rights holders.

Lord Berkeley ultimately withdrew his amendment, citing a desire to maintain the dignity of the House and avoid further unnecessary divisions. However, he and others urged the Government to take the concerns of the creative industries seriously and act swiftly to address them.

What will happen next?

The Bill now awaits Royal Assent and once in force, it will reform elements of the UK GDPR and Privacy Electronic Communications Regulations – from introducing a list of recognised legitimate interests to adding new exceptions to the consent requirements for cookies and similar technologies.

It should be noted that while the UK’s adequacy decision from the EU to allow a free flow of personal data transfers has been extended to 27 December 2025, the Bill does introduce changes to the UK GDPR which ultimately leads to a departure from the EU GDPR. As such, we wait eagerly to see if it decided whether or not the UK’s data protection regime will continue to offer materially equivalent protections in order to maintain the free flow of transfers between the UK and EU.

If you would like more information, please feel free to reach out to one of our dedicated data protection lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.

Legal Directions podcast: career insights from Sacha Wilson and Lisa Lacuna

In the latest episode of the Legal Directions podcast, partner and head of our advertising group Sacha Wilson and senior people manager Lisa Lacuna introduce Harbottle & Lewis and discuss the work we do, what life is like at the firm and what aspiring trainees can do to maximise their chances of success when applying for training contracts.

Sacha gives an overview of his career, explaining his journey to partnership before providing an insight into advising clients in a few of our key sectors, including advertising and the wider media and entertainment industry. He outlines the type of work he and his team does, from contract negotiations for brands and advertising agencies to data protection compliance and regulatory projects, and both the challenges and opportunities brought by emerging technologies such as NFTs and the legal issues when working with AI or in areas like influencer marketing.

Lisa discusses the application process for training contracts, providing guidance for those looking to apply and sharing key tips to consider when invited to interview.

To find out more about applying for a training contract or what it’s like to work at Harbottle & Lewis, visit our Grad Hub.

The Legal Directions podcast is hosted by the University of Liverpool Law School and is aimed at students, paralegals and other aspiring legal professionals keen to gain insight, advice and inspiration from those working in the legal industry. You can listen to this episode here.

Reel trouble: the ICO reprimands Greater Manchester Police for CCTV failings

On 29 May 2025 the ICO reprimanded Greater Manchester Police (GMP) for failures in handling sensitive CCTV footage of a custody detainee, exposing gaps in data protection practices. The case highlights outdated policies, inadequate training, and procedural failings that led to missing footage.

Background

The data subject was held in custody at Pendleton Police Station for 48 hours in February 2021 during which CCTV was in operation. GMP became aware of serious allegation made against officers via local media and requested that Pendleton Police Station retain the personal data of the data subject. This was beyond the documented period of 90 days and the procedures in place at the time allowed for retention of a period of up to six years.

During the process of retaining the personal data, the personal data was quality checked to ensure its security. GMP had received multiple Data Subject Access Requests (DSARs) from the individual concerned. When GMP was able to comply with the request to release the footage captured, it was then quality checked.

Following a resolved technical issue, where one of the discs containing some of the data would not initially play and it was established on 19 May 2022 that two hours of footage was missing from the personal data set originally retained in 2021.

On 23 August 2023, GMP stated that, despite all attempts, it was unable to recover the missing two hours of footage. This led GMP to self-report a personal data breach to the ICO on 5 September 2023.

Findings

Following the assessment of information provided by both the Independent Office for Police Conduct and GMP who were conducting separate investigations with a different scope, the ICO has identified two main failures leading to this lack of quality check:

  • A misunderstanding at the time between staff, each believing that the other had conducted a quality check
  • A lack of any policies or guidelines at the time within GMP, identifying that quality checks were required, coupled with a lack of appointed responsibility for this task

Therefore, the ICO considers that the GMP failed to take the following actions:

  1. Provide the data subject with their personal data without undue delay and by the end of the applicable period of one month. This is because following the expiry of any exemptions in place to the right of access, GMP was not able to release all applicable personal data to the individual within the timeframe or to date. GMP did not provide the ICO with any evidence that it notified the data subject of any such extension.
  2. Ensure that the appropriate technical or organisational measures were in place to protect the accidental loss of the CCTV data it was processing in 2021. The ICO considers that had GMP had an appropriate standard operating procedure (SOP) in place, with clearly defined and delegated responsibilities for quality checking any backed-up personal data. This would have mitigated the risk of this breach. GMP failed to deploy an adequate SOP, designed to encompass the processing and retention of personal data beyond 90 days. The operating procedure that was in place had been developed in 2017 and had not been reviewed or amended since that time. In line with good practice, SOPs should be reviewed and updated, if necessary, once every 12 months.
  3. Conduct a data protection impact assessment (DPIA) in relation to their CCTV systems. A DPIA should have been conducted in compliance with section 64 of DPA 2018. A DPIA would have crucially assisted GMP in identifying shortfalls in their technical and organisational measures at the time.
  4. Provide the GMP’s custody officer with data protection training despite having a data protection training regime in place, which was supposed to have provided all staff members with data protection training during induction periods.

There were issues with the CCTV system itself such as:

  • The CCTV system, in operation at the time, was only able to download captured footage for retention in half-hour or one-hour segments. This placed GMP staff at substantial risk of human error.
  • The CCTV system did not save the half-hour/one-hour segments in chronological order, resulting in it being difficult to identify if all required footage had been captured.
  • The CCTV system did not have any inbuilt alerts, identifying any errors that may have occurred during the back-up process.

Mitigating and remedial steps taken by GMP

The ICO took into account the following:

  • GMP, at the time of the breach, had a requirement for a form of authorisation in place. This required the signed authorisation of an officer, ranked inspector or above, to allow the appropriate team access to the footage recorded on the server (held for 90-days before automatic overwrite).
  • Any footage retained was stored by GMP in sealed evidence bags at the time. This ensured there was no break in the evidence chain, during the period the footage was held by GMP and Pendleton Police Station.
  • GMP has undergone a proactive investment in their surveillance and security system infrastructure in 2023. This resulted in a significant upgrade to their system capabilities.
  • GMP has introduced a strictly regulated process to ensure that only authorised force personnel had access to the footage held within the CCTV server. Access was restricted to qualified officers within the criminal justice and custody branch of GMP.
  • GMP has informed the ICO of improvements to their security when managing DSARs from individuals. GMP advised that these requests are now administered centrally within their Information Access team. Where a DSAR is submitted, custody officers contact the relevant custody unit as soon as possible with urgent instructions as to how the footage is to be retained, so this is not overwritten. The footage is automatically uploaded to a dedicated local folder for DSARs. This location can only be accessed by authorised officers within the custody branch.
  • Auditing of footage has been vastly improved. This provides a comprehensive account of which officers have accessed the footage, copied it to disc or the location of the server, with date stamps.
  • GMP have already improved their SOP. The operating procedure has undergone a complete rewrite. GMP will ensure that this new procedure will be circulated moving forward across the force. GMP will ensure this procedure is now reviewed on an annual basis.

Action

Taking into account all the circumstances of this case, including the mitigating factors and remedial steps, the ICO decided to issue a reprimand to GMP. The ICO set out certain recommendations which do not form part of the reprimand and as such are not legally binding. Such recommendations include:

  • When formulating a replacement for the current processes, GMP should create an appropriate SOP, detailing how any retained personal data should be quality checked.
  • When developing the SOP, the roles and responsibilities for such checks should be clearly defined.
  • Under section 64 of the DPA 2018, GMP is required to have a DPIA in place for this processing. GMP should develop a DPIA for this processing without delay if they haven’t done so already.
  • GMP should deploy appropriate technical and administrative processes to monitor that all staff receive appropriate data protection training, which is refreshed at least every two years (recommended every year), in line with good practice. Staff should be trained and regularly refreshed on how to identify a personal data breach.
  • All breaches should be reported to GMP’s Information Access team/Data Protection Officer for assessment and documentation.
  • GMP should always keep a written record/assessment regarding their rationale not to inform the ICO of a breach.

Comment

While the ICO’s decision to reprimand, rather than fine, GMP reflects its Public Sector Approach – which avoids penalising taxpayer-funded organisations to prevent a “double hit” on victims and the public – this enforcement underscores the critical importance of protecting highly sensitive data, such as CCTV footage, where mishandling can lead not only to a data breach but a failure to respond to a data subject’s request. The key takeaway is to ensure measures are in place to comply with data protection laws in relation to CCTV such as access procedures, retention policies, security measures, staff training and data protection impact assessments.

If you would like more information, please feel free to reach out to one of our dedicated data protection lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.

Joint Controllers, TC Strings, and OpenRTB: Unpacking the Belgian Market Court’s Appeal Decision on IAB Europe’s TCF

On 14 May 2025, the Belgian Market Court (part of the Brussels Court of Appeal) delivered a landmark judgment in the case concerning IAB Europe’s Transparency and Consent Framework (TCF).

The case centred on allegations that IAB Europe violated the General Data Protection Regulation (GDPR, or AVG in Dutch) through its data processing practices within the TCF. This judgment follows an earlier decision by the Belgian Data Protection Authority (APD), which found several breaches of the GDPR and imposed a €250,000 fine on IAB Europe.

CASE BACKGROUND

IAB Europe is an international non-profit association aiming to bring compliance to the digital advertising and marketing sector. They developed the TCF to promote adherence to the GDPR when internet sites or applications use the OpenRTB protocol.

On 2 February 2022, the APD found that IAB Europe’s TCF violated GDPR and fined IAB €250,000. Key findings included:

  • The TC String (user preferences signal) is personal data.
  • IAB Europe is a joint controller for both the creation and subsequent processing of the TC String.
  • Lack of a valid legal basis for processing TC Strings as the TCF did not obtain explicit and informed consent from users, nor could it rely on legitimate interests due to the large-scale and intrusive nature of the data processing involved.
  • Failure to fulfil transparency obligations and not adequately informing users about its role as a data controller, the purposes of data processing, or the recipients of their data.
  • Inadequate security measures and lack of mechanisms to prevent manipulation of consent signals.
  • Failure to conduct data protection impact assessments.
  • Failure to appoint a data protection officer. 
  • Incomplete register of processing activities.

On 4 March 2022, IAB Europe challenged the APD’s decision before the Belgian Market Court, disputing its role as a joint controller and the APD’s legal analysis on the TC String being personal data.

On 7 September 2022, the Belgian Market Court made an interim ruling, confirming the procedural irregularities in the APD’s investigation. It referred two preliminary questions to the CJEU:

  • Does the TC String constitute personal data under GDPR?
  • Is IAB Europe a joint controller for processing TC Strings and subsequent data uses?

On 7 March 2024, the CJEU judgement confirmed that:

  • the TC String may constitute personal data if:
    1. It is associated with other data points (e.g., IP address) that can identify a user.
    2. IAB Europe has reasonable means to access such data.
  • IAB Europe may be a joint controller for the creation and use of TC Strings if it influences the processing’s purposes and means.
  • IAB Europe is not a joint controller for subsequent processing (e.g. personalised advertising) by third parties.

The case was sent back to the Belgian Market Court for factual verification and further examination which this article explains.

FINDINGS OF THE MARKET COURT

Are TC Strings Personal Data?

TC Strings are unique codes containing users’ consent preferences.

The Market Court referenced the preliminary ruling of the CJEU in March 2024, which clarified that TC Strings, when linked to identifiers such as IP addresses, allow for user identification.

In paragraph 48 of the judgment, the Market Court stated that “the fact that IAB Europe itself would not have the reasonable means to proceed with Identification because it cannot make the link between a TC String and the IP address and would not have direct access to the personal data, is in itself irrelevant”.

As such, the Market Court confirmed that a TC String is personal data within the meaning of Article 4(1) of the GDPR.

Is there any processing of personal data?

IAB Europe, as the managing organisation and central figure in the digital ecosystem, determines the storage and dissemination of the TC String.

Under the TCF Technical Specifications, the TC String is shared with Consent Management Platforms (CMPs) in two ways:

  • By storing it in a shared global consent cookie on IAB Europe’s consensu.org domain; or
  • By storing it in a CMP-chosen system for service-specific consent signals.

The Market Court found that storing the TC String in a shared cookie and making it available via the consensu.org domain clearly constitutes processing of personal data under GDPR.

The Market Court further explained that, regardless of the consent cookie or domain, processing of personal data occurs in the TCF, including:

  • User preferences being collected by CMPs (along with the user’s IP address);
  • User preferences being structured and ordered in a TC String; and
  • The TC String being stored, distributed, and shared with TCF participants.

Should IAB Europe’s Role in the TCF be considered as a Data Controller?

Paragraphs 62-75 of the judgment confirms that it is clear that IAB Europe has real decision-making power, both over the purposes and means of processing and this given its overriding control over the operation of the TCF:

  • IAB Europe acknowledges its responsibility for the TCF in its own documentation – such as “Frequently Asked Questions” on the TCF (version 2.0) – noting that this judgment only focusses on v2.0 as IAB Europe’s TCF v2.2 already includes updates to address compliance concerns raised.
  • On determining the purpose and means of these processing operations, IAB Europe indeed exercises a decisive influence. IAB Europe has a shared purpose with the other participants for the processing of personal data, which incidentally all have the same, which is to ensure that user preferences are captured in a structured way and then shared with all other participants. Even though many TCF participants may be competitors, when it comes to the processing of user preferences under the TCF, they all have similar interests, which are also similar to those of IAB.

The Market Court states that “the concept of a data controller in this case just does have to interpreted broadly, since IAB Europe is the only one who, as it itself states, manages and administers the TCF and can therefore resolve the issues identified by the Dispute Resolution Chamber, after consultation with all other EU regulators.”

The Market Court confirmed that IAB Europe is a joint data controller with TCF participants for storing the consent preferences of the affected users in the TC String.

If yes, is IAB Europe a Joint Controller for the processing of personal data in the context of OpenRTB?

The Market Court assessed whether IAB Europe with the TCF “influences” the further processing of personal data under OpenRTB.

The APD argued that IAB Europe’s TCF and OpenRTB are inherently interconnected. It claimed that IAB Europe facilitates an ecosystem where consent preferences are collected and shared for further processing by third parties (e.g. publishers and adtech vendors). As such, the ADP considered IAB Europe and participating organisations to be joint controllers for both the collection and dissemination of consent data.

The Market Court identified inconsistencies in the ADP’s reasoning. Although the ADP acknowledged that IAB Europe does not act as a data controller for processing under OpenRTB, it nevertheless implied such responsibility in its decision. The Market Court found that the Appellants had limited the scope of their arguments to the TCF, no evidence was provided to establish IAB Europe as a joint controller for OpenRTB processing and it lacked influence over this stage of data use..

It concluded that the APD failed to demonstrate that IAB Europe acts as a joint data controller for processing operations under OpenRTB as not all processing stages fall under their control.

OUTCOME

The Market Court upheld the €250,000 fine imposed by the APD, deeming it proportionate and justified under Article 83 of the GDPR. It also confirmed the corrective measures requiring IAB Europe to bring its processing activities into compliance.

The Market Court dismissed most of IAB Europe’s grievances but acknowledged procedural flaws in the initial decision. It upheld the APD’s sanctions regarding TCF operations but clarified that IAB Europe is not responsible for OpenRTB operations – annulling the APD’s decision in part.

IAB Europe is ordered to pay the costs of proceedings, estimated at €7,848.84, and other contributions totalling €424.

IMPLICATIONS

This Judgment clarifies that even entities without direct access to personal data can be held accountable as data controllers if they influence the purposes and means of processing.

For the adtech industry, this ruling reinforces the GDPR principles and in particular supports the requirements to:

  • carefully examine consent mechanisms to ensure they are transparent, freely given, specific, informed and unambiguous;
  • ensure the use of consent frameworks like the TCF does not create ambiguity about their own roles and accountability in data processing operations;
  • provide users with clear, accessible, and understandable information about how their data is processed; and
  • minimise the processing of personal data by leveraging contextual advertising, privacy-enhancing technologies, and aggregated or pseudonymised datasets instead of third party cookies.

Less healthy food advertising restrictions pushed to early 2026

Today, the Government has announced that it intends to delay the effective date of the less healthy food regulations. The regulations, which ban TV ads for less healthy food or drink being shown before 9pm and online ads for these products, were due to come into force on 1 October this year.

Following heavy lobbying from the industry around the implications of ‘brand advertising’ (i.e. advertising a brand/company name even if unhealthy products were not shown), the Government has announced that it intends to make and lay a Statutory Instrument (SI) to explicitly exempt ‘brand advertising’ from the restrictions. To allow time to consult on the draft SI, the formal date that these new restrictions come into force has been extended from 1 October 2025 to 5 January 2026.

However, as per a voluntary agreement with the Government, advertisers and broadcasters have made a public commitment to comply with the restrictions as though they would still come into force from 1 October 2025. This means that, from 1 October 2025, the Government has said that it would expect adverts for specific identifiable less healthy products not to be shown on TV between 5:30am and 9pm or at any time online. This is a positive development for advertisers, particularly those that largely or wholly advertise products that fall within the ‘less healthy product’ category, who (it is expected) will be able to continue to advertise their branding without showing such less healthy products, at any time. This is of course subject to how the Government will define ‘brand advertising’, which we expect clarification on before the restrictions come into force on 5 January 2026 (subject to Parliamentary approval). However, advertisers will still be expected to comply with the general restriction and no longer advertise identifiable less healthy products on TV between 5:30am and 9pm or at any time online, as of 1 October 2025.

Changes to consumer laws and B2C engagement take effect

The Digital Markets Competition and Consumer Act 2024 (DMCCA) came into force on 1 January 2025, and is now in effect, bringing with it significant changes to consumer law since the Consumer Rights Act 2015.

Snapshot of the DMCCA

Outright ban of “unfair commercial practices”. The DMCCA overhauls existing consumer protections under the Consumer Protection from Unfair Trading Regulations and introduces several new provisions aimed at enhancing consumer rights and processes. This includes the outright banning of certain unfair commercial practices such as drip pricing and those in relation to fake or concealed incentivised consumer reviews.

Changes to subscription rules. The DMCCA also tightens the rules around B2C subscription contracts, adding new requirements for subscription services to comply with, however these changes are not expected to come into force until Spring 2026.

Strengthens the role of the CMA. The CMA will now be able to directly investigate suspected infringements and issue enforcement notices without the need for lengthy court proceedings. The DMCCA brings with it the ability for the CMA to impose penalties of up to 10% of global turnover. This is a significant shift from the previous regime which largely required court involvement for enforcement actions.

Phased implementation

The first set of changes relating to consumer law are now in effect, and the CMA has published guidance on unfair commercial practices that are banned by the DMCCA and subject to enforcement action. It is worth noting that many of these “unfair commercial practices” are not new in principle, but the main difference now is that the CMA has the ability to investigate and impose penalties for breaches of these rules. The CMA has also published guidance on how it will enforce the DMCCA. For the first 12 months, the CMA will target particularly harmful behaviours to consumers such as aggressive sales practices that prey on consumers in vulnerable positions, fees that are hidden until late in the buying process, information being given to consumers that is objectively false, unfair and unbalanced contract terms and fake reviews.

What can we expect next?

The CMA will likely start the first wave of its investigation and enforcement, focusing on the “most egregious” breaches of the DMCCA. The CMA has indicated that it will be consulting further on drip pricing this year, including in relation to fixed-term period contracts. We expect this further guidance in relation to drip pricing to be published this autumn. Look out for our further articles on the impacts of the DMCCA on influencer marketing, prize draws and competitions and subscription services.