Data protection update

This update includes key developments such as the ICO-HMG memorandum on data protection, new provisions under the Data (Use and Access) Act, guidance on international data transfers and age assurance, and significant enforcement actions like fines for unsolicited marketing, misuse of biometric data, and breaches involving children’s data, alongside global concerns over AI and high-profile investigations.

General updates

  • On 8 January, the Information Commissioner’s Office and His Majesty’s UK Government (HMG) signed a Memorandum of Understanding (MOU) to formalise their shared commitment to improving data protection standards which includes appointing a Government Chief Data Officer to oversee data protection risks and compliance across HMG departments and key governance boards, such as the Transformation Board and Government Security Board, will monitor data protection risks and progress.
  • On 3 February, the ICO opened formal investigations into X Internet Unlimited Company (XIUC) and X.AI LLC (X.AI) covering their processing of personal data in relation to the Grok artificial intelligence system and its potential to produce harmful sexualised image and video content.
  • On 5 February, most of the remaining data protection provisions of the Data (Use and Access) Act have come into force, except for the requirement for organisations to have a complaints procedure which is due to commence on 19 June 2026 and some ICO governance provisions which will follow at a later date. Such provisions now in force include only having to carryout, a “reasonable and proportionate” search in response to data subject access requests and the maximum fine issued under the Privacy and Electronic Communications Regulations is no longer £500,000 but, now matches the GDPR of up to £17.5 million or 4% of global turnover (whichever the greater).
  • On 23 February, privacy regulators from around the world issued a joint statement addressing mounting concerns over artificial intelligence (AI) systems that create realistic images and videos of identifiable individuals without their consent.
  • On 11 February 2026, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) issued a joint opinion on the European Commission’s Digital Omnibus Regulation proposal, which seeks to streamline digital regulations, reduce administrative burdens, and enhance competitiveness across the EU. The EDPB and EDPS strongly oppose proposed changes to the definition of personal data, warning that they could narrow its scope, weaken privacy protections, and create legal uncertainty.
  • On 25 April, John Edwards, the UK’s Information Commissioner, announced that he has temporarily stepped back from his role as the ICO conducts an independent investigation into unspecified “HR matters.” Edwards, who has held the position since January 2022, announced his cooperation with the inquiry in a LinkedIn post.

Latest guidance

  • On 15 January 2026, the Information Commissioner’s Office released updated guidance on international transfers of personal data under the UK GDPR. Key updates include: a three-step test for restricted transfers and explanations on roles and responsibilities, particularly for complex, multi-layered transfer scenarios. The regulators back several provisions aimed at reducing administrative burdens, including raising thresholds for mandatory data breach notifications and extending deadlines for reporting.
  • On 12 March 2026, the UK’s data protection regulator, the Information Commissioner’s Office has published an open letter to social media and video-sharing platforms operating in the UK calling on them to urgently strengthen their age assurance measures.
  • On 25 March 2026, Ofcom and the Information Commissioner’s Office released a joint statement outlining regulatory expectations for age assurance measures under the Online Safety Act and UK data protection laws. The statement aims to help online services protect children from harmful content and data risks while ensuring compliance with both legal frameworks.
  • On 31 March, the ICO called on businesses to review their use of automated decision-making in recruitment to ensure compliance with data protection laws and to protect jobseekers from unfair or biased outcomes.
  • On 29 April 2026, the Information Commissioner’s Office (ICO) released its finalised guidance on Storage and Access Technologies alongside an update on its online tracking strategy. This guidance addresses the application of the Privacy and Electronic Communications Regulations and, where relevant, the UK GDPR to technologies such as cookies, tracking pixels, device fingerprinting, and similar tools. It incorporates updates following two consultations and amendments introduced by the Data (Use and Access) Act 2025.
  • On 14 April 2026, the European Data Protection Board announced a new Data Protection Impact Assessment template to simplify compliance with the General Data Protection Regulation and promote consistency across Europe.

Latest enforcement action

  • On 15 January, the Information Commissioner’s Office fined Allay Claims Ltd £120,000 for sending over 4 million unsolicited marketing SMS messages between February 2023 and February 2024. These messages promoted PPI tax refund services and were sent without valid consent or compliance with the ‘soft opt-in’ exemption. Allay argued that recipients were existing customers who had engaged with the company in 2019 and signed terms of engagement, which it believed satisfied the ‘soft opt-in’ exemption. However, aggravating circumstances included Allay was previously investigated by the ICO in 2020 for PECR breaches and despite the investigation and complaints, Allay failed to suspend its marketing activities, resulting in further complaints. The distress caused to recipients, as unsolicited marketing is intrusive and can lead to financial harm, particularly in the context of PPI tax refund services, which often involve high fees and hidden charges.
  • On 2 January, The President of the Personal Data Protection Office (Poland’s data protection authority) imposed a fine of PLN 978,128 (approximately €232,379) on T. S.A. for the failure to ensure the independence of the Data Protection Officer (DPO) and the absence of measures to prevent conflicts of interest in the DPO’s role. The DPO of T. S.A. simultaneously held a managerial role (Director V.) and other positions within the company. The company’s history of GDPR violations was considered an aggravating factor, as it demonstrated ongoing compliance challenges. The company resolved the identified issues by restructuring the DPO’s role before the administrative proceedings concluded. This led to a 40% reduction in the fine.
  • On 29 January, the Italian Data Protection Authority (GPDP) fined e-Campus Online University €50,000 for unlawfully using facial recognition technology to verify student attendance during a teacher qualification course. The university processed biometric data without a valid legal basis, relying on invalid consent while failing to conduct a proper Data Protection Impact Assessment (DPIA) before implementation. The GPDP highlighted several violations of GDPR, including unnecessary data retention, lack of alternatives for students, and the power imbalance inherent in requiring biometric data for course participation. While the university cooperated with the investigation and ceased using the system, the fine reflected the serious nature of processing sensitive biometric data and the large number of students affected.
  • On 13 February, the ICO and Ofcom responded to an open letter from approx. 20 MPs urging the ICO to investigate Tattle Life for potential breaches of data protection laws after the death of a social media influencer’s 16 year old daughter.
  • The ICO confirmed it has an ongoing investigation into Tattle Life, examining its compliance with data protection laws. These include obligations to process personal data lawfully, transparently, and fairly, and to address user requests for data rectification or erasure. While the ICO does not have the authority to shut down websites, it can issue enforcement notices to ensure compliance if data protection violations are identified.
  • On 19 February, the ICO won its appeal in a landmark case against DSG Retail Limited. The dispute originated from a 2020 ICO fine of £500,000 imposed on DSG after a cyber-attack compromised the personal data of at least 14 million individuals. Despite appeals by DSG to the First-tier Tribunal and Upper Tribunal, the ICO sought further clarification on a critical point of data protection law by appealing to the CoA in 2024. The court clarified that this duty applies even if the stolen data cannot directly identify individuals, recognising the broader harm caused by cyber-attacks.
  • On 3 February, the ICO reprimanded Staines Health Group for sending excessive medical details about a terminally ill patient to their insurance company, Vitality. A patient at the NHS GP surgery was diagnosed with a terminal illness and made a claim to their insurer. The insurer, on behalf of the patient, subsequently requested that five years of medical history be sent to the patient to review, before being sent to the insurer in order to progress the claim. But, instead of five years of medical history being sent to the patient, Staines Health Group sent 23 years of medical records direct to the insurer. The patient believed the excessive disclosure of unnecessary medical records led to a reduction in the payout of their claim.
  • On 3 February, the ICO issued a monetary penalty of £100,000 to TMAC Ltd for making calls promoting alarm systems and monitoring services to individuals registered with the Telephone Preference Service.
  • On 4 February, the ICO issued a Penalty Notice to MediaLab.AI, Inc. fining it £247,590 for UK GDPR breaches relating to children’s data and the absence of a DPIA. The ICO found unlawful processing of under-13s’ data without valid parental consent and a failure to complete a DPIA for high-risk processing affecting under-18s during 27 September 2021 to 30 September 2025.
  • On 23 February 2026, the ICO issued a Penalty Notice to Reddit, Inc of £14,472,500 for UK GDPR breaches involving children’s personal data and failure to complete a DPIA.

The AI-enabled threat landscape: real world lessons from lawyers, PR and cybersecurity experts

In collaboration with Sodali & Co and LevelBlue, we have produced a new report offering vital insights into AI-driven cybercrime. Designed for non-technical executives and board members, it highlights key threats, practical talking points, and actionable steps to support discussions with risk, legal, and cyber security teams.

AI is transforming the cyber threat landscape, enabling faster, cheaper and more personalised attacks while lowering the entry barrier for malicious actors. These risks pose significant financial, operational and reputational challenges for businesses.

EU AI Act Transparency Obligations: latest developments and key obligations

A core requirement imposed by the EU AI Act (the Act) is in respect of transparency obligations for the AI systems used.

The majority of the Act is expected to come into force on 2 August 2026. The European Parliament, however, has agreed a proposal that would delay the obligations imposed in respect of high risk AI systems. The remaining provisions of the Act remain largely unaffected, and businesses should operate on that basis, noting that breaching these obligations can result in a fine of up to EUR 15 million or 3% of their total worldwide annual turnover for the preceding financial year (whichever is higher).

The Act raised a number of questions around how companies would comply with their transparency obligations. This led to the creation of a draft code of practice (the “Code of Practice on Marking and Labelling of AI-generated content” (the Code)), integrating feedback from hundreds of participants and observers including industry, academia and other stakeholders.

The Code of Practice on marking and labelling of AI-generated content

The second draft of the Code was published on 3 March 2026 and a final version is expected by June 2026. The Code is subject to further amendments, but sets out four key requirements to demonstrate compliance:

  1. multi-layered marking through metadata embedding, imperceptible watermarking, or fingerprinting/logging;
  2. providers having to offer a free interface or publicly available tool enabling users and third parties to verify whether content is AI-generated;
  3. technical solutions for marking and detection must be effective and reliable; and
  4. continuous testing and improvement to keep pace with real-world developments.

The transparency obligations

The Code is underpinned by the underlying transparency obligations in the Act.

The extent of these obligations is influenced by different factors such as whether the AI system is classified as limited or high risk; and whether you are a deployer or provider.

For limited risk AI systems:

If you are a provider

A ‘provider’ is a company, individual, public authority, agency or body that: (a) develops, or procures the development of an AI system or general-purpose AI model; and (b) places it on the market or puts it into service under its own name or trademark. In other words, this applies to those who set out to create, or procure the creation of an AI system.

Providers of limited risk AI systems must comply with three core transparency requirements.

  1. AI systems must be designed to inform individuals that they are engaging with an AI system;
  2. Providers must ensure that outputs are marked in a machine-readable format and are detectable as artificially generated or manipulated; and
  3. Technical solutions employed must be effective, interoperable, robust and reliable.

The question of how providers can satisfy these requirements has been a recurring area of discussion, such that the European Commission has stepped in to provide guidance via the voluntary code of practice on the transparency of AI-generated content. We discuss this in further detail below.

If you are a deployer

In contrast, a ‘deployer’ is a company, individual, public authority, agency or body using an AI system under its authority, except where the AI system is used in a personal non-professional activity.

Given that deployers are effectively users with little to no control over the AI system, they are subject to much fewer disclosure requirements. The Act only imposes obligations on deployers of three specific types of AI systems:

  1. emotion recognition or biometric categorisation systems;
  2. deepfakes, where the system generates or manipulates image, audio or video content; or
  3. systems generating or manipulating text published to inform the public on matters of public interest.

For high risk AI systems:

If you are a provider

Unsurprisingly, the Act imposes the most obligations for this category. In general, it will include requirements for providers to supply instructions for safe use and information about accuracy, robustness, and cybersecurity. Individuals overseeing such systems must be suitably qualified to understand the system’s capacities and limitations, with various recordkeeping and risk management protocols.

If you are a deployer

Similar to above, deployers face fewer but a broader set of obligations reflective of the higher risk AI system. These include the implementation of specific governance, monitoring, transparency and impact assessment requirements. The key obligations can be grouped under two headings:

Operational obligations

The deployer must implement appropriate measures to ensure the high-risk AI system is used in accordance with the relevant instructions for use, that input data is relevant and sufficiently representative for the intended purpose of the system, and monitor its operation in order to be able to inform the provider in the event it identifies any risks or serious incidents.

Control and risk management obligations

A deployer must conduct a fundamental rights impact assessment (FRIA) before deploying the system, assign human oversight to individuals with the necessary competence, train and regularly monitor the AI system for risks, and keep the logs of the AI system in an automatic and documented manner for at least six months.

Future outlook

The trajectory is unmistakable: the Act positions transparency as a core principle, which is going to impact design choices, user interfaces and governance processes. Organisations will be expected to comply with the Code and the underlying transparency obligations that underpin it.

Companies leveraging AI along their supply chain should therefore prioritise embedding and documenting transparency measures that can withstand both regulatory and legal scrutiny, while ensuring alignment with wider IP governance and strategic commercial decisions.

For more information the EU AI Act and the Code and how they might impact your business, contact Sacha Wilson and Jacky Lai.

Unfair contract terms in consumer contracts: new draft guidance from the CMA

If you deal with consumers, then you need to know how consumer law applies to your contract terms and notices.

Ten years on from the introduction of the Consumer Rights Act 2015 (the CRA), the Competition and Markets Authority (the CMA) is revising its current guidance on unfair contract terms.

The draft guidance is aimed at making the guidance more accessible, helping businesses better understand and comply with the CRA. The consultation closed on 19 March 2026. Once finalised, it will replace the existing guidance on unfair contract terms.

Which terms are unfair?

Contract terms are unfair if they tilt the rights and responsibilities excessively in favour of the supplier. The law currently uses a ‘fairness test’ by looking at the words in the contract, taking into consideration what is being sold, how a term relates to other terms in the contract, and all the circumstances at the time the term was agreed.

Certain terms and notices giving rise to particular concerns are ‘blacklisted’ and deemed as unsuitable for use with consumers. These include terms that exclude or restrict liability for death or personal injury resulting from negligence, a consumer’s statutory rights and any associated remedies. Blacklisted terms are never enforceable against a consumer.

What are the key changes in the draft guidance?

Enhanced CMA enforcement powers under the DMCC:

The updated guidance integrates the Digital Markets, Competition and Consumer Act 2024 (the DMCC), enabling the CMA to impose penalties without going to court for businesses that use prohibited, non-transparent or unfair terms or notices. Fines may be up to 10% of a company’s global turnover or £300,000 (whichever is higher).

Transparency – more than words:

Transparency now covers not just the content itself, but also its presentation by requiring clear fonts and headings that follow a logical structure, supported by explanation of terms which may be complex or challenging to understand.

Fairness and consumer behaviour:

The requirement of ‘good faith’ should include a behavioural dimension. Suppliers must consider consumer psychology and avoid exploiting consumer biases — for instance, consumers’ tendency not to read standard terms thoroughly, or to underestimate future costs such as renewal or termination fees. Campaigns emphasising quick benefits, such as a free trial, while using tactics to minimise attention as to future costs will face greater scrutiny. Automatic renewal of subscriptions are also specifically noted as an area of concern, with the DMCC’s new subscription provisions (to enter into force no later than August 2026) adding further obligations.

The role of advertising:

Advertising is explicitly incorporated into the fairness assessment, requiring consistency between terms and marketing claims. Small print which removes or curtails more prominent claims, failing to highlight key terms during the marketing process, or inconsistency between marketing claims and the contract terms could give rise to an unfair commercial practices. Statements made by a supplier that a consumer is likely to see may also be treated as terms of the contract.

Exclusions and variations to the contract:

Vague language such as “liability is excluded so far as the law permits” will not remedy an unfair clause; and terms allowing a supplier to vary terms such as changing the description or price of the services or goods may now be deemed unfair should they be overly wide in scope or result in changes that may be unexpected to the customer.

What are the key takeaways for consumer businesses?

The draft guidance makes clear that unfair, onerous or significantly unbalanced terms will be closely scrutinised. Suppliers should ensure that lines of communication with customers are clear, transparent and user-friendly to understand.

Contract terms should similarly be reviewed to make sure that they strike a reasonable balance without prejudicing consumers by including reasonable protections around cancellation or refund rights.

For more information on how the new guidance will impact your consumer contracts, contact Sacha Wilson and Jacky Lai.

Government IT contracts: how to challenge the procurement process

If your business enters into contracts with public sector entities for the provision of IT or related services, you will be familiar with the public sector tender and procurement processes. But are you familiar with what can be done to challenge the outcome of those processes?

Whether it is an issue with the application of the scoring criteria, or how the process has been conducted, your business may have the ability to challenge contract awards.

However, in order to do so effectively, your business will need to move quickly and ensure that it deploys the various legal tools available to it strategically.

What is the relevant legislation?

In 2025, the Procurement Act 2023 (the Act) came into force. This represented the most significant development to UK public procurement laws for over 30 years, replacing the well-established EU-founded regime under the Public Contracts Regulations 2015 (the PCR).

How long do you have to bring a claim?

The period during which a legal claim can be brought under the Act is very short and remains largely unchanged from the PCR. In summary:

  • If you are a supplier seeking to challenge an award, the period to bring a claim is just 30 days from when they knew, or ought reasonably to have known, of the circumstances giving rise to the claim. However, this may be extended for up to three months where the court considers there is a good reason to do so.
  • If you are supplier seeking to set aside a contract that has been entered into, the period to bring a claim is 30 days from the date it knew or ought to have known of the circumstances giving rise to a claim with a long stop date of 6 months from the date the contract was entered into.

However, the parties can enter into a standstill agreement which, in effect, extends the limitation period, allowing the parties an opportunity to resolve the dispute.

Can you prevent the authority from entering into a contract with another supplier whilst you challenge the decision?

Under the previous regime, contracting authorities were required to observe a 10-day waiting period following the issue of a ‘standstill letter’ to all tendering suppliers before entering into a contract with the preferred supplier. Claims issued prior to contract execution would trigger an automatic suspension of the procurement process.

The Act reduces the standstill period from 10 to eight working days, with the period now triggered by the contract award notice instead of the issue of a standstill letter. Claimants are no longer entitled to the benefit of automatic suspension up until the date of contract execution. This is a significant shift from the previous position and impacts upon strategic considerations.

What information do you have about the decision-making process?

There are various ways you can find out more about the decision-making process. One of them is that contracting authorities must publish a Contract Award Notice on a central digital platform, and an assessment summary to each supplier that submitted an assessed tender.

The assessment summary must include: (a) the scores awarded for each criterion; (b) an explanation of those scores; and (c) in respect of unsuccessful suppliers, the reasons why the contract was not awarded to them, together with the corresponding information at (a) and (b) for the successful tender.

The enhanced disclosure requirements are a positive development for suppliers looking for substantive grounds on which to base a potential challenge.

What remedies can you obtain when challenging an award?

In many cases, compromise solutions are reached with the relevant authority without a claim needing to be issued. However, if you do pursue a claim, the remedies available remain mostly unchanged from the previous regime. There are two main categories:

Pre-contractual remedies:

Where a contract has been awarded but not yet executed, a successful challenge may result in the court granting one of the following orders:

  • an order setting aside the relevant decision or action (including the decision to award the contract);
  • an order requiring the contracting authority to take specified action (such as reconsidering a decision previously made);
  • an order for damages (which may be granted in addition to any other order, and has historically encompassed lost profits arising from the breach and/or wasted bid costs); or
  • such other order as the court considers appropriate.

Post-contractual remedies:

Where the awarded contract has been executed, the available remedies are limited to damages and/or an order setting aside the contract (subject to certain conditions in the Act).

What does this mean for suppliers?

If you are concerned about a procurement decision, then given the short timeframes for challenge, it is critical to seek legal advice at the earliest possible opportunity to allow your advisors time to evaluate the claim and devise and deploy the optimum strategy.

The Act’s emphasis on transparency, creating a level playing field and the introduction of new obligations on contracting authorities, expands the scope for potential challenges.

You will however need to navigate the reduced standstill period, which now runs for 8 working days from the contract award notice, and the fact that automatic suspension is no longer available until the date of contract execution.

If you would like to find out more about how to make procurement challenges, contact Lizzie Williams and Jacky Lai.

Government consultation: the reshaping of sports sponsorships?

The Department for Culture, Media and Sport have this week announced a plan to consult on a ban of unlicensed gambling operators sponsoring British sports teams. This will form part of the government’s consultation on sports sponsorship, to be launched in the spring.

This could bring about an intriguing change in sports sponsorship, particularly in respect of the sponsorship of Premier League football clubs, several of whom have unlicensed gambling operator brands on their front of shirts. While the Premier League members have voluntarily committed to removing all gambling branding from the front of shirts by the end of this current season, there was an assumption that those brands would move to shirt sleeves and other club inventory.

With a political wind behind the announced consultation to tackle the illegal gambling market, it seems more of a case of ‘when’ a ban on unlicensed gambling operators will come into force, rather than ‘if’.

This may create greater opportunities for other brands and sectors to increase their presence in football, both on front of shirts and across wider club inventory freed up by a departure of unlicensed gambling operators.

It could see a return of more alcohol brands (possibly promoting low or non-alcoholic products) to the Premier League – Guinness returned to the front of a football shirt for the first time since 1986 as part of its sponsorship of WSL2 club Bristol City Women using its Guinness 0.0 brand.

A local focus could also become more prevalent drawing on the historical and geographical connections between club and local sponsors – P&O Cruises landed on the front of shirt for Southampton last year in the Premier League.

Alternatively, there may be an opportunity for both established and challenger brands who have not previously partnered with football clubs to enter the market. This may be at a reduced price compared to current levels given the potential amount of inventory that could be available.

In any case, front of football shirts might look a little different in the not too distant future.

Managing risks and opportunities with AI

In a GC100 poll of 106 companies in September 2024, 8% of respondents reported they already regularly used Co-Pilot and Teams Premium for transcription of initial draft minutes.

Since then, there has been an influx of providers in the market that can prepare agendas, summarise discussions, and draft lists of action points. Before employing such AI tools in your company, it is essential to consider whether the use of AI is appropriate, and, if so, whether all the necessary risk-mitigation steps have been taken.

What are the key risks?

  • AI tools lack the ability to differentiate between different types of contexts and tones, cannot exercise discretion, and may fail to properly identify commercial nuances and stifle candid discussions.
  • Confidential and sensitive discussions may become disclosable in future litigation or regulatory contexts, as part of subject access requests, or during due diligence processes. AI tools may not be able to identify legally privileged information, which could lead to an inadvertent loss of privilege.
  • Without careful review, there is a risk AI would not deliver a formal record of decisions made that is accurate and impartial.
  • When using AI tools that rely on third-party providers, organisations face the risk of data breaches and confidential information leaks.

How can you mitigate these risks?

  • The most effective risk-mitigation measure is to ensure human review by an employee of an appropriate level. This ensures there is a clear, accurate, and concise outcome with careful consideration being given for any inclusion of commercially sensitive or legally privileged information.
  • Companies should conduct thorough due diligence on any AI providers, including understanding the extent to which the provider uses data inputted by users for AI development or training activities and any processes that can lead to a leak of confidential company information.
  • Companies should also ensure suppliers are compliant with all relevant data protection regulations and have adequate security systems in place to ensure their AI tools do not increase the company’s exposure to cyber-attacks/data breaches/leaks.
  • Organisations should establish robust consent and communication processes regarding the use of AI tools internally and externally. Companies should think carefully about having AI tools as a default setting and providing the opportunity/awareness around opting out of such use.

Key takeaway

Consider carefully whether the benefits of using AI tools outweighs the significant risks. If used, human oversight, comprehensive diligence on AI providers, and clear guidance is essential to protect organisations.

Data protection update

The Data (Use and Access) Act 2025 (DUA Act) becomes UK law

The DUA Act, which received Royal Assent on 19 June 2025, reforms UK data protection laws and will be implemented in phases. Key changes include:

  • A new “recognised legitimate interests” lawful basis, exempting certain data uses (e.g., crime prevention and protecting vulnerable individuals) from a balancing test.
  • Exemptions to the need to get user consent to deploy non-strictly necessary cookie rules for low-risk purposes, such as fraud prevention.
  • Complaints must now be raised with data controllers first before escalating to the ICO, so policies and procedures should be updated to include a complaints and escalation mechanism for data protection issues.

UK’s adequate data protection law status likely to be extended to December 2031

UK GDPR, a UK-specific version of the EU GDPR, was deemed “adequate” by the EU, allowing free data flow between the UK and the EU (post Brexit). The adequacy decision, initially expiring in June 2025, was extended to December 2025 following the DUA Act, which made changes to data protection rules in the UK. The European Commission reviewed the UK’s updated data protection framework and concluded it still meets the “essential equivalence” standard, likely extending adequacy until 27 December 2031, with reviews every four years.

Court of Appeal decision on compensation claims for personal data breaches

On 22 August 2025, the Court of Appeal delivered a significant judgment in Farley and Others v Paymaster (1836) Ltd (trading as Equiniti) [2025] EWCA Civ 1117. The case arose from the misaddressing of annual benefit statements (ABS) for 432 police pension scheme members, sent to outdated addresses. Claimants alleged distress and anxiety over potential misuse of their data. While 14 confirmed their ABS had been accessed by unauthorised third parties, the High Court had ruled proof of third-party disclosure was necessary.

The Court of Appeal reversed this decision, holding third-party disclosure is not essential for data protection claims. Mishandling personal data itself constitutes an infringement of GDPR rights. Compensation is recoverable for non-material damage, including anxiety, if the fear of misuse is objectively reasonable. Hypothetical or speculative fears cannot be compensated. The case now returns to the High Court to assess the reasonableness of the appellants’ fears and any psychiatric injuries.

UK’s digital ID scheme

The scheme aims to simplify access to government and private services (e.g., welfare, childcare, renting) and reduce identity fraud, streamline verification, and toughen employment checks. The scheme is centred around free digital IDs stored securely on phones with biometric security (photo). Data includes name, date of birth, nationality/residency status, photo with biometric security. Address may be added post-consultation. The scheme will require employers to check IDs for right-to-work. The police will not be able to demand to see the digital ID. The UK Government state that the data will be stored on devices with encryption and credentials can be revoked if a device is lost/stolen. The scheme will be accessible with assistive technologies and physical alternatives plus support for non-smartphone users. A public consultation is planned for later in the year and rollout expected by end of current Parliament.

ICO call for views on regulating online advertising, legitimate interests, data protection complaints and online safety

  • On 7 July 2025, the ICO launched a consultation on its approach to regulating online advertising under the Privacy and Electronic Communications Regulations (PECR). Open until 7 September 2025, it sought views on balancing privacy, innovation, and economic growth. Consent remains mandatory for high-risk practices like behavioural advertising, but the ICO aims to identify low-risk advertising activities (e.g., ad delivery and fraud prevention). It plans to outline non-enforcement areas and safeguards in early 2026.
  • On 30 July 2025, the ICO launched a consultation on guidance for using profiling tools in online safety systems under the Online Safety Act 2023. Open until 31 October 2025, it focuses on lawful, fair, and transparent use of AI tools for detecting harmful behaviours like grooming and fraud. It highlights compliance with UK GDPR, DPA 2018, and PECR, emphasising lawfulness, transparency, data minimisation, and safeguarding children.
  • On 21 August 2025, the ICO launched public consultations on DUA Act amendments:
    • Recognised Legitimate Interest: This lawful basis allows processing personal data for pre-approved public interest purposes like safeguarding and emergencies. Consultation closes 30 October 2025.
    • Complaints: By June 2026, organisations must establish formal processes for complaints regarding personal data handling. Consultation ends 19 October 2025.

ICO enforcement actions

  • AFK Letters Co Ltd: Fined £90,000 on 14 April 2025 for breaching PECR regulation 21 by making 95,277 unsolicited marketing calls in 2023. Issues included poor consent documentation and non-compliance.
  • 23andMe: Fined £2.31 million on 17 June 2025 for a data breach exposing sensitive genetic and health data of 155,592 UK users. Failures included weak security measures and lack of mandatory multi-factor authentication.
  • Capita: A Penalty Notice to Capita plc (£8 million) and Capita Pension Solutions Limited (£6 million) for data breach. A cyberattack (March 2023) exposed data of over 6.6 million individuals, including sensitive health data and financial details.

Harbottle & Lewis advises Ovation Rights on landmark acquisition of Sir Richard Stilgoe’s theatrical rights catalogue

We have advised Ovation Rights on its acquisition of Sir Richard Stilgoe’s rights in some of the most successful stage musicals of all time, including The Phantom of the Opera and Starlight Express. The transaction represents one of the most significant acquisitions of theatrical rights in recent years, marking a pivotal step in recognising the enduring value of theatrical rights and the legacies behind them.

Founded and led by producer Jamie Hendry and former Amazon executive Philip Green, Ovation Rights introduces the scale and strategy ordinarily seen with music catalogue acquisitions to the creators and rightsholders of major plays and musicals. The company collaborates closely with authors, composers, lyricists and estates, working as custodians to protect legacies, honour artistic visions and passionately champion works.

Our team was led by partners Neil Adleman and Charles Leveque, with support from managing associate Teresa Walker and associates Emma Riggs and David Jones and with partner David Scott advising on corporate tax matters.

On working with Harbottle & Lewis, Jamie Hendry commented: “Working with Harbottle & Lewis has been exceptional throughout this transaction. The team understood the unique nature of theatrical rights acquisitions and provided invaluable guidance as we navigated this deal. Their expertise in entertainment law, combined with their commercial understanding of our vision for Ovation Rights, made them the perfect partners for this acquisition. As we continue to build our portfolio and support artists’ legacies, we know we have trusted advisors who share our commitment to protecting and championing the works that have shaped global theatre.”

Neil Adleman added: “We are delighted to have supported the Ovation Rights team on this groundbreaking acquisition. This transaction demonstrates the significant value and enduring appeal of theatrical rights, and we look forward to seeing Ovation Rights continue to invest in artists and rightsholders as custodians of these remarkable legacies.”

COURT OF APPEAL VERDICT IN EXCLUSION CLAUSE DISPUTE

In February 2025 the Court of Appeal (by a 2:1 majority) dismissed an appeal brought by EE against Virgin Mobile in relation to a significant claim arising out of a telecommunications supply agreement.

The Court of Appeal agreed with the first instance decision that the exclusion clause excluded EE’s entire £24.6m loss of profit claim against Virgin Mobile.

EE claimed that it had suffered loss and damage in the amount of £24.6m as a result of Virgin Mobile breaching an exclusivity obligation in the telecommunications supply agreement, because EE had lost the revenue that it would have received from Virgin Mobile under the terms of the agreement had the exclusivity obligation not been breached.

Virgin Mobile denied breaching the agreement as alleged but argued that, in any event, EE’s claim was precluded because it was, in substance, a claim for anticipated profits. It therefore fell within the scope of the exclusion clause in the agreement which provided that “Neither Party shall be liable to the other in respect of … anticipated profits”.

EE argued that this interpretation could not be correct because (amongst other things), on the facts which occurred, EE did not have a wasted expenditure claim or a good argument for an injunction, so excluding the loss of profits claim would leave EE without an effective remedy, creating commercial absurdity and defeating the main purpose of the agreement.

The majority of the Court of Appeal rejected this argument because the specific facts which occurred, where no alternative remedy was viable, were not known to the parties when they entered into the agreement and therefore should not affect its interpretation. It was held that, applying the proper legal principles, the exclusion clause did preclude EE’s entire claim.

However, the Court of Appeal did not reach this conclusion easily, and indeed Phillips LJ dissented, noting that “it would be surprising if the parties intended that [Virgin Media] could breach the key exclusivity provision, unlawfully diverting its customers to a third party supplier, without incurring liability to pay EE damages reflecting the loss of revenue resulting from that breach”.

This case provides a further example of the unpredictability of the interpretation of exclusion clauses and the importance of clear, future-proof, contract drafting.