Data protection update

This update includes key developments such as the ICO-HMG memorandum on data protection, new provisions under the Data (Use and Access) Act, guidance on international data transfers and age assurance, and significant enforcement actions like fines for unsolicited marketing, misuse of biometric data, and breaches involving children’s data, alongside global concerns over AI and high-profile investigations.

General updates

  • On 8 January, the Information Commissioner’s Office and His Majesty’s UK Government (HMG) signed a Memorandum of Understanding (MOU) to formalise their shared commitment to improving data protection standards which includes appointing a Government Chief Data Officer to oversee data protection risks and compliance across HMG departments and key governance boards, such as the Transformation Board and Government Security Board, will monitor data protection risks and progress.
  • On 3 February, the ICO opened formal investigations into X Internet Unlimited Company (XIUC) and X.AI LLC (X.AI) covering their processing of personal data in relation to the Grok artificial intelligence system and its potential to produce harmful sexualised image and video content.
  • On 5 February, most of the remaining data protection provisions of the Data (Use and Access) Act have come into force, except for the requirement for organisations to have a complaints procedure which is due to commence on 19 June 2026 and some ICO governance provisions which will follow at a later date. Such provisions now in force include only having to carryout, a “reasonable and proportionate” search in response to data subject access requests and the maximum fine issued under the Privacy and Electronic Communications Regulations is no longer £500,000 but, now matches the GDPR of up to £17.5 million or 4% of global turnover (whichever the greater).
  • On 23 February, privacy regulators from around the world issued a joint statement addressing mounting concerns over artificial intelligence (AI) systems that create realistic images and videos of identifiable individuals without their consent.
  • On 11 February 2026, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) issued a joint opinion on the European Commission’s Digital Omnibus Regulation proposal, which seeks to streamline digital regulations, reduce administrative burdens, and enhance competitiveness across the EU. The EDPB and EDPS strongly oppose proposed changes to the definition of personal data, warning that they could narrow its scope, weaken privacy protections, and create legal uncertainty.
  • On 25 April, John Edwards, the UK’s Information Commissioner, announced that he has temporarily stepped back from his role as the ICO conducts an independent investigation into unspecified “HR matters.” Edwards, who has held the position since January 2022, announced his cooperation with the inquiry in a LinkedIn post.

Latest guidance

  • On 15 January 2026, the Information Commissioner’s Office released updated guidance on international transfers of personal data under the UK GDPR. Key updates include: a three-step test for restricted transfers and explanations on roles and responsibilities, particularly for complex, multi-layered transfer scenarios. The regulators back several provisions aimed at reducing administrative burdens, including raising thresholds for mandatory data breach notifications and extending deadlines for reporting.
  • On 12 March 2026, the UK’s data protection regulator, the Information Commissioner’s Office has published an open letter to social media and video-sharing platforms operating in the UK calling on them to urgently strengthen their age assurance measures.
  • On 25 March 2026, Ofcom and the Information Commissioner’s Office released a joint statement outlining regulatory expectations for age assurance measures under the Online Safety Act and UK data protection laws. The statement aims to help online services protect children from harmful content and data risks while ensuring compliance with both legal frameworks.
  • On 31 March, the ICO called on businesses to review their use of automated decision-making in recruitment to ensure compliance with data protection laws and to protect jobseekers from unfair or biased outcomes.
  • On 29 April 2026, the Information Commissioner’s Office (ICO) released its finalised guidance on Storage and Access Technologies alongside an update on its online tracking strategy. This guidance addresses the application of the Privacy and Electronic Communications Regulations and, where relevant, the UK GDPR to technologies such as cookies, tracking pixels, device fingerprinting, and similar tools. It incorporates updates following two consultations and amendments introduced by the Data (Use and Access) Act 2025.
  • On 14 April 2026, the European Data Protection Board announced a new Data Protection Impact Assessment template to simplify compliance with the General Data Protection Regulation and promote consistency across Europe.

Latest enforcement action

  • On 15 January, the Information Commissioner’s Office fined Allay Claims Ltd £120,000 for sending over 4 million unsolicited marketing SMS messages between February 2023 and February 2024. These messages promoted PPI tax refund services and were sent without valid consent or compliance with the ‘soft opt-in’ exemption. Allay argued that recipients were existing customers who had engaged with the company in 2019 and signed terms of engagement, which it believed satisfied the ‘soft opt-in’ exemption. However, aggravating circumstances included Allay was previously investigated by the ICO in 2020 for PECR breaches and despite the investigation and complaints, Allay failed to suspend its marketing activities, resulting in further complaints. The distress caused to recipients, as unsolicited marketing is intrusive and can lead to financial harm, particularly in the context of PPI tax refund services, which often involve high fees and hidden charges.
  • On 2 January, The President of the Personal Data Protection Office (Poland’s data protection authority) imposed a fine of PLN 978,128 (approximately €232,379) on T. S.A. for the failure to ensure the independence of the Data Protection Officer (DPO) and the absence of measures to prevent conflicts of interest in the DPO’s role. The DPO of T. S.A. simultaneously held a managerial role (Director V.) and other positions within the company. The company’s history of GDPR violations was considered an aggravating factor, as it demonstrated ongoing compliance challenges. The company resolved the identified issues by restructuring the DPO’s role before the administrative proceedings concluded. This led to a 40% reduction in the fine.
  • On 29 January, the Italian Data Protection Authority (GPDP) fined e-Campus Online University €50,000 for unlawfully using facial recognition technology to verify student attendance during a teacher qualification course. The university processed biometric data without a valid legal basis, relying on invalid consent while failing to conduct a proper Data Protection Impact Assessment (DPIA) before implementation. The GPDP highlighted several violations of GDPR, including unnecessary data retention, lack of alternatives for students, and the power imbalance inherent in requiring biometric data for course participation. While the university cooperated with the investigation and ceased using the system, the fine reflected the serious nature of processing sensitive biometric data and the large number of students affected.
  • On 13 February, the ICO and Ofcom responded to an open letter from approx. 20 MPs urging the ICO to investigate Tattle Life for potential breaches of data protection laws after the death of a social media influencer’s 16 year old daughter.
  • The ICO confirmed it has an ongoing investigation into Tattle Life, examining its compliance with data protection laws. These include obligations to process personal data lawfully, transparently, and fairly, and to address user requests for data rectification or erasure. While the ICO does not have the authority to shut down websites, it can issue enforcement notices to ensure compliance if data protection violations are identified.
  • On 19 February, the ICO won its appeal in a landmark case against DSG Retail Limited. The dispute originated from a 2020 ICO fine of £500,000 imposed on DSG after a cyber-attack compromised the personal data of at least 14 million individuals. Despite appeals by DSG to the First-tier Tribunal and Upper Tribunal, the ICO sought further clarification on a critical point of data protection law by appealing to the CoA in 2024. The court clarified that this duty applies even if the stolen data cannot directly identify individuals, recognising the broader harm caused by cyber-attacks.
  • On 3 February, the ICO reprimanded Staines Health Group for sending excessive medical details about a terminally ill patient to their insurance company, Vitality. A patient at the NHS GP surgery was diagnosed with a terminal illness and made a claim to their insurer. The insurer, on behalf of the patient, subsequently requested that five years of medical history be sent to the patient to review, before being sent to the insurer in order to progress the claim. But, instead of five years of medical history being sent to the patient, Staines Health Group sent 23 years of medical records direct to the insurer. The patient believed the excessive disclosure of unnecessary medical records led to a reduction in the payout of their claim.
  • On 3 February, the ICO issued a monetary penalty of £100,000 to TMAC Ltd for making calls promoting alarm systems and monitoring services to individuals registered with the Telephone Preference Service.
  • On 4 February, the ICO issued a Penalty Notice to MediaLab.AI, Inc. fining it £247,590 for UK GDPR breaches relating to children’s data and the absence of a DPIA. The ICO found unlawful processing of under-13s’ data without valid parental consent and a failure to complete a DPIA for high-risk processing affecting under-18s during 27 September 2021 to 30 September 2025.
  • On 23 February 2026, the ICO issued a Penalty Notice to Reddit, Inc of £14,472,500 for UK GDPR breaches involving children’s personal data and failure to complete a DPIA.

Data protection update

The Data (Use and Access) Act 2025 (DUA Act) becomes UK law

The DUA Act, which received Royal Assent on 19 June 2025, reforms UK data protection laws and will be implemented in phases. Key changes include:

  • A new “recognised legitimate interests” lawful basis, exempting certain data uses (e.g., crime prevention and protecting vulnerable individuals) from a balancing test.
  • Exemptions to the need to get user consent to deploy non-strictly necessary cookie rules for low-risk purposes, such as fraud prevention.
  • Complaints must now be raised with data controllers first before escalating to the ICO, so policies and procedures should be updated to include a complaints and escalation mechanism for data protection issues.

UK’s adequate data protection law status likely to be extended to December 2031

UK GDPR, a UK-specific version of the EU GDPR, was deemed “adequate” by the EU, allowing free data flow between the UK and the EU (post Brexit). The adequacy decision, initially expiring in June 2025, was extended to December 2025 following the DUA Act, which made changes to data protection rules in the UK. The European Commission reviewed the UK’s updated data protection framework and concluded it still meets the “essential equivalence” standard, likely extending adequacy until 27 December 2031, with reviews every four years.

Court of Appeal decision on compensation claims for personal data breaches

On 22 August 2025, the Court of Appeal delivered a significant judgment in Farley and Others v Paymaster (1836) Ltd (trading as Equiniti) [2025] EWCA Civ 1117. The case arose from the misaddressing of annual benefit statements (ABS) for 432 police pension scheme members, sent to outdated addresses. Claimants alleged distress and anxiety over potential misuse of their data. While 14 confirmed their ABS had been accessed by unauthorised third parties, the High Court had ruled proof of third-party disclosure was necessary.

The Court of Appeal reversed this decision, holding third-party disclosure is not essential for data protection claims. Mishandling personal data itself constitutes an infringement of GDPR rights. Compensation is recoverable for non-material damage, including anxiety, if the fear of misuse is objectively reasonable. Hypothetical or speculative fears cannot be compensated. The case now returns to the High Court to assess the reasonableness of the appellants’ fears and any psychiatric injuries.

UK’s digital ID scheme

The scheme aims to simplify access to government and private services (e.g., welfare, childcare, renting) and reduce identity fraud, streamline verification, and toughen employment checks. The scheme is centred around free digital IDs stored securely on phones with biometric security (photo). Data includes name, date of birth, nationality/residency status, photo with biometric security. Address may be added post-consultation. The scheme will require employers to check IDs for right-to-work. The police will not be able to demand to see the digital ID. The UK Government state that the data will be stored on devices with encryption and credentials can be revoked if a device is lost/stolen. The scheme will be accessible with assistive technologies and physical alternatives plus support for non-smartphone users. A public consultation is planned for later in the year and rollout expected by end of current Parliament.

ICO call for views on regulating online advertising, legitimate interests, data protection complaints and online safety

  • On 7 July 2025, the ICO launched a consultation on its approach to regulating online advertising under the Privacy and Electronic Communications Regulations (PECR). Open until 7 September 2025, it sought views on balancing privacy, innovation, and economic growth. Consent remains mandatory for high-risk practices like behavioural advertising, but the ICO aims to identify low-risk advertising activities (e.g., ad delivery and fraud prevention). It plans to outline non-enforcement areas and safeguards in early 2026.
  • On 30 July 2025, the ICO launched a consultation on guidance for using profiling tools in online safety systems under the Online Safety Act 2023. Open until 31 October 2025, it focuses on lawful, fair, and transparent use of AI tools for detecting harmful behaviours like grooming and fraud. It highlights compliance with UK GDPR, DPA 2018, and PECR, emphasising lawfulness, transparency, data minimisation, and safeguarding children.
  • On 21 August 2025, the ICO launched public consultations on DUA Act amendments:
    • Recognised Legitimate Interest: This lawful basis allows processing personal data for pre-approved public interest purposes like safeguarding and emergencies. Consultation closes 30 October 2025.
    • Complaints: By June 2026, organisations must establish formal processes for complaints regarding personal data handling. Consultation ends 19 October 2025.

ICO enforcement actions

  • AFK Letters Co Ltd: Fined £90,000 on 14 April 2025 for breaching PECR regulation 21 by making 95,277 unsolicited marketing calls in 2023. Issues included poor consent documentation and non-compliance.
  • 23andMe: Fined £2.31 million on 17 June 2025 for a data breach exposing sensitive genetic and health data of 155,592 UK users. Failures included weak security measures and lack of mandatory multi-factor authentication.
  • Capita: A Penalty Notice to Capita plc (£8 million) and Capita Pension Solutions Limited (£6 million) for data breach. A cyberattack (March 2023) exposed data of over 6.6 million individuals, including sensitive health data and financial details.

UK direct marketing laws made easier for charities

The UK’s new Data (Use and Access) Act 2025 will be changing the direct marketing laws to make it easier for charities to send electronic marketing to existing supporters and supporters who have expressed an interest in the charity without their express consent.

This is referred to as the “soft opt-in” rule which is currently relied on by many commercial businesses and will be amended to broaden the scope to charities.

How can charities rely on soft opt-in?

Charities can send electronic marketing such as, emails or text messages or direct messages on social media, without the consent of a person, providing:

  • The sole purpose of electronic marketing is to further the charity’s own charitable purpose(s)
  • The charity collected the contact details directly from the person themselves
  • The charity collected the contact details when a person:
    • expressed an interest in one or more of the charitable purposes; or
    • offered or provided support to further one or more of those purposes
  • People are given a simple and free of charge way of opting out of direct marketing at the time of:
    • collecting their contact details; and
    • every subsequent direct marketing message thereafter

How can charities start to rely on soft opt-in?

The UK’s data protection regulator, the Information Commissioner, has stated that this change allowing charities to rely on the “soft opt-in” rule is planned to commence from January 2026.

What is the latest from the UK regulators on soft opt-in?

The Information Commissioner has produced draft guidance and launched a consultation on the new rules aiming to gather feedback from charities. The consultation runs from 16 October to 27 November 2025 and details can be found here.

What can charities do now to prepare?

  • Review your privacy policy to inform people of the reliance on “soft opt-in”
  • Review your consent mechanisms and plan the changes needed to rely on “soft opt-in”
  • Review your current opt-out mechanism and plan the changes needed to rely on “soft opt-in”
  • Ensure you have a do not contact list of people who have opted out of receiving direct marketing
  • Review existing marketing lists to separate people who have given their consent to electronic marketing and people who will be sent it using the “soft opt-in” rule
  • Train staff on how to respond to queries and complaints from people about the direct marketing
  • Implement policies and procedures to ensure staff know how to implement “soft opt-in” and the rules around data protection

Safeguarding your business in the wake of the ChatGPT share breach

In today’s fast-paced digital landscape, businesses are increasingly leveraging Artificial Intelligence (AI) tools such as OpenAI’s ChatGPT to streamline operations.

However, recent developments surrounding the now-discontinued “share” feature of ChatGPT should serve as a critical reminder of the importance of robust data governance and proactive measures to safeguard sensitive information, such as personal data and confidential business information.

What happened?

OpenAI recently faced scrutiny after its “share” feature in ChatGPT appeared to inadvertently expose private conversations to public search engines such as Google. While the feature allowed users to share chat links, discrepancies in the user interface and terms across platforms (e.g., Web, iOS, Android) led to confusion over whether shared chats were private or publicly discoverable. Although OpenAI has since removed the feature and requested the removal of indexed links from search engines stating it was a “short-lived experiment”, researchers have alleged that over 100,000 conversations, many containing personal data, were archived and remain accessible in some instances.

At the time of writing, it is also reported that chats from X.com’s “Grok” platform have been exposed online, highlighting a common risk within the industry.

Why it matters to your business

This issue underscores the risks associated with using AI tools and highlights potential vulnerabilities that could expose sensitive company or client data. For businesses, the key takeaways are:

  • Personal data: Conversations shared through AI platforms may include personal data about your employees, customers or clients. There are several data protection compliance issues that must be considered prior to sharing personal data with AI platforms from meeting transparency requirements via privacy policies to carrying out supplier due diligence on your data processing agreements with AI platforms.
  • Confidential information: As with personal data, conversations can be shared through AI platforms about your internal strategy, or intellectual property. Once shared outside of your business, such information can be challenging to remove entirely.
  • Reputational damage: Data leaks can severely impact your brand’s reputation, erode client trust, and lead to loss of business.
  • Regulatory implications: Mishandling of sensitive data could result in non-compliance with data protection laws such as the UK GDPR, leading to fines and legal challenges. Such fines can be up to £17.5m or 4% of your annual turnover (whichever the greater).
  • Legal claims: Clients or other individuals whose data is exposed may bring legal claims for breach of contract, breach of confidence, privacy or their data protection rights, and complain to the data protection regulator. Some larger data breaches have also attracted attempts to start ‘class-action’ claims.

What should you do?

If your organisation uses AI tools such as OpenAI’s ChatGPT, now is the time to review and strengthen your policies and practices. Below are some actionable steps to consider:

1. Implement an AI usage policy

If you haven’t already, establish a clear AI usage policy within your organisation. This should cover:

  • Approved AI tools and platforms
  • Guidelines on the type of information that can be inputted into AI systems
  • Specific processes for sharing data generated by AI tools

2. Train employees

Educate employees on the risks of using AI tools and ensure they understand how to use these platforms responsibly. Emphasise the importance of avoiding inputting personal data or confidential data into AI systems.

3. Conduct data audits

Review your organisation’s use of AI tools to identify any potential exposure of data. If you suspect that data may have been shared via ChatGPT’s “Share” feature, investigate whether these links have been indexed and take immediate steps to request their removal.

4. Monitor evolving AI risks

AI technology evolves rapidly, and so do its associated risks. Stay updated on developments in the AI space, including how tools such as ChatGPT handle data and privacy.

5. Seek legal support

If your business is impacted by the ChatGPT share breach or similar issues, legal advice can help you assess your exposure, address potential liabilities, and implement stronger safeguards.

How we can help

We understand the complex intersection of technology, data, and the law. Our team of experts can assist you with:

  • Drafting and implementing AI usage policies tailored to your business
  • Conducting data audits to assess your organisation’s risk exposure
  • Advising on regulatory compliance and potential liabilities
  • Supporting you with incident response and remediation in the event of a data breach, regulatory involvement, and legal claims

If you have any questions about how the OpenAI ChatGPT share breach might affect your business or need assistance in implementing preventative measures, please don’t hesitate to contact one of our specialists.

New measures announced to tackle ransomware attacks: what does this mean for businesses?

On 22 July, the UK government unveiled a set of measures designed to curb ransomware attacks and protect critical public and private sector services. Following public consultation, these steps aim to dismantle the business model of cyber criminals while fortifying national resilience against cyber threats.

Ransomware, a form of malicious software, is used by cyber criminals to encrypt victims’ systems or steal data, only unlocking access upon payment of a ransom. This cybercrime costs the UK economy millions of pounds annually, with recent high-profile attacks demonstrating risks ranging from operational disruption to life-threatening consequences.

Key Proposals

  1. Targeted ban on ransomware payments: aimed at public sector bodies, including local government and critical national infrastructure (CNI) operators, this ban intends to eliminate the financial motivation for ransomware attacks on essential services. Nearly 72% of respondents supported this targeted ban, with many agreeing it would reduce funds flowing to criminals and dissuade attacks. However, concerns about implementation, the need for clear guidance, and potential exemptions for life-threatening scenarios were raised.
  1. Ransomware payment prevention regime: this regime would require victims to report their intent to pay ransoms, allowing the Government to assess and potentially block payments to sanctioned groups. Feedback was mixed, with 47% supporting an economy-wide approach, but concerns were highlighted around thresholds creating loopholes for attackers. Respondents also stressed the importance of guidance and support for compliance, particularly for small businesses.
  1. Mandatory incident reporting regime: this proposal mandates victims to report ransomware incidents within 72 hours, followed by a detailed report within 28 days. It received strong backing, with 63% agreeing to an economy-wide mandatory reporting system. Respondents noted that such a regime would strengthen intelligence gathering and law enforcement’s ability to address ransomware threats. However, concerns were raised about reporting burdens on individuals and smaller organisations.

Next Steps

The Government is proceeding with developing these measures, taking into account the feedback received. Key actions include:

  • Publishing detailed guidance to clarify the scope and implementation of the proposals
  • Exploring proportional penalties and tailored compliance measures for organisations of different sizes and sectors
  • Strengthening victim support services, including expert guidance, operational updates, and intelligence sharing
  • Maintaining the proposed 72-hour reporting window for initial incident notifications

Read more about the Government’s position here and the outcome of the consultation here. If you would like more information, please feel free to reach out to one of our dedicated cyber security lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download, and watch our recent webinar here.

The Data (Use and Access) Act receives Royal Assent, bringing change to the UK’s data protection regime

On 19 June 2025, the UK’s Data (Use and Access) Act 2025 (the “DUA Act”) received Royal Assent.

This new legislation updates the UK’s current data protection regime which comprises of the UK General Data Protection Regulation (the “UK GDPR”), the Data Protection Act 2018 and the Privacy and Electronic Communications Regulations (the “PECR”). The DUA Act will come into force in phases, expected to commence at two, six and twelve months after Royal Assent, giving you time to implement the necessary data protection related changes to your organisation.

What does the DUA Act change and how does it impact organisations?

New ‘recognised legitimate interests’ lawful basis: when you use personal data for legitimate interests, you need to balance the impact on the people whose personal data you use, against the benefits arising from that use – this is commonly done by way of a legitimate interest assessment (“LIA”). However, the DUA Act now includes a list of recognised legitimate interests which means for such interests you don’t need to complete an LIA. This list will be in Schedule 4 to the DUA Act which inserts a new annex to the UK GDPR and includes interests such as:

  1. Sharing personal data if a public authority confirms it’s needed for their public task
  2. Using personal data to safeguard national security, public security, or defence
  3. Using personal data to respond to emergencies under the Civil Contingencies Act 2004
  4. Using personal data to detect, investigate, prevent crime, or prosecute offenders
  5. Using personal data to protect vulnerable individuals from physical, mental and emotional harm or neglect and support their well-being

A new ‘assumption of compatibility’: under the purpose limitation principle, if you re-use personal data you have already collected for a different purpose, you must ensure the new purpose is compatible with the purpose you initially collected it for. However, the DUA Act now includes a list of reuses of personal data that are assumed compatible with the original purpose. This list will be in Schedule 5 to the DUA Act which inserts a new annex to the UK GDPR. You can reuse previously consented personal data for a new purpose if necessary for one of the reasons below, but only if it’s not reasonable to obtain fresh consent, such as using personal data to:

  1. assess or collect taxes or duties; or
  2. comply with legal requirements.

‘Soft opt in’ for charities: if you’re a charity, it allows you to send electronic mail and SMS marketing to people whose personal data you collect when they support, or offer support or express an interest in, your work – providing you offered them a chance to opt out when you collected their personal data and you provide them with a chance to opt out in every electronic communication thereafter.

New cookie exemptions: the DUA Act allows you to set some types of cookies without having to get consent. Currently, you must get consent for all non-strictly necessary cookies. The list of exemptions will be in Schedule 12 to the DUA Act which inserts a new schedule to PECR, so you won’t need consent where the cookie or similar technology is for:

  1. the sole purpose of carrying out transmission of a communication over an electronic communications network;
  2. the non-exhaustive examples of strictly necessary purposes listed in the schedule, including security, fraud prevention, fault detection and authentication;
  3. the sole purpose of enabling a service provider to collect information for statistical purposes about how their online service is used;
  4. the sole purpose of enabling a service to adapt its appearance or functions in accordance with someone’s preferences; and
  5. the sole purpose of working out the subscriber or user’s geographical location when they request emergency assistance.

Reasonable and proportional search under data subject access requests (DSARs): it makes it clear that you only have to make reasonable and proportionate searches when someone asks for access to their personal data.

Complaints: individuals have certain rights such as the right to be informed, access, object, erase, restrict and rectify their personal data. The DUA Act introduces a right for people to complain to organisations and competent authorities if they think that they’ve used their personal information in a way that doesn’t comply with the law. This is similar to the complaints procedure under Freedom of Information Requests under FOIR. It places an obligation on organisations and competent authorities to:

  1. help people to make complaints, requiring them to take steps such as providing an electronic complaints form; and
  2. acknowledge complaints within 30 days and advise the complainant of the outcome without undue delay.

They must also take appropriate steps in the meantime, such as making enquiries into the subject matter of the complaint and keeping the complainant informed about progress.

Using personal data for scientific research: the DUA Act makes it clearer when you can use personal data for the purposes of scientific research, including commercial scientific research. It makes the following clarifications:

  1. People can give “broad consent” to an “area of scientific research” rather than “specific” consent – as long as: the exact purpose was unknown at the time of consent, the consent aligns with recognised ethical standards for the research area, and individuals are given the option to consent to only part of the processing.
  2. You can re-use people’s personal data for scientific research without giving them a privacy notice, if that would involve a disproportionate effort, so long as you protect their rights in other ways and still explain what you’re doing by publishing the notice on your website.  

Automated decision making – personal data: previously, decisions based solely on automated processing of personal data were restricted unless they were necessary for a contract between you and the individual, permitted by UK law or done with consent from the individual. Now, the DUA Act removes this restriction and allows an organisation to make solely automated decisions in a wider range of situations as long as it has appropriate safeguards in place – such safeguards include:

  1. providing the individual with information about the decision;
  2. allowing that person to make representations about the decision;
  3. enabling that person to obtain human intervention about the decision; and
  4. enabling that person to contest the decision.

There is no change to the restrictions around decision based solely on automated processing of special categories of personal data – they are still restricted unless you have consent from the individual or it was necessary for substantial public interest under the Data Protection Act 2018.

International transfers: various changes have been made to help make transferring personal data internationally easier. For example:

  1. The protection standard for transferring data now requires that it “is not materially lower” than UK GDPR and Data Protection Act 2018 standards (previously, it required that “the protection of natural persons guaranteed by the UK GDPR is not undermined”). This is referred to as the data protection test.
  2. Schedule 7 of the DUA Act form formalises the requirement for an organisation to do a transfer risk assessment for transfers subject to appropriate safeguards (such as standard contractual clauses). It does this by saying that an organisation must meet the data protection test “reasonably and proportionately”.

There are some operational and terminology changes such as: adequacy decisions are now called “transfers approved by regulations”, with the Secretary of State required to consider specific factors for the data protection test, implement ongoing monitoring instead of a four-year review period, gain new powers to recognise and introduce other transfer mechanisms, and make minor adjustments and restructuring to existing transfer requirements.

PECR breaches and enforcement: there are changes to the rules under PECR, including:

  1. the time period within which communications providers need to inform the ICO of a personal data breach from without undue delay or within 24 hours, to ”without undue delay and where feasible, not later than 72 hours after having become aware of it”, aligning it with the UK GDPR requirement to report a personal data breach;
  2. removing the requirement to establish that a contravention under PECR has caused substantial damage and distress; and
  3. allowing the ICO to impose monetary penalties up to a maximum of £17.5m for certain failures to comply, aligning it with the UK GDPR monetary penalty cap.

Changes to the ICO: there are multiple changes around the structure and powers of the UK’s data protection regulator, the Information Commissioner’s Office, such as:

  1. The ICO can compel individuals working for or on behalf of organisations to attend interviews and answer questions if there is suspected non-compliance or an offence under data protection law.
  2. An extension to the time for the ICO to issue penalty notices after a notice of intent from six months to six months or as soon as reasonably practicable.

Are there any new compliance requirements you have to meet?

Yes:

  1. If you provide an online service that is likely to be used by children, the DUA Act explicitly requires you to take their needs into account when you decide how to use their personal information. You should already satisfy this requirement if you conform to the ICO’s Age Appropriate Design Code.
  2. If you don’t already do so, the DUA Act requires you to take steps to help people who want to make complaints about how you use their personal data, such as providing an electronic complaints form. You must to acknowledge complaints within 30 days and respond to them ‘without undue delay’.   

Next steps

  1. Familiarise yourself with the changes that the DUA Act makes to data protection laws.
  2. Map out how the DUA Act can make your organisations compliance with data protection laws easier – such as “should do” and “must do” lists.
  3. Introduce a complaints escalation mechanism to allow individuals to complain to your organisation if they feel that the organisation has not complied with data protection laws.
  4. Implement a data protection compliance programme accordingly.

If you would like more information, please feel free to reach out to one of our dedicated data protection lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.

The UK’s Data (Use and Access) Bill passes as Lords’ concede on a push for AI transparency to protect creative industries

On 11 June, the House of Lords debated amendments to the Data (Use and Access) Bill (the Bill) and marked the culmination of an extensive “ping-pong” process between the House of Lords and the House of Commons regarding the protections for copyright holders in the context of artificial intelligence (AI).

What was the debate about?

  • The Government’s commitment to protecting copyright holders remains but it argues it cannot act prematurely without completing consultations on the issue. Emphasising the importance of transparency, enforcement and remuneration, it insisted on following due process, which includes analysing over 11,500 consultation responses and establishing technical and parliamentary working groups.
  • Several Lords, including Baroness Kidron and Lord Berkeley of Knighton, expressed frustration at the Government’s inaction. They argued that immediate transparency measures are needed to protect copyright holders from exploitation by AI companies. The creative sector fears that AI systems are using copyrighted works without consent or compensation, which could undermine the livelihoods of artists, writers, musicians and others.

What happened?

In efforts to ensure transparency and incentivise AI developers to comply with copyright law Lord Berkeley of Knighton introduced a new amendment to the Bill requiring AI developers to disclose which copyrighted works they use for training and how they access them, unless a licence has been agreed with rights holders.

Lord Berkeley ultimately withdrew his amendment, citing a desire to maintain the dignity of the House and avoid further unnecessary divisions. However, he and others urged the Government to take the concerns of the creative industries seriously and act swiftly to address them.

What will happen next?

The Bill now awaits Royal Assent and once in force, it will reform elements of the UK GDPR and Privacy Electronic Communications Regulations – from introducing a list of recognised legitimate interests to adding new exceptions to the consent requirements for cookies and similar technologies.

It should be noted that while the UK’s adequacy decision from the EU to allow a free flow of personal data transfers has been extended to 27 December 2025, the Bill does introduce changes to the UK GDPR which ultimately leads to a departure from the EU GDPR. As such, we wait eagerly to see if it decided whether or not the UK’s data protection regime will continue to offer materially equivalent protections in order to maintain the free flow of transfers between the UK and EU.

If you would like more information, please feel free to reach out to one of our dedicated data protection lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.

Reel trouble: the ICO reprimands Greater Manchester Police for CCTV failings

On 29 May 2025 the ICO reprimanded Greater Manchester Police (GMP) for failures in handling sensitive CCTV footage of a custody detainee, exposing gaps in data protection practices. The case highlights outdated policies, inadequate training, and procedural failings that led to missing footage.

Background

The data subject was held in custody at Pendleton Police Station for 48 hours in February 2021 during which CCTV was in operation. GMP became aware of serious allegation made against officers via local media and requested that Pendleton Police Station retain the personal data of the data subject. This was beyond the documented period of 90 days and the procedures in place at the time allowed for retention of a period of up to six years.

During the process of retaining the personal data, the personal data was quality checked to ensure its security. GMP had received multiple Data Subject Access Requests (DSARs) from the individual concerned. When GMP was able to comply with the request to release the footage captured, it was then quality checked.

Following a resolved technical issue, where one of the discs containing some of the data would not initially play and it was established on 19 May 2022 that two hours of footage was missing from the personal data set originally retained in 2021.

On 23 August 2023, GMP stated that, despite all attempts, it was unable to recover the missing two hours of footage. This led GMP to self-report a personal data breach to the ICO on 5 September 2023.

Findings

Following the assessment of information provided by both the Independent Office for Police Conduct and GMP who were conducting separate investigations with a different scope, the ICO has identified two main failures leading to this lack of quality check:

  • A misunderstanding at the time between staff, each believing that the other had conducted a quality check
  • A lack of any policies or guidelines at the time within GMP, identifying that quality checks were required, coupled with a lack of appointed responsibility for this task

Therefore, the ICO considers that the GMP failed to take the following actions:

  1. Provide the data subject with their personal data without undue delay and by the end of the applicable period of one month. This is because following the expiry of any exemptions in place to the right of access, GMP was not able to release all applicable personal data to the individual within the timeframe or to date. GMP did not provide the ICO with any evidence that it notified the data subject of any such extension.
  2. Ensure that the appropriate technical or organisational measures were in place to protect the accidental loss of the CCTV data it was processing in 2021. The ICO considers that had GMP had an appropriate standard operating procedure (SOP) in place, with clearly defined and delegated responsibilities for quality checking any backed-up personal data. This would have mitigated the risk of this breach. GMP failed to deploy an adequate SOP, designed to encompass the processing and retention of personal data beyond 90 days. The operating procedure that was in place had been developed in 2017 and had not been reviewed or amended since that time. In line with good practice, SOPs should be reviewed and updated, if necessary, once every 12 months.
  3. Conduct a data protection impact assessment (DPIA) in relation to their CCTV systems. A DPIA should have been conducted in compliance with section 64 of DPA 2018. A DPIA would have crucially assisted GMP in identifying shortfalls in their technical and organisational measures at the time.
  4. Provide the GMP’s custody officer with data protection training despite having a data protection training regime in place, which was supposed to have provided all staff members with data protection training during induction periods.

There were issues with the CCTV system itself such as:

  • The CCTV system, in operation at the time, was only able to download captured footage for retention in half-hour or one-hour segments. This placed GMP staff at substantial risk of human error.
  • The CCTV system did not save the half-hour/one-hour segments in chronological order, resulting in it being difficult to identify if all required footage had been captured.
  • The CCTV system did not have any inbuilt alerts, identifying any errors that may have occurred during the back-up process.

Mitigating and remedial steps taken by GMP

The ICO took into account the following:

  • GMP, at the time of the breach, had a requirement for a form of authorisation in place. This required the signed authorisation of an officer, ranked inspector or above, to allow the appropriate team access to the footage recorded on the server (held for 90-days before automatic overwrite).
  • Any footage retained was stored by GMP in sealed evidence bags at the time. This ensured there was no break in the evidence chain, during the period the footage was held by GMP and Pendleton Police Station.
  • GMP has undergone a proactive investment in their surveillance and security system infrastructure in 2023. This resulted in a significant upgrade to their system capabilities.
  • GMP has introduced a strictly regulated process to ensure that only authorised force personnel had access to the footage held within the CCTV server. Access was restricted to qualified officers within the criminal justice and custody branch of GMP.
  • GMP has informed the ICO of improvements to their security when managing DSARs from individuals. GMP advised that these requests are now administered centrally within their Information Access team. Where a DSAR is submitted, custody officers contact the relevant custody unit as soon as possible with urgent instructions as to how the footage is to be retained, so this is not overwritten. The footage is automatically uploaded to a dedicated local folder for DSARs. This location can only be accessed by authorised officers within the custody branch.
  • Auditing of footage has been vastly improved. This provides a comprehensive account of which officers have accessed the footage, copied it to disc or the location of the server, with date stamps.
  • GMP have already improved their SOP. The operating procedure has undergone a complete rewrite. GMP will ensure that this new procedure will be circulated moving forward across the force. GMP will ensure this procedure is now reviewed on an annual basis.

Action

Taking into account all the circumstances of this case, including the mitigating factors and remedial steps, the ICO decided to issue a reprimand to GMP. The ICO set out certain recommendations which do not form part of the reprimand and as such are not legally binding. Such recommendations include:

  • When formulating a replacement for the current processes, GMP should create an appropriate SOP, detailing how any retained personal data should be quality checked.
  • When developing the SOP, the roles and responsibilities for such checks should be clearly defined.
  • Under section 64 of the DPA 2018, GMP is required to have a DPIA in place for this processing. GMP should develop a DPIA for this processing without delay if they haven’t done so already.
  • GMP should deploy appropriate technical and administrative processes to monitor that all staff receive appropriate data protection training, which is refreshed at least every two years (recommended every year), in line with good practice. Staff should be trained and regularly refreshed on how to identify a personal data breach.
  • All breaches should be reported to GMP’s Information Access team/Data Protection Officer for assessment and documentation.
  • GMP should always keep a written record/assessment regarding their rationale not to inform the ICO of a breach.

Comment

While the ICO’s decision to reprimand, rather than fine, GMP reflects its Public Sector Approach – which avoids penalising taxpayer-funded organisations to prevent a “double hit” on victims and the public – this enforcement underscores the critical importance of protecting highly sensitive data, such as CCTV footage, where mishandling can lead not only to a data breach but a failure to respond to a data subject’s request. The key takeaway is to ensure measures are in place to comply with data protection laws in relation to CCTV such as access procedures, retention policies, security measures, staff training and data protection impact assessments.

If you would like more information, please feel free to reach out to one of our dedicated data protection lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our quarterly newsletter, The Data Download.

Joint Controllers, TC Strings, and OpenRTB: Unpacking the Belgian Market Court’s Appeal Decision on IAB Europe’s TCF

On 14 May 2025, the Belgian Market Court (part of the Brussels Court of Appeal) delivered a landmark judgment in the case concerning IAB Europe’s Transparency and Consent Framework (TCF).

The case centred on allegations that IAB Europe violated the General Data Protection Regulation (GDPR, or AVG in Dutch) through its data processing practices within the TCF. This judgment follows an earlier decision by the Belgian Data Protection Authority (APD), which found several breaches of the GDPR and imposed a €250,000 fine on IAB Europe.

CASE BACKGROUND

IAB Europe is an international non-profit association aiming to bring compliance to the digital advertising and marketing sector. They developed the TCF to promote adherence to the GDPR when internet sites or applications use the OpenRTB protocol.

On 2 February 2022, the APD found that IAB Europe’s TCF violated GDPR and fined IAB €250,000. Key findings included:

  • The TC String (user preferences signal) is personal data.
  • IAB Europe is a joint controller for both the creation and subsequent processing of the TC String.
  • Lack of a valid legal basis for processing TC Strings as the TCF did not obtain explicit and informed consent from users, nor could it rely on legitimate interests due to the large-scale and intrusive nature of the data processing involved.
  • Failure to fulfil transparency obligations and not adequately informing users about its role as a data controller, the purposes of data processing, or the recipients of their data.
  • Inadequate security measures and lack of mechanisms to prevent manipulation of consent signals.
  • Failure to conduct data protection impact assessments.
  • Failure to appoint a data protection officer. 
  • Incomplete register of processing activities.

On 4 March 2022, IAB Europe challenged the APD’s decision before the Belgian Market Court, disputing its role as a joint controller and the APD’s legal analysis on the TC String being personal data.

On 7 September 2022, the Belgian Market Court made an interim ruling, confirming the procedural irregularities in the APD’s investigation. It referred two preliminary questions to the CJEU:

  • Does the TC String constitute personal data under GDPR?
  • Is IAB Europe a joint controller for processing TC Strings and subsequent data uses?

On 7 March 2024, the CJEU judgement confirmed that:

  • the TC String may constitute personal data if:
    1. It is associated with other data points (e.g., IP address) that can identify a user.
    2. IAB Europe has reasonable means to access such data.
  • IAB Europe may be a joint controller for the creation and use of TC Strings if it influences the processing’s purposes and means.
  • IAB Europe is not a joint controller for subsequent processing (e.g. personalised advertising) by third parties.

The case was sent back to the Belgian Market Court for factual verification and further examination which this article explains.

FINDINGS OF THE MARKET COURT

Are TC Strings Personal Data?

TC Strings are unique codes containing users’ consent preferences.

The Market Court referenced the preliminary ruling of the CJEU in March 2024, which clarified that TC Strings, when linked to identifiers such as IP addresses, allow for user identification.

In paragraph 48 of the judgment, the Market Court stated that “the fact that IAB Europe itself would not have the reasonable means to proceed with Identification because it cannot make the link between a TC String and the IP address and would not have direct access to the personal data, is in itself irrelevant”.

As such, the Market Court confirmed that a TC String is personal data within the meaning of Article 4(1) of the GDPR.

Is there any processing of personal data?

IAB Europe, as the managing organisation and central figure in the digital ecosystem, determines the storage and dissemination of the TC String.

Under the TCF Technical Specifications, the TC String is shared with Consent Management Platforms (CMPs) in two ways:

  • By storing it in a shared global consent cookie on IAB Europe’s consensu.org domain; or
  • By storing it in a CMP-chosen system for service-specific consent signals.

The Market Court found that storing the TC String in a shared cookie and making it available via the consensu.org domain clearly constitutes processing of personal data under GDPR.

The Market Court further explained that, regardless of the consent cookie or domain, processing of personal data occurs in the TCF, including:

  • User preferences being collected by CMPs (along with the user’s IP address);
  • User preferences being structured and ordered in a TC String; and
  • The TC String being stored, distributed, and shared with TCF participants.

Should IAB Europe’s Role in the TCF be considered as a Data Controller?

Paragraphs 62-75 of the judgment confirms that it is clear that IAB Europe has real decision-making power, both over the purposes and means of processing and this given its overriding control over the operation of the TCF:

  • IAB Europe acknowledges its responsibility for the TCF in its own documentation – such as “Frequently Asked Questions” on the TCF (version 2.0) – noting that this judgment only focusses on v2.0 as IAB Europe’s TCF v2.2 already includes updates to address compliance concerns raised.
  • On determining the purpose and means of these processing operations, IAB Europe indeed exercises a decisive influence. IAB Europe has a shared purpose with the other participants for the processing of personal data, which incidentally all have the same, which is to ensure that user preferences are captured in a structured way and then shared with all other participants. Even though many TCF participants may be competitors, when it comes to the processing of user preferences under the TCF, they all have similar interests, which are also similar to those of IAB.

The Market Court states that “the concept of a data controller in this case just does have to interpreted broadly, since IAB Europe is the only one who, as it itself states, manages and administers the TCF and can therefore resolve the issues identified by the Dispute Resolution Chamber, after consultation with all other EU regulators.”

The Market Court confirmed that IAB Europe is a joint data controller with TCF participants for storing the consent preferences of the affected users in the TC String.

If yes, is IAB Europe a Joint Controller for the processing of personal data in the context of OpenRTB?

The Market Court assessed whether IAB Europe with the TCF “influences” the further processing of personal data under OpenRTB.

The APD argued that IAB Europe’s TCF and OpenRTB are inherently interconnected. It claimed that IAB Europe facilitates an ecosystem where consent preferences are collected and shared for further processing by third parties (e.g. publishers and adtech vendors). As such, the ADP considered IAB Europe and participating organisations to be joint controllers for both the collection and dissemination of consent data.

The Market Court identified inconsistencies in the ADP’s reasoning. Although the ADP acknowledged that IAB Europe does not act as a data controller for processing under OpenRTB, it nevertheless implied such responsibility in its decision. The Market Court found that the Appellants had limited the scope of their arguments to the TCF, no evidence was provided to establish IAB Europe as a joint controller for OpenRTB processing and it lacked influence over this stage of data use..

It concluded that the APD failed to demonstrate that IAB Europe acts as a joint data controller for processing operations under OpenRTB as not all processing stages fall under their control.

OUTCOME

The Market Court upheld the €250,000 fine imposed by the APD, deeming it proportionate and justified under Article 83 of the GDPR. It also confirmed the corrective measures requiring IAB Europe to bring its processing activities into compliance.

The Market Court dismissed most of IAB Europe’s grievances but acknowledged procedural flaws in the initial decision. It upheld the APD’s sanctions regarding TCF operations but clarified that IAB Europe is not responsible for OpenRTB operations – annulling the APD’s decision in part.

IAB Europe is ordered to pay the costs of proceedings, estimated at €7,848.84, and other contributions totalling €424.

IMPLICATIONS

This Judgment clarifies that even entities without direct access to personal data can be held accountable as data controllers if they influence the purposes and means of processing.

For the adtech industry, this ruling reinforces the GDPR principles and in particular supports the requirements to:

  • carefully examine consent mechanisms to ensure they are transparent, freely given, specific, informed and unambiguous;
  • ensure the use of consent frameworks like the TCF does not create ambiguity about their own roles and accountability in data processing operations;
  • provide users with clear, accessible, and understandable information about how their data is processed; and
  • minimise the processing of personal data by leveraging contextual advertising, privacy-enhancing technologies, and aggregated or pseudonymised datasets instead of third party cookies.

The UK’s data protection regulator publishes a new code of conduct for UK private investigators and litigation services

On 13 November, the Information Commissioner’s Office (ICO) approved and published a new sector-owned code of conduct – the Association of British Investigators Limited (ABI) UK GDPR Code of Conduct for Investigative and Litigation Support Services (Code).

What is the Code?

The Code seeks to address key challenges faced by investigators and enable code members to demonstrate compliance with specific areas of data protection law in the provision of investigative and litigation support services.

It aims to provide sector-specific guidance and to increase accountability in handling personal data. As such, by complying with the Code, you are complying with data protection laws in the UK.

The Code includes advice, guidance, and practical examples in relation to:

  • the roles and responsibilities of investigators;
  • how to conduct Data Protection Impact Assessments;
  • identification of the lawful basis for processing personal data;
  • Legitimate Interests Assessments including for invisible processing such as covert surveillance, tracking devices, background checks and social media monitoring; and
  • consent to share when tracing and locating individuals in certain cases.

How does the Code help your private investigation or litigation service?

  • Public confidence: Verified adherence to the Code is intended to give confidence to users and subjects of investigative and litigation support services. It demonstrates that Code members comply with key aspects of data protection law and operate to a high standard in key areas.
  • Reduce risk and enforcement action: Showing compliance with the Code reduces the risks of enforcement action from the ICO. This means you are less likely to receive fines, reprimands or other regulatory action in the event of a breach of data protection laws.
  • Due diligence carried out by users: Users of investigation and litigation services (particularly other businesses who are controllers) should be carrying out diligence on service providers. Your prospective clients may check whether you adhere to the Code when they are carrying out due diligence prior to instructing you.

Can I sign up to the Code? If so, how?

Investigators and litigation services can voluntarily sign up for the Code and Code membership is managed by an independent ICO approved and UKAS accredited monitoring body. Code members must satisfy the monitoring body with the requirements explained in Appendix I to the Code. Such requirements include:

  • Administrative evidence: Such as registration with the ICO, basic DBS disclosure, two references, finance checks and CV.
  • Training: Satisfactory completion and maintenance of data protection training to the level comparable to the ABI UK GDPR compliance workshop, or training to an equivalent standard on the areas covered by the Code – including data protection impact assessments, lawful bases and more.
  • Roles and responsibilities: Evidence that the Code member has documented and communicated to its client the roles and responsibilities in respect of the data processing undertaken in the delivery of Code services. This could be evidenced for example by providing a copy of the client engagement letter and/or contract.
  • Case extracts: Samples of Data Protection Impact Assessments, lawful bases relied on, Legitimate Interest Assessments. In particular for children and the Code notes that Code members must not maintain a register of criminal convictions.
  • Complaints: Evidence of any complaints received by the Code member from individuals in relation to data protection and the steps the Code member took to respond to the complaint and where relevant, evidence that in relation to monitoring body investigations of alleged breaches of the Code, the Code member has communicated with the monitoring body in accordance with the Code and the cooperation criteria in this Code.

The Code builds on the existing standards and criteria required for ABI membership however, Code members are not required to be ABI members and Code membership is available to any sector agency that meets the Code member criteria as at Appendix I to the Code, whether affiliated to the ABI or not.

What to do next?

We can assist you with your data protection compliance programme ahead of signing up to the Code. The following checklist describes the compliance steps that we suggest to cover:

  • Registration with the ICO: As a data controller you are obliged to pay a fee to the ICO depending on your size.
  • Records of processing activity: This document explains what data you process, how, who it is shared with and why. This is a legal requirement under GDPR (in most cases) but in any case will be a necessary exercise in order to satisfy the other requirements below.
  • Privacy policies: Such as website privacy policy, employees privacy policy, recruitment privacy policy, privacy policy for users and third parties subject to the services – this is to comply with transparency requirements.
  • Cookie audit: Policy and mechanism cookie banner – this is the consent mechanism that allows you to drop cookies. A good cookie banner will be tailored to your needs and allow users to decide what type of cookies they want. This is a requirement under the electronic marketing rules.
  • Assessments: Such as Data Protection Impact Assessments, Legitimate Interests Assessments and Transfer Risk Assessments – this is to demonstrate your compliance and prove accountability.
  • Supplier onboarding checklist and procedure and template data sharing clauses: To ensure you have carried out due diligence on any third parties you choose to use to help fulfil your services.
  • Data protection rights procedure: This document sets out how to manage DSARs and other requests in relation to an individual’s data. Dealing with these requests is a legal requirement, getting it wrong can lead to fines and to reputational damage.
  • Security incident management policy: This document sets out what each team needs to do in the event of a data breach. Dealing with these requests is a legal requirement, getting it wrong can lead to fines and to reputational damage.
  • Regular privacy training: We can provide introductory or further training sessions depending on what your staff have already received. In order to comply with your security obligations you must train people to ensure that human error is avoided to the extent possible and that they understand what the GDPR requirements are.
  • Data handling policy: This policy contains an explanation on why data protection is important and how you and your staff and comply with data protections laws on a day to day basis.
  • BYOD and acceptable use policy: This policy would contain rules on how employees are allowed to use their personal devices including acceptable use practices.
  • Data security policy: This policy documents how you keep data safe from an organisational and technical perspective.
  • Data retention policy: This document explains how long you keep each type of data.

If you would like more information, please feel free to reach out to one of our dedicated data protection lawyers, or if you would like keep up to date on the latest in data protection, please subscribe to our newsletter, The Data Download here.

Further details about the Code can be found here.