Latest news

“Clients say ‘they’re very good at knowledge-sharing’.”

Chambers UK

Data & Privacy eBulletin: Autumn 2020

19 October 2020

Welcome to our Autumn 2020 Data & Privacy eBulletin, where we look at the ongoing impact of the Schrems II decision, COVID-19 data privacy implications and other recent data developments.

ICO Commissioner Elizabeth Denham’s term to end July 2021

Elizabeth Denham, who has faced some press criticism for working from Canada over these COVID-affected months, has confirmed that she will step down when her term comes to an end in July 2021.

Remember Brexit?

2020 is not just the year of COVID, it’s also the last year during which the UK is governed by the EU’s data protection regime. Unless a different deal is agreed before the end of the year, from 1 January 2021 the transition period will end and the UK will then be subject only to the UK’s implementation of the GDPR.

This will have a number of data protection implications, including:

  • EU-UK and UK-EU data transfers: we’ve covered this topic in previous eBulletins; essentially, under current arrangements, data transfers from the UK to the EU will continue to be permitted (as the UK has confirmed that it will grant the EU “adequacy”), but EU-UK transfers are at risk unless the EU grants adequacy to the UK. There is no sign of that at the moment (and it may have become more difficult due to a recent CJEU decision to restrict permitted EU surveillance practices), and using model clauses to govern such transfers raises many of the same questions in relation to the use of such contracts to govern EU-US data transfers post-Schrems II.
  • Appointment of Representatives: if a UK-based controller or processor meets the criteria for having to appoint an EU Representative under Article 27 of the GDPR, it will have to have this in place prior to 1 January 2021. Equally, an EU-based controller or processor meeting the same criteria in relation to the UK will need to appoint a UK Representative before the end of this year.
  • Regulatory relationships: UK businesses are used to having to deal with the ICO in relation to data protection issues affecting their business. Where a business has customers or operations in the EU, it will have an increasing exposure to national EU regulators.

With a couple of months left to go there is time to make the necessary arrangements, but the clock is ticking.

ICO’s children’s code entered into force on
2 September

The ICO’s Age Appropriate Design Code, known as the Children’s Code, came into force on 2 September 2020 (but with a 12 month implementation period). The Children’s Code is a statutory code of practice that applies to organisations providing online services and products targeted at, or likely to be accessed by, anyone under 18. Social media platforms, educational websites, streaming services, online games and apps, and connected toys are all likely to be within scope.

The Code emphasises a “privacy by design” approach, focusing on the best interests of the child.

Organisations are required:

  • to switch off privacy-intrusive settings by default (e.g. location) and ensure other settings are “high privacy” by default;
  • not to deploy “nudge” techniques or tools which might encourage children to turn off or weaken their privacy settings;
  • provide prominent, accessible and user friendly privacy tools to enable children to exercise their rights and report concerns;
  • provide concise and prominent privacy information to users which is in clear language and formats suited to the age of the child (including information about any parental controls or monitoring); and
  • to collect and retain only the minimum amount of personal data necessary.

If your service is for, or is likely to be accessed by, children you will need to take steps to comply with this Code.

This may include:

  • reviewing privacy information and tools offered to users and considering if they are appropriate for the age range of the child users;
  • reviewing the default privacy settings of your service and considering whether you need to adjust these in line with the standards mentioned above; and
  • reviewing your DPIA for the service and updating it to ensure you have documented the steps taken to comply with the Code. If you consider that your services are not likely to be accessed by children you should also document that in your DPIA so that you are able to justify your decision if required to do so by the ICO.

Schrems II fallout continues

Following the Schrems II decision (see our previous eBulletin for further detail), a couple of key developments include:

  • Regulators rushing to provide guidance: the EDPB has produced a set of FAQs attempting to deal with some key questions raised by the invalidation of Privacy Shield and the general questions raised by Schrems II in relation to EU-US data transfers. A number of national regulators in the EU have also produced their own guidance, but the ICO has not, to date, gone any further than the EDPB guidance and stating that it will provide “practical and pragmatic support.”
  • Cases continue: on 17 August 2020, noyb (none of your business) announced that it had filed 101 complaints with supervisory authorities in 30 EU and EEA member states against EU controllers and Google and Facebook in the US, over EU-US data transfers which noyb states are in contravention of the ECJ’s ruling in Schrems II.

COVID-19: Contact tracing apps

The ICO has published a statement which acknowledges the importance of contact tracing to support the fight against COVID-19 and welcomes the development of an effective app.

Significantly less labour intensive and arguably more efficient than traditional methods, contact tracing apps have been heralded as a key tool in the fight against COVID-19. However, these apps have also raised a number of data protection issues due to their extensive collection and use of highly sensitive personal data. In order to maintain public and political confidence in such apps, it is vital that any COVID-19 contact tracing app complies with the key principles of the GDPR:

  1. Lawfulness, fairness and transparency – consent will, in most cases, be the lawful basis for processing data.
  2. Purpose limitation – any personal data collected must only be used for the specific purposes disclosed to the user.
  3. Data minimisation – only the type and amount of personal data which is necessary to achieve the core purpose should be collected. In most cases the ICO has suggested that location identifying data should not be collected.
  4. Accuracy – the app should allow individuals to exercise their rights, such as the right to erasure and rectification of personal data.
  5. Storage limitation – personal data collected by an app should only be stored for the minimum amount of time required to achieve the relevant purpose.
  6. Security – given the highly sensitive nature of the data and the potential for mis-use, any app must employ state of the art cryptographic and security techniques.
  7. Accountability – Data Protection Impact Assessments should be conducted prior to any processing taking place given the high risk to the rights and freedoms of individuals.

ICO launches guidance on AI and data protection

The ICO has published guidance to help organisations understand how data protection principles will apply to the development and deployment of Artificial Intelligence systems. It includes recommendations on best practice and technical measures.

The guidance covers four main areas:

Accountability and governance

  • in most cases, a DPIA will be legally required in relation to AI activity. The ICO acknowledges that it can be difficult to describe the processing activity of AI systems, especially where they involve complex models and data sources. Although not a legal requirement of a DPIA, the ICO recommends maintaining two versions of a DPIA – one containing the technical description for specialist audiences and the other containing a high level description focusing, in particular, on how the personal data inputs relate to the outputs affecting individuals. The latter will also help organisations fulfill their obligation to explain AI decisions to individuals.

Lawfulness, fairness, and transparency of processing

  • organisations will need to break down and separate each distinct processing operation involved in the AI system and ensure there is a lawful basis for each one. It is likely to be necessary to distinguish between the lawful basis appropriate for the training of the AI versus the deployment;
  • in terms of fairness, statistical accuracy and the risks of bias need to be addressed in the development and procurement of the AI system; and
  • in terms of transparency, the ICO has developed, alongside The Alan Turing Institute, separate detailed guidance on “explaining decisions made with AI”.

Data minimisation and security

  • the ICO acknowledges that AI systems pose a data minimisation challenge as they require a large amount of data to train – organisations should consider whether they can use synthetic data to train systems, or anonymise data, or convert it into less human readable formats.

Compliance with data subject rights

  • organisations will have to assess how individuals will be able to exercise their rights in relation to personal data used in the life-cycle of an AI system; and
  • particular attention should be paid to automated decision making with legal or similar effects. As well as informing individuals about the logic involved, organisations need to implement suitable safeguards when processing personal data in this manner including providing individuals with the right to obtain human intervention, express their opinion and contest the decision.

Data breaches & enforcement

Enforcement news

  • BA’s ICO fine reduced from £183m to £20m: The fine of £20m, which was considerably smaller than the £183m that the ICO originally said it intended to issue back in 2019, was reduced as a result of “the economic impact of COVID-19.” However, it is still the largest penalty issued by the ICO to date. The incident took place in 2018 when customer’s personal and credit card data were compromised.
  • Direct marketing calls can contravene more than just data protection laws: The ICO has fined CPS Advisory Ltd £130,000 for making more than 100,000 direct marketing calls about pensions without lawful authority. The ICO found that the company was not a trustee or manager of a pension scheme nor was it authorised by the FCA. The ban on cold-calls in relation to pensions (which includes emails and texts) was introduced in January 2019.
  • 21m+ unsolicited emails: Koypo Laboratories, a “lead generator specialising in scientific customer acquisition” was fined £100k for instigating the transmission of 21,166,574 unsolicited emails between 1 March 2017 and 31 March 2018.
  • Check the effectiveness of your “consents”: Price comparison and technology company Decision Technologies Limited has been fined £90k; between 12 July 2017 and 23 May 2018 it sent 14,986,423 direct marketing emails (with a further 1,136,647 emails estimated) for which there were issues regarding whether the consent obtained was freely given, specific, and informed. The ICO considered that Decision Technologies failed to take reasonable steps to prevent the violations.
  • 270k unsolicited cold calls to Telephone Preference Service registrants: Rain Trading Ltd has been fined £80k for making 270,774 unsolicited direct marketing calls to subscribers on the TPS register without valid consent between 1 January 2018 and 29 November 2018.
  • €35m fine for H&M in relation to unlawful employee monitoring: Although not an ICO action, this fine by the Hamburg DPA is notable not just for its size but also the context. In “welcome back talks” with employees returning from periods of absence, H&M managers recorded various details of the employees’ personal lives, which were then accessible by other managers. Take great care on what information you gather about your employees, and who you share it with.

Significant recent cases

Marriott faces class action over data breach (Bryant v Marriott International Inc and others)

A class action has been filed in the High Court against Marriott International following the data breach investigated by the ICO in 2019.

Martin Bryant, founder of technology and media consultancy Big Revolution, is leading the claim for English and Welsh-domiciled guests after more than 300 million customer records from Marriott’s global database, potentially including passport and credit card details, were hacked between 2014 and 2018.

Around seven million British guest records were compromised by the hack, according to the UK ICO, which last year proposed to fine Marriott £99.2m. Concerns, however, have been raised regarding the robustness of the ICO’s approach to enforcement given the multiple extensions agreed by the ICO in relation to the investigation of the Marriott data breach.

Facial-recognition technology in breach of human rights and data protection legislation (Court of Appeal) (R Bridges) v Chief Constable of South Wales Police (Respondent) and others)

The appeal concerned the lawfulness of the use of automated facial-recognition (AFR) technology by the South Wales Police Force. The police deployed AFR on about 50 occasions between May 2017 and April 2019 at public events and sought a watchlist of various persons, including persons wanted on warrants and suspected of having committed crimes.

Mr Bridges was not included on a watchlist but had brought a claim for judicial review on the basis that the use of AFR was unlawfully intrusive, including in relation to his privacy rights under Article 8 of the European Convention on Human Rights and data protection legislation in the UK.

On 4 September 2019, the Divisional Court dismissed the claim on all grounds. Mr Bridges appealed to the Court of Appeal and the appeal was upheld in part.

Some key points to note:

  • It was held that the police’s interference with the Article 8 privacy rights was not in accordance with the law. There was no clear guidance on where AFR could be used and who could be put on a watchlist. This was too broad a discretion to afford to the police officers to meet the standard required by Article 8(2).
  • However, the benefits of the AFR were potentially great, and the impact on Mr Bridges was minor, so the use of AFR was proportionate under Article 8(2).
  • It was found that the police had not carried out an adequate Data Protection Impact Assessment.

Any organisation wanting to use facial recognition technology should consider the judgment carefully. It is clear that any use of the technology must undergo a Data Protection Impact Assessment which recognises privacy rights are likely to be infringed.

Back to news

Share this page