Welcome to our Autumn 2020 Data & Privacy eBulletin, where we look at the ongoing impact of the Schrems II decision, COVID-19 data privacy implications and other recent data developments.
ICO Commissioner Elizabeth Denham’s term to end July 2021
Elizabeth Denham, who has faced some press criticism for working from Canada over these COVID-affected months, has confirmed that she will step down when her term comes to an end in July 2021.
ICO’s children’s code entered into force on
Schrems II fallout continues
Following the Schrems II decision (see our previous eBulletin for further detail), a couple of key developments include:
- Regulators rushing to provide guidance: the EDPB has produced a set of FAQs attempting to deal with some key questions raised by the invalidation of Privacy Shield and the general questions raised by Schrems II in relation to EU-US data transfers. A number of national regulators in the EU have also produced their own guidance, but the ICO has not, to date, gone any further than the EDPB guidance and stating that it will provide “practical and pragmatic support.”
- Cases continue: on 17 August 2020, noyb (none of your business) announced that it had filed 101 complaints with supervisory authorities in 30 EU and EEA member states against EU controllers and Google and Facebook in the US, over EU-US data transfers which noyb states are in contravention of the ECJ’s ruling in Schrems II.
COVID-19: Contact tracing apps
The ICO has published a statement which acknowledges the importance of contact tracing to support the fight against COVID-19 and welcomes the development of an effective app.
Significantly less labour intensive and arguably more efficient than traditional methods, contact tracing apps have been heralded as a key tool in the fight against COVID-19. However, these apps have also raised a number of data protection issues due to their extensive collection and use of highly sensitive personal data. In order to maintain public and political confidence in such apps, it is vital that any COVID-19 contact tracing app complies with the key principles of the GDPR:
- Lawfulness, fairness and transparency – consent will, in most cases, be the lawful basis for processing data.
- Purpose limitation – any personal data collected must only be used for the specific purposes disclosed to the user.
- Data minimisation – only the type and amount of personal data which is necessary to achieve the core purpose should be collected. In most cases the ICO has suggested that location identifying data should not be collected.
- Accuracy – the app should allow individuals to exercise their rights, such as the right to erasure and rectification of personal data.
- Storage limitation – personal data collected by an app should only be stored for the minimum amount of time required to achieve the relevant purpose.
- Security – given the highly sensitive nature of the data and the potential for mis-use, any app must employ state of the art cryptographic and security techniques.
- Accountability – Data Protection Impact Assessments should be conducted prior to any processing taking place given the high risk to the rights and freedoms of individuals.
ICO launches guidance on AI and data protection
The ICO has published guidance to help organisations understand how data protection principles will apply to the development and deployment of Artificial Intelligence systems. It includes recommendations on best practice and technical measures.
The guidance covers four main areas:
Accountability and governance
- in most cases, a DPIA will be legally required in relation to AI activity. The ICO acknowledges that it can be difficult to describe the processing activity of AI systems, especially where they involve complex models and data sources. Although not a legal requirement of a DPIA, the ICO recommends maintaining two versions of a DPIA – one containing the technical description for specialist audiences and the other containing a high level description focusing, in particular, on how the personal data inputs relate to the outputs affecting individuals. The latter will also help organisations fulfill their obligation to explain AI decisions to individuals.
Lawfulness, fairness, and transparency of processing
- organisations will need to break down and separate each distinct processing operation involved in the AI system and ensure there is a lawful basis for each one. It is likely to be necessary to distinguish between the lawful basis appropriate for the training of the AI versus the deployment;
- in terms of fairness, statistical accuracy and the risks of bias need to be addressed in the development and procurement of the AI system; and
- in terms of transparency, the ICO has developed, alongside The Alan Turing Institute, separate detailed guidance on “explaining decisions made with AI”.
Data minimisation and security
- the ICO acknowledges that AI systems pose a data minimisation challenge as they require a large amount of data to train – organisations should consider whether they can use synthetic data to train systems, or anonymise data, or convert it into less human readable formats.
Compliance with data subject rights
- organisations will have to assess how individuals will be able to exercise their rights in relation to personal data used in the life-cycle of an AI system; and
- particular attention should be paid to automated decision making with legal or similar effects. As well as informing individuals about the logic involved, organisations need to implement suitable safeguards when processing personal data in this manner including providing individuals with the right to obtain human intervention, express their opinion and contest the decision.
Data breaches & enforcement
- BA’s ICO fine reduced from £183m to £20m: The fine of £20m, which was considerably smaller than the £183m that the ICO originally said it intended to issue back in 2019, was reduced as a result of “the economic impact of COVID-19.” However, it is still the largest penalty issued by the ICO to date. The incident took place in 2018 when customer’s personal and credit card data were compromised.
- Direct marketing calls can contravene more than just data protection laws: The ICO has fined CPS Advisory Ltd £130,000 for making more than 100,000 direct marketing calls about pensions without lawful authority. The ICO found that the company was not a trustee or manager of a pension scheme nor was it authorised by the FCA. The ban on cold-calls in relation to pensions (which includes emails and texts) was introduced in January 2019.
- 21m+ unsolicited emails: Koypo Laboratories, a “lead generator specialising in scientific customer acquisition” was fined £100k for instigating the transmission of 21,166,574 unsolicited emails between 1 March 2017 and 31 March 2018.
- Check the effectiveness of your “consents”: Price comparison and technology company Decision Technologies Limited has been fined £90k; between 12 July 2017 and 23 May 2018 it sent 14,986,423 direct marketing emails (with a further 1,136,647 emails estimated) for which there were issues regarding whether the consent obtained was freely given, specific, and informed. The ICO considered that Decision Technologies failed to take reasonable steps to prevent the violations.
- 270k unsolicited cold calls to Telephone Preference Service registrants: Rain Trading Ltd has been fined £80k for making 270,774 unsolicited direct marketing calls to subscribers on the TPS register without valid consent between 1 January 2018 and 29 November 2018.
- €35m fine for H&M in relation to unlawful employee monitoring: Although not an ICO action, this fine by the Hamburg DPA is notable not just for its size but also the context. In “welcome back talks” with employees returning from periods of absence, H&M managers recorded various details of the employees’ personal lives, which were then accessible by other managers. Take great care on what information you gather about your employees, and who you share it with.
Significant recent cases
Marriott faces class action over data breach (Bryant v Marriott International Inc and others)
A class action has been filed in the High Court against Marriott International following the data breach investigated by the ICO in 2019.
Martin Bryant, founder of technology and media consultancy Big Revolution, is leading the claim for English and Welsh-domiciled guests after more than 300 million customer records from Marriott’s global database, potentially including passport and credit card details, were hacked between 2014 and 2018.
Around seven million British guest records were compromised by the hack, according to the UK ICO, which last year proposed to fine Marriott £99.2m. Concerns, however, have been raised regarding the robustness of the ICO’s approach to enforcement given the multiple extensions agreed by the ICO in relation to the investigation of the Marriott data breach.
Facial-recognition technology in breach of human rights and data protection legislation (Court of Appeal) (R Bridges) v Chief Constable of South Wales Police (Respondent) and others)
The appeal concerned the lawfulness of the use of automated facial-recognition (AFR) technology by the South Wales Police Force. The police deployed AFR on about 50 occasions between May 2017 and April 2019 at public events and sought a watchlist of various persons, including persons wanted on warrants and suspected of having committed crimes.
Mr Bridges was not included on a watchlist but had brought a claim for judicial review on the basis that the use of AFR was unlawfully intrusive, including in relation to his privacy rights under Article 8 of the European Convention on Human Rights and data protection legislation in the UK.
On 4 September 2019, the Divisional Court dismissed the claim on all grounds. Mr Bridges appealed to the Court of Appeal and the appeal was upheld in part.
Some key points to note:
- It was held that the police’s interference with the Article 8 privacy rights was not in accordance with the law. There was no clear guidance on where AFR could be used and who could be put on a watchlist. This was too broad a discretion to afford to the police officers to meet the standard required by Article 8(2).
- However, the benefits of the AFR were potentially great, and the impact on Mr Bridges was minor, so the use of AFR was proportionate under Article 8(2).
- It was found that the police had not carried out an adequate Data Protection Impact Assessment.
Any organisation wanting to use facial recognition technology should consider the judgment carefully. It is clear that any use of the technology must undergo a Data Protection Impact Assessment which recognises privacy rights are likely to be infringed.