Authors:Ariane Adam and Mia Leslie
Created:2022-09-27
Last updated:2023-09-18
UK GDPR: key to ensuring non-discrimination in the use of data
.
.
.
Marc Bloomfield
Description: PLP
Data is fast becoming one of the most important commodities in the UK’s economy, making up about four per cent of its gross domestic product in 2020.1UK Digital Strategy, Department for Digital, Culture, Media and Sport, June 2022, page 11. This summer, the publication and subsequent swift shelving of the Data Protection and Digital Information (DPDI) Bill went some way towards demystifying the Conservative party’s intention to cut the ‘red tape’ of the UK General Data Protection Regulation in favour of unlocking data’s economic value.
As we face uncertainty about how Liz Truss’s newly-formed cabinet will proceed with the bill, we are reminded that our current data protection framework, albeit imperfect, has been a safeguard in the face of overcollection and overprocessing of data by government bodies. We focus here on two examples that illustrate the risks of watering down such protections, particularly to marginalised individuals and communities.
Between April and November 2020, an unpublished, blanket Home Office policy was in force in respect of seizing the phones of migrants who arrived in the UK by small boats. In the course of proceedings before the High Court, the Home Office conceded that this policy was unlawful because it did not provide a lawful basis for the processing, including extraction, of data pursuant to the Data Protection Act (DPA) 2018.2R (HM) v Secretary of State for the Home Department; R (MA and KH) v Secretary of State for the Home Department [2022] EWHC 695 (Admin); July/August 2022 Legal Action 38 at para 6. Among other concessions, the Home Office admitted that its data protection impact assessments (DPIAs) did not properly assess the risks to the rights and freedoms of data subjects, rendering them unlawful.3It is striking that the Home Office initially denied the policy’s existence when responding to the claims for judicial review. The High Court has ordered a further hearing to address the Home Office’s apparent failure to comply with its duty of candour. At the time of writing, the hearing had not yet taken place. The intervention by Privacy International highlighted the disproportionate effect of the policy: seizing and looking through someone’s phone, possibly the most extensive intimate record of their private life, requires serious justification, which was absent.4PI intervenes in judicial review to support asylum seekers against the UK home secretary’s seizure and extraction of their mobile phones’, Privacy International, 31 January 2022.
The case illustrates, inter alia, the importance of DPIAs in ensuring that policies properly assess the risks to the rights of individuals and therefore comply with the law – a point also made in a 2020 challenge to South Wales Police’s (SWP’s) use of automated facial recognition (AFR) technology.5R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058; November 2020 Legal Action 21. The Court of Appeal found that because SWP’s impact assessment was written on the basis that article 8 of the European Convention on Human Rights was not infringed by its use of AFR technology, when in fact it was, the DPIA was not in compliance with DPA 2018 s64 and thus its policy was unlawful.
It is worrying, therefore, that the requirement to thoroughly consider the risks to the rights and freedoms of data subjects has been framed as burdensome on public bodies and businesses.6Data: a new direction – government response to consultation, Department for Digital, Culture, Media and Sport, 17 June 2022; updated 23 June 2022, para 2.2. Without rigorous risk and impact analysis, disproportionate and therefore discriminatory processing could be carried out, before the possibility of harm is evaluated and mitigated.
Another dangerous pattern is the increased use of automation by the Department for Work and Pensions and the disproportionate effect that is having on those who receive benefits. Big Brother Watch has identified an ‘over-reliance on intrusive data processing and algorithmic systems that pose serious privacy and equalities risks to individuals in the welfare system, despite insufficient evidence of genuine benefits’.7Poverty panopticon: the hidden algorithms shaping Britain’s welfare state, 20 July 2021, page 8. At present, there is an alarming absence of transparency around these systems, and a reluctance from public bodies to disclose information on the development, deployment and monitoring of automated tools to the public.
The outcome of the Department for Digital, Culture, Media & Sport's September 2021 consultation Data: a new direction,8Data: a new direction – government response to consultation, ibid, introduction. read alongside the DPDI Bill as it was published in July, indicates plans to seriously limit people’s ability to access information about how their own personal data is being collected and used. The bill’s provisions would lower the threshold for data controllers to refuse requests from individuals to know what data is held on them, and limit the right to know if personal data will be used for purposes other than those for which it was originally collected. Moreover, the bill would reduce the existing protections against solely automated decision-making, allowing exceptions to the requirement for human oversight in a much wider range of contexts.
Earlier this year, the Justice and Home Affairs Committee published its anticipated report on new technologies and the application of the law,9Technology rules? The advent of new technologies in the justice system. First report of session 2021–22, HL Paper 180, 30 March 2022. stressing that without transparency, there is no accountability when things go wrong. The committee also drew attention to a concerning tendency on the types of human behaviours that are policed by algorithmic technology, as put by Professor Karen Yeung: ‘We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people.’10Ibid, para 28, page 17.
There is a pressing need to preserve robust data protections in law to ensure that the use of data is fair, transparent and accountable. Without such protections, unfair, disproportionate and unlawful practices could fall through the gaps, leaving marginalised individuals and communities exposed to even higher levels of intrusive data collection and processing, where they are likely to suffer significant harm.
Inadequate protection of personal data, as in the instances above, poses risks of exacerbating structural inequalities and creating new ones.
 
1     UK Digital Strategy, Department for Digital, Culture, Media and Sport, June 2022, page 11. »
2     R (HM) v Secretary of State for the Home Department; R (MA and KH) v Secretary of State for the Home Department [2022] EWHC 695 (Admin); July/August 2022 Legal Action 38 at para 6. »
3     It is striking that the Home Office initially denied the policy’s existence when responding to the claims for judicial review. The High Court has ordered a further hearing to address the Home Office’s apparent failure to comply with its duty of candour. At the time of writing, the hearing had not yet taken place. »
5     R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058; November 2020 Legal Action 21. »
6     Data: a new direction – government response to consultation, Department for Digital, Culture, Media and Sport, 17 June 2022; updated 23 June 2022, para 2.2. »
8     Data: a new direction – government response to consultation, ibid, introduction. »
10     Ibid, para 28, page 17. »