Authors:Rachel Solomon
Created:2023-08-25
Last updated:2023-09-25
New data legislation will weaken our rights
.
.
.
Marc Bloomfield
Description: PLP
At a time when public and private bodies collect and process ever more of our personal data, it is disappointing that the UK government’s Data Protection and Digital Information (No 2) Bill proposes to weaken – rather than strengthen – our rights.
The bill stands as an example of how data protection laws are failing to keep pace with rapidly developing technologies1Regulation and legislation lag behind constantly evolving technology’, Bloomberg Law, 27 September 2019. – including automated and artificial intelligence (AI) systems – that risk causing ever greater problems for our rights to privacy (article 8) and equality (article 14) under the European Convention on Human Rights (ECHR).
Evidence heard by the Joint Committee on Human Rights in 2019 suggested that even with the UK General Data Protection Regulation in its current form, individuals’ rights in the UK were not being adequately enforced.2The right to privacy (article 8) and the digital revolution. Third report of session 2019, HC 122/HL Paper 14, 3 November 2019, para 78, page 62. In a survey by the EU Agency for Fundamental Rights (FRA) in early 2020, 34 per cent of companies in the UK said they use technologies that depend on AI, and 20 per cent were planning to use AI.3Getting the future right – artificial intelligence and fundamental rights, FRA, 2020, figure 1, page 26. Furthermore, a recent SS&C Blue Prism report on the adoption of intelligent automation in the public sector found that in 2022, 51 per cent of UK public sector organisations described their use of intelligent automation as either moderate or heavy, compared with eight per cent in 2019.4Adoption of intelligent automation in the public sector, SS&C Blue Prism, 2023, page 4. Public Law Project’s (PLP’s) Tracking Automated Government (TAG) Register identifies a number of these automated tools, which are used by organisations including the Home Office, the Department for Work and Pensions, and local authorities.
Of further concern are recent FRA report findings indicating that many businesses already using or thinking about using AI do not fully know how it affects people’s rights and cannot explain how their algorithms use people’s data.5Technological advances and data protection should go hand-in-hand’, FRA news item, 28 January 2021. At a time when the use of automation is clearly accelerating, and with glaring gaps in the safeguards around AI use, it is more important than ever that steps are taken to protect, not weaken, rights.
However, the following two measures illustrate how the bill will do the opposite:
Lowering the threshold for refusing a data subject’s request for access, erasure or rectification
A request may currently be refused if it is ‘manifestly unfounded’ (Data Protection Act 2018 s53), but the bill (clause 8, but see also clause 34) seeks to change this standard to ‘vexatious or excessive’. Controllers will likely have a wider discretion to refuse requests by data subjects. It may also cause widespread delays, with AWO estimating it may take up to 20 months to resolve even basic breaches of data rights.6Data Protection and Digital Information Bill: impact on data rights, AWO, September 2022; updated March 2023, para 18.
This could directly contravene privacy protections under ECHR article 8 that include requirements for access to one’s data to be ‘effective and accessible’, and that data be provided within a reasonable time frame. Breaches of this right could lead to serious consequences, as the right of access is a gateway right – without knowing what data an organisation holds and how it is being processed, one cannot exercise any other data rights.
Reversing the requirement for human oversight in automated decisions that have legal or other significant effects
This is a concerning change (clause 12 of the bill), as automated decisions carry a higher risk of being discriminatory – for example, because of a system’s design, or if the training data is unrepresentative (which is more likely to be the case for already marginalised groups). Furthermore, the lack of transparency around the ways in which these algorithms make decisions can make it difficult to prove discrimination. This change could therefore lead to serious and discriminatory effects on people’s lives, including disproportionate impacts on marginalised groups, which has implications under ECHR article 14 (right to non-discrimination).7Public Law Project evidence submission on the Data Protection and Digital Information Bill (No 2), May 2023, paras 24–38.
This direction of travel diverges from international consensus and existing rights protection. For example, the Council of Europe’s Modernised Convention for the Protection of Individuals with regard to the Processing of Personal Data includes a right, under article 9(1)(a), ‘not to be subject to a decision significantly affecting him or her based solely on an automated processing of data without having his or her views taken into consideration’. While the UK signed this convention in 2018, it is not clear when – or whether – it plans to ratify it (even though the Information Commissioner’s Office recommended its ratification in early 20218The Information Commissioner’s response to the International Trade Committee Inquiry into Digital Trade and Data, DTD0029, February 2021, para 40.).
Another example is the European Commission proposal for an increased policy and legislative focus on AI in April 2021. The AI Act has been progressing through the EU legislative process,9AI rules: what the European Parliament wants’, European Parliament news, 21 October 2020; updated 20 June 2023. signalling that the EU and its member states intend to move towards stronger data- and AI-focused protections. Following Brexit, the UK will not have to comply with these policies, which could lead to a significant divergence in data protection between the UK and the EU (which could in turn impede data flows between the two10Zach Meyers and Camino Mortera-Martinez, The three deaths of EU-UK data adequacy, Centre for European Reform, 15 November 2021.).
At a time when there is a clear impetus towards strengthening protections elsewhere, the UK should not be diluting data subjects’ rights. Technological advances and protections for data subjects cannot be divorced from one another without a risk of systemic human rights breaches.
 
1     Regulation and legislation lag behind constantly evolving technology’, Bloomberg Law, 27 September 2019. »
2     The right to privacy (article 8) and the digital revolution. Third report of session 2019, HC 122/HL Paper 14, 3 November 2019, para 78, page 62. »
4     Adoption of intelligent automation in the public sector, SS&C Blue Prism, 2023, page 4. »
5     Technological advances and data protection should go hand-in-hand’, FRA news item, 28 January 2021. »
6     Data Protection and Digital Information Bill: impact on data rights, AWO, September 2022; updated March 2023, para 18. »
9     AI rules: what the European Parliament wants’, European Parliament news, 21 October 2020; updated 20 June 2023. »
10     Zach Meyers and Camino Mortera-Martinez, The three deaths of EU-UK data adequacy, Centre for European Reform, 15 November 2021. »