Authors:John Walsh
Created:2020-08-14
Last updated:2023-11-07
The dangers of digital
.
.
.
Marc Bloomfield
John Walsh considers some of the risks for access to justice of the rise in remote hearings during the pandemic and the use of algorithms in judicial decision-making.
The impact of COVID-19 has led to the adoption of remote hearings across many courts and tribunals in place of conventional courts where judges, participants and the public convene in one place. This has huge implications for practitioners and judges, but equally for parties to litigation, and for the efficacy of the judicial process as a whole.
The physical courtroom is embedded in the UK as a cardinal feature of open justice. The courthouse serves as a public space for the dispensation of justice. Remote hearings, already underway, do not fit easily with the demands of open justice. If there is no dedicated courtroom, to what can the public have easy access? How can remote hearings be public hearings? How can the habit of journalists dipping into and out of courts to report what is interesting and ignore the remainder be maintained? The effect of Coronavirus Act 2020 s55 and Sch 25 and Civil Procedure Rules 1998 Practice Direction 51Y is that remote hearings are deemed public hearings where the proceedings are livestreamed and where journalists can log into the hearings.
The problems with remote hearings
The prospect of remote hearings being rolled out presents severe challenges to the rule of law and the ability of the judiciary and representatives to provide an acceptable service. A number of practical issues arise. First, interests of confidentiality require that judges and representatives are based in a secure and safe place. It is too easily assumed that the representative or even the judge can engage fully from their sitting room or library in their home. It cannot be assumed that such facilities are available to all. Many of the participants have families and dependants with whom they live, and the expectation of an exclusive space for them to occupy and use cannot be taken for granted.
Second, the principle of open justice will be compromised, as will the freedom to access the courts and the interests of transparency. The challenges become more acute if an appellant or witness is expected to give evidence remotely with or without the assistance of an interpreter.
Recent research on the impact of remote hearings reports on positive experiences from professional participants while acknowledging difficulties including the sort identified above.1Dr Natalie Byrom, Sarah Beardon and Dr Abby Kendrick, The impact of COVID-19 measures on the civil justice system, Civil Justice Council/The Legal Education Foundation, May 2020, and Remote hearings in the family justice system: a rapid consultation, Nuffield Family Justice Observatory, May 2020. The outcome is that remote hearings are here to stay and will be a growing feature of the court system. Features that emerge from research are the experience of alienation from the process on the part of the lay parties and the disadvantages associated with video communication in contrast to face-to-face communication.
Remote hearings, by definition, are more bureaucratic with even greater focus on the professional participants, the lawyers and the judge. Further, research shows that spoken communication only accounts for seven per cent of meaning transferred in an interaction, with the tone of voice accounting for 38 per cent and body language accounting for 55 per cent. Non-verbal communication is a significant part of the message made redundant by remote hearings.2Albert Mehrabian, Non-verbal communication, Transaction Publishers, 1972; Routledge, 2017.
Remote hearings on a limited scale have been in use in the UK immigration courts since around 2000 and in the US since 1996.3Ingrid V Eagly, ‘Remote adjudication in immigration’, Northwestern University Law Review, vol 109 no 4, 2015, page 945. In the UK, they have been restricted to bail applications where the applicant is connected by video link to the courtroom where the immigration judge sits, accompanied by a Home Office representative and the applicant’s representative. An interpreter, if needed, is present in the courtroom, as are would-be sureties for bail. The quality of the connection varies, as does the suitability of the video suites in the detention centres.
In-depth research has been carried out on remote immigration hearings in the US, where they are deployed routinely.4‘Remote adjudication in immigration’, page 933. Around one-third of all immigration detainees there have their hearings against deportation or removal considered remotely. The judge, representatives and court personnel gather in the courtroom while the detainee in the detention centre is connected by video. The research found that judges were more inclined to reject the appeals of remote detainees than in person detainees.5‘Remote adjudication in immigration’, figure 10, page 966. The reason for this, the research found, was not a predisposition of judges to dismiss such appeals, but rather that remote detainees were less likely to engage in the process. A feeling of isolation on their part was accompanied by a hopelessness and scepticism that their case was being taken seriously. An Australian study in 2013 suggested that the use of video hearings has a dehumanising effect on participants.6Emma Rowden et al, Gateways to justice: design and operational guidelines for remote participation in court proceedings, University of Western Sydney, 2013.
Research carried out in respect of remote immigration hearings in England and Wales found that bail was granted to ‘in-person’ applicants more frequently than remote applicants (40 per cent as against 26 per cent for one period and 70 per cent as against 39 per cent for a later period).7Immigration bail hearings: a travesty of justice? Observations from the public gallery, Campaign to Close Campsfield, 2011, table 1, page 25, and Still a travesty: justice in immigration bail hearings: second report from the Bail Observation Project, Campaign to Close Campsfield, 2013, table 5, page 33. Lord Wilson, in R (Kiarie and Byndloss) v Secretary of State for the Home Department [2017] UKSC 42, commented (at para 67):
There is no doubt that, in the context of many appeals against immigration decisions, live evidence on screen is not as satisfactory as live evidence given in person from the witness box.
Algorithms and artificial intelligence
Remote court hearings are only one of a number of options opened up because of technological advances. In the end, remote hearings involve only a different way of transmitting evidence and representations. They involve the judge exclusively making decisions on the basis of evidence and representations, however presented. The analysis of the evidence remains in the hands of the judge, aided by representatives' arguments, again however presented. The use of artificial intelligence (AI) – the development of systems that simulate human intelligence – for the resolution of disputes is a very different creature from what occurs in a remote hearing.
AI has no place in court hearings at present. While it is making major inroads in the fields of the economy, public services, medicine and education, the courts have been immune so far from its impact. However, its impact on decision-makers in general is raising important issues. In May 2019, OECD member countries adopted the OECD Principles on Artificial Intelligence where AI systems are deployed, including transparency in the rule of law.
The immediate difficulty facing the application of the principles is that AI systems are being developed by organisations on a business model. Investors in the emerging systems will wish to keep secret the programs that drive them. When someone seeks to understand the reasons for a particular result – for example, they seek to review or appeal the decision – they will not be able to access the processes used, and will thus be unable to understand the reasoning behind the decision. An individual decision-maker can be interrogated as to the reasons for the decision; not so a computer programmer, who will not disclose the programs. The reasoning will, by definition, be opaque. The rule of law will be threatened.
For several years, Durham Constabulary and computer science academics have been developing the Harm Assessment Risk Tool (HART),8Matt Burgess, ‘UK police are using AI to inform custodial decisions – but it could be discriminating against the poor’, Wired, 1 March 2018. designed to predict whether suspects are at a low, moderate or high risk of committing further crimes in a two-year period. The algorithm is one of the first to be used by police forces in the UK. It does not decide whether suspects should be kept in custody, but it is intended to help police officers decide if a person should be referred to a rehabilitation programme.
The Law Society produced a detailed report in 2019 that examined the widespread use of algorithms by police forces and the prison service throughout England.9Algorithms in the criminal justice system, The Law Society, June 2019. It points out that facial recognition systems, DNA profiling, predictive crime mapping and mobile phone data extraction are all examples of algorithmic systems currently in use in the criminal justice system. An uncritical and unexplained use of algorithms has serious implications for fundamental human rights and the integrity of the justice system. Within the right framework, however, algorithmic systems can deliver a range of benefits, such as efficiency, efficacy, auditability and consistency, the report concluded.
A ‘streaming algorithm’ used to process visa applications has recently been withdrawn by the Home Office in the face of a judicial review challenge by the Joint Council for the Welfare of Immigrants and others that asserts that the algorithm is racist in its impact.10Henry McDonald, ‘Home Office to scrap “racist algorithm” for UK visa applicants’, Guardian, 4 August 2020. In R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058, the Court of Appeal very recently held that the widespread use of automated facial recognition (AFR) software to capture facial images at major events, such as football matches, to be processed with other information by way of algorithms, was a breach of the private life rights of the claimant under article 8(1) of the European Convention on Human Rights as the practice of AFR did not have a sufficient legal basis, and was in breach of the Equality Act 2010 and the Data Protection Act 2018.
A decision generated by AI lacks a human element, with an absence of flexibility or rule of manipulation, and is not able to respond to concepts such as fairness and justice. These are highly flexible concepts and it is doubtful if their flexibility could be replaced in a computer program. Lord Sales, in the Sir Henry Brooke Lecture for BAILII (12 November 2019), Algorithms, artificial intelligence and the law, stated (following consideration of the challenges posed by AI):
Underlying all these challenges are a series of inter-connected problems regarding (i) the lack of knowledge, understanding and expertise on the part of lawyers (I speak for myself, but I am not alone), and on the part of society generally; (ii) unwillingness on the part of programming entities, mainly for commercial reasons, to disclose the program coding they have used, so that even with technical expertise it is difficult to dissect what has happened and is happening; and (iii) a certain rigidity at the point of the interaction of coding and law, or rather where coding takes the place of law (page 5).
Summary
The impact of technological change on the justice system continues to grow. The most recent example of this has been the widespread use of remote hearings in response to the pandemic. They have attracted many positive comments and are here to stay as a feature of our judicial system. However, there are serious limitations to their proper effectiveness and they present a significant challenge to open justice. There is a real risk that the lay client’s involvement in a remote hearing will be significantly diminished, leading to an increasing alienation from the process.
However, they represent only one aspect of the use of technology. While AI will not likely take the place of judges in the courtroom, it will play an ever-increasing role in primary decision-making in public and private organisations. The onward march of algorithms and AI into decision-making in general, including the judicial process, carries the danger of accentuating a sense among users of anomie in the courts, a feeling of being unable to control or even influence decisions bearing on them.
 
1     Dr Natalie Byrom, Sarah Beardon and Dr Abby Kendrick, The impact of COVID-19 measures on the civil justice system, Civil Justice Council/The Legal Education Foundation, May 2020, and Remote hearings in the family justice system: a rapid consultation, Nuffield Family Justice Observatory, May 2020. »
2     Albert Mehrabian, Non-verbal communication, Transaction Publishers, 1972; Routledge, 2017. »
3     Ingrid V Eagly, ‘Remote adjudication in immigration’, Northwestern University Law Review, vol 109 no 4, 2015, page 945. »
4     ‘Remote adjudication in immigration’, page 933. »
5     ‘Remote adjudication in immigration’, figure 10, page 966. »
6     Emma Rowden et al, Gateways to justice: design and operational guidelines for remote participation in court proceedings, University of Western Sydney, 2013. »
7     Immigration bail hearings: a travesty of justice? Observations from the public gallery, Campaign to Close Campsfield, 2011, table 1, page 25, and Still a travesty: justice in immigration bail hearings: second report from the Bail Observation Project, Campaign to Close Campsfield, 2013, table 5, page 33. »
9     Algorithms in the criminal justice system, The Law Society, June 2019. »
10     Henry McDonald, ‘Home Office to scrap “racist algorithm” for UK visa applicants’, Guardian, 4 August 2020. »