Artificial
intelligence – a cautionary tale |
||
The decision in the case of R (Ayinde) v London Borough of Haringey, has just been published. Although in one sense it is run of the mill, it is also a quite extraordinary case. It was a request for judicial review by a homeless, very ill claimant against an uncooperative local authority. He badly needed a home. Specialists confirmed that he had serious kidney problems requiring invasive procedures for both diagnosis and therapy. As he was homeless, he was at significant risk and his GP was very concerned about the Claimant’s survival. In July 2023 the GP wrote to the Council confirming all of this and saying that there was a high risk of stroke or heart attack. He needed strict blood pressure monitoring. Homelessness created a ten times increase in the risk that the claimant faced. He continued: “With a home he would get the care he needs and be able to get renal angiography leading to potential renal angioplasty...He cannot do without a home”. The Judge in the High Court however found that the Council "ignored all the Claimant’s medical evidence both before and during the course of these proceedings and I find that remarkable and less than ideal.” Such litigation is not though uncommon and so, unfortunately, this is not an unusual case. The defendant Council was hopeless in its conduct of the litigation. It did not put in a defence or comply with other normal procedural requirements for such cases. The judge described the “wholesale breach of court orders” by the Council and so ultimately it was barred from defending the claim. Fortunately however, as the case progressed, the homeless claimant was in fact given accommodation and so was no longer, as the judge described it, “street homeless”. For him, at least, the case had a satisfactory ending. And the merits of the claim were on his side. As the judge said, “The submission was a good one. The medical evidence was strong. The ground was potentially good.” Justice was done. And yet this case is now famous – at least in legal circles - and for all the wrong reasons. Strangely, however what makes this case out of the ordinary is not what the Council did or did not do. It was what the Claimant’s lawyers - the solicitors and the barrister – put in their legal submissions. In essence, the lawyers based a number of their legal submissions on five fake legal cases, including even a supposed Court of Appeal case. All the cases had mundane names and proper-looking citations and looked very much like cases that would exist. But they were all invented. There were no law reports. And what made this all the more extraordinary is that the strength of the Claimant’s case was such that there was no need to fabricate these cases. The High Court judgment makes it clear that the legal points being made were so straightforward that there were real authorities which could have been used. As the judge said in respect of one of the examples “[t]he problem with that paragraph was not the submission that was made, which seems to me to be wholly logical, reasonable and fair in law, it was that the case of Ibrahim does not exist, it was a fake”. Why invent fictional cases when real-life cases existed which would support the same point? We do not actually know. The judge did not need to find out how it had come about. All he needed was to see that the cases relied upon were fakes before making an order penalising the claimant’s lawyers, both the barrister and the solicitors, and requiring notification to the Bar Council and the Solicitors Regulatory Authority respectively for their professional misconduct. The costs which they would otherwise have received from Haringey Council for acting for the Claimant were slashed. The suggestion however is that the lawyers relied on artificial intelligence. In other words, that the fake cases were not actually invented by the lawyers but taken from searches using ChatGPT, or similar, a LLM prone to hallucination. The judge said he did not need to decide whether this was what happened, but it is the most plausible explanation, given all the circumstances. It may also be the least damaging explanation. If the lawyers concerned actually took the time and trouble to create false case references themselves, then the time, effort and degree of fraud involved would mean time in prison for those involved. As it is, to have taken the easy way out by using ChatGPT would show laziness and an unjustified, careless, reliance on the magic of artificial intelligence as a research tool, but not necessarily actual fraud. All this came to light however because, ultimately, when the Claimant’s solicitors were applying for costs against the Council, there was a change of staff at Haringey Council. The person taking over the case, when trying to recover the situation looked, unsuccessfully, for the citations. And when the lawyers for the Claimant were asked for copies of the cases, they tried to cover their tracks. They offered various excuses. The barrister said that she had a box of photocopied case reports relating to this sort of matter, which were also indexed in digital form for copying and pasting into pleadings. As the Judge said, quite how you photocopy a non-existent case report was not something he could understand. The Claimant’s solicitors told the Council that the citations could easily be explained and were merely ‘cosmetic errors’. The Judge said - “I ask rhetorically: is that a professional way forwards for solicitors of the Supreme Court who have produced fake cases, or who have not spotted that counsel has produced fake cases? Then when they are shown that counsel has produced fake cases, they say: “it can be easily explained...They were cosmetic errors“. Well, it has not been explained easily before me. The solicitors hid behind their letters, which I find were unprofessional. These were not cosmetic errors, they were substantive fakes and no proper explanation has been given for putting them into a pleading.” The judge said it would be negligent for a lawyer to use AI and not double-check the results. But checking it for faults presumes that you know what it should be saying and is probably almost as time-consuming as doing the work yourself. Legal research is at the heart of what we do as lawyers and, in doing it, we fix it in our minds and are better able to apply it in different but analogous circumstances. It’s part of our continuing education. If we ask AI to do our work for us, we will not really engage with the law, just skate over the surface. We know that proper legal research is time-consuming and can be expensive. Online legal information services are very expensive or, if free, incomplete in their coverage. Physical law libraries are from a previous era. The Birmingham law Library no longer exists. Small law firms and sole practitioners cannot easily afford to spend their time in extensive research. And clients are reluctant to pay for it because they hold the view that we lawyers should know all of the law anyway. And so, for the unwary, using AI seems the way forward. But, in a case where no injustice was caused to the Claimant and the invented cases made no real difference to the outcome, the Court has nonetheless sanctioned lawyers for the misuse of AI. Using AI in legal research is not only arguably a waste of time, it can also now lead to substantial personal cost to the lawyers and to the possible loss of their livelihood. 11 May 2025 Paul Buckingham |
||
|