Thank your for your subscribe
Oops something went wrong. Please check your entry

Predictive justice: when algorithms pervade the law

To most people, the term “predictive justice” refers to a science fiction short story by Philip K. Dick titled The Minority Report in which precogs predict future crimes. But it also covers a complex reality. In the United States, judges use software to assess a suspect’s likelihood of reoffending. Elsewhere in the world, emerging start-up offer to anticipate litigation outcomes and their potential compensations. Legal Tech offers many advantages (automation of repetitive tasks for lawyers, diversion, reduction of judicial risk, etc.) but this isn’t without risk. Indeed, justice could become sheeplike, unfair and dehumanized.

June 2017
lire en français
lire en français
Lire le résumé

To most people, the term “predictive justice” refers to a science fiction short story by Philip K. Dick titled The Minority Report in which precogs predict future crimes. But it also covers a complex reality. In the United States, judges use software to assess a suspect’s likelihood of reoffending. Elsewhere in the world, emerging start-up offer to anticipate litigation outcomes and their potential compensations. Legal Tech offers many advantages (automation of repetitive tasks for lawyers, diversion, reduction of judicial risk, etc.) but this isn’t without risk. Indeed, justice could become sheeplike, unfair and dehumanized.

Are algorithms compatible with justice? The American citizen Eric Loomis believes that they aren’t. He was sentenced to six years in prison for eluding the police in a stolen car. The court in Wisconsin based its judgment, in part, on the high risk of recidivism calculated by a software named Compas. Eric Loomis believes he didn’t receive a fair trial, arguing that he wasn’t granted access to the software’s algorithm. His claim was dismissed by the Supreme Court of the State of Wisconsin and he now filed an appeal to the Supreme Court of the United States.

Digital tools

Welcome to the world of “predictive justice.” Several American States use software programs such as Compas to decide whether a suspect should be incarcerated before trial or not, or to assess the likelihood of recidivism, which may alter the sentence. In the United Kingdom, the Durham police will soon be equipped with a similar software called Harm Assessment Risk Tool (Hart), a program that determines whether a suspect should be held in pre-trial detention or not. Developed with the University of Cambridge, it takes into account approximately thirty different factors. In France, there is no question at the moment of implementing this kind of tools into criminal law, but start-up such as PredicticeSupra Legem or Case Law Analytics are offering solutions to anticipate the chances of success of a case and the amounts of compensation in civil proceedings.

The use of digital tools by courts is nothing new. The digital has made its entry in courts for a good fifteen years. This “cyberjustice,” well described in a recent report by the European Commission for the Efficiency of Justice (CEJEP), includes facilitating access to justice, improving communication between courts and legal professionals, assisting the judge or administering courts. But cyberjustice must be distinguished from predictive justice, which appeared more recently at the crossroads of artificial intelligence, big data and open data.

Indeed, thanks to increasing progress in artificial intelligence, to machines capable of processing an increasingly large amounts of data and to public data disclosure policies, Legal Tech startups have entered the market of justice with a double promise: to facilitate the work of legal professionals and reduce legal uncertainty. “The idea is to annihilate legal research,” admitted Nicolas Bustamante, president of during a symposium on predictive justice at the Catholic University of Lille. “Our goal is to promote the right for artificial intelligence to automate repetitive tasks so that lawyers can focus on their first added value: client advising and legal inventiveness.”

Long hours spent digging into jurisprudence are over: Legal Tech offers dedicated legal search engines (CasetextRoss, etc.). These startups aim to become more than simple legal databases. Their algorithms allow to do research in natural language and adapt to the user by learning from their research. By transforming jurisprudence into data, some Legal Tech tools even offer a statistical analysis that can be used by lawyers to assess the likelihood of success in a particular case, to determine the range of compensation awarded in similar cases in the past, or even to adapt their strategy by using arguments that are known to be convincing. For this reason, these tools aim to reduce legal uncertainty. Louis Larret-Chahine, one of the founders of Predictice (a software currently tested by the Courts of Appeal of Rennes and Douai and by the Lille Bar), promises to “put an end to this justice that was unpredictable, random and disparate over the country and move towards something a bit more logical, scientific or, at the very least, a little more controllable.”

The question ultimately boils down to whether justice should be predictable. “The predictive function isn’t new in itself: it is an integral part of the very nature of the law which makes social relations predictable,” recalls Antoine Garapon, Secretary General of the Institute of Higher Studies on Justice, Paris. But “a rule isn’t predictable if one doesn’t know the rule of application of this rule. These second-rank rules are much more difficult to find and formalize than the first ones. That’s why they provide a margin of appreciation to lawyers,” according to the magistrate. If law is intended to provide a certain degree of predictability, justice must be dispensed on a case-by-case basis.

Nevertheless, “predicting court decisions has always been the goal of every lawyer and academic consultant. Those who turn to one or the other of these actors expect from them, with more or less hope, a prescience of jurisprudence,” underlines Professor Bruno Dondero. By allowing them to do so more reliably and systematically, these programs tend to resolve conflicts by other means than in front of a judge. Indeed, if both parties know very precisely what they can hope for when initiating a contentious action, they are likely to opt for an alternative settlement. “Legal protection insurers were the first players to read correctly this new situation and they are today among our best customers. They use this tool to discourage their clients from seeking litigation,” as reported by Louis Larret-Chahine during the Lille conference.


Legal diversion has obvious advantages: decreased congestion, speed of proceedings, lower cost. The exercise of the profession of lawyer is likely to be quite profoundly altered. The latter “will no longer serve the unique interests of a party but, together with their colleague, strive to find a reasonable solution in the common interest of both parties,” predicted Stéphane Dhonte, president of the Bar Council of Lille. Given that half of the decisions handed down each year by the European Court of Human Rights (ECHR) concern procedures that are too slow at a national level, diversion is more than welcome, provided of course that it is in the interests of the litigant, and not only those of legal professionals. Some, like Antoine Garapon, will regret an evolution that “transforms law practitioners into auxiliaries of economic strategies and sees in the judgment the sign of a failure of a reasonable and modern regulation of disputes”.

From the rule of law to the norm of application?

One of the main risks posed by predictive justice is the substitution of the rule of law by the norm of application. “These software, like the common law, will highlight the rule of judicial precedent. But they must never erase the rule written by the legislator. Otherwise, we face the risk of reversing our hierarchy of standards,” warns Stéphane Dhonte. This is why these software programs are generally more developed in Anglo-Saxon countries (United States, United Kingdom) than in continental law countries. But whatever the system, it is problematic to consider legal data, the primary characteristics of a dispute and its contextual elements, on the same level. “For big data, law and jurisprudence are simple facts: to the same level as the specifics of a file or the temperament of a judge,” regrets Antoine Garapon.

Another major risk is that of herd effect. Tools provide an in-depth analysis of previous case law. A judge will be able to observe, for example, that 90% of his colleagues have taken the same decision in similar cases. He will feel a pressure to do the same or feel relieved of the responsibility of having to take a personal decision by following the majority. The CEPEJ is concerned that “with this type of device, the judge’s ruling may be just as well consolidated or biased by overdetermination or inertia effects”. The judge’s independence and freedom can therefore be threatened by software that could push to group conformism. “Judicial decision-making tools must be designed and perceived as an auxiliary aid to judicial decision-making, facilitating its work, and not as a constraint”, advocates the CEPEJ. “Respect for the principle of independence requires that everyone can, and therefore should, take a personal decision as a result of a reasoning that they must be able to assume in their personal capacity, regardless of the computer tool.” The judge must remain in control of the procedure at all levels.

This committee attached to the Council of Europe also recommends that attention be paid to the nature and use of data when it is not strictly related to jurisprudence, warning, for example, against the risk of using the identity of judges for profiling purposes. Will judges be rated like restaurants or hotels? In the United States, the Lex Machina startup, specializing in intellectual property business, and bought in 2015 by LexisNexis, is already providing statistics on court decisions based on the judge, as well as information on lawyers of the opposing party. This could lead to the development of forum shopping: the applicant will choose a jurisdiction based on procedural or substantive aspects of the court.

It is all the more important that judges remain in charge of the procedure that algorithms are not always transparent and that the question arises as to the quality of the data they are based on. Although Northpointe refuses to reveal its Compas algorithm, an extensive investigation led by the American site ProPublica shows that the skin color of suspects gives different results. The analysis of Compas made by ProPublica showed that “the formula was more likely to falsely flag black defendants as future criminals, whereas white defendants were more likely mislabeled as low risk.” The software reproduces the discriminations already at work within the country.

Isn’t there a real risk that conservative algorithms based on jurisprudence block the evolution and improvement of the law? In other words, should the past dictate the future? These algorithms work based on previous situations: it is a true limitation. If judges and lawyers allow themselves to be guided by the results of these programs, this may slow down the adaptation of law to society. However, for Yannick Meneceur, a magistrate seconded to the Council of Europe, the quick change of legal rules constitutes per se a technical limitation for algorithms. Legal rules don’t evolve in a linear manner, as in hard sciences where new rules complement the older one, without necessarily invalidating them. In systems of continental law, if the fundamental text changes, all jurisprudence based on this text will be discarded.

Hence, predictive justice must be approached following several rules. First, it should be transparent. “If predictive justice doesn’t want to end up being considered as a divinatory art, as mysterious and intimidating as the ancient oracles, it must disclose its algorithms instead of hiding behind trade secret,” warns Antoine Garapon. Secondly, the so-called neutrality of algorithms shouldn’t fool us. Algorithms are partly mirror the data that feed them, as illustrated by the corruption of Microsoft’s AI chatbot on Twitter. Barely 24 hours after its launch, the AI started tweeting racist comments...

It is also necessary to ensure that procedural fairness and adversarial debate are respected. If only a handful of powerful firms have access to these digital tools, parties may find themselves in unequal terms when facing a judge. “The adversarial principle and the equality of terms must be guaranteed in the same way as in non-digitized procedures with regard to the technological tools made available or used by the different parties on their own initiative during a trial”, according to the CEPEJ.

Finally, these technologies should never dehumanize justice. “We have a major responsibility,” warned Stéphane Dhonte in Lille. “These software of the future can prove excellent tools for decision-making, whether judicial or amicable, but they shouldn’t divert us from our common objectives: quality, human and individualized justice.” Ideally, lawyers freed from repetitive tasks should be able to use the extra time available to better serve their clients, while the judge, who will process fewer cases, will be able to spend more time on those that will be submitted to him. “Legal Tech offers more time to focus on each individual case, believes Bruno Cathala, first president of the Court of Appeal of Douai. Some are less optimistic: “I am unconvinced by the idea that extra time will allow us to judge better,” says Antoine Garapon. He further observes: “The more tools, instruments or techniques available to free our time, the more you dedicate to tasks that densify it and the less time you have...”