• David C. Donald

Be watchful as decision-making is translated into algorithms


Members of the legal profession know quite a lot about the methods and dangers of delegating decision-making power and the controls that are effective against abuse of this power. For centuries, lawyers have concerned themselves with placing checks and balances on the decisions of those exercising power. During the 20th century, lawyers also began to concern themselves with expert or scientific decision-making in administrative bodies with power delegated from elected government. Similar problems were considered when duties were developed and applied to corporate directors making decisions in large enterprises. A principal aim of supervising decision-making has been to reign in arbitrary discretion by subjecting it to rule of law, the control of an agreed regulation or an approved system of regular procedures.

Behavioral psychology has shown that so much arbitrary nonsense finds its way into spontaneous human decision-making that regulatory frameworks of control appear urgently needed. Legal scholarship has shifted toward quantitative research so that we now have a deep and steadily increasing body of empirical data on which to model optimal decision-making. The step from such models to an algorithm is short. Such translation of decisions into algorithms was pioneered in the birth of algorithmic trading, which now constitutes the majority of developed market activity.

Advanced machine learning means that computers can adjust their algorithms to evolving circumstances by autonomously adapting the decision-making-process-frozen-in-algorithm to new situations, so that something very similar to human reasoning appears (in automated form).

The great advantage of this will be the elimination of ugly humanity: algorithmic trading replaced the testosterone-driven bouts of exuberance and panic found in the fraternity binges of earlier markets. Perhaps algorithmic decision-making can replace class and race prejudice in human regulators, taking with it the uncertainty arising from whether a human decision-maker sees someone like himself in the defendant or has just eaten some sugar (or rather needs some).

The great weakness of algorithmically packaged decision-making is opacity. The ‘black box’ makes the decision, but we do not see what original command was injected into the box, or how the box itself was designed. When a robot refuses our payment for lack of a ZIP code or a vending machine rejects our crumpled dollar we cannot demand to speak with its programmer. The algorithm designed so carefully at a distant point in the production process now operates on a vast scale at such speed that its hardwired prejudices can become “weapons of math destruction,” as Cathy O'Neil calls them. The actual decision is many steps removed from the process of reasoning that eventually found its way into the algorithm. Layers of camouflage and an aura of mechanical perfection separate what might have been a very arbitrary decision from those suffering it.

As more and more of the decisions made in our legal systems are translated into algorithms, the risk of injustice inflicted at a great distance and on a grand scale increases. The cost savings, convenience, and (to many) welcome escape from interacting with unpleasant humans that such algorithmic decision-making offers are difficult to contradict. Such systems will quickly gather very considerable momentum.

Lawyers should view algorithms as just another procedure for supervised decision-making, and they should pay close attention to them. It is crucial that people who are very good at evaluating the objectivity of decisions and the regulations that work to ensure them, including average lawyers and judges, but also jurisprudence scholars, philosophers of law, human rights experts and constitutional scholars, get involved in the design of important algorithms at the very beginning. There is nothing ‘luddite’ about wanting justice, even if the process is automated. Once the algorithm is in action, it will be more difficult to unwind.

David Donald, Hong Kong


Copyright © 2018 All Rights Reserved. Faculty of Law, The Chinese University of Hong Kong

The Chinese University of Hong Kong