Skip to content

Cass R. Sunstein, Governing by Algorithm? No Noise and (Potentially) Less Bias, 71 Duke L. J. 1175 (2022).


Abstract: As intuitive statisticians, human beings suffer from identifiable biases, cognitive and otherwise. Human beings can also be “noisy,” in the sense that their judgments show unwanted variability. As a result, public institutions, including those that consist of administrative prosecutors and adjudicators, can be biased, noisy, or both. Both bias and noise produce errors. Algorithms eliminate noise, and that is important; to the extent that they do so, they prevent unequal treatment and reduce errors. In addition, algorithms do not use mental short-cuts; they rely on statistical predictors, which means that they can counteract or even eliminate cognitive biases. At the same time, the use of algorithms, by administrative agencies, raises many legitimate questions and doubts. Among other things, they can encode or perpetuate discrimination, perhaps because their inputs are based on discrimination, perhaps because what they are asked to predict is infected by discrimination. But if the goal is to eliminate discrimination, properly constructed algorithms nonetheless have a great deal of promise for administrative agencies.