Review: Nicholas Heronon Lorraine Daston

Critique of Pure Mindlessness

In the 1969 postscript to The Structure of Scientific Revolutions, Thomas Kuhn distinguished two different senses in which ‘paradigm’, the technical term his book popularised, had been used. On the one hand, a paradigm denoted the ‘entire constellation of beliefs, values, techniques, and so on shared by the members of a given community’. In this first sense, paradigm was employed sociologically, an application Kuhn regretted in retrospect. ‘Disciplinary matrix’ became his preferred locution for the shared commitments defining a specific scientific community. 

On the other hand (and in a stricter sense), paradigm denoted only one element in that constellation: a model or an example that could ‘replace explicit rules as a basis for the solution of the remaining puzzles of normal science’. In this second, ‘deeper’ understanding of the term, paradigms were equated with the concrete problem-solutions (the example Kuhn gives is Newton’s Second Law of Motion, typically written f = ma) enshrined in a scientific community’s textbooks, lectures and laboratory exercises that scientists learned to apply in their research. 

Identification of paradigms in this second sense constituted the most novel and least understood contribution of his book, Kuhn claimed. Scientists, he observed, learn to solve new problems by modelling them on previous problem-solutions. They learn, that is, to see new problems as analogous to old ones, and to solve the former by applying the same schema that had worked in the latter. Scientific learning thus proceeded by means of the application of shared exemplars, and not on the basis of a set of rules secured in advance through scholarly consensus. Paradigms guide scientific research, Kuhn argued, even in the absence of any set of rules that would find expression in them.  

The use of paradigms entailed thus a special kind of knowledge, one that could be acquired only through practice. For his critics, this was no knowledge at all. But Kuhn insisted that this knowledge without rules met almost all the criteria of knowledge as conventionally understood. It was transmitted via education and acquired through training. It had won out over historical rivals in determining its group’s current environment. It was subject to change or modification when no longer fit for purpose.  

Yet in one crucial respect it was quite different, Kuhn conceded. Its exponents possessed no access to what they knew – there were no rules or generalisations, abstracted from the exemplars themselves, with which to express this knowledge. For Kuhn’s critics, this was tantamount to making individual intuition the basis of scientific knowledge. It exposed his book to charges of glorifying subjectivity or, even worse, irrationality. Nonetheless both took for granted the existence of a fundamental opposition between paradigms and rules. 

It is the apparent self-evidence of this opposition that historian of science Lorraine Daston seeks to unsettle in her fascinating recent study, Rules: A Short History of What We Live By. Approaching the problem historically, Daston argues that, until relatively recently, paradigm was in fact one of rule’s prevailing meanings. Thus, as late as the mid-eighteenth century, the first entry for ‘Règle’ in the Enlightenment Encyclopédie could recall that there are situations where rule and model can be employed interchangeably, as for example in ‘the life of Our Lord is the rule or model for Christians’.  

This furnishes the platform for her double intervention. Firstly, she seeks to recover the ‘lost coherence’ of the pre-modern category of rule in European culture, which for centuries accommodated meanings – rule-as-model and rule-as-algorithm – subsequently deemed antithetical to one another. Secondly, she attempts to trace the progressive narrowing of the meaning of rules, whereby algorithms not only dethroned models as the pre-eminent rules but simultaneously caused the latter to appear impervious to rational scrutiny. It is this second move that gives Daston’s analysis its critical edge. For it enables her to inscribe her history of rules in the broader context of a history of rationality. 


Arundo donax is a giant perennial cane that has grown in the wetlands of the Mediterranean for millennia. Kanon, the ancient Greek word for ‘rule’, derives from the Semitic name for this plant, whose large, unbending stalks were used in the antique world as measuring rods. Three distinct semantic clusters, Daston argues, branched from this ‘reedy stem’. Since Greco-Roman antiquity, rules have figured, alternately and simultaneously, as: (i) tools of measurement and computation, (ii) models to be emulated, and (iii) laws that divide and demarcate. The first and third of these meanings remain instantly recognisable to denizens of the twenty-first century. But the second strikes us as odd, Daston contends; as Kuhn’s attempt to define his paradigms as knowledge without rules shows, models have come to seem the very antithesis of rules. How to account for this extraordinary transformation? 

At first glance, no body of rules appears more severe and punctiliously observed than those in a Benedictine monastery. When to rise in the morning to pray, what time to gather for meals, what to eat and drink on which occasion, and how much: every aspect of monastic existence, down to the smallest, most insignificant detail, is governed according to the rules meticulously recorded in the seventy-three chapters of the Rule of Saint Benedict. ‘If micro-mangers have a patron saint,’ Daston observes, ‘it is surely Saint Benedict.’ 

But the Benedictine rule also has its living embodiment in the exceptional person of the abbot, Christ’s representative in the monastery, who is continuously enjoined to adapt its rigour to the specificity of circumstances. No sooner are the precepts promulgated than their exceptions multiply. For example, chapter 40 states that one cup of wine per day should be sufficient for each monk. But then it immediately adds: ‘if the circumstances of the place, or their work, or the heat of summer require more, let the superior be free to grant it.’ The rule refers to monastic life in its totality, not to each individual prescription. The abbot does not merely enforce the rule; he exemplifies it through exercising his discretion (discretio). It is in this respect that he furnishes a model or rule to be emulated. Through emulation, the brethren too learn to use discretion, applying judgement to translate their experience to new cases – much like Kuhn’s trainee scientists – by analogy. 

It is this discretionary power that will come to be excluded from the ambit of modern rules, Daston argues. In a sense, the very aim of modern rules, she observes, is to minimise discretion, understood now as arbitrary whim or caprice. Yet the abbot’s discretion was anything but arbitrary; it involved a kind of practical wisdom – a technē or an ars, to invoke the ancient philosophical register – that knows how to adjust the general rule to the particular case. It formed an integral part of the rule, not a departure from it.  

Pre-modern rules thus included their exceptions. As distinct from their modern counterparts, they were ‘thick’, in Daston’s apposite expression. Although still formulated in the imperative mood, they were routinely embellished with qualifications, caveats, and provisos. They could never be so thick as to replace experience. But nor could they be so thin as to obviate the need for reflection. Instead, they shuttled back and forth between abstract theory and concrete application. Even the rules of early modern mathematics, such as the Rule of Three, which used cross-multiplication to determine the value of an unknown variable, necessitated working through examples. Simple formulae including two similars and one contrary (‘if 3 Avignon florins are worth 2 royal francs, how much would 20 Avignon florins be worth?’ is the fifteenth-century example Daston cites) could be generalised from one case to another. Early modern mathematics too required tallying knowledge with experience, reflection with practice. 

Initially, Daston claims, the indeterminate character of the pre-modern rule also held for what would in time become the modern rule par excellence: the algorithm. As with pre-modern rules more generally, employing algorithms for the purposes of calculation entailed coordinating head with hand, knowing with doing. Even algorithms, the pre-eminently thin rules, were originally thick. Typically amalgams of other algorithms, they were comprised of the memorisation of techniques, acquired through practice and accumulated layer by layer. Sedimented in their composite form was a movement from problem to solution and back again. Only through repeated application to particulars did they acquire their generality. In their combination of species and genera, Daston claims, they resembled nothing so much as the taxonomies of natural history. 

In the history of rules, what made the algorithm the site where head and hand finally separated was the discovery of the principle of the division of labour. In his 1776 Inquiry into the Nature and Causes of the Wealth of Nations, Adam Smith famously used the ‘trifling’ example of pin manufacture to illustrate the great benefits of this principle. Were all the constituent parts of a pin to be made by just one man, Smith observed, he could scarcely produce a single item per day. But when his trade is divided into the eighteen different operations required, and these are each performed by distinct hands, the daily rate of production, he estimated, could increase to as much as forty-eight thousand.  

Smith limited his demonstration of the ‘universal opulence’ resulting from this principle to instances of physical work. Later proponents of the idea, such as the English polymath Charles Babbage (1791-1871), extended its application to mental operations as well, specifically to processes of calculation and computation. When broken down according to the division of labour, basic calculation and computation could be mechanised, Babbage and others quickly realised. That is, they could be performed by so-called mechanicals, the lowest class of manual labourers, who previously toiled with their hands alone. Indeed, experiments soon showed that it was the least mathematically proficient workers who made the fewest mistakes. 

The algorithms mechanicals learned to handle no longer required the exercise of analogical reasoning. The problems to which they could be applied were fixed and specified down to the last detail. The context for their use was sealed and hence immured from any outside variables. The equations they contained demanded only the most elementary arithmetic of addition and subtraction. It was the division of labour, and not the use of machines (the development of which, Smith thought, this principle encouraged), that enabled the algorithms themselves to become merely mechanical. In the age of artificial intelligence, we are accustomed to thinking in terms of sentient machines. But a necessary, though by no means sufficient, condition for this ulterior transformation was making the hitherto mindful tasks algorithms performed mindless. 


Nine years prior to her history of rules, Daston co-authored a study with five historians and social scientists titled, How Reason Almost Lost Its Mind: The Strange Career of Cold War Rationality. It positioned the Cold War period as an important chapter in the longer history of reason. For millennia, the authors explain, reason and rationality had been understood to be roughly synonymous with one another, or at least to refer to functionally ordered capacities, with rationality cast in the subservient role of instrument relative to the higher intellectual faculty of reason. The specific achievement of Cold War rationality was not only to have opened a gap between the two, but to have inverted the usual hierarchy of relations. The great dream of the social scientists and military strategists who form the subject of this book was to liberate a pure rationality that would operate independently of reason.  

Rules play a significant part in this story. In the atomic age, it was argued, decisions had become too consequential to be left to the determination of fallible human reason alone. What was needed was a kind of formalisation of decision-making – one that, at the limit, could extinguish the rogue individual human element altogether. This entailed reconceiving human beings as rational agents whose behaviour could be accurately predicted insofar as it could be modelled according to a strictly delimited set of rules. And the prototypical rule in these accounts, since it admitted of no ambiguity, was the algorithm.  

For centuries restricted to the domain of arithmetic, the ‘humble’ algorithm suddenly burst its bounds to assume centre stage in this vast reconfiguration of knowledge. Although often allied to conflicting research programs, the participants in the debates over Cold War rationality – the economists, sociologists, cognitive scientists, behavioural psychologists and games theorists – were unified by their shared conviction that rationality could be specified in terms of algorithmic rules. 

What made this modest tool of calculation a candidate for redefining rationality was the very feature that would have immediately disqualified it in the eyes of its early modern proselytisers. For these thinkers, the fact that mindless machines, such as Babbage’s difference and analytical engines, could be programmed to perform astonishing feats of calculation made the algorithm the most unlikely avatar of higher intelligence. Reason was still the exclusive privilege of the machine’s maker – certainly not of the machine itself, even less of the rules that enabled its functioning. Consistent with the progressive social philosophies of the period, the division of cognitive labour, in its early formulations, aimed to free the most accomplished mathematical minds for pursuits more elevated than the humdrum work of computing. 

But Cold War rationalists had their utopian visions as well. In an increasingly turbulent world, with the most violent conflict in human history in recent memory, abstract formal models such as the ubiquitous prisoner’s dilemma were applied to a wide range of real-world situations, with the aim of reducing the complex of choices and decisions available to rational actors to a finite ensemble of rules. For these thinkers, it was the rules themselves, independently of any ratiocination, which had become the basis of rationality. With this subtle shift in perspective, enacted simultaneously across multiple disciplinary fields and extending to the highest levels of public administration, order could be restored to a chaotic environment. An unruly world could be made ‘rulier’. 

One of the principal aims of Daston’s study is to show that this transformation could not have taken place without a significant narrowing of the meaning of rules. The thick and flexible rules of the pre-modern world, which included its exceptions, first had to give way to the thin and rigid rules of its modern counterpart, which tolerated none.  

Yet there was nothing inevitable about this transformation, she contends. It was not the consequence of some inexorable logic of modernity; indeed, the modernisation paradigm was itself one of Cold War rationality’s most pervasive tropes. Whether general or specific in scope and application, algorithmic rules require a fixed and stable frame of reference to guarantee their validity. And such uniformity could only be the contingent and provisional result of a coordinated effort of technological foresight and political will. From the regulation of traffic through to the standardisation of orthography, considerable energy and resources were expended to make a world governable by thin rules. 

The resulting stability was purchased at significant cost, according to Daston. Arguably the book’s most penetrating shadow theme is the demotion of judgement and discretion concomitant with the ascent of rules. In his Kritik der Urteilskraft [Critique of the Power of Judgement] of 1790, Immanuel Kant could still frame his definition of artistic genius in terms of a hierarchical distinction between rule and rules. Genius, he wrote, is the talent that ‘gives the rule to art’. That is, it is a talent for producing that for which no preceding rule can be given. Yet for precisely this reason, he continued, its products must serve as ‘models’, i.e. ‘standards or rules’, against which others may be judged. In this respect, it differs in kind (and not merely in degree) from scientific erudition or technical expertise, because the latter can be learned by rules.  

The new precedence assumed by rules in modern societies upended this hierarchy completely. It is not just that the algorithm has replaced the model as the pre-eminent rule. As the response to Kuhn’s privileging of paradigms attests, algorithmic rationality has made the intelligence that sanctions departing from the rule in order to model it more effectively appear suspect in comparison. The faculty once claimed to be coterminous with reason itself could now be dismissed as mere intuition and subjectivity, and as such inscrutable and resistant to analysis. In the interest of maintaining well-ordered polities, modern societies seek to narrow the margin for judgement and discretion to a minimum.  

Success in creating ‘islands of uniformity, stability, and predictability’, Daston writes in one the book’s most eloquent passages, has ‘fostered the dream of rules without exceptions, without equivocations, without elasticity’. Rules, no matter how minor and insignificant, that would assume the same constancy as the most all-encompassing and intransigent rules of all: the so-called laws of nature. But exceptions can never be eliminated completely. No matter how total it might seem, the triumph of thin rules is not irreversible. When the background conditions for thin rules collapse, Daston observes – for example, during a global pandemic – thick rules return with a vengeance. Only it can no longer be taken for granted that we still possess the cognitive skills required to adapt them. 

Works Cited

Encyclopédie, ou dictionnaire raisonné des sciences, des arts et des métiers, etc. Edited by Denis Diderot and Jean le Rond d’Alembert. University of Chicago: ARTFL Encyclopédie Project (Autumn 2022 Edition). Edited by Robert Morrissey and Glenn Roe. s.v. “Regle, modele,” http://encyclopedie.uchicago.edu/.

Erickson, Paul, Judy L. Klein, Lorraine Daston, Rebecca Lemov, Thomas Sturm, and Michael D. Gordin. How Reason Almost Lost Its Mind: The Strange Career of Cold War Rationality. Chicago: University of Chicago Press, 2013.

Kant, Immanuel. Critique of the Power of Judgment. Edited by Paul Guyer. Translated by Paul Guyer and Eric Matthews. Cambridge: Cambridge University Press, 2000.

Kuhn, Thomas S. The Structure of Scientific Revolutions, 4th ed. Chicago: University of Chicago Press, 2012.

Regula S.P.N. Benedicti, https://www.thelatinlibrary.com/benedict.html.

Smith, Adam. An Inquiry into the Nature and Causes of the Wealth of Nations. Edited by R.H. Campbell and A.S. Skinner. 2 vols. Oxford: Clarendon Press, 1976.