## Arthur C. Clarke’s tertium non datur

### 2 March 2013

**Saturday **

**A**rthur C. Clarke is best remembered for this science fiction stories, but many of his dicta and aphorisms have become common currency and are quoted and repeated to the point that their connection to their source is sometimes lost. (Clarke’s thought ranged widely and, interestingly, Clarke identified himself as a **logical positivist**.) Recently I quoted one of Clarke’s well-known sayings in **Happy Birthday Nicolaus Copernicus!**, as follows:

“Two possibilities exist: either we are alone in the Universe or we are not. Both are equally terrifying.”

quoted in

Visions: How Science Will Revolutionize the Twenty-First Century(1999) by Michio Kaku, p. 295

**I**n so saying, Clarke asserted a particular case of what is known as the logical law (or principle) of the **excluded middle**, which is also known as *tertium non datur*: the idea that, given a proposition and its negation, either one or the other of them must be true. This is also expressed in propositional logic as **“P or not-P”** (** “P v ~P”**). The principle of

*tertium non datur*is not to be confused with the

**principle of non-contradiction**, which can be formulated as

**“~(P & ~P).”****E**ven stating *tertium non datur* is controversial, because there are narrowly logical formulations as well as ontological formulations of potentially much greater breadth. This, of course, is what makes the principle fascinating and gives it its philosophical depth. Moreover, the **principle of the excluded middle** is subtly distinct from the **principle of bivalence**, though the two usually work in conjunction. Whereas the law of the excluded middle states that of a proposition and its negation, one of the other must be true, the principle of bivalence states that there are only two propositional truth values: true and false.

**T**o get started, here is the principle of the excluded middle as formulated in *The Cambridge Dictionary of Philosophy* edited by Robert Audi:

**principle of excluded middle**, the principle that the disjunction of any (significant) statement with its negation is always true; e.g., ‘Either there is a tree over 500 feet tall or it is not the case that there is such a tree’. The principle is often confused with the principle of bivalence.

THE CAMBRIDGE DICTIONARY OF PHILOSOPHYsecond edition, General Editor Robert Audi, 1999, p. 738

**A**nd to continue the Oxbridge axis, here is the formulation from Simon Blackburn’s *The Oxford Dictionary of Philosophy*:

**excluded middle, principle (or law) of** The logical law asserting that either p or not-p. It excludes middle cases such as propositions being half correct or more or less right. The principle directly asserting that each proposition is either true or false is properly called the law of bivalence.

The Oxford Dictionary of Philosophy, Simon Blackburn, Oxford University Press, 1996, p. 129

**F**or more partisan formulations, we turn to other sources. Mario Bunge formulated a narrowly syntactical conception of the law of the excluded middle in his *Dictionary of Philosophy*, which is intended to embody a scientistic approach to philosophy:

**EXCLUDED MIDDLE** A logical truth or tautology in ordinary (classical) logic: For every proposition p, p v ~p.

Dictionary of Philosophy, Mario Bunge, Prometheus Books, 1999, p. 89

**B**y way of contrast, in D. Q. McInerny’s ** Being Logical: A Guide to Good Thinking** we find a strikingly ontological formulation of the law of the excluded middle:

“Between being and nonbeing there is no middle state. Something either exists or it does not exist; there is no halfway point between the two.”

D. Q. McInerny,

Being Logical: A Guide to Good Thinking, Part Two, The Basic Principles of Logic, 1. First Principles, p. 26

**W**hat these diverse formulations bring out for us is the difficulty of separating logical laws of how formal systems are to be constructed from ontological laws about how the world is constructed, and in so bringing out this difficulty, they show us the relation between the law of the excluded middle and the principle of bivalence, since the logical intuition that there are only two possible truth values of any one proposition — true or false — is so closely tied to our logical intuition that, of these two values, one or the other (but not both, which qualification is the principle of non-contradiction) must hold for any given (meaningful) proposition.

**T**he powerful thing about Clarke’s observation is that it appeals to this admixture of logical intuitions and empirical intuitions, and in so doing seems to say something very compelling. Indeed, since I am myself a realist, and I think it can be shown that there is a fact of the matter that makes propositions true or false, I think that Clarke not only said something powerful, he also said something *true*: either there are extraterrestrial intelligences or there are not. It is humbling to contemplate either possibility: ourselves utterly alone in a vast universe with no other intelligent species or civilizations, or some other alien intelligence out there somewhere, unknown to us at present, but waiting to be discovered — or to discover us.

**A**lthough these logical intuitions are powerful, and have shaped human thought from its earliest times to the present day, the law of the excluded middle has not gone unquestioned, and indeed Clarke’s formulation gives us a wonderful opportunity to explore the consequences of the difference between constructive and non-constructive reasoning in terms of a concrete example.

**T**o formulate the existence or non-existence of extraterrestrials in the form of a logical law like the law of the excluded middle makes the implicit realism of Clarke’s formulation obvious as soon as we think of it in these terms. In imagining the possibilities of our cosmic isolation or an unknown alien presence our terror rests on our intuitive, visceral feeling of realism, which is as immediate to us as the intuitions rooted in our own experiences as bodies.

**T**he constructivist (at least, most species of constructivist, but not necessarily all) must deny the validity of the *teritum non datur* formulation of the presence of extraterrestrials, and in so doing the constructivist must pretend that our visceral feelings of realism are misleading or false, or must simply deny that these feelings exist. None of these are encouraging strategies, especially if one is committed to understanding the world by getting to the bottom of things rather than denying that they exist. Not only I am a realist, but I also believe strongly in the attempt to do justice to our intuitions, something that I have described in two related posts, **Doing Justice to Our Intuitions** and **How to Formulate a Philosophical Argument on Gut Instinct**.

**I**n **P or not-P** (as well as in subsequent posts concerned with constructivism, **What is the relationship between constructive and non-constructive mathematics?** **Intuitively Clear Slippery Concepts**, and **Kantian Non-constructivism**) I surveyed constructivist and non-constructivist views of *tertium non datur* and suggested that constructivists and non-constructivists need each other, as each represents a distinct point of view on formal thought. Formal thought is enriched by these diverse perspectives.

**B**ut whereas non-constructive thought, which is largely comprised of classical realism, can accept both the constructivist and non-constructivist point of view, the many varieties of constructivism usually explicitly deny the validity of non-constructive methods and seek to systematically limit themselves to constructive methods and constructive principles. Most famously, L. E. J. Brouwer (whom I have previously discussed in **Saying, Showing, Constructing** and **One Hundred Years of Intuitionism and Formalism**) formulated the philosophy of mathematics we now know as intuitionism, which is predicated upon the explicit denial of the law of the excluded middle. Brouwer, and those following him such as Heyting, sought to formulate mathematical and logic reasoning without the use of *tertium non datur*.

**B**rouwer and the intuitionists (at least as I interpret them) were primarily concerned to combat the growing influence of Cantor and his set theory in mathematics, which seemed to them to license forms of mathematical reasoning that had gone off the rails. Cantor had gone too far, and the intuitionists wanted to reign him in. They were concerned about making judgments about infinite totalities (in this case, sets with an infinite number of members), which the law of the excluded middle, when applied to the infinite, allows one to do. This seems to give us the power to make deductions about matters we cannot either conceive or even (as it is sometimes said) survey. “Surveyability” became a buzz word in the philosophy of mathematics after Wittgenstein began using it in his posthumously published *Remarks on the Foundations of Mathematics*. Although Wittgenstein was not himself an intuitionist *sensu stricto*, his work set the tone for constructivist philosophy of mathematics.

**G**iven the intuitionist rejection of the law of the excluded middle, it is not correct to say that there either is intelligent alien life in the universe or there is not intelligent alien life in the universe; to meaningfully make this statement, one would need to actually observe (inspect, survey) all possible locations where such alien intelligence might reside, and only after seeing it for oneself can one legitimately claim that there is or is not alien intelligence in the universe. For am example closer to home, it has been said that an intuitionist will deny the truth of the statement “either it is raining or it is not raining” without looking out the window to check and see. This can strike one as merely perverse, but we must take the position seriously, as I will try to show with the next example.

**A**lready in classical antiquity, Aristotle brought out a striking feature of the **law of the excluded middle**, in a puzzle sometimes known as the “sea battle tomorrow.” The idea is simple: either there will be a sea battle tomorrow, or there will *not* be a sea battle tomorrow. We may not know anything about this battle, and as of today we do not even know if it will take place, but we can nevertheless confidently assert that either it will take place or it will *not* take place. This is the law of the excluded middle as applied to *future contingents*.

**O**ne way to think of this odd consequence of the law of the excluded middle is that when it is projected beyond the immediate circumstances of our ability to ascertain its truth by observation it becomes problematic. This is why the intuitionists reject it. Aristotle extrapolated the law of the excluded middle to the future, but we could just as well extrapolate it into the past. Historians do this all the time (either Alexander cut the Gordian Knot or Alexander did *not* cut the Gordian Knot), but because of our strong intuitive sense of historical realism this does not feel as odd as asserting that something that hasn’t happened yet either will happen or will not happen.

**I**n terms of Clarke’s dichotomy, we could reformulate Aristotle’s puzzle about the sea battle tomorrow in terms of the discovery of alien intelligence tomorrow: either we will receive an alien radio broadcast tomorrow, or we will *not* receive an alien broadcast tomorrow. There is no third possibility. One way or another, the realist says, one of these propositions is true, and one of them is false. We do not know, today, which one of them is true and which one of them is false, but that does not mean that they do no possess definite truth values. The intuitionist says that the assertion today that we will or will not receive an alien radio broadcast is meaningless until tomorrow comes and we turn on our radio receivers to listen.

**T**he intuitionists thus have an answer to this puzzling paradox that remains a problem for the realist. This is definitely a philosophical virtue for intuitionism, but, like all virtues, it comes at a price. It is not a price I am willing to pay. This path can also lead us to determinism — assuming that all future contingents have a definite truth value implies that they are set in stone — but I am also not a determinist (as I discussed in **The Denial of Freedom as a Philosophical Problem**), and so this intersection of my realism with my libertarian free willist orientation leaves me with a problem that I am not yet prepared to resolve. But that’s what makes life interesting.

**. . . . .**

**. . . . .**

**. . . . .**

## The Church-Turing Thesis and the Asymmetry of Intuition

### 23 November 2012

**Friday **

**W**hat is the Church-Turing Thesis? The Church-Turing Thesis is an idea from theoretical computer science that emerged from research in the foundations of logic and mathematics, also called Church’s Thesis, Church’s Conjecture, the Church-Turing Conjecture as well as other names, that ultimately bears upon what can be computed, and thus, by extension, what a computer can do (and what a computer cannot do).

Note:For clarity’s sake, I ought to point out the Church’s Thesis and Church’s Theorem are distinct. Church’s Theorem is an established theorem of mathematical logic, proved by Alonzo Church in 1936, that there is no decision procedure for logic (i.e., there is no method for determining whether an arbitrary formula in first order logic is a theorem). But the two – Church’s theorem and Church’s thesis – are related: both follow from the exploration of the possibilities and limitations of formal systems and the attempt to define these in a rigorous way.

**E**ven to state Church’s Thesis is controversial. There are many formulations, and many of these alternative formulations come straight from Church and Turing themselves, who framed the idea differently in different contexts. Also, when you hear computer science types discuss the Church-Turing thesis you might think that it is something like an engineering problem, but it is essentially a philosophical idea. What the Church-Turing thesis is *not* is as important as what it is: it is *not* a theorem of mathematical logic, it is *not* a law of nature, and it *not* a limit of engineering. We could say that it is a *principle*, because the word “principle” is ambiguous and thus covers the various formulations of the thesis.

**T**here is an article on the Church-Turing Thesis at the **Stanford Encyclopedia of Philosophy**, one at Wikipedia (of course), and even a website dedicated to a critique of the Stanford article, **Alan Turing in the Stanford Encyclopedia of Philosophy**. All of these are valuable resources on the Church-Turing Thesis, and well worth reading to gain some orientation.

**O**ne way to formulate Church’s Thesis is that *all effectively computable functions are general recursive*. Both “effectively computable functions” and “general recursive” are *technical* terms, but there is an important different between these technical terms: “effectively computable” is an intuitive conception, whereas “general recursive” is a formal conception. Thus one way to understand Church’s Thesis is that it asserts the identity of a formal idea and an informal idea.

**O**ne of the reasons that there are many alternative formulations of the Church-Turing thesis is that there any several formally equivalent formulations of recursiveness: recursive functions, Turing computable functions, Post computable functions, representable functions, lambda-definable functions, and Markov normal algorithms among them. All of these are formal conceptions that can be rigorously defined. For the other term that constitutes the identity that is Church’s thesis, there are also several alternative formulations of effectively computable functions, and these include other intuitive notions like that of an algorithm or a procedure that can be implemented mechanically.

**T**hese may seem like recondite matters with little or no relationship to ordinary human experience, but I am surprised how often I find the same theoretical conflict played out in the most ordinary and familiar contexts. The dialectic of the formal and the informal (i.e., the intuitive) is much more central to human experience than is generally recognized. For example, the conflict between intuitively apprehended free will and apparently scientifically unimpeachable determinism is a conflict between an intuitive and a formal conception that both seem to characterize human life. Compatibilist accounts of determinism and free will may be considered the “Church’s thesis” of human action, asserting the identity of the two.

**I**t should be understood here that when I discuss *intuition* in this context I am talking about the kind of mathematical intuition I discussed in **Adventures in Geometrical Intuition**, although the idea of mathematical intuition can be understood as perhaps the narrowest formulation of that intuition that is the polar concept standing in opposition to formalism. Kant made a useful distinction between *sensory intuition* and *intellectual intuition* that helps to clarify what is intended here, since the very idea of intuition in the Kantian sense has become lost in recent thought. Once we think of intuition as something given to us in the same way that sensory intuition is given to us, only without the mediation of the senses, we come closer to the operative idea of intuition as it is employed in mathematics.

**M**athematical thought, and formal accounts of experience generally speaking, of course, seek to capture our intuitions, but this formal capture of the intuitive is itself an intuitive and essentially creative process even when it culminates in the formulation of a formal system that is essentially inaccessible to intuition (at least in parts of that formal system). What this means is that intuition can know itself, and know itself to be an intuitive grasp of some truth, but formality can only know itself as formality and cannot cross over the intuitive-formal divide in order to grasp the intuitive even when it captures intuition in an intuitively satisfying way. We cannot even understand the *idea* of an intuitively satisfying formalization without an intuitive grasp of all the relevant elements. As Spinoza said that *the true is the criterion both of itself and of the false*, we can say that the intuitive is the criterion both of itself and the formal. (And given that, today, truth is primarily understood formally, this is a significant claim to make.)

**T**he above observation can be formulated as a general principle such that the intuitive can grasp all of the intuitive and a portion of the formal, whereas the formal can grasp only itself. I will refer to this as the *principle of the asymmetry of intuition*. We can see this principle operative both in the Church-Turing Thesis and in popular accounts of Gödel’s theorem. We are all familiar with popular and intuitive accounts of Gödel’s theorem (since the formal accounts are so difficult), and it is not usual to make claims for the limitative theorems that go far beyond what they formally demonstrate.

**A**ll of this holds also for the attempt to translate traditional philosophical concepts into scientific terms — the most obvious example being free will, supposedly accounted for by physics, biochemistry, and neurobiology. But if one makes the claim that consciousness is *nothing but* such-and-such physical phenomenon, it is impossible to cash out this claim in any robust way. The science is quantifiable and formalizable, but our concepts of mind, consciousness, and free will remain stubbornly intuitive and have not been satisfyingly captured in any formalism — the determination of any such satisfying formalization could only be determined by intuition and therefore eludes any formal capture. To “prove” determinism, then, would be as incoherent as “proving” Church’s Thesis in any robust sense.

**T**here certainly are interesting philosophical arguments on both sides of Church’s Thesis — that is to say, both its denial and its affirmation — but these are arguments that appeal to our intuitions and, most crucially, our idea of ourselves is intuitive and informal. I should like to go further and to assert that the idea of the self *must* be intuitive and cannot be otherwise, but I am not fully confident that this is the case. Human nature can change, albeit slowly, along with the human condition, and we could, over time — and especially under the selective pressures of **industrial-technological civilization** — shape ourselves after the model of a formal conception of the self. (In fact, I think it very likely that this is happening.)

**I** cannot even say — I would not know where to begin — what would constitute a formal self-understanding of the self, much less any kind of understanding of a *formal self*. Well, *maybe* not. I have written elsewhere that the doctrine of the punctiform present (not very popular among philosophers these days, I might add) is a formal doctrine of time, and in so far as we identify internal time consciousness with the punctiform present we have a formal doctrine of the self.

**W**hile the above account is one to which I am sympathetic, this kind of formal concept — I mean the punctiform present as a formal conception of time — is very different from the kind of formality we find in physics, biochemistry, and neuroscience. We might assimilate it to some mathematical formalism, but this is an abstraction made concrete in subjective human experience, not in physical science. Perhaps this partly explains the **fashionable anti-philosophy** that I have written about.

**. . . . .**

**. . . . .**

**. . . . .**

## One Hundred Years of Intuitionism and Formalism

### 14 October 2012

**Sunday **

**A** message to the foundations of mathematics (FOM) listserv by **Frank Waaldijk** alerted me to the fact that today, 14 October 2012, is the one hundredth anniversary of Brouwer’s inaugural address at the University of Amsterdam, **“Intuitionism and Formalism.”** (I have discussed Frank Waaldijk earlier in **P or Not-P** and **What is the Relationship Between Constructive and Non-Constructive Mathematics?**)

**I** have called this post “One Hundred Years of Intuitionism and Formalism” but I should have called it “One Hundred Years of Intuitionism” since, of the three active contenders as theories for the foundations of mathematics a hundred years ago, only intuitionism is still with us in anything like its original form. The other contenders — formalism and logicism — are still with us, but in forms so different that they no longer resemble any kind of programmatic approach to the foundations of mathematics. In fact, it could be said that logicism was gradually transformed into technical foundational research, primarily logical in character, without any particular programmatic content, while formalism, following in a line of descent from Hilbert, has also been incrementally transformed into mainstream foundational research, but primarily mathematical in character, and also without any particular programmatic or even philosophical content.

**T**he very idea of “foundations” has come to be questioned in the past hundred years — though, as I commented a few days ago in **The Genealogy of the Technium**, the early philosophical foundationalist programs continue to influence my own thinking — and we have seen that intuitionism has been able to make the transition from a foundationalist-inspired doctrine to doctrine that might be called mathematical “best practices.” In contemporary philosophy of mathematics, one of the most influential schools of thought for the past couple of decades or more has been to focus not on theories of mathematics, but rather on mathematical practices. Sometimes this is called “neo-empiricism.”

**I**ntuitionism, I think, has benefited from the shift from the theoretical to the practical in the philosophy of mathematics, since intuitionism was always about making a distinction between the acceptable and the unacceptable in logical principles, mathematical reasoning, proof procedures, and all those activities that are part of the mathematician’s daily bread and butter. This shift has also made it possible for intuitionism to distance itself from its foundationalist roots at a time when foundationalism is on the ropes.

**B**rouwer is due some honor for his prescience in formulating intuitionism a hundred years ago — and intuitionism came almost fully formed out of the mind of Brouwer as syllogistic logic came almost fully formed out of the mind of Aristotle — so I would like to celebrate Brouwer on this, the one hundredth anniversary of his inaugural address at the University of Amsterdam, in which he formulated so many of the central principles of intuitionism.

**B**rouwer was prescient in another sense as well. He ended his inaugural address with a quote from Poincaré that is well known in the foundationalist community, since it has been quoted in many works since:

“Les hommes ne s’entendent pas, parce qu’ils ne parlent pas la même langue et qu’il y a des langues qui ne s’apprennent pas.”

**T**his might be (very imperfectly) translated into English as follows:

“Men do not understand each other because they do not speak the same language and there are languages that cannot be learned.”

**W**hat Poincaré called *men not understanding each other* Kuhn would later and more famously call incommensurability. And while we have always known that men do not understand each other, it had been widely believed before Brouwer that at least mathematicians understood each other because they spoke the same universal language of mathematics. Brouwer said that his exposition revealed, “the fundamental issue, which divides the mathematical world.” A hundred years later the mathematical world is still divided.

**F**or those who have not studied the foundations and philosophy of mathematics, it may come as a surprise that the past century, which has been so productive of research in advanced mathematics — arguably going beyond all the cumulative research in mathematics up to that time — has also been a century of conflict during which the idea of mathematics as true, certain, and necessary — ideas that had been central to a core Platonic tradition of Western thought — have all been questioned and largely abandoned. It has been a raucous century for mathematics, but also a fruitful one. A clever mathematician with a good literary imagination could write a mathematical analogue of Mandeville’s *Fable of the Bees* in which it is precisely the polyglot disorder of the hive that made it thrive.

**T**hat core Platonic tradition of Western thought is now, even as I write these lines, dissipating just as the illusions of the philosopher, freed from the cave of shadows, dissipate in the light of the sun above.

**B**rouwer, like every revolutionary (and we recall that it was Weyl, who was sympathetic to Brouwer, who characterized Brouwer’s work as a revolution in mathematics), wanted to do away with an old, corrupt tradition and to replace it with something new and pure and edifying. But in the affairs of men, a revolution is rarely complete, and it is, far more often, the occasion of schism than conversion.

**M**any were converted by Brouwer; many are still being converted today. As I wrote above, intuitionism remains a force to be reckoned with in contemporary mathematical thought in a way that logicism and formalism cannot claim to be such a force. But the conversions and subsequent defections left a substantial portion of the mathematical community unconverted and faithful to the old ways. The tension and the conflict between the old ways and the new ways has been a source of creative inspiration.

**P**recisely that moment in history when the very nature of mathematics was called into question became the same moment in history when mathematics joined technology in exponential growth.

**M**ars is the true muse of men.

**. . . . .**

**. . . . .**

**. . . . .**

**. . . . .**

## Parsimonious Formulations

### 30 December 2009

**Wednesday **

**T**he principle of parsimony — also called Ockham’s Razor, after William of Ockham who gave the principle some of its most compelling formulations — is among the most venerable of principles in human thought. This must be one of the few medieval philosophical principles that remains a staple of thought even today. Few but Thomists would be able to make it through the bulk of the *Summa Theologiae*, and far fewer still would find much in it with which they could agree, but there are parts of Ockham that can be read like a contemporary. Ockham is among the very few medieval writers of whom we can say this, and he shares this status with the canonical texts of classical antiquity.

**N**ot long ago in **A Formulation of Naturalism** I cited Hallett’s book ** Cantorian Set Theory and Limitation of Size** for its treatment of what Hallett called Cantor’s finitism, i.e., Cantor’s treatment of transfinite numbers as being like finite numbers as far as this methodological analogy could be made to hold. I suggested that a similar approach could be used to characterize naturalism in terms of materialism: we can treat naturalism like materialism by way of a methodological analogy that is employed as long as it can be made to work. Later, in

**Two Thoughts on Naturalism**, I suggested that naturalism could be given a similar treatment vis-à-vis mechanism.

**S**uch formulations — the transfinite in terms of the finite, and naturalism in terms of materialism or mechanism — are minimalist formulations. Conceptual minimalism makes the most it can from the fewest resources. This is an application of the principle of parsimony. It has always been felt most strongly in the formal sciences. Axiomatization is an expression of this spirit of minimalism. Łukasiewicz’s reduction of the propositional calculus to a single axiom is another expression of the spirit of parsimony, as is the Polish notation for symbolic logic that he first formulated. The later Russell’s formulations in terms of “minimum vocabularies” must be counted a part of the same tradition, though Russell’s parsimonious roots go much deeper and are perhaps expressed most profoundly in his theory of descriptions.

**T**he language of parsimony is pervasive throughout contemporary logic and mathematics, such as when one says that, for example, Von Neumann–Bernays–Gödel set theory is a conservative extension of Zermelo–Fraenkel set theory (ZF). There is even a conservativity theorem of mathematical logic that formalizes this approach to parsimony. Perhaps counter-intuitively, a conservative extension of a theory extends the language of a theory without extending the theorems that can be derived from the original (unextended) theory. Michael Dummett is sometimes credited with originating the idea of a conservative extension (by Neil Tennant, for example), and he wrote in his *Frege: Philosophy of Mathematics* that:

“The notion of a conservative extension makes sense only if the theory to be extended is formulated in a language more restricted than that of the extended theory.” (p. 297)

**I**t sounds puzzling at first, but it shouldn’t surprise us. Quine noted that the more we conserve on the elements of our theory, the larger the apparatus of derivation must become, and vice versa: there is an inverse relationship between the two.

**T**he short **Wikipedia article on conservative extensions** observes as follows:

“a conservative extension of a consistent theory is consistent. Hence, conservative extensions do not bear the risk of introducing new inconsistencies. This can also be seen as a methodology for writing and structuring large theories: start with a theory, *T*_{0}, that is known (or assumed) to be consistent, and successively build conservative extensions *T*_{1}, *T*_{2}, … of it.”

**T**hus the methodologically parsimonious tool of conservative extensions has implications for theoretical work over all. One can imagine an entire theoretical discipline given over to gradual and incremental extensions of an originally modest theory, which implies a model of theoretical thought innocent of Kuhnian paradigm shifts and revolutions in knowledge.

**O**f course, all parsimonious theories must rely upon some original bold insight upon which later conservative extensions can build. Cantor’s informal insights into set theory and transfinite numbers begat such an embarrassment of riches that almost all subsequent mathematical thought has consisted of various restrictions and codifications of Cantor’s intuitive and informal ideas. There is scarcely anything in the history of science to compare with it, except for Darwin’s conceptual breakthrough to natural selection. But mathematical theory and biological theory are developed so differently that the resemblance of these two insights followed by decades (and, I would guess, coming centuries) and elaboration and qualification is easier to miss than to see.

**T**here is an implicit recognition in the conceptualization of parsimonious formulations of the power of more sweeping formulations, the proactive character of conceptual innovation that goes beyond accepted formulations, even while there is at the same time an implicit recognition of the danger and perhaps also irresponsibility of such theorizing.

**S**ome time ago I noted in **Exaptation of the Law** that the law has an intrinsic bias in favor of the past that makes it a conservative force in society. With the law, this influence is concrete and immediate, often deciding the fates of individuals. It strikes me now that the minimalism and parsimony of much (if not most) formal thought is intrinsically conservative in an intellectual sense, and constitutes the ontological equivalent of bias in favor of the past. This intrinsic bias of formal thought is likely to be less concrete and immediate than that of the law, but no less pervasive.

**. . . . .**

**. . . . .**

**. . . . .**