24 November 2013
The world, we are learning every day, is a very large place. Or perhaps I should say that the universe is a very large place. It is also a very complex and strange place. J. B. S. Haldane famously said that, “I have no doubt that in reality the future will be vastly more surprising than anything I can imagine. Now my own suspicion is that the Universe is not only queerer than we suppose, but queerer than we can suppose.” (Possible Worlds and Other Papers, 1927, p. 286) In other words, human beings, no matter how valiantly they attempt to understand the universe, may not be cognitively equipped to understand it; our minds may not be the kind of minds that can understand the kind of place that the world is.
This idea of our inability to understand the world in which we find ourselves (an admirably humble Copernican insight that we might call metaphysical modesty, and which stands in contrast to epistemic hubris) has received many glosses since Haldane’s time. Most notable (notable, at least, from my perspective) are the evolutionary gloss, the quantum physics gloss, and the philosophical gloss. I will consider each of these in turn.
In terms of evolution, there is no reason to suppose that descent with modification in a context of a struggle for vital resources on the plains of Africa (the environment of evolutionary adaptedness, or EEA) is going to produce minds capable of understanding higher dimensional spatial manifolds or quantum physics at microscopic scales that differ radically from the macroscopic scales of ordinary human perception. Alvin Plantinga (about whom I wrote some time ago in A Note on Plantinga, inter alia) has used this argument for theological purposes. However, there is no intrinsic reason that a mind born in the mud and the muck cannot raise itself above its origins and come to contemplate the world in Copernican terms. The evolutionary argument cuts both ways, and since we have ourselves as the evidence of an organism that can raise itself from strictly survival behavior to forms of thought that have nothing to do with survival, from the perspective of the weak anthropic principle this is proof enough that natural selection can result in such a mind.
In terms of quantum theory, we are all familiar with famous quotes from the leading lights of quantum theory as to the essentially incomprehensibility of that theory. For example, Richard Feynman said, “I think I can safely say that nobody understands quantum mechanics.” However, I have observed (in The limits of my language are the limits of my world and elsewhere) that recent research is making strides in working around the epistemic limitations of quantum theory, revealing its uncertainties to be not absolute and categorical, but rather subject to careful and painstaking narrowing that renders the uncertainty a little less uncertain. I anticipate two developments that will emerge from the further elaborate of quantum theory: 1) the finding of ways to gradually and incrementally chip away at an absolutist conception of uncertainty (as just mentioned), and 2) the formulation of more adequate intuitions to make quantum theory more palatable to the human mind.
In terms of philosophy, Colin McGinn’s book Problems in philosophy: The Limits of Inquiry formulates a position which he calls Transcendental Naturalism:
“Philosophy is an attempt to get outside the constitutive structure of our minds. Reality itself is everywhere flatly natural, but because of our cognitive limits we are unable to make good on this general ontological principle. Our epistemic architecture obstructs knowledge of the real nature of the objective world. I shall call this thesis transcendental naturalism, TN for short.” (pp. 2-3)
I have previously written about McGinn’s work in Transcendental Non-Naturalism and Naturalism and Object Oriented Ontology, inter alia. Our ability to get outside the constitutive structure of our minds is severely limited at best, and so our ability to understand the world as it is is limited at best.
While our cognitive abilities are admittedly limited (for all the reasons discussed above, as well as other reasons not discussed), these limits are not absolute, but rather admit of revision. McGinn’s position as stated above implies a false dichotomy between staying within the constitutive structure of our minds and getting outside it. This is a classic case of facing the sheer cliff of Mount Improbable: while it is impossible to get outside our cognitive architecture in one fell swoop, we can little by little transgress the boundaries of our cognitive architecture, each time ever-so-slightly expanding our capacities. Incrementally over time we improve our ability to stand outside those limits that once marked the boundaries of our cognitive architecture. Thus in an ironic twist of intellectual history, the evolutionary argument, rather than demonstrating metaphysical modesty, is rather the key to limiting the limitations on the human mind.
All of this is related to one of the central problems in the philosophy of science of our time — the whole Kuhnian legacy that is the framework of so much contemporary philosophy of science. Copernican revelations and revolutions, which formerly disturbed our anthropocentric bias every few hundred years, now occur with alarming frequency. The difference today, of course, is that science is much more advanced than it was with past Copernican revelations and revolutions — it has much more advanced instrumentation available to it (as a result of the STEM cycle), and we have a much better idea of what to look for in the cosmos.
It was a shock to almost everyone to have it scientifically demonstrated that the universe is not static and eternal, but dynamic and changing. It was a shock when quantum theory demonstrated the world to be fundamentally indeterministic. This is by now a very familiar narrative. In fact, it is so familiar that it has been expropriated (dare I say exapted?) by obscurantists and irrationalists of our time, who point at continual changes at scientific knowledge as “proof” that science doesn’t give us any “truth” because it changes. The assumption here is that change in scientific knowledge demonstrates the weakness of science; in fact, change in scientific knowledge is the strength of science. Scientific knowledge is what I have elsewhere called an intelligent institution in so far as it is institutionalized knowledge, but that institution is formulated with internal mechanisms that facilitate the re-shaping of the institution itself over time. That mechanism is the scientific method.
It is important to see that the overturning of familiar conceptions of the world — some of which are ancient and some of which are not — is not arbitrary. Less comprehensive conceptions are being replaced by more comprehensive conceptions. The more comprehensive our perspective on the world, the greater the number of anomalies we must face, and the greater the number of anomalies we face the more likely it is that our theories will be overturned, or at least partially falsified. But it is the wrong debate to ask whether theory change is rational or irrational. It is misleading, because what ought to concern us is how well our theories account for the ever-larger world that is revealed to us through our ever-more comprehensive methods of science, and not how well our theories conform to our presuppositions about rationality. The more we get the science right, reason will follow, shaping new intuitions and formulating new theories.
Our ability to discover (and to understand) ever greater scales of the universe is contingent upon our growing intellectual capabilities, which are cumulative. Just as in the STEM cycle science begets technologies that beget industries that create better scientific instruments, so too on a purely intellectual level the intellectual capabilities of one generation are the formative context of the intellectual capabilities of the next generation, which allows the later generation to exceed the earlier generation. Concepts are the tools of the mind, and we use our familiar concepts to create the next generation of concepts, which latter are both more refined and more powerful than the former, in the same way as we use each generation of tools to build the next generation of tools, which makes each generation of tools better than the last (except for computer software — but I expect that this degradation in the practicability of computer software is simply the software equivalent of planned obsolescence).
Our current generation of tools — both conceptual and technological — are daily revealing to us the inadequacy of our past conceptions of the world. Several recent discoveries have in particular called into question our understanding of the size of the world, especially in so far as the world is defined in terms of its origins in the Big Bang. For example, the discovery of hyperclusters suggest physical structures of the universe that are larger than the upper limit set to physical structures by contemporary cosmologies theories (cf. ‘Hyperclusters’ of the Universe — “Something is Behaving Very Strangely”).
In a similar vein, writing of the recent discovery of a “large quasar group” (LQG) as much as four billion light years across, the article The Largest Discovered Structure in the Universe Contradicts Big-Bang Theory Cosmology states:
“This LQG challenges the Cosmological Principle, the assumption that the universe, when viewed at a sufficiently large scale, looks the same no matter where you are observing it from. The modern theory of cosmology is based on the work of Albert Einstein, and depends on the assumption of the Cosmological Principle. The principle is assumed, but has never been demonstrated observationally ‘beyond reasonable doubt’.”
This formulation gets the order of theory and observation wrong. The cosmological principle is not a principle that can be proved or disproved by evidence; it is a theoretical idea that is used to give structure and meaning to observations, to organize observations into a theoretical whole. The cosmological principle belongs to theoretical cosmology; recent discoveries such as hyperclusters and large quasar groups belong to observational cosmology. While the two — i.e., theoretical and observational — cannot be separated in the practice of science, it is also true that they are not identical. Theoretical methods are distinct from observational methods, and vice versa.
Thus the cosmological principle may be helpful or unhelpful in organizing our knowledge of the cosmos, but it is not the kind of thing that can be falsified in the same way that, for example, a theory of planetary formation can be falsified. That is to say, the cosmological principle might be opposed to (falsified by) another principle that negates the cosmological principle, but this anti-cosmological principle will similarly belong to an order not falsifiable by empirical observations.
The discoveries of hyperclusters and LQGs are particularly problematic because they question some of the fundamental assumptions and conclusions of Big Bang cosmology, which is, essentially, the only large scale cosmological model in contemporary science. Big Bang cosmology is the explanation for the structure of the cosmos that was formulated in response to the discovery of the red shift, which implies that, on the largest observable scales, the universe is expanding. It is important to add the qualification, “on the largest observable scales” because stars within a given galaxy are circulating around the galaxy, and while a given star may be moving away from another given star, it is also likely to be moving toward yet some other star. And, even at larger scales, not all galaxies are receding from each other. It is fairly well known that galaxies collide and commingle; the Helmi stream of our own Milky Way is the result of a long past galactic collision, and at some far time in the future the Milky Way itself will merge with the larger Andromeda galaxy, and be absorbed by it.
Cosmology during the period of the big bang theory (a period in which we still find ourselves today) is in some respects like biology before Darwin. Almost all biology before Darwin was essentially theological, but no one had a better idea so biology had to wait to become a science capable of methodologically naturalistic formulations until after Darwin. The big bang theory was, on the contrary, proposed as a scientific theory (not merely bequeathed to us by pre-scientific tradition), and most scientists working within the big bang tradition have formulated the Big Bang in meticulously naturalistic terms. Nevertheless, once the steady state theory was overthrown, no one really had an alternative to the big bang theory, so all cosmology centered on the Big Bang for lack of imagination of alternatives — but also due to the limitations of the scientific instruments, which at the time of the triumph of the big bang theory were much more modest than they are today.
As disconcerting as it was to discover that the cosmos did not embody an eternal order, that it is expanding and had a history of violent episodes, and that it was much larger than an “island universe” comprising only the Milky Way, the observations that we need to explain today are no less disconcerting in their own way.
Here is how Leonard Susskind describes our contemporary observations of the expanding universe:
“In every direction that we look, galaxies are passing the point at which they are moving away from us faster than light can travel. Each of us is surrounded by a cosmic horizon — a sphere where things are receding with the speed of light — and no signal can reach us from beyond that horizon. When a star passes the point of no return, it is gone forever. Far out, at about fifteen billion light years, our cosmic horizon is swallowing galaxies, stars, and probably even life. It is as if we all live in our own private inside-out black hole.”
Leonard Susskind, The Black Hole War: My Battle with Stephen Hawking to make the World Safe for Quantum Mechanics, New York, Boston, and London: Little, Brown and Company, 2008, pp. 437-438
This observation has not yet been sufficiently appreciated. What lies beyond Susskind’s cosmic horizon is unobservable, as anything that disappears beyond the event horizon of a black hole has become unobservable, and that places such matters beyond the reach of science understood in a narrow sense of observation. But as I have noted above, in the practice of science we cannot disentangle the theoretical and the observational, but the two are not the same. While our observations come to an end at the cosmic horizon, our principles encounter no such boundary. Thus it is that we naturally extrapolate our science beyond the boundaries of observation, but if we get our scientific principles wrong, anything beyond the boundary of observation will be wrong and will be incapable or correction by observation.
Science in the narrow sense must, then, come to an end with observation. But this does not satisfy the mind. One response is to deny the mind its satisfaction and refuse to pass beyond observation. Another response is to fill the void with mythology and fiction. Yet another response is to take up the principles on their own merits and consider them in the light of reason. This response is the philosophical response, and we see that it is a rational response to the world that is continuous with science even when it passes beyond science.
. . . . .
. . . . .
. . . . .
4 April 2010
Theses on the Occasion of Easter Sunday
A Theoretical Account of Ritualized Celebration
1. Distinctions must be made among myth, ritual, and celebration.
1.1 Myth, ritual, and celebration, though distinct, are logically related.
1.11 A celebration is an occasion for a ritual,
A ritual is an opportunity to participate in a myth,
Therefore a celebration is an occasion in which to participate in a myth.
Q. E. D.
1.2 Rituals of burial are older than agricultural rituals of life-death-rebirth, even extending to other species (Neanderthals, now extinct), and may well be the origin of life-death-rebirth rituals.
2. Among the most ancient of continually observed celebrations is that of the life-death-resurrection of the Year-God, eniautos daimon.
2.1 The celebration of the life and re-birth of the Year-God, eniautos daimon, is at least as old as settled, agrarian society.
2.11 Agriculture and the written word together produced settled, historical civilization.
2.12 Settled historical civilization has defined the norm of human history from the Neolithic Agricultural Revolution to the Industrial Revolution.
2.2 Settled agrarian society coincides with the origins of civilization.
2.21 The celebration of the life and re-birth of the Year-God, eniautos daimon, coincides with the origins of civilization.
3. Once the breakthrough to history has been made by way of the written word, it is the nature of historical civilization to commemorate nodal points of the year, whether with solemnities, festivities, or both.
3.1 Historical civilization is predicated upon the presumed value of the history that brings that civilization into being.
3.2 Nodal points of the year celebrated in historical civilizations are observed as validation of their historicity through the performance of rituals.
3.21 In a temperate climate, summer and winter solstices and spring and fall equinoxes are nodal points of the year.
4. The mythology of a settled, agricultural civilization emerges from the same regularities of nature observed of necessity by agricultural peoples.
4.1 The calendrics of celebration emerges from the regularities of nature observed of necessity by agricultural peoples.
4.11 The mythology and calendar of celebrations of settled, agricultural civilizations come from the same source.
4.2 Celebrations are the points of contact between the two parallel orders of mythological events and the actual historical calendar.
4.21 A civilization validates its mythology by establishing a correspondence between mythological events and historical events.
4.3 Enacting a myth in historical time, by way of a ritual, makes that myth literal truth by giving to it a concrete embodiment.
5. Easter is one species of the genus of life-death-rebirth celebrations.
5.1 The particular features of the Easter celebration are the result of the adaptive radiation of the dialectic of sacrifice and resurrection.
6. Easter is that species of life-death-rebirth celebration specific to Christendom.
6.1 Christendom was primarily a construction of the Middle Ages.
6.11 Christendom was the legacy of Medieval Europe that disappeared with the passing of medieval civilization but which, like the Roman Empire before it, is with us still and remains a touchstone of the Western tradition.
6.12 Christendom was an empire of the spirit and of the cross as Rome was an empire of the will and of the sword.
6.13 To have once been Roman, and then to have been Christian, and finally to have become modern, is the condition of Western man.
6.2 Easter is a celebration specific to civilization, the civilized celebration par excellence.
7. The naturalistic civilization that is emerging from the consequences of the Industrial Revolution represents the first significant change in the social structure of human society since the Neolithic Agricultural Revolution.
7.1 With the advent of the Industrial Revolution, we have ceased to be an agrarian society.
7.2 For the first time in history, life-death-rebirth celebrations face interpretation by a non-agrarian society.
7.21 Not only should we not hesitate to find new meanings in ancient celebrations, of which Easter represents the latest adaptive radiation, but rather we should actively and consciously seek meanings relevant to the present in such celebrations.
8. As the painters of the renaissance drew upon the traditions of pagan antiquity already at that time a thousand years out of date, so too the post-Christian Western civilization will draw upon the traditions of Christendom for hundreds if not thousands of years to come.
8.1 The period of time that we have come to call the modern era — roughly the past five hundred years — has not been the modern era proper but rather has been the period of the formation of modernity.
8.2 Modernity simpliciter has but begun.
. . . . .
. . . . .
. . . . .
30 December 2009
The principle of parsimony — also called Ockham’s Razor, after William of Ockham who gave the principle some of its most compelling formulations — is among the most venerable of principles in human thought. This must be one of the few medieval philosophical principles that remains a staple of thought even today. Few but Thomists would be able to make it through the bulk of the Summa Theologiae, and far fewer still would find much in it with which they could agree, but there are parts of Ockham that can be read like a contemporary. Ockham is among the very few medieval writers of whom we can say this, and he shares this status with the canonical texts of classical antiquity.
Not long ago in A Formulation of Naturalism I cited Hallett’s book Cantorian Set Theory and Limitation of Size for its treatment of what Hallett called Cantor’s finitism, i.e., Cantor’s treatment of transfinite numbers as being like finite numbers as far as this methodological analogy could be made to hold. I suggested that a similar approach could be used to characterize naturalism in terms of materialism: we can treat naturalism like materialism by way of a methodological analogy that is employed as long as it can be made to work. Later, in Two Thoughts on Naturalism, I suggested that naturalism could be given a similar treatment vis-à-vis mechanism.
Such formulations — the transfinite in terms of the finite, and naturalism in terms of materialism or mechanism — are minimalist formulations. Conceptual minimalism makes the most it can from the fewest resources. This is an application of the principle of parsimony. It has always been felt most strongly in the formal sciences. Axiomatization is an expression of this spirit of minimalism. Łukasiewicz’s reduction of the propositional calculus to a single axiom is another expression of the spirit of parsimony, as is the Polish notation for symbolic logic that he first formulated. The later Russell’s formulations in terms of “minimum vocabularies” must be counted a part of the same tradition, though Russell’s parsimonious roots go much deeper and are perhaps expressed most profoundly in his theory of descriptions.
The language of parsimony is pervasive throughout contemporary logic and mathematics, such as when one says that, for example, Von Neumann–Bernays–Gödel set theory is a conservative extension of Zermelo–Fraenkel set theory (ZF). There is even a conservativity theorem of mathematical logic that formalizes this approach to parsimony. Perhaps counter-intuitively, a conservative extension of a theory extends the language of a theory without extending the theorems that can be derived from the original (unextended) theory. Michael Dummett is sometimes credited with originating the idea of a conservative extension (by Neil Tennant, for example), and he wrote in his Frege: Philosophy of Mathematics that:
“The notion of a conservative extension makes sense only if the theory to be extended is formulated in a language more restricted than that of the extended theory.” (p. 297)
It sounds puzzling at first, but it shouldn’t surprise us. Quine noted that the more we conserve on the elements of our theory, the larger the apparatus of derivation must become, and vice versa: there is an inverse relationship between the two.
The short Wikipedia article on conservative extensions observes as follows:
“a conservative extension of a consistent theory is consistent. Hence, conservative extensions do not bear the risk of introducing new inconsistencies. This can also be seen as a methodology for writing and structuring large theories: start with a theory, T0, that is known (or assumed) to be consistent, and successively build conservative extensions T1, T2, … of it.”
Thus the methodologically parsimonious tool of conservative extensions has implications for theoretical work over all. One can imagine an entire theoretical discipline given over to gradual and incremental extensions of an originally modest theory, which implies a model of theoretical thought innocent of Kuhnian paradigm shifts and revolutions in knowledge.
Of course, all parsimonious theories must rely upon some original bold insight upon which later conservative extensions can build. Cantor’s informal insights into set theory and transfinite numbers begat such an embarrassment of riches that almost all subsequent mathematical thought has consisted of various restrictions and codifications of Cantor’s intuitive and informal ideas. There is scarcely anything in the history of science to compare with it, except for Darwin’s conceptual breakthrough to natural selection. But mathematical theory and biological theory are developed so differently that the resemblance of these two insights followed by decades (and, I would guess, coming centuries) and elaboration and qualification is easier to miss than to see.
There is an implicit recognition in the conceptualization of parsimonious formulations of the power of more sweeping formulations, the proactive character of conceptual innovation that goes beyond accepted formulations, even while there is at the same time an implicit recognition of the danger and perhaps also irresponsibility of such theorizing.
Some time ago I noted in Exaptation of the Law that the law has an intrinsic bias in favor of the past that makes it a conservative force in society. With the law, this influence is concrete and immediate, often deciding the fates of individuals. It strikes me now that the minimalism and parsimony of much (if not most) formal thought is intrinsically conservative in an intellectual sense, and constitutes the ontological equivalent of bias in favor of the past. This intrinsic bias of formal thought is likely to be less concrete and immediate than that of the law, but no less pervasive.
. . . . .
. . . . .
. . . . .