Monday


academic silos

Contemporary scholarship is a hierarchy of specializations, though the hierarchy is not always obvious. A typical idiom employed today to describe specialization is that of “academic silos,” as though each academic specialization were tightly circumscribed by very high walls rarely breached. The idiom of “silos” points not to a hierarchy, but to a landscape of separate but equal and utterly isolated disciplines.

There are several taxonomies of the academic disciplines that arrange them hierarchically, as in the unity of science movement of twentieth century logical empiricism, which sought to reduce all the sciences to physics. This isn’t what I have in mind when I say that contemporary scholarship is a hierarchy of specializations. I am, rather, recurring to an idea that appeared in the work of Alfred North Whitehead, and which was picked up by Buckminster fuller (of geodesic dome fame).

We can think of Buckminster Fuller as a proto-techno-philosopher, and we know that techno-philosophy disdains the philosophical tradition and seeks to treat traditional philosophical problems de novo from the perspective of science and technology. In one of the rare instances of borrowing by techno-philosophy from traditional philosophy, Buckminster Fuller quoted Alfred North Whitehead, who was a bona fide philosopher.

In R. Buckminster Fuller’s Utopia or Oblivion: The Prospects for Humanity (Chapter 2, “The Music of the New Life”), Fuller identifies what he called “Whitehead’s dilemma,” following an observation made by Alfred North Whitehead about the accelerating pace of specialization in higher education. The dilemma is that the best and brightest students were channeled into specialized studies, and these studies became more specialized as they progress. But there remains a need for a coordinating function among specializations, though all the best minds have already been channeled into specialist studies. That means that the dullest minds that remain are left with the task of the overall coordination of specialist disciplines.

Whitehead formulated his dilemma in terms of academic specialization and governmental coordination of society, but there are “big picture” coordinating functions that have nothing to do with government. This is most especially evident in what I have called the epistemic overview effect, which is concerned with the “big picture” of knowledge. A comprehensive understanding of some specialist discipline no less that an overall coordinating function demands a grasp of the big picture. But the rise of specialization militates against comprehensive understanding in its widest aspect — where it is most needed.

The role of specialization in contemporary scholarship is ironic in historical perspective. It is ironic because, today, more students than ever before in history throng more institutions of higher learning than ever before existed in history, and the traditional ideal of higher education was that of creating a well-rounded individual who had some degree of sophistication across a spectrum of scholarship. Specialization was once the function of the trades (something Whitehead also noted, cf. his Adventures of Ideas, Part One, Chap. 4 “Aspects of Freedom,” Section V; Whitehead’s distinction in this section between profession and craft is instructive). An individual either went on to further academic education in order to understand the wider relationship between the sciences and the humanities, or one entered a trade school or an apprenticeship program and specialized in learning some skill or craft.

It would not be going too far to say that, if you want to understand the big picture, the last person you should talk to is a specialist. A specialist may simply refuse to talk about the big picture, or, if they do talk about the big picture, it will be through the lens of their specialty, which can be highly misleading as regards the big picture. Thus the big picture may be characterized as a body of knowledge in which there are no specialists and no experts. Can there be experts in comprehensive knowledge? Is it possible to specialize in the big picture? How would one go about specializing in the big picture, such that one’s neglect of detail and the specialization of the special sciences would be a principled neglect of detail in order to focus on the details and patterns that emerge exclusively from an attempt to grasp the whole of the world, or the whole of the universe? This kind of specialization sounds counter-intuitive, but we must make the effort to formulate such a conception.

While prima facie counter-intuitive, we should immediately recognize that the idea of specializing in the big picture is nothing other than a particular application of the general principle of scientific abstraction. Science constructs abstract, simplified, idealized models of the world in order to understand processes and phenomena that, in the fullness of their presence, are far too complex to allow totality of knowledge. Recall that Wordsworth said we murder to dissect. The world in itself is intractable; the world of science is made tractable through abstraction; abstraction is the price that we pay for understanding. We must learn to pay that price willingly, if not cheerfully.

In asking if it is possible to specialize in the big picture, I am also in a sense asking if it is possible to think rigorously about the big picture, thus we can also ask: Is it possible to think about the big picture with a clear scholarly conscience? Big picture thinking often invites careless and sloppy formulations, and this has brought big picture thinking into disrepute by those who wish to distance themselves from careless and sloppy thinking — which is to say, almost all contemporary philosophers, who take a special pride in the rigor of their formulations. And this is a rigor largely due to the kind of specialization that Whitehead identified.

There is a kind of implicit contrition in the contemporary philosophical passion for rigor and precision, since much traditional philosophy now seems painfully muddled and unclear, and this has been a stick that scientists have used to beat philosophers, and with which they have justified their fashionable anti-philosophy. But Scientists, too, are guilty on this account. And whereas philosophers committed their sins against rigor in the past, scientists are committing their sins against rigor in the present. The pronouncements of scientists upon extra-scientific questions is an admirable attempt at comprehensive understanding, but it almost always takes place in a context that ignores the history of the question addressed.

History, I think, is essential to the big picture. Indeed, I will go further and I will suggest that the emerging discipline of Big History offers the possibility of a discipline that can specialize in the big picture with the hope of rigorous formulations. We have need of such a discipline. At the 2014 IBHA conference, David Christian in his keynote address (titled,”Can I study everything, please?”) expressed quite vividly the origins of his own interest in what would become big history in an experience of disappointment. He talked about going to school as a child with an initial sense of excitement that his big questions would be answered, only to find that his big questions were shunted aside.

How do you talk about the whole of time without inviting scholarly ridicule by those who have spent their entire careers seeking to accurately portray some small fragment of the whole? Is it possible to speak at this level of generality and still be to “right” in any relevant sense? Big History seeks to be just such a discipline, and the big historians have done a remarkable job in integrating the results of the special sciences into a coherent whole. I have made the claim that big history need not reject any more specialized scholarship, but provides the overall framework within which all specialized studies can find a place. Big history is a “big tent” in which all scholarship can find a place.

Big History is now an established (albeit youthful) branch of historiography, but it could be more than this. Where Big History remains weak is in its theoretical formulations, and this is not a surprise. While Big Historians seek to portray philosophy and the humanities as part of the sweeping story of human civilization (itself a part of a larger cosmic history), they do not draw upon philosophy and the humanities in the same way that they draw upon the special sciences. There is, as yet, no philosophy of big history, and that means that there is, as yet, no systematic attempt to clarify and to extend the conceptions upon which Big History relies in its formulations. This remains to be done.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

Advertisements

Monday


Studies in Formalism:

The Synoptic Perspective in Formal Thought


In my previous two posts on the overview effect — The Epistemic Overview Effect and The Overview Effect as Perspective Taking — I discussed how we can take insights gained from the “overview effect” — what astronauts and cosmonauts have experienced as a result of seeing our planet whole — and apply them to over areas of human experience and knowledge. Here I would like to try to apply these insights to formal thought.

The overview effect is, above all, a visceral experience, something that the individual feels as much as they experience, and you may wonder how I could possibly find a connection between a visceral experience and formal thinking. Part of the problem here is simply the impression that formal thought is distant from human concerns, that it is cold, impersonal, unfeeling, and, in a sense, inhuman. Yet for logicians and mathematicians (and now, increasingly, also for computer scientists) formal thought is a passionate, living, and intimate engagement with the world. Truly enough, this is not an engagement with the concrete artifacts of the world, which are all essentially accidents due to historical contingency, but rather an engagement with the principles implicit in all things. Aristotle, ironically, formalized the idea of formal thought being bereft of human feeling when he asserted that mathematics has no ethos. I don’t agree, and I have discussed this Aristotelian perspective in The Ethos of Formal Thought.

And yet. Although Aristotle, as the father of logic, had more to do with the origins of formal thought than any other human being who has ever lived, the Aristotelian denial of an ethos to formal thought does not do justice to our intuitive and even visceral engagement with formal ideas. To get a sense of this visceral and intuitive engagement with the formal, let us consider G. H. Hardy.

Late in his career, the great mathematician G. H. Hardy struggled to characterize what he called mathematically significant ideas, which is to say, what makes an idea significant in formal thought. Hardy insisted that “real” mathematics, which he distinguished from “trivial” mathematics, and which presumably engages with mathematically significant ideas, involves:

“…a very high degree of unexpectedness, combined with inevitability and economy.”

G. H. Hardy, A Mathematician’s Apology, section 15

Hardy’s appeal to parsimony is unsurprising, yet the striking contrast of the unexpected and the inevitable is almost paradoxical. One is not surprised to hear an exposition of mathematics in deterministic terms, which is what inevitability is, but if mathematics is the working out of rigid formal rules of procedure (i.e., a mechanistic procedure), how could any part of it be unexpected? And yet it is. Moreover, as Hardy suggested, “deep” mathematical ideas (which we will explore below) are unexpected even when they appear inevitable and economical.

It would not be going too far to suggest that Hardy was trying his best to characterize mathematical beauty, or elegance, which is something that is uppermost in the mind of the pure mathematician. Well, uppermost at least in the minds of some pure mathematicians; Gödel, who was as pure a formal thinker as ever lived, said that “…after all, what interests the mathematician, in addition to drawing consequences from these assumptions, is what can be carried out” (Collected Works Volume III, Unpublished essays and lectures, Oxford, 1995, p. 377), which is an essentially pragmatic point of view, in which formal elegance would seem to play little part. Mathematical elegance has never been given a satisfactory formulation, and it is an irony of intellectual history that the most formal of disciplines relies crucially on an informal intuition of formal elegance. Beauty, it is often said, in the mind of the beholder. Is this true also for mathematical beauty? Yes and no.

If a mathematically significant idea is inevitable, we should be able to anticipate it; if unexpected, it ought to elude all inevitability, since the inevitable ought to be predictable. One way to try to capture the ineffable sense of mathematical elegance is through paradox — here, the paradox of the inevitable and the unexpected — in way not unlike the attempt to seek enlightenment through the contemplation of Zen koans. But Hardy was no mystic, so he persisted in his attempted explication of mathematically significant ideas in terms of discursive thought:

“There are two things at any rate which seem essential, a certain generality and a certain depth; but neither quality is easy to define at all precisely.

G. H. Hardy, A Mathematician’s Apology, section 15

Although Hardy repeatedly expressed his dissatisfaction with his formulations of generality and depth, he nevertheless persisted in his attempts to clarify them. Of generality Hardy wrote:

“The idea should be one which is a constituent in many mathematical constructs, which is used in the proof of theorems of many different kinds. The theorem should be one which, even if stated originally (like Pythagoras’s theorem) in a quite special form, is capable of considerable extension and is typical of a whole class of theorems of its kind. The relations revealed by the proof should be such as to connect many different mathematical ideas.” (section 15)

And of mathematical depth Hardy hazarded:

“It seems that mathematical ideas are arranged somehow in strata, the ideas in each stratum being linked by a complex of relations both among themselves and with those above and below. The lower the stratum, the deeper (and in general more difficult) the idea.” (section 17)

This would account for the special difficulty of foundational ideas, of which the most renown example would be the idea of sets, though there are other candidates to be found in other foundational efforts, as in category theory or reverse mathematics.

Hardy’s metaphor of mathematical depth suggests foundations, or a foundational approach to mathematical ideas (an approach which reached its zenith in the early twentieth century in the tripartite struggle over the foundations of mathematics, but is a tradition which has since fallen into disfavor). Depth, however, suggests the antithesis of a synoptic overview, although both the foundational perspective and the overview perspective seek overarching unification, one from the bottom up, the other from the top down. These perspectives — bottom up and top down — are significant, as I have used these motifs elsewhere as an intuitive shorthand for constructive and non-constructive perspectives respectively.

Few mathematicians in Hardy’s time had a principled commitment to constructive methods, and most employed non-constructive methods will little hesitation. Intuitionism was only then getting its start, and the full flowering of constructivistic schools of thought would come later. It could be argued that there is a “constructive” sense to Zermelo’s axiomatization of set theory, but this is of the variety that Godel called “strictly nominalistic construtivism.” Here is Godel’s attempt to draw a distinction between nominalistic constructivism and the sense of constructivism that has since overtaken the nominalistic conception:

…the term “constructivistic” in this paper is used for a strictly nominalistic kind of constructivism, such that that embodied in Russell’s “no class theory.” Its meaning, therefore, if very different from that used in current discussions on the foundations of mathematics, i.e., from both “intuitionistically admissible” and “constructive” in the sense of the Hilbert School. Both these schools base their constructions on a mathematical intuition whose avoidance is exactly one of the principle aims of Russell’s constructivism… What, in Russell’s own opinion, can be obtained by his constructivism (which might better be called fictionalism) is the system of finite orders of the ramified hierarchy without the axiom of infinity for individuals…”

Kurt Gödel, Kurt Gödel: Collected Works: Volume II: Publications 1938-1974, Oxford et al.: Oxford University Press, 1990, “Russell’s Mathematical Logic (1944),” footnote, Author’s addition of 1964, expanded in 1972, p. 119

This profound ambiguity in the meaning of “constructivism” is a conceptual opportunity — there is more that lurks in this idea of formal construction than is apparent prima facie. That what Gödel calls a, “strictly nominalistic kind of constructivism” coincides with what we would today call non-constructive thought demonstrates the very different conceptions of what is has meant to mathematicians (and other formal thinkers) to “construct” an object.

Kant, who is often called a proto-constructivist (though I have identified non-constructive elements on Kant’s thought in Kantian Non-Constructivism), does not invoke construction when he discusses formal entities, but instead formulates his thoughts in terms of exhibition. I think that this is an important difference (indeed, I have a long unfinished manuscript devoted to this). What Kant called “exhibition” later philosophers of mathematics came to call “surveyability” (“Übersichtlichkeit“). This latter term is especially due to Wittgenstein; Wittgenstein also uses “perspicuous” (“Übersehbar“). Notice in both of the terms Wittgenstein employs for surveyability — Übersichtlichkeit and Übersehbar — we have “Über,” usually (or often, at least) translated as “over.” Sometimes “Über” is translated as “super” as when Nietzsche’s Übermensch is translated as “superman” (although the term has also been translated as “over-man,” inter alia).

There is a difference between Kantian exhibition and Wittgensteinian surveyability — I don’t mean to conflate the two, or to suggest that Wittgenstein was simply following Kant, which he was not — but for the moment I want to focus on what they have in common, and what they have in common is the attempt to see matters whole, i.e., to take in the object of one’s thought in a single glance. In the actual practice of seeing matters whole it is a bit more complicated, especially since in English we commonly use “see” to mean “understand,” and there are a whole range of visual metaphors for understanding.

The range of possible meanings of “seeing” accounts for a great many of the different formulations of constructivism, which may distinguish between what is actually constructable in fact, that which it is feasible to construct (this use of “feasible” reminds me a bit of “not too large” in set theories based on the “limitation of size” principle, which is a purely conventional limitation), and that which can be constructed in theory, even if not constructable in fact, or if not feasible to construct. What is “surveyable” depends on our conception of what we can see — what might be called the modalities of seeing, or the modalities of surveyability.

There is an interesting paper on surveyability by Edwin Coleman, “The surveyability of long proofs,” (available in Foundations of Science, 14, 1-2, 2009) which I recommend to the reader. I’m not going to discuss the central themes of Coleman’s paper (this would take me too far afield), but I will quote a passage:

“…the problem is with memory: ‘our undertaking’ will only be knowledge if all of it is present before the mind’s eye together, which any reliance on memory prevents. It is certainly true that many long proofs don’t satisfy Descartes-surveyability — nobody can sweep through the calculations in the four color theorem in the requisite way. Nor can anyone do it with either of the proofs of the Enormous Theorem or Fermat’s Last Theorem. In fact most proofs in real mathematics fail this test. If real proofs require this Cartesian gaze, then long proofs are not real proofs.”

Edwin Coleman, “The surveyability of long proofs,” in Foundations of Science, 14 (1-2), 2009

For Coleman, the received conception of surveyability is deceptive, but what I wanted to get across by quoting his paper was the connection to the Cartesian tradition, and to the role of memory in seeing matters whole.

The embodied facts of seeing, when seeing is understood as the biophysical process of perception, was a concern to Bertrand Russell in the construction of a mathematical logic adequate to the deduction of mathematics. In the Introduction to Principia Mathematica Russell wrote:

“The terseness of the symbolism enables a whole proposition to be represented to the eyesight as one whole, or at most in two or three parts divided where the natural breaks, represented in the symbolism, occur. This is a humble property, but is in fact very important in connection with the advantages enumerated under the heading.”

Bertrand Russell and Alfred North Whitehead, Principia Mathematica, Volume I, second edition, Cambridge: Cambridge University Press, 1963, p. 2

…and Russell elaborated…

“The adaptation of the rules of the symbolism to the processes of deduction aids the intuition in regions too abstract for the imagination readily to present to the mind the true relation between the ideas employed. For various collocations of symbols become familiar as representing important collocations of ideas; and in turn the possible relations — according to the rules of the symbolism — between these collocations of symbols become familiar, and these further collocations represent still more complicated relations between the abstract ideas. And thus the mind is finally led to construct trains of reasoning in regions of thought in which the imagination would be entirely unable to sustain itself without symbolic help.”

Loc. cit.

Thinking is difficult, and symbolization allows us to — mechanically — extend thinking into regions where thinking alone, without symbolic aid, would not be capable of penetrating. But that doesn’t mean symbolic thinking is easy. Elsewhere Russell develops another rationalization for symbolization:

“The fact is that symbolism is useful because it makes things difficult. (This is not true of the advanced parts of mathematics, but only of the beginnings.) What we wish to know is, what can be deduced from what. Now, in the beginnings, everything is self- evident; and it is very hard to see whether one self- evident proposition follows from another or not. Obviousness is always the enemy to correctness. Hence we invent some new and difficult symbolism, in which nothing seems obvious. Then we set up certain rules for operating on the symbols, and the whole thing becomes mechanical. In this way we find out what must be taken as premiss and what can be demonstrated or defined.”

Bertrand Russell, Mysticism and Logic, “Mathematics and the Metaphysicians”

Russell formulated the difficulty of thinking even more strongly in a later passage:

“There is a good deal of importance to philosophy in the theory of symbolism, a good deal more than at one time I thought. I think the importance is almost entirely negative, i.e., the importance lies in the fact that unless you are fairly self conscious about symbols, unless you are fairly aware of the relation of the symbol to what it symbolizes, you will find yourself attributing to the thing properties which only belong to the symbol. That, of course, is especially likely in very abstract studies such as philosophical logic, because the subject-matter that you are supposed to be thinking of is so exceedingly difficult and elusive that any person who has ever tried to think about it knows you do not think about it except perhaps once in six months for half a minute. The rest of the time you think about the symbols, because they are tangible, but the thing you are supposed to be thinking about is fearfully difficult and one does not often manage to think about it. The really good philosopher is the one who does once in six months think about it for a minute. Bad philosophers never do.”

Bertrand Russell, Logic and Knowledge: Essays 1901-1950, 1956, “The Philosophy of Logical Atomism,” I. “Facts and Propositions,” p. 185

Alfred North Whitehead, coauthor of Principia Mathematica, made a similar point more colorfully than Russell, which I recently in The Algorithmization of the World:

“It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle: they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.”

Alfred North Whitehead, An Introduction to Mathematics, London: WILLIAMS & NORGATE, Chap. V, pp. 45-46

This quote from Whitehead follows a lesser known passage from the same work:

“…by the aid of symbolism, we can make transitions in reasoning almost mechanically by the eye, which otherwise would call into play the higher faculties of the brain.”

Alfred North Whitehead, An Introduction to Mathematics, London: WILLIAMS & NORGATE, Chap. V, pp. 45

In other words, the brain is saved effort by mechanizing as much reason as can be mechanized. Of course, not everyone is capable of these kinds of mechanical deductions made possible by mathematical logic, which is especially difficult.

Recent scholarship has only served to underscore the difficulty of thinking, and the steps we must take to facilitate our thinking. Daniel Kahneman in particular has focused on the physiology effort involved in thinking. In his book Thinking, Fast and Slow, Daniel Kahneman distinguishes between two cognitive systems, which he calls System 1 and System 2, which are, respectively, that faculty of the mind that responds immediately, on an intuitive or instinctual level, and that faculty of the mind that proceeds more methodically, according to rules:

Why call them System 1 and System 2 rather than the more descriptive “automatic system” and “effortful system”? The reason is simple: “Automatic system” takes longer to say than “System 1” and therefore takes more space in your working memory. This matters, because anything that occupies your working memory reduces your ability to think. You should treat “System 1” and “System 2” as nicknames, like Bob and Joe, identifying characters that you will get to know over the course of this book. The fictitious systems make it easier for me to think about judgment and choice, and will make it easier for you to understand what I say.

Daniel Kahneman, Thinking, Fast and Slow, New York: Farrar, Straus, and Giroux, Part I, Chap. 1

While such concerns do not appear to have explicitly concerned Russell, Russell’s concern for economy of thought implicitly embraced this idea. One’s ability to think must be facilitated in any way possible, including the shortening of names — in purely formal thought, symbolization dispenses with names altogether and contents itself with symbols only, usually introduced as letters.

Kahneman’s book, by the way, is a wonderful review of cognitive biases that cites many of the obvious but often unnoticed ways in which thought requires effort. For example, if you are walking along with someone and you ask them in mid-stride to solve a difficult mathematical problem — or, for that matter, any problem that taxes working memory — your companion is likely to come to a stop when focusing mental effort on the work of solving the problem. Probably everyone has had experiences like this, but Kahneman develops the consequences systematically, with very interesting results (creating what is now known as behavioral economics in the process).

Formal thought is among the most difficult forms of cognition ever pursued by human beings. How can we facilitate our ability to think within a framework of thought that taxes us so profoundly? It is the overview provided by the non-constuctive perspective that makes it possible to take a “big picture” view of formal knowledge and formal thought, which is usually understood to be a matter entirely immersed in theoretical details and the minutiae of deduction and derivation. We must take an “Über” perspective in order to see formal thought whole. We have become accustomed to thinking of “surveyability” in constructivist terms, but it is just as valid in non-constructivist terms.

In P or not-P (as well as in subsequent posts concerned with constructivism, What is the relationship between constructive and non-constructive mathematics? Intuitively Clear Slippery Concepts, and Kantian Non-constructivism) I surveyed constructivist and non-constructivist views of tertium non datur — the central logical principle at issue in the conflict between constructivism and non-constructiviem — and suggested that constructivists and non-constructivists need each other, as each represents a distinct point of view on formal thought.

In P or not-P, cited above, I quoted French mathematician Alain Connes:

“Constructivism may be compared to mountain climbers who proudly scale a peak with their bare hands, and formalists to climbers who permit themselves the luxury of hiring a helicopter to fly over the summit …the uncountable axiom of choice gives an aerial view of mathematical reality — inevitably, therefore, a simplified view.”

Conversations on Mind, Matter, and Mathematics, Changeux and Connes, Princeton, 1995, pp. 42-43

In several posts I have taken up this theme of Alain Connes and have spoken of the non-constructive perspective (which Connes calls “formalist”) as being top-down and the constructive perspective as being bottom-up. In particular, in The Epistemic Overview Effect I argued that in additional to the possibility of a spatial overview (the world entire seen from space) and a temporal overview (history seen entire, after the manner of Big History), there is an epistemic overview, that is to say, an overview of knowledge, perhaps even the totality of knowledge.

If we think of those mathematical equations that have become sufficiently famous that they have become known outside mathematics and physics — (as well as some that should be more widely known, but are not, like the generalized continuum hypothesis and the expression of epsilon zero) — they all have not only the succinct property that Russell noted in the quotes above in regard to symbolism, but also many of the qualities that G. H. Hardy ascribed to what he called mathematically significant ideas.

It is primarily non-constructive modes of thought that give us a formal overview and which make it possible for us to engage with mathematically significant ideas, and, more generally, with formally significant ideas.

. . . . .

Note added Monday 26 October 2015: I have written more about the above in Brief Addendum on the Overview Effect in Formal Thought.

. . . . .

Formal thought begins with Greek mathematics and Aristotle's logic.

Formal thought begins with Greek mathematics and Aristotle’s logic.

. . . . .

Studies in Formalism

1. The Ethos of Formal Thought

2. Epistemic Hubris

3. Parsimonious Formulations

4. Foucault’s Formalism

5. Cartesian Formalism

6. Doing Justice to Our Intuitions: A 10 Step Method

7. The Church-Turing Thesis and the Asymmetry of Intuition

8. Unpacking an Einstein Aphorism

9. The Overview Effect in Formal Thought

10. Einstein on Geometrical intuition

11. Methodological and Ontological Parsimony (in preparation)

12. The Spirit of Formalism (in preparation)

. . . . .

Wittgenstein's Tractatus Logico-Philosophicus was part of the efflourescence of formal thinking focused on logic and mathematics.

Wittgenstein’s Tractatus Logico-Philosophicus was part of an early twentieth century efflorescence of formal thinking focused on logic and mathematics.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

The Retrodiction Wall

23 October 2013

Wednesday


scientific-method

Prediction in Science

One of the distinguishing features of science as a system of thought is that it makes testable predictions. The fact that scientific predictions are testable suggests a methodology of testing, and we call the scientific methodology of testing experiment. Hypothesis formation, prediction, experimentation, and resultant modification of the hypothesis (confirmation, disconfirmation, or revision) are all essential elements of the scientific method, which constitutes an escalating spiral of knowledge as the scientific method systematically exposes predictions to experiment and modifies its hypotheses in the light of experimental results, which leads in turn to new predictions.

The escalating spiral of knowledge that science cultivates naturally pushes that knowledge into the future. Sometimes scientific prediction is even formulated in reference to “new facts” or “temporal asymmetries” in order to emphasize that predictions refer to future events that have not yet occurred. In constructing an experiment, we create a few set of facts in the world, and then interpret these facts in the light of our hypothesis. It is this testing of hypotheses by experiment that establishes the concrete relationship of science to the world, and this is also a source of limitation, for experiments are typically designed in order to focus on a single variable and to that end an attempt is made to control for the other variables. (A system of thought that is not limited by the world is not science.)

Alfred North Whitehead captured this artificial feature of scientific experimentation in a clever line that points to the difference between scientific predictions and predictions of a more general character:

“…experiment is nothing else than a mode of cooking the facts for the sake of exemplifying the law. Unfortunately the facts of history, even those of private individual history, are on too large a scale. They surge forward beyond control.”

Alfred North Whitehead, Adventures of Ideas, New York: The Free Press, 1967, Chapter VI, “Foresight,” p. 88

There are limits to prediction, and not only those pointed out by Whitehead. The limits to prediction have been called the prediction wall. Beyond the prediction wall we cannot penetrate.

effective history

The Prediction Wall

John Smart has formulated the idea of a prediction wall in his essay, “Considering the Singularity,” as follows:

With increasing anxiety, many of our best thinkers have seen a looming “Prediction Wall” emerge in recent decades. There is a growing inability of human minds to credibly imagine our onrushing future, a future that must apparently include greater-than-human technological sophistication and intelligence. At the same time, we now admit to living in a present populated by growing numbers of interconnected technological systems that no one human being understands. We have awakened to find ourselves in a world of complex and yet amazingly stable technological systems, erected like vast beehives, systems tended to by large swarms of only partially aware human beings, each of which has only a very limited conceptualization of the new technological environment that we have constructed.

Business leaders face the prediction wall acutely in technologically dependent fields (and what enterprise isn’t technologically dependent these days?), where the ten-year business plans of the 1950’s have been replaced with ten-week (quarterly) plans of the 2000’s, and where planning beyond two years in some fields may often be unwise speculation. But perhaps most astonishingly, we are coming to realize that even our traditional seers, the authors of speculative fiction, have failed us in recent decades. In “Science Fiction Without the Future,” 2001, Judith Berman notes that the vast majority of current efforts in this genre have abandoned both foresighted technological critique and any realistic attempt to portray the hyper-accelerated technological world of fifty years hence. It’s as if many of our best minds are giving up and turning to nostalgia as they see the wall of their own conceptualizing limitations rising before them.

Considering the Singularity: A Coming World of Autonomous Intelligence (A.I.) © 2003 by John Smart (This article may be reproduced for noncommercial purposes if it is copied in its entirety, including this notice.)

I would to suggest that there are at least two prediction walls: synchronic and diachronic. The prediction wall formulated above by John Smart is a diachronic prediction wall: it is the onward-rushing pace of events, one following the other, that eventually defeats our ability to see any recognizable order or structure of the future. The kind of prediction wall to which Whitehead alludes is a synchronic prediction wall, in which it is the outward eddies of events in the complexity of the world’s interactions that make it impossible for us to give a complete account of the consequences of any one action. (Cf. Axes of Historiography)

wyoming dig

Retrodiction and the Historical Sciences

Science does not live by prediction alone. While some philosophers of science have questioned the scientificity of the historical sciences because they could not make testable (and therefore falsifiable) predictions about the future, it is now widely recognized that the historical sciences don’t make predictions, but they do make retrodictions. A retrodiction is a prediction about the past.

The Oxford Dictionary of Philosophy by Simon Blackburn (p. 330) defines retrodiction thusly:

retrodiction The hypothesis that some event happened in the past, as opposed to the prediction that an event will happen in the future. A successful retrodiction could confirm a theory as much as a successful prediction.

I previously wrote about retrodiction in historical sciences, Of What Use is Philosophy of History in Our Time?, The Puppet Always Wins, and Futurism without predictions.

As with predictions, there is also a limit to retrodiction, and this is the retrodiction wall. Beyond the retrodiction wall we cannot penetrate.

I haven’t been thinking about this idea for long enough to fully understand the ramifications of a retrodiction wall, so I’m not yet clear about whether we can distinction diachronic retrodiction and synchronic retrodiction. Or, rather, it would be better to say that the distinction can certainly be made, but that I cannot think of good contrasting examples of the two at the present time.

Albert Einstein Quote mathematics reality

Effective History

We can define a span of accessible history that extends from the retrodiction wall in the past to the prediction wall in the future as what I will call effective history (by analogy with effective computability). Effective history can be defined in a way that is closely parallel to effectively computable functions, because all of effective history can be “reached” from the present by means of finite, recursive historical methods of inquiry.

Effective history is not fixed for all time, but expands and contracts as a function of our knowledge. At present, the retrodiction wall is the Big Bang singularity. If anything preceded the Big Bang singularity we are unable to observe it, because the Big Bang itself effectively obliterates any observable signs of any events prior to itself. (Testable theories have been proposed that suggest the possibility of some observable remnant of events prior to the Big Bang, as in conformal cyclic cosmology, but this must at present be regarded as only an early attempt at such a theory.)

Prior to the advent of scientific historiography as we know it today, the retrodiction wall was fixed at the beginning of the historical period narrowly construed as written history, and at times the retrodiction wall has been quite close to the present. When history experiences one of its periodic dark ages that cuts it off from his historical past, little or nothing may be known of a past that once familiar to everyone in a given society.

The emergence of agrarian-ecclesiastical civilization effectively obliterated human history before itself, in a manner parallel to the Big Bang. We know that there were caves that prehistorical peoples visited generation after generation for time out of mind, over tens of thousands of years — much longer than the entire history of agrarian-ecclesiastical civilization, and yet all of this was forgotten as though it had never happened. This long period of prehistory was entirely lost to human memory, and was not recovered again until scientific historiography discovered it through scientific method and empirical evidence, and not through the preservation of human memory, from which prehistory had been eradicated. And this did not occur until after agrarian-ecclesiastical civilization had lapsed and entirely given way to industrial-technological civilization.

We cannot define the limits of the prediction wall as readily as we can define the limits of the retrodiction wall. Predicting the future in terms of overall history has been more problematic than retrodicting the past, and equally subject to ideological and eschatological distortion. The advent of modern science compartmentalized scientific predictions and made them accurate and dependable — but at the cost of largely severing them from overall history, i.e., human history and the events that shape our lives in meaningful ways. We can make predictions about the carbon cycle and plate tectonics, and we are working hard to be able to make accurate predictions about weather and climate, but, for the most part, our accurate predictions about the future dispositions of the continents do not shape our lives in the near- to mid-term future.

I have previously quoted a famous line from Einstein: “As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality.” We might paraphrase this Einstein line in regard to the relation of mathematics to the world, and say that as far as scientific laws of nature predict events, these events are irrelevant to human history, and in so far as predicted events are relevant to human beings, scientific laws of nature cannot predict them.

Singularity-magnify

Singularities Past and Future

As the term “singularity” is presently employed — as in the technological singularity — the recognition of a retrodiction wall in the past complementary to the prediction wall in the future provides a literal connection between the historiographical use of “singularity” and the use of the term “singularity” in cosmology and astrophysics.

Theorists of the singularity hypothesis place a “singularity” in the future which constitutes an absolute prediction wall beyond which history is so transformed that nothing beyond it is recognizable to us. This future singularity is not the singularity of astrophysics.

If we recognize the actual Big Bang singularity in the past as the retrodiction wall for cosmology — and hence, by extension, for Big History — then an actual singularity of astrophysics is also at the same time an historical singularity.

. . . . .

I have continued my thoughts on the retrodiction wall in Addendum on the Retrodiction Wall.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Some Thoughts for Labor Day

6 September 2010

Monday


Labor Day is one of the major holiday weekends on the US calendar. Memorial Day and Labor Day bookend the summer at its beginning and ending, and are widely celebrated here as families pile into their cars for the long weekend and often go to a lake, a river, the ocean, or a mountain to get away from the infrastructure of industrialized civilization. While Memorial Day remains a time when people are widely aware of the origin and meaning of the holiday, there is very little reflection on the meaning of Labor Day, except for some politicized speeches at picnics sponsored by unions.

The contemporary political left in the US makes much of the violence of US labor history, and its virtual elision from textbooks, if not also public consciousness. In this, they are right. Very few people who do not already have an ideological inclination to study labor history have any idea of the hard-fought and hard-won battles over labor and the rights of labor in US history. It is a fascinating story that I will not attempt to recount or even summarize here.

These hard-fought and hard-won battles, unlike many historical disputes that have dropped out of public consciousness, had real results. While it is by no means easy to be a laborer, the working class (which, as I have observed elsewhere, is almost everyone today) has protections that have been written into law. Social security, medicare, unemployment insurance, worker’s compensation insurance, and a variety of protections, as well as legal mechanisms for workers to seek a redress of grievances against employers, have changed the way the companies do business, and have changed the lives of workers. We are so familiar with these programs that we scarcely think of them any more except as budgetary items, but they represent the legal institutionalization of the labor movement. Certainly worker benefits are much more generous in Europe, but Europe has a very different culture than the US, and the programs that function in Europe probably would not work very well on this side of the Atlantic.

It is deceptive to speak of “workers” because, as I noted above, almost everyone today is a worker. A year and a half ago in Responses to Recession, Left and Right I quoted otherwise notoriously muddle-headed metaphysical philosopher Alfred North Whitehead on this, who seemed to understand with preternatural clarity the condition of man in industrialized society:

“In any large city, almost everyone is an employee, employing his working hours in exact ways predetermined by others. Even his manners may be prescribed. So far as sheer individual freedom is concerned, there was more diffused freedom in the City of London in the year 1633, when Charles the First was King, than there is to-day in any industrial city of the world.”

Alfred North Whitehead, “The Study of the Past – its Uses and its Dangers,” Harvard Business Review (Volume XI, number 4, 1933)

To this I added the following:

Quite true. Even the heads of multinational corporations, CEOs, CFOs, CIOs, and all the other chiefs and captains of industry are in fact employees of a corporation who are paid a salary for their efforts. While such individuals are celebrated both in the business press and in popular culture as larger than life, the figures such as the Hedge Fund Manager were, until recently, nearly legendary characters in the case of contemporary life, they are still employees, and they serve at the pleasure of shareholders and boards of directors. Moreover, they can be dismissed, and of late they actually have been dismissed.

Despite the near universal condition of being a worker today in the industrialized world, people by and large do not want to identify themselves as workers, much less as laborers. This image problem may ultimately be more relevant to the future of the labor movement than declining union membership and declining pay and benefits relative to economic growth.

While in no sense a scientific poll, some time ago I formulated a detailed survey that I posted on Craigslist in Portland, which included questions about how people self-identify in relation to their work, and even from my modest effort it was very obvious that people did not want to call themselves “laborers” or even members of the working class. Needless to say, equally small numbers were willing to identify themselves as proletarians, although I have noticed over the past couple of years the ironic use of “prole” (an abbreviation for “proletariat”) when people want to both acknowledge their role as workers and to criticize the ways in which their work lives are compromised by stultifying policies and procedures.

While people hesitate to self-identify as workers and laborers, they know that they must labor to live in an industrialized economy. And, more often than not, they know that they must accept compromises in terms of what work they can do that will support them. Matthew B. Crawford in his Shop Class as Soulcraft: An Inquiry into the Value of Work (which I discussed in Back to shop class!) returns to this theme several times:

“As against the confused hopes for the transformation of work along emancipatory lines, we are recalled to the basic antagonism of economic life: work is toilsome and necessarily serves someone else’s interests.”

Matthew B. Crawford, Shop Class as Soulcraft: An Inquiry into the Value of Work, p. 52

Crawford also emphasized the trades as a dependable way to make a living, and this is a brave thing to say today, since the relentless advice given to young people is to get a degree (really, a credential) and move as rapidly as possible into the professional classes. But the problem (earning a living), and the practical response to it (getting work that actually pays the bills), are neither of them new. I was interested to find this in Samuel Butler’s The Way of All Flesh:

“Professions are all very well for those who have connection and interest as well as capital, but otherwise they are white elephants. How many men do not you and I know who have talent, assiduity, excellent good sense, straightforwardness, every quality in fact which should command success, and who yet go on from year to year waiting and hoping against hope for the work which never comes?”

This is a theme that runs throughout the book, and we find it again near the end:

“Being a gentleman is a luxury which I cannot afford, therefore I do not want it. Let me go back to my shop again, and do things for people which they want done and will pay me for doing for them. They know what they want and what is good for them better than I can tell them.”

I couldn’t find the exact quote that I wanted from Butler’s hilarious and still all-too-true novel, but these two quotes give a flavor of his opinion on the matter.

Doing things for people which they want done and will pay others for doing for them is often a wake-up call as to what the world really values. Some of these economic valuations border on the absurd, and certainly don’t seem like an optimal use of labor. Recently on the BBC there was a very interesting story about the high unemployment rate in Latvia, Fears over Latvia brain drain as economy struggles. Damien McGuinness of BBC News in Riga interviewed one Martins Neimanis, a civil engineer who was hoping to get work, “picking strawberries or packing vegetables in England.”

Sometimes, despite the effort we put into improving ourselves, we are more valued for the physical labor that we are capable of doing than for anything else. Here’s a personal example: recently I bought firewood from a neighbor who has a small photography business. The photography business in Portland is not doing very well at present, but he can sell firewood. So I bought firewood from him, because this is what I needed. I haven’t ever patronized his photography business. And I am in the same boat myself, so to speak. I had to self-publish my books because no one wants them, and I publish everything on this forum for free and count myself lucky if anyone bothers to read it. No one is going to pay me for it. But I can get paid for manual labor, so I do what I can get paid to do, though I would much rather be paid to think and to write, and I am probably much better at thinking and writing than I am at manual labor, but, by and large, others are not willing to pay for it.

Matthew B. Crawford in his above-mentioned Shop Class as Soulcraft: An Inquiry into the Value of Work made great sport of recent talk about the “creative class,” and deservedly so. This is a target than invites an arrow. Our above reflections make plain one of the weaknesses of trying to focus an economy on the creative class: mostly people are unwilling to pay for what the creative class produces, whereas they are willing to pay for shelter, food, and clothing. This can be a great disappointment to those of us who would like to be paid for our creative efforts, but it is a fact of life that cannot be wished away.

. . . . .

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

The Doors of Intellection

23 February 2009


“If the doors of perception were cleansed every thing would appear to man as it is, infinite. For man has closed himself up, till he sees all things through narrow chinks of his cavern.”

William Blake, The Marriage of Heaven and Hell


William Blake, The Marriage of Heaven and Hell, Plate 14

William Blake, The Marriage of Heaven and Hell, Plate 14

William Blake is the source of the famous phrase “the doors of perception” — it is from his wonderfully Faustian The Marriage of Heaven and Hell — though most readers will connect the phrase to the novel by Huxley. Huxley, of course, took his title from Blake. Since in yesterday’s Algorithms of Ecstasy I mentioned religious experiences induced, at least in part, by chemical means, it is appropriate to mention in this connection Huxley’s drug-addled visions of “Love as the primary and fundamental cosmic fact” (in a letter to Humphry Osmond). Huxley represents a dead end in the scientific pursuit of the absolute; Huxley represents science that has lost its objectivity and has ceased to operate according to methodological naturalism, and therefore ceased to be science.

What Blake has observed about the doors of perception holds good also for the doors of intellection: if the doors of intellection were cleansed every thing would appear to man as it is, infinite. For man has closed himself up, till he understands all things through narrow chinks of his cavern.

I have just finished listening to Richard Dawkins’ The God Delusion on CD. From a philosophical perspective, the book is highly problematic, but Dawkins is quite explicitly coming from a scientific perspective, and he knows it. He often has difficulty concealing his contempt for philosophical argumentation, and this makes the book problematic as he takes on many paradigmatically philosophical questions and does so from a scientific standpoint. Thus much of the book is at cross purposes with its intended subject matter.

dawkins_front

I mention Dawkins not to criticize him, however — many others have already done so, and I genuinely enjoyed the book — but to take up some of the themes with which he closes. The book has a fine peroration, and I was pleased with this as many authors on such subjects don’t bother to craft a good closing so that the book just lurches to a halt without any sense of climax and resolution. Dawkins delivers nicely on this score.

In the last few pages Dawkins introduces a number of notions, among them the Middle World and the motif of our sense perception being like the slit of light admitted by a burka. The Middle World is the familiar world of things not too large (like the objects of cosmology), not too small (like the objects of quantum mechanics), and not too fast (like objects approaching the speed of light), so that they obey the familiar laws that seem to hold for the greater part of things of our experience. dawkins_spine

The final sentences of Dawkins’ book thus proclaim, “Could we, by training and practice, tear off our black burka, and achieve some kind of intuitive — as well as just mathematical — understanding of the very small, the very large, and the very fast? I genuinely don’t know the answer, but I am thrilled to be alive at a time when humanity is pushing against the limits of understanding. Even better, we may eventually discover that there are no limits.”

Dawkins is here suggesting the equivalent for physical science of opening the doors of intellection, as well as the doors of perception. Prior to this passage the motif of the burka is used to emphasize the narrow range of phenomena to which our senses give us access, and he rightly generalizes to the possibility of understanding that similarly throws off the limits imposed by the anthropocentric origins of our ideas about the world.

However, there are limits. We already know this, and we can prove it. Dawkins’ mention of “just mathematical” in this passage — as though to say “mere mathematical” — provides a clue as to the false hopefulness of this otherwise inspiring conclusion. There is a highly developed branch of mathematical logic that deals explicitly and systematically with what are called the “limitative theorems”, i.e., theorems of formal logic that demonstrate the logical limits of our thinking.  The formal treatment of these limits is daunting, but it has been well-put (intuitively so, no less) by Wittgenstein: “The limits of my language mean the limits of my world.” (Tractatus Logico-Philosophicus, 5.6)

The limitative theorems are especially interesting in relation to Dawkins’ book given an amusing formulation given to the most famous of the limitative theorems, viz. Gödel’s incompleteness theorems:

“Suppose we loosely define a religion as any discipline whose foundations rest on an element of faith, irrespective of any element of reason which may be present. Quantum mechanics for example would be a religion under this definition. But mathematics would hold the unique position of being the only branch of theology possessing a rigorous demonstration of the fact that it should be so classified.”

F. De Sua, “Consistency and completeness — a résumé” American Mathematical Monthly, 63 (1956)

The kind of intuitive mastery of concepts that originate in mathematics and advanced recent work in the physical sciences is difficult, but we have ample evidence that it is achievable. The concept of zero was once advanced mathematics; today it is elementary, and most people experience little difficulty in mastering the concept. The truth table method for semantic decision procedures was advanced logic when Wittgenstein wrote his Tractatus; now it is familiar fare for elementary logic textbooks.

We create intuitions through the labor of the mind, and once an adequate intuition is obtained we can let the labor fall away as though it had never existed, like the scaffolding that held Michelangelo up to the underside of the ceiling of the Sistine Chapel. That is to say, it is possible to transcend the process by which we arrive at our ideas — the ontogeny of cognition, as it were — and to grasp the idea beyond its own history. When we do this in the social context of the idea (as with the examples above of the concept of zero and the truth table method) we even transcend the phylogeny of cognition. To invoke Wittgenstein again: .

My propositions are elucidatory in this way: he who understands me finally recognizes them as senseless, when he has climbed out through them, on them, over them. (He must so to speak throw away the ladder, after he has climbed up on it.)

Wittgenstein, Tractatus Logico-Philosophicus, 6.54

This motif of throwing away the ladder after climbing up it has become widely quoted in philosophical literature. Wittgenstein gives the paradoxical tension between intuitive concept and formal surrogate in its strongest form. Even when the tension does not appear in this radically paradoxical form, it is still present, informing our conceptions of logic, mathematics, and science. Sometimes the intuitive conception comes first, and we struggle to formalize it; sometimes the formal concept comes first, and we struggle to find an intuition adequate to it. In either case, it is a philosophical labor of no mean order (and one rarely appreciated for what it is).

This notion was also given a surprising and equally paradoxical (i.e., counter-intuitive) formulation by Alfred North Whitehead:

“It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.”

Alfred North Whitehead, An Introduction to Mathematics, Chapter 5

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

%d bloggers like this: