Sunday


Easter is, at bottom, a holiday that is about ideas. That is one reason that I am fascinated by Easter, and why over the past few years I’ve written many posts about Easter and the Lenten season, including:

The Meaning of Good Friday

Sabbatum Sanctum

Easter Sunday Reflection

Great Monday

Polysematic Good Friday

Theses on Easter

A Palm Sunday Message

Visualizing Easter

And, most recently…

Palm Sunday and April Fools Day

I have been at pains to point out in earlier posts that spring celebrations of the renewal of life seem to be as old has our species, and with this in mind it sounds more than a little odd that I should say that Easter is about ideas, except that Easter has become about ideas because it has been so repeatedly exapted throughout human history. As ideas are the currency of human interaction within civilization, the exaptation of Easter since the advent of civilization has meant the construction of an ideological exaptation mechanism of sufficient power to displace earlier celebrations with their established institutions.

It was necessary to overlay a Christian idea on a Pagan idea, and the Pagan idea was overlain on an even more ancient idea — if we take the stages of savagery, barbarism, and civilization (which I recently discussed in Savagery, Barbarism, and Civilization) as our model for the development of the forms by which we conceptualize life, we can see the Christian idea as an idea of medieval European civilization, the Greco-Roman idea that was exapted by Christianity as an idea of the civilization of classical antiquity, the earlier idea exapted by greco-Roman civilization as an idea of barbarism, and the earlier idea exapted by the barbaric idea as the savage idea — and now we have made it all the way back to “the savage mind” of Levi-Strauss.

In Christian civilization (i.e., Western civilization), Christmas and Thanksgiving have become more-or-less easily assimilated to the family gatherings that have become identified with these holidays, but Easter does not involve the kind of travel season that we find at Christmas and Thanksgiving. Perhaps this is because everyone has just had their Spring Break and is not in a position to travel again immediately for a holiday family gathering.

At Thanksgiving there is the preparation and consumption of a large meal, while at Christmas there is the trimming of the tree and gift exchange. In a large family these can be undertakings of significant proportions. While from a devotional standpoint these family-based rituals are not central to the holiday, from a sociological standpoint these features are in fact very central to the holidays, and if we could quantify that amount of time people spent thinking about, planning, and preparing for the practical consequences of Thanksgiving and Christmas and compare this to the time spent thinking about, planning, and preparing for the devotional significance of these holidays, it would probably be pretty obvious what concerns dominated the holidays.

In such cases as Thanksgiving and Christmas, we could say that holidays become exapted by the infrastructure of celebration. The infrastructure of familial celebration can, in turn, become exapted by the practical demands seemingly imposed by major holidays. In one of my least-read posts, Personal Dystopias, I tried to show how these socio-familial concerns can get out of hand and reduce or entirely eliminate any joy felt in the holiday or celebrated event. I believe that this is more common than is generally recognized.

This is, of course, the Protestant in me speaking: for those of a Protestant temperament, the “real” celebration is rigorously defined in devotional terms, and anything that detracts from the intensity of devotional observations is an impiety and indeed an impurity of the will. But knowing that Easter (like most holidays) has layer upon layer of sedimented meaning, and that the ideational content of devotional observance may well be the most superficial “meaning” of the holiday, compels us to respect the oldest and most continuous meaning of the celebration, which is the celebration itself. This recognition, however, of a continuity to the celebration that transcends the changing meanings that have been associated with the holiday is itself an idea — another perspective that one might bring to the celebration.

The history of Easter is the history of the exaptations of a holiday continuously celebrated since human beings have been celebrating holidays, and as civilization has added to the complexity of the forms by which we conceptualize life, the history of the exaptations of Easter has become a history of the exaptation of ideas.

. . . . .

Happy Easter… whatever it happens to mean to you!

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Tuesday


Jean Piaget

One of the important ideas from Piaget’s influential conception of cognitive development is that of perspective taking. The ability to coordinate the perspectives of multiple observers of one and the same state of affairs is a cognitive skill that develops with time and practice, and the mastery of perspective taking coincides with cognitive maturity.

From a philosophical standpoint, the problem of perspective taking is closely related to the problem of appearance and reality, since one and the same state of affairs not only appears from different perspectives for different observers, it also appears from different perspectives for one and the same observer at different times. In other words, appearance changes — and presumably reality does not. It is interesting to note that developmental psychologists following Paiget’s lead have in fact conducted tests with children in order to understand at what stage of development they can consistently distinguish between appearance and reality.

Just as perspective taking is a cognitive accomplishment — requiring time, training, and natural development — and not something that happens suddenly and all at once, the cognitive maturity of which perspective taking is an accomplishment does not occur all at once. Both maturity and perspective taking continue to develop as the individual develops — and I take it that this development continues beyond childhood proper.

While I find Piaget’s work quite congenial, the developmental psychology of Erik Erikson strikes me as greatly oversimplified, with its predictable crises at each stage of life, and the implicit assumption built in that if you aren’t undergoing some particular crisis that strikes most people at a given period of life, then there is something wrong with you; you ought to be experiencing the right crisis at the right time. That being said, what I find of great value in Erikson’s work is his insistence that development continues throughout the human lifespan, and does not come to a halt after a particular accomplishment of cognitive maturity is achieved.

Piagetian cognitive development in terms of perspective taking can easily be extended throughout the human lifespan (and beyond) by the observation that there are always new perspectives to take. As civilization develops and grows, becoming ever more comprehensive as it does so, the human beings who constitute this civilization are forced to formulate ever more comprehensive conceptions in order to take the measure of the world being progressively revealed to us. Each new idea that takes the measure of the world at a greater order of magnitude presents the possibility of a new perspective on the world, and therefore the possibility of a new achievement in terms of perspective taking.

The perspectives we attain constitute a hierarchy that begins with the first accomplishment of the self-aware mind, which is egocentric thought. Many developmental psychologists have described the egocentric thought patterns of young children, though the word “egocentric” is now widely avoided because of its moralizing connotations. I, however, will retain the term “egocentric,” because it helps to place this stage within a hierarchy of perspective taking.

The egocentric point of departure for human cognition does not necessarily disappear even when it is theoretically surpassed, because we know egocentric thinking so well from the nearly universal phenomenon of human selfishness, which is where the moralizing connotation of “egocentric” no doubt has its origin. An individual may become capable of coordinating multiple perspectives and still value the world exclusively from the perspective of self-interest.

In any case, the purely egocentric thought of early childhood confines the egocentric thinker to a tightly constrained circle defined by one’s personal perspective. While this is a personal perspective, it is also an impersonal perspective in so far as all individuals share this perspective. It is what Francis Bacon called the “idols of the cave,” since every human being, “has a cave or den of his own, which refracts and discolours the light of nature.” This has been well described in a passage from F. H. Bradley made famous by T. S. Eliot, because the latter quoted it in a footnote to The Waste Land:

“My external sensations are no less private to myself than are my thoughts or my feelings. In either case my experience falls within my own circle, a circle closed on the outside; and, with all its elements alike, every sphere is opaque to the others which surround it… In brief, regarded as an existence which appears in a soul, the whole world for each is peculiar and private to that soul.”

F. H. Bradley, Appearance and Reality, p. 346, quoted by T. S. Eliot in footnote 48 to The Waste Land, “What the Thunder Said”

I quote this passage here because, like my retention of the term “egocentric,” it can help us to see perspectives in perspective, and it helps us to do so because we can think of expanding and progressively more comprehensive perspectives as concentric circles. The egocentric perspective is located precisely at the center, and the circle described by F. H. Bradley is the circle within which the egocentric perspective prevails.

The next most comprehensive perspective taking beyond the transcendence of the egocentric perspective is the transcendence of the ethnocentric perspective. The ethnocentric perspective corresponds to what Bacon called the “idols of the marketplace,” such that this perspective is, “formed by the intercourse and association of men with each other.” The ethnocentric perspective can also be identified with the sociosphere, which I recently discussed in Eo-, Eso-, Exo-, Astro- as an essentially geocentric conception which, in a Copernican context, should be overcome.

Beyond ethnocentrism and its corresponding sociosphere there is ideocentrism, which Bacon called the “idols of the theater,” and which we can identify with the noösphere. The ideocentric perspective, which Bacon well described in terms of philosophical systems, such that, “all the received systems are but so many stage-plays, representing worlds of their own creation after an unreal and scenic fashion.” Trans-ethnic communities of ideology and belief, like world’s major religions and political ideologies, represent the ideocentric perspective.

The transcendence of the ideocentric perspective by way of more comprehensive perspective taking brings us to the anthropocentric perspective, which can be identified with the anthroposphere (still a geocentric and pre-Copernican conception, as with the other -spheres mentioned above). The anthropocentric perspective corresponds to Bacon’s “idols of the tribe,” which Bacon described thus:

“The Idols of the Tribe have their foundation in human nature itself, and in the tribe or race of men. For it is a false assertion that the sense of man is the measure of things. On the contrary, all perceptions as well of the sense as of the mind are according to the measure of the individual and not according to the measure of the universe. And the human understanding is like a false mirror, which, receiving rays irregularly, distorts and discolours the nature of things by mingling its own nature with it.”

Bacon was limited by the cosmology of his time so that he could not readily identify further idols beyond the anthropocentric idols of the (human) tribe, just as we are limited by the cosmology of our time. Yet we do today have a more comprehensive perspective than Bacon, we can can identify a few more stages of more comprehensive perspective taking. Beyond the anthropocentric perspective there is the geocentric perspective, the heliocentric perspective, and even what we could call the galacticentric perspective — as when early twentieth century cosmologists argued over whether the Milky Way was the only galaxy and constituted an “island universe.” Now we know that there are other galaxies, and we can be said to have transcended the galacticentric perspective.

As I wrote above, as human knowledge has expanded and become more comprehensive, ever more comprehensive perspective taking has come about in order to grasp the concepts employed in expanding human knowledge. There is every reason to believe that this process will be iterated indefinitely into the future, which means that perspective taking also will be indefinitely iterated into the future. (I attempted to make a similar and related point in Gödel’s Lesson for Geopolitics.) Therefore, further levels of cognitive maturity wait for us in the distant future as accomplishments that we cannot yet attain at this time.

This last observation allows me to cite one more relevant developmental psychologist, namely Lev Vygotsky, whose cognitive mediation theory of human development makes use of the concept of a Zone of proximal development (ZPD). Human development, according to Vygotsky, takes place within a proximal zone, and not at any discrete point or stage. Within the ZPD, certain accomplishments of cognitive maturity are possible. In the lower ZPD there is the actual zone of development, while in the upper ZPD there lies the potential zone of development, which can be attained through cognitive mediation by the proper prompting of an already accomplished mentor. Beyond the upper ZPD, even if there are tasks yet to be accomplished, they cannot be accomplished within this particular ZPD.

With the development of the whole of human knowledge, we’re on our own. There is no cognitive mediator to help us over the hard parts and assist us in the more comprehensive perspective taking that will mark a new stage of cognitive maturity and possible also a new zone of proximal development in which new accomplishments will be possible. But this has always been true in the past, and yet we have managed to make these breakthroughs to more comprehensive perspectives of cognitive maturity.

I hope that the reader sees that this is both hopeful and sad. Hopeful because this way of looking at human knowledge suggests indefinite progress. Sad because we will not be around to see the the accomplishments of cognitive maturity that lie beyond our present zone of proximal development.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Sunday


In what style should we think? It sounds like an odd question. I will attempt to make it sound like a reasonable one.

It would, of course, be preferable (or maybe I should say, “more natural”) to ask, “In what manner should we think?” or simply, “How should we think?” But I have formulated my question as I have in order to refer to Heinrich Hübsch’s essay, “In what style should we build?” (In welchem Style sollen wir bauen? 1828)

Building and thinking are both human activities, and thus both can be assimilated to the formulation of Weyl that I quoted in The Clausewitzean Conception of Civilization:

“The ultimate foundations and the ultimate meaning of mathematics remain an open problem; we do not know in what direction it will find its solution, nor even whether a final objective answer can be expected at all. ‘Mathematizing’ may well be a creative activity of man, like music, the products of which not only in form but also in substance are conditioned by the decisions of history and therefore defy complete objective rationalization.”

Hermann Weyl, Philosophy of Mathematics and Natural Science, Appendix A, “The Structure of Mathematics”

What Weyl here refers to as “mathematizing” can be generalized to human cognition generally speaking, and, if we like, we can generalize all the way to a comprehensive Cartesian conception of thought:

By the word ‘thought’, I mean everything which happens in us while we are conscious, in so far as there is consciousness of it in us. So in this context, thinking includes sensing as well as understanding, willing, and imagining. If I say, ‘I see therefore I am,’ or ‘I walk therefore I am,’ and mean by that the seeing or walking which is performed by the body, the conclusion is not absolutely certain. After all, when I am asleep I can often think I am seeing or walking, but without opening my eyes or moving, — and perhaps even without my having any body at all. On the other hand, the conclusion is obviously certain if I mean the sensing itself, or the consciousness that I am seeing or walking, since the conclusion then refers to the mind. And it is only the mind which senses, or thinks about its seeing or walking.

Descartes, Principles of Philosophy, section 9

Do thinking and building have anything in common beyond both being human activities? Is there not something essentially constructive in both activities? (This question is surprisingly apt, because we need to understand what constructive thinking is, but I will return to that later.) Did not Kant refer to the “architectonic” of pure reason, and has it not become commonplace among contemporary cognitive scientists and philosophers of mind to speak of our “cognitive architecture.”

Just taking the term “constructive” in its naïve and intuitive signification, we know that thought is not always constructive. Indeed, it is often said that thought, and especially philosophical thought, must be analytical and critical. Critical thought is not always or invariably destructive, and most of us know the difference between constructive criticism and destructive criticism. Still, thought can be quite destructive. William of Ockham, for example, is often credited with bringing down the Scholastic philosophical synthesis that reached its apogee in Aquinas.

Similar observations can be made about the building trades. While we usually do not include demolition crews among the construction trades, there is a sense in which demolition and construction are both phases in the building process. Combat engineers must be equally trained in the building and demolition of bridges, for example, which demonstrates both the constructive and the destructive aspects of construction engineering.

Just as we have a choice not only what to build, but in what style we will build, so too we have a choice, not only in what we think, but also how we think. As a matter of historical fact, I think you will find that the thinking of most individuals is not much more than a reaction, or a reflex. People think in the way that comes naturally to them, and they do not realize that they are thinking in a certain style unless they pause to think about their thinking. Well, this would be one way to characterize philosophy: thinking about thinking.

The unthinking way in which most of us think has the consequence of fostering what may be called cognitive monoculture. Individuals rarely step outside the parameters of thought with which they are comfortable, and so they allow their thoughts to follow in the ruts and the grooves left by their ancestors, much as architects, for many generations, reiterated classical building styles for lack of imagination of anything different.

It is probably very nearly impossible that I should write about building and thinking without citing Heidegger, so here is my nearly obligatory Heidegger citation, which, despite my general dislike of Heideggerian thought, suits my purposes quite perfectly:

“We come to know what it means to think when we ourselves try to think. If the attempt is to be successful, we must be ready to learn thinking.”

Martin Heidegger, What is called thinking? Lecture I

I agree with this: a serious attempt at thinking entails that we come to know what it means to think, and moreover we must be ready to learn thinking, and not merely take it for granted. But I find that I do not agree with the very next paragraph in Heidegger:

“As soon as we allow ourselves to become involved in such learning, we have admitted that we are not yet capable of thinking.”

Martin Heidegger, What is called thinking? Lecture I

In fact, we are capable of thinking, though the problem is that we do not really know whether we are thinking well or thinking poorly. When we think about thinking, when we reflect on what we are doing when we are thinking, we will discover that we have been thinking in a particular style, even if we were not aware that we were doing so — much like the physician in Moliere who did not know that he had been speaking prose his entire life.

If we pay attention to our thinking, and think critically about our thinking, we stumble across a number of distinctions that we realize can be used to classify the style of thought in which we have been engaged: formal or informal, constructive or non-constructive, abstract or concrete, objective or subjective, theoretical or practical, a priori or a posteriori, empirical or rational. These distinction define styles of thought, and it is only in reflection that we realize that one or another of these terms has applied to our thought, and thus we have been thinking in this particular style.

Ideally one would be aware of how one was thinking, and be able to shift gears in the middle of thinking and adopt a different mode of thought as the need or desire arose. The value of knowing how one has been thinking, and realizing the unconscious distinctions one has been making, is that one is now in a position to provide counter-examples to one’s own thought, and one is therefore no longer strictly reliant upon the objections of others who think otherwise than ourselves.

The cognitive monoculture that we uncritically accept before we learn to reflect on our own thinking is more often than not borrowed from the world, and not the product of our own initiative. Are we living, intellectually, so to speak, in a structure built by others? If so, ought we to question or to accept that structure?

This is a theme to which Merleau-Ponty often returned:

“…it is by borrowing from the world structure that the universe of truth and of thought is constructed for us. When we want to express strongly the consciousness we have of a truth, we find nothing better than to invoke a topos noetos that would be common to minds or to men, as the sensible world is common to the sensible bodies. And this is not only an analogy: it is the same world that contains our bodies and our minds, provided that we understand by world not only the sum of things that fall or could fall under our eyes, but also the locus of their compossibility, the invariable style they observe, which connects our perspectives, permits transition from one to the other, and — whether in describing a detail of the landscape or in coming to agreement about an invisible truth — makes us feel we are two witnesses capable of hovering over the same true object, or at least of exchanging out situations relative to it, as we can exchange out standpoints in the visible world in the strict sense.”

Maurice Merleau-Ponty, The Visible and the Invisible,

I trust Merleau-Ponty with this idea, but, to put it bluntly, there are many that I would not trust with this idea, since the idea that our cognitive architecture is borrowed from the world that we inhabit can be employed as a strategy to dilute and perhaps even to deny the individual. One could make the case on this basis that we are owned by the past, and certainly there are those who believe that inter-generational moral duties flow in only one direction, from the present to the past, but merely to formulate it in these terms suggests the possibility of inter-generational moral duties that flow from the past to the present.

Certainly by being born into the world we are born into a linguistic and intellectual context at the same time as we are born into an existential context, and this fact has profound consequences. As in the passage from Marx that I have quoted many times:

“Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past.”

Karl Marx, The Eighteenth Brumaire of Louis Napoleon, first paragraph

Marx gives us a particular perspective on this idea, but we can turn it around and by reformulating Marx attain to a different perspective on the same idea. Marx takes the making of history to be a unidirectional process, but it goes both ways, men make history and history makes men:

“Men begin under circumstances existing already, given and transmitted from the past, and make their own history as they please from what they select of the past. The past has not reality but that which men give to it.”

The circumstances transmitted to us from the past are not arbitrary; these circumstances are the sum total of the efforts of previous generations to re-make the world during their lives according to their vision. We live with the consequences of this vision. Moreover, the circumstances we then create are then transmitted to the past; this is our legacy, and future generations will do with it as they will.

The architect, too, begins with circumstances existing already, given and transmitted from the past. For Hübsch this is the problem. Hübsch begins his brief treatise with a ringing assertion that architectural thought is dominated to an archaic paradigm:

“Painting and sculpture have long since abandoned the lifeless imitation of antiquity. Architecture has yet to come of age and continues to imitate the antique style. Although nearly everyone recognizes the inadequacy of that style in meeting today’s needs and is dissatisfied with the buildings recently erected in it, almost all architects still adhere to it.”

Heinrich Hübsch, In what style should we build? 1828

In the twenty-first century this is no longer true. Building has been substantially liberated from classical forms. In fact, since Hübsch’s time, a new classicism — international modern — rose, dominated for a short time, and now has been displaced by a bewildering plethora of styles, from an ornately decorative post-modernism to outlandish structures that would have been impossible without contemporary materials technology. There are, to be sure, architectural conventions that remain to be challenge, and in the sphere of urban planning these conventions can be quite rigid because they become embodied in legal codes.

For our time, the most forceful way to understand Hübsch’s question would be, “In what style should we build our cities?” Another way in which Hübsch’s question retains its poignant appeal is in the form that I suggested above: in what style should we think?

Are we intellectually owned by the past? Is there a moral obligation for us to think in the style of our grandfathers? A semi-humorous definition attributed to Benjamin Disraeli has it that, “A realist is a man who insists on making the same mistakes his grandfather did.” Are we obliged to be realists?

Here we see the clear connection between building and thinking. Just as we might think like our grandfathers, so too we might build like our grandfathers. This latter was the concern of Hübsch. That is to say, we can as well inhabit (and restore, and reconstruct) the intellectual constructions of our forefathers as well as the material constructions of our forefathers.

It would be entirely possible for us today to construct classical cities on the Greco-Roman model; it is even possible to imagine a traditional Roman house with hot and cold running water, electric kitchen appliances, and wired for WiFi. That is to say, we could have our modern conveniences and still continue to build as the past built. We could choose to literally inhabit the structure of the past, as civilization did in fact choose to do for almost a thousand years when classical cities were built to essentially the same plan throughout the ancient world. (See my remarks on this in The Iterative Conception of Civilization.)

A perfectly comfortable dwelling with modern plumbing and electrical appliances added. Why not? Why not build in the style of the past?

We can take the Middle Ages as the intellectual analogy for thinking that the modernized Roman house is for living: the role of intellectual authority in medieval thinking was unprecedented and unparalleled. If experience contradicted authority, so much the worse for experience. If a classical text stated that something was the case, and the world seemed at variance with the text, the world was assumed to be in error. As classical antiquity lived with the same buildings for a thousand years, so the Middle Ages lived with the same thoughts for a thousand years. There is no reason that we could not take medieval scholarship, as we might update a Roman house, and add a few modern conveniences — like names for chemical elements, etc. — and have this perfectly serviceable intellectual context as our own.

A perfectly comfortable way of thinking with a few modern ideas and distinctions added. Why not? Why not think in the style of the past?

Thus the two previous macro-historical stages of Western civilization prior to modernism — namely, classicism and medievalism — represent, respectively, the attempt to build in the style of the past and the attempt to think in the style of the past. It has been the rude character of modernism to focus on the future and to be dismissive of the past. While this attitude can be nihilistic, we can now clearly see how it came about: the other alternatives were tried and found wanting.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Friday


In yesterday’s Addendum on Neo-Agriculturalism I made a distinction between political ideas (with which, to use Sartre’s formulation, essence precedes existence) and historical ideas (with which existence precedes essence). Political ideas are formulated as ideas and are packaged and promoted as ideologies to be politically implemented. Historical ideas are driving forces of historical change that are only recognized and explicitly formulated as ideas ex post facto. At least, that was my general idea, though I recognize that a more subtle and sophisticated account is necessary that will take account of shadings of each into the other, and acknowledging all manner of exceptions. But I start out (being the theoretician of history that I am) in the abstract, with the idea of the distinction to be further elaborated in the light of evidence and experience.

Also in yesterday’s post I suggested that this distinction between political and historical ideas can be applied to communism, extraterrestrialization, pastoralization, singularization, and neo-agriculturalism. Thinking about this further as I was drifting off to sleep last night (actually, this morning as I was drifting off to sleep after staying awake all night, as is my habit) I realized that this distinction can shed some light on the diverse ways that the term “globalization” is used. In short, globalization can be a political idea or an historical idea.

I have primarily used “globalization” as an historical idea. I have argued from many different perspectives and in regard to different sets of facts and details, that globalization is nothing other than the unfolding of the Industrial Revolution in those parts of the world where the Industrial Revolution had not yet transformed the life of the people, many of whom until recently, and many of whom still today, live in an essentially agricultural civilization and according to the institutions of agricultural civilization. While is the true that industrialization is sometimes consciously pursued as a political policy (though the earliest appearances of industrialization was completely innocent of any design), politicized industrialization is almost always a failure. Or, the least we can say is that politicized industrialization usually results in unintended consequences outrunning intended consequences. Industrialization happens when it happens when a people is historically prepared to make the transition from agricultural civilization to industrialized civilization. This is not a policy that has been implemented, but a response both to internal social pressures and external influences.

In this sense of globalization as the industrialization of the global economies and all the peoples of the world, globalization is not and cannot be planned, is not the result of a policy, and in fact almost any attempt to implement globalization is likely to be counter-productive and result in the antithesis of the intended result (with the same dreary inevitability that utopian dreams issue in dystopian nightmares).

However, this is not the only sense in which “globalization” is used, and in fact I suspect that “globalization” is invoked more often in the popular media as a name for a political idea, not an historical idea. Globalization as a political idea is globalization consciously and intentionally pursued as a matter of policy. It is this sense of globalization that is protested in the streets, found wanting in a thousand newspaper editorials, and occasionally touted by think tanks.

Considering the distinction between political ideas and historical ideas in relation to globalization, I was reminded of something I wrote a few months back in 100 Year Starship Study Symposium Day 2:

If you hold that history can be accurately predicted (at least reasonably accurately) a very different conception of the scope of human moral action must be accepted as compared to a conception of history that assumes (as I do) what we are mostly blindsided by history.

A conception of history dominated by the idea that things mostly happen to us that we cannot prevent (and mostly can’t change) is what I have previously called the cataclysmic conception of history. The antithetical position is that in which the future can be predicted because agents are able to realize their projects. This is different in a subtle and an important way from either fatalism or determinism since this conception of predictability assumes human agency. This is what I have elsewhere called the political conception of history.

What I have observed here in relation to futurist prediction holds also in the case of commentary on current events: if one supposes that everything, or almost everything, happens according to a grand design, then it follows that someone or some institution is responsible for current events. Therefore there is someone to blame.

Of course, the world is more complicated and subtle than this, but we only need acknowledge one exception to an unrealistically picayune political conception of history in order to provide a counter-example that demonstrates not all things happen according to a grand design. Any sophisticated political conception of history will recognize that some things happen according to plan, other things just happen and are not part of any plan, while the vast majority of human action is an attempt, only partly successful, to steer the things that happen into courses preferred by conscious agents. If, then, this is the sophisticated political conception of history, what I just called the “unrealistically picayune political conception of history” may be understood as the vulgar political conception of history (analogous to “vulgar Marxism.” Vulgar politicism is political determinism.

This analysis in turn suggests a distinction between vulgar catastrophism, which maintains dogmatically that everything “merely happens,” that chance and accident rules the world without exception, and that there is no rhyme or reason, no planning or design whatsoever, in the world. From this it follows that human agency is illusory. A sophisticated catastrophism would recognize that things largely happen out of our control, but that we do possess authentic agency and are sometimes able to affect historical outcomes — sometimes, but not always or dependably or inevitably.

In so far as globalization is global industrialization, it is and has been happening to the world and began as a completely unplanned development. Since the advent of industrialization, its global extrapolation has mostly followed from the same principles as its unplanned beginnings, but has occasionally been pursued as a matter of policy. On the whole, the industrialization of the world’s economy today is a development that proceeds apace, and which we can sometimes (although not always) influence in small and subtle ways even while the main contours are beyond direct control. Thus globalization begins as a purely historical idea, and as it develops gradually takes on some features of a political idea. This pattern of development, too, is probably repeated in regard to other historical phenomena.

. . . . .

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Thursday


In my recent post on neo-agriculturalism I mentioned the back-to-the-land movement that was especially prevalent in the late 1960s and early 1970s. Often the back-to-the-land movement was undertaken (when it was in fact undertaken) as a family affair. In its more radical and ideologically-motivated forms, however, the back-to-the-land movement involved the founding of communes.

Communes are a venerable American tradition. In the nineteenth century there were several American experiments with communes — proving the durability of the “back-to-the-land” movement — the most famous of which was the Brook Farm. Brook Farm became famous not least because Nathaniel Hawthorne lived there for a time and based his novel The Blithedale Romance on his experiences there.

A number of utopian currents fed into the nineteenth century vogue for communes, so they were probably doomed from the start. Take a little socialism, mix in Fourierism and some New England transcendentalism, liberally season with naïveté and youthful ideals, and you get a nineteenth century American commune. Since most of these short-lived institutions were founded by intellectuals with more experience of books and writing than of farming and animal husbandry, the stories that come out of these noble social experiments often sounds like a frighteningly close anticipation of Orwell’s Animal Farm, where one or a few members of the community (like the workhorse in Orwell’s fictional account) take on the actual burden of engaging in the unpleasant but necessary labor that makes life possible, while the rest shut themselves in their cottages to read and write.

One thing that can be said for the nineteenth century communes is that these visionaries and idealists actually tried to put their visions and ideals into practice. They not only talked the talk, they also tried to walk the walk — at least for a time. Which brings me to my theme: while there are a few experiments in communal living today, relative to the size of the global population these experiments are quite rare.

For those on the political left who favor cooperativism over individualism (the tension between which two I recently discussed in Addendum on Marxist Eschatology), and for those who have strongly advocated for communal living and cooperativist ideals — whether on the basis of a social philosophy or a particular understanding of economics — the establishment of a commune provides the possibility of a concrete experiment in communal living. And almost all of these have been failures. I find this to be highly significant, and the absence both of voluntary communism and discussion of the failure of communes to be also very significant.

For quite some time I have been meaning to write about the absence of voluntary communism and voluntary communes, which is, sociological speaking, very interesting. Yes, I know there are a few communes that are functioning, and there are long-term experiments in communal living such as the Kibbutz movement in Israel, but these amount to little when compared to what might have been… or what might yet be. If one really believes that a communal way of life is a good thing, or that the economics of communal living are superior to the economics of anarchic, unplanned and individualist capitalism, then one is free to make common cause with others of similar beliefs and to create a little utopia of one’s own — or rather of the community doing so together, in a spirit of mutual cooperation and shared sacrifice — even in the midst of capitalism.

In the twentieth century — so different from the experience of the nineteenth century — it became the tradition not to voluntarily establish communes, but to attempt to create communal living arrangements by threat of force and military coercion. This was the fundamental idea of what I have called The Stalin Doctrine, which Stalin himself formulated as: “Everyone imposes his own system as far as his army can reach. It cannot be otherwise. If now there is not a communist government in Paris, the cause of this is Russia has no army that can reach Paris in 1945.” This is the paradigm of non-voluntary communism.

These twentieth century “experiments” — which we might call “socialism under duress” — were enormous, catastrophic failures. We must not allow the short-sightedness of contemporary institutions or the nostalgia of memory to attempt to paper over the complete and utter failure of large-scale collectivism. The nation-states that attempted to put collectivism into practice, whether by a complete attempt at communism or a more gradual process of the nationalization of industry and expanding the social welfare state, are still suffering from the effects of this, and will continue to suffer for many decades, if not centuries.

What then of small-scale collectivism? Why should not those who are alive today, who believe strongly in collectivist ideals and who campaign and protest for these ideals, when there are precious few large-scale social experiments under way, get together and try socialism on a voluntary basis, without barbed wire and without armed guards in watchtowers forcing the residents of a presumptively communal society to remain against their will? Why not demonstrate to the world entire that collectivism is not dependent upon The Stalin Doctrine and that a social system need not have an army at its command in order to succeed?

Please don’t try to tell me that it can’t work. We know that one of the few Western institutions that functioned during the Middle Ages was that of cenobitic monasticism, which were isolated and nearly closed communities that not only survived, but ultimately thrived in the lawless conditions of medieval Europe. In fact, medieval monastic communities were so successful that they eventually became multi-national corporations that held enormous properties and governed some of the largest industries of the late middle ages. This was why Henry VIII dissolved them and expropriated their properties (and the revenues from these properties) for the crown.

Please don’t try to tell me that communal and cooperativist living must be global or the system simply won’t work, because the same cenobitic monastic communities just mentioned were almost always isolated islands of communal living. And, again, please don’t try to tell met that the initial capital for such an experiment is lacking, because there are quite a few wealthy individuals with collectivist sentiments who could easily sponsor a few hundred acres and a few dozen buildings as the seed for a contemporary voluntary commune.

What is lacking today is not the means or the opportunity to engage in voluntary collectivist living, but the will. The fact of the matter is that individualism has become what Fukuyama has called, “a systematic idea of political and social justice” much more so than the idea of liberal democracy, and this is because individualism is the practical implementation of what Fukuyama has called “The Drive for Dignity.” People today rarely if ever advocate individualism as a political philosophy — it sounds selfish when expressed explicitly — but they don’t need to advocate for individualism when then live its doctrines 24/7.

Whether in the heyday of non-voluntary communism during the twentieth century, or those who protest today for collectivist ideals, communism is always seems to be something for other people. Just as the Kim dynasty has lived in personal luxury while the people of North Korea starve, or Presidente Gonzalo lived in an upscale Lima apartment while directing the Maoist insurgency in Peru, or the Nomenklatura enjoyed the privileges of the elite under the Soviet Union, or the Princelings (children of communist party leaders) in China use their connections to become wealthy, those with presumably the greatest stake in collectivist living never want to live collectively themselves.

It is important to point out that when we speak of voluntary or non-voluntary communism we talking about a social arrangement that can be chosen or rejected. In the sense in which Marx discussed communism, and the sense in which I have recently written about communism in Marxist Eschatology and Addendum on Marxist Eschatology, communism is an historical force that is larger than the individual, and not something that can be chosen or rejected.

Thus we are talking about two fundamentally different things here:

1. communism as a political idea, which as such behaves according to the presuppositions of political society, being chosen by individuals or imposed by force, and…

2. communism as an historical idea, which as such is a category of historical understanding whereby we interpret and understand the large-scale movements and patterns of human society

The distinction is a subtle one, because a political idea often emerges from an historical idea implicit within a given political milieu, while an historical idea will often be used to analyze political ideas. But the difference, while subtle, is important, because the two kinds of ideas are opposed as contraries: with a political idea, essence precedes existence, while with an historical idea, existence precedes essence.

We should expect to find that the other possible futures that I have discussed alongside communism — extraterrestrialization, pastoralization, singularization, and now also neo-agriculturalism — will be expressed as both political ideas and historical ideas. And, in fact, when we pause to think it over, we do find that there are those thinking of political terms who want to foster the creation of a society that embodies these historical movements, while there are others thinking in historical terms of these possibilities as ideas already present at history and only discovered upon analysis.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Saturday


Yesterday in Marxist Eschatology I wrote:

Marx is the greatest exemplar of a perennial tradition of human thought that has been with us from the beginning and which will be with us as long as civilization and human life endures. This tradition wasn’t always called Marxism, and it won’t always be called Marxism, but the perennial tendency will remain. There will always be individuals who are attracted to the perennial idea that Marx represents, and as of the present time Marx remains the most powerful advocate of these ideas.

While on my other blog in Marx and Fukuyama I wrote:

With Marx, we can identify a “bend in the road” of history at which point Marx might be proved right or wrong. For some people — wrongly to my mind — this point was identified as the end of the Cold War. To my mind, it is the full industrialization of the world’s economy. Thus Marx’s thesis has the virtue of falsification.

This calls for a little clarification, since if interpreted uncharitably it might be found contradictory for Marxism to be a perennial idea and to be falsifiable, since what distinguishes a perennial idea is that it is not falsifiable — at least, not in a robust sense of falsification.

Karl Popper was the philosopher who formulated falsifiability as a criterion of scientificity (I’m not certain he was the first, be he has definitely been the most influential in advancing the idea of falsifiability, especially in contradistinction to the logical positivist emphasis on the verifiability criterion), and he discussed Marx at some length. Here’s nice summary from one of Popper’s later works:

“As I pointed out in my Open Society, one may regard Marx’s theory as refuted by events that occurred during the Russian Revolution. According to Marx the revolutionary changes start at the bottom, as it were: means of production change first, then social conditions of production, then political power, and ultimately ideological beliefs, which change last. But in the Russian Revolution the political power changed first, and then the ideology (Dictatorship plus Electrification) began to change the social conditions and the means of production from the top. The reinterpretation of Marx’s theory of revolution to evade this falsification immunized it against further attacks, transforming it into the vulgar-Marxist (or socioanalytic) theory which tells us that the ‘economic motive’ and the class struggle pervade social life.”

Karl Popper, Unended Quest, “Early Studies,” p. 45

I should point out that I agree with Popper’s arguments, and that Marxism construed in the narrow sense that Popper construed it was falsified by the events of the Russian Revolution. Lenin’s “weakest link of capitalism” theory was instrumental in the reinterpretation of Marxism that Popper mentioned. Beyond Lenin, Mao made even more radical changes by shifting the focus from the industrial proletariat to the agricultural peasant. It is a testament to the extent to which the twentieth century was not fully industrialized that it was Maoism rather than Marxism or Leninism that was the form of communism that reached the masses during the last century.

However, I think that there is a species of Marxism that lies between Popper’s narrowly conceived Marxism and the vulgar Marxism reinterpreted in the light of apparent falsification, and this is a Marxism that has been generalized beyond the historically specific conditions of the Russian Revolution, and even beyond the Cold War, which had almost nothing to do with democracy or communism and almost everything to do with national rivalry and the great game of power politics.

I have called a generalized Marxism a species of Marxism, and herein lies to clue to the distinction between Marxism and a perennial idea in the strict sense. Marxism (of one variety or another) is a species that falls under the genera of collectivist political thought. The latter — collectivist political thought — is a perennial idea, and lies beyond falsification. It is neither true nor false, but an ongoing influence, just like its implied contrary, which is individualist political thought. Individualism also lies beyond falsification, and is neither true nor false but remains an ongoing influence in human affairs.

Most forms of capitalism are individualist in orientation, though not all: oligarchical capitalist societies (like medieval Venice) had little to do with individualism. Thus a generalization of capitalism does not always lead to individualism. A generalization of capitalism, depending on its subtle differences in tone of market activity from one society to another, may lead to individualism, but it may also lead to a profoundly hierarchical crony capitalism, or to some other socio-economic formation.

Speaking generally for ideas, and not just communism and capitalism, and indeed not just political and economic ideas but all ideas, the generalization of an historically situated and therefore specific idea usually leads to a perennial idea if the generalization is sufficiently radical. The generalization of capitalism may or may not lead to individualism, but it will eventually lead to some perennial idea which lies beyond falsification, whether that idea is patriarchalism or something else. The generalization of Marxism, I think, leads more directly to a perennial form of collectivist thought, which at its greatest reach of generality is scarcely distinguishable from a vague sentimental connection to others.

The species of Marxism that I have posited — midway between Marxism narrowly conceived and Marxism generalized to the point of a vague feeling of cooperative common cause — is falsifiable, but it is not falsifiable by experiment. It is only falsifiable by history. It shares this property with other theses in the philosophy of history. This is one of the fundamental distinctions between the natural sciences and at least some of the historical sciences: theses in some of the historical sciences are falsifiable, but they are not falsifiable on demand. One can only wait and see if they are eventually falsified. With the passage of time the inductive evidence of an unfalsifiable thesis in the philosophy of history increases, but is never confirmed. Thus the philosophy of history, contrary to most expectations, is the most science-like of the branches of philosophy.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Plato’s Guardians

6 December 2011

Tuesday


Plato more or less founded the Western tradition of philosophical inquiry, and the spirit of Plato still looms large. Platonism (that is to say, Plato's theory of ideas) still has legs.

Everyone is familiar with the famous passage in Plato’s Republic, Book VI, where he introduces the philosopher-king:

“…neither cities nor States nor individuals will ever attain perfection until the small class of philosophers whom we termed useless but not corrupt are providentially compelled, whether they will or not, to take care of the State, and until a like necessity be laid on the State to obey them; or until kings, or if not kings, the sons of kings or princes, are divinely inspired with a true love of true philosophy. That either or both of these alternatives are impossible, I see no reason to affirm: if they were so, we might indeed be justly ridiculed as dreamers and visionaries.”

Of course philosophers are routinely understood to be useless dreamers and visionaries — some take this as a sign that the Republic is intentionally ironic and consciously “utopian” in an unrealistic sense — and it is partly this image of the useless philosopher that makes philosophers an easy target for fashionable anti-philosophy.

Today we should not wish for philosopher-kings, but it is revelatory of Plato’s hostility to democracy that he formulated his utopian political leadership in terms of kingship rather than in terms of the democracy that his city of Athens had made famous throughout the ancient world, and for which it is still famous today — and rightly so. But for Plato, it was democratic Athens, agitated to a frenzy by demagogues, that was responsible for the execution of Socrates, “the wisest, and justest, and best of all the men,” according to Plato.

Today we should wish for philosopher-citizens, from whose ranks are democratically chosen philosopher-legislators, philosopher-judges, and philosopher-presidents, which latter appoint philosopher-cabinet members and so forth. That certainly sounds like a meritocratic ideal, and one for which I could work up a certain level of enthusiasm, though I doubt it would appeal to many, or indeed appear as even vaguely plausible or realistic by many. Again the figure of the philosopher as a useless dreamer and visionary haunts us.

Plato’s vision of an ideal republic, however, is not entirely or exclusively informed by monarchical institutions. There are, after all, the Guardians.

Plato’s republic includes an elite class of individuals who are reserved for political rule — the guardians. At the end of Book V Plato describes the way of life of the guardians of the republic.

“Then let us consider what will be their way of life, if they are to realize our idea of them. In the first place, none of them should have any property of his own beyond what is absolutely necessary; neither should they have a private house or store closed against any one who has a mind to enter; their provisions should be only such as are required by trained warriors, who are men of temperance and courage; they should agree to receive from the citizens a fixed rate of pay, enough to meet the expenses of the year and no more; and they will go and live together like soldiers in a camp. Gold and silver we will tell them that they have from God; the diviner metal is within them, and they have therefore no need of the dross which is current among men, and ought not to pollute the divine by any such earthly admixture; for that commoner metal has been the source of many unholy deeds, but their own is undefiled. And they alone of all the citizens may not touch or handle silver or gold, or be under the same roof with them, or wear them, or drink from them. And this will be their salvation, and they will be the saviours of the State. But should they ever acquire homes or lands or moneys of their own, they will become housekeepers and husbandmen instead of guardians, enemies and tyrants instead of allies of the other citizens; hating and being hated, plotting and being plotted against, they will pass their whole life in much greater terror of internal than of external enemies, and the hour of ruin, both to themselves and to the rest of the State, will be at hand. For all which reasons may we not say that thus shall our State be ordered, and that these shall be the regulations appointed by us for guardians concerning their houses and all other matters?”

This doesn’t sound like a whole lot of fun, and I don’t think that many persons would enjoy this level of discipline in service to the state. Indeed, Plato thinks as much as well, since in the earlier passage quoted above he suggested that philosophers would have to be compelled to rule the state. In other words, philosophers are to be drafted against their will into service of the state.

This sounds altogether too much like Rousseau talking about people being “forced to be free” and makes the modern individual more than a little uncomfortable. And all the comparisons of the guardians’ way of life with that of soldiers makes it all sound rather regimented and self-sacrificing. If you ask persons to engage in sacrifice, do not be expected to be overwhelmed by volunteers. Individuals do in fact sacrifice, we know this from countless historical examples, but it is bad form to ask for sacrifice. It is also likely to be unsuccessful.

In contrast to these austere and off-putting images I would like to suggest another interpretation of Plato’s guardians. This is not an interpretation that has any textual basis in Plato’s Republic, but which is suggested by the actual lives of the philosophers who might have once become philosopher-kings, or who today might become philosopher-citizens. While Plato did not approach his guardians in this way, Plato certainly would have known the character of philosophers, and so I think that Plato may well have had another sense, which I will suggest below, in mind, even if it does not come across in the Republic.

Philosophers are fascinated by ideas, and especially by abstract ideas. There was formerly also a class of natural philosophers, but natural philosophers have since become natural scientists. Natural philosophers once upon a time, and natural scientists today, are fascinated by ideas also, but more by empirical ideas than abstract ideas. The key here is the sense of wonder and fascination in things, which is coupled with an unparalleled moral imperative to understand the world on its own terms.

This can sound a bit grim to speak in terms of a “moral imperative,” but there is nothing at all grim about being fascinated by ideas. Take my word for it, just as some people are excited by betting on horses, others by the prospect of an especially good meal, and a few by watching an especially hard-fought boxing match, just so philosophers are similarly excited by ideas.

The philosopher delves into ideas and immerses himself in them not for his own pleasure — though there is pleasure in it — but for the sake of the idea itself. This pure pleasure at one remove is a powerful force. In the lives of philosophers it is a force expressed in an abstract realm, but in terms of human nature it is a universal called forth by different stimuli in different individuals. If the state could harness this enthusiasm for its own ends, as states today have learned to harness incentives for economic growth, then the state would be in a position to have individuals plunge themselves headfirst into the problems of state for the pure desire of thinking them through and coming to the optimal solution if there is one, or an understanding of why there is no solution if one is lacking.

This, I think, is a better way to understand Plato’s guardians: men and women who are absolutely fascinated by the problems of the state and who immerse themselves in thinking through matters usually consigned to the instinctive and intuitive reactions of primarily political men. This would be a much more powerful force than I think most people would realize, like the harnessing of incentive in a capitalistic economy as mentioned above.

Here is an example that comes close to my meaning. Russian nuclear scientist Constantine Checherov appeared in the Nova episode “Inside Chernobyl’s Sarcophagus,” in which he described his experience as a scientist entering the reactor at Chernobyl destroyed in the explosion. Despite great personal danger and a cataclysm that affected many thousands, his response to his discoveries was that of scientific wonder:

“Maybe it’s bad of me, but I must admit, as a researcher I was filled with joy — when I realized exactly what I’d found, it was sheer delight. It’s comparable to the excitement of a scientist studying volcanic lava. It’s incredibly interesting, inspirational.”

And at Deixant Rastre we find this additional paean to scientific curiosity and epistemic joy from Checherov:

“Nobody orders me to do this, nobody forces me to do it. When I enter the fourth reactor nobody and nothing can disturb me. There are no people around checking the radiation dose that I get there. I am in another world, a world of freedom — of pure euphoria and joy. I was the very first person in the world to see the reactor from the inside.”

This is the pure euphoria and joy of a nuclear scientist in the face of what to others is an unspeakable horror. So, if can imagine this, transpose this same attitude to physical theory into social theory, and this would be a response of the guardian as I would have the guardians understood.

Imagine, if you can, a republic governed by citizen-philosophers — guardians who are philosophical technocrats in charge of the state apparatus. If a terrible calamity befalls the state — a destructive earthquake, a financial panic, an epidemic, etc. — instead of placing themselves in front of television cameras and emoting to beat the band, telling the victims that they feel their pain while telling the survivors that they share their joy, our Platonic guardians of the state respond by viewing the calamity as primarily an intriguing intellectual challenge to be met. How can institutions be constructed that can adequately respond to such calamities in the future? What is the most rational allocation of state resources in time of calamity? How can the needs of those adversely affected be met most rapidly and systematically? These are the questions with which our philosopher-citizens will immediately engage, and seek to produce practical results.

A government by such guardians would be like government by think tank, although the thinkers would be chosen for their intrinsic sense of inquiry and abstract thought rather than conformity to any one ideological point of view, as is the case with most think tanks today.

To some, the meditations of such a political think tank in service to the state and its citizens could appear cold, bloodless, passionless, and calculating. As off-putting as it sounds, the best philosophical thought has exactly this character, which is why the best philosophers are often believed to be cold and distant individuals even when they are persons of great passion and of profound feelings for their fellow man.

Greatness of thought is a singular thing, and is rarely understood. In fact, it is most often misunderstood. If formulated in terms of a good analogy, it might make sense to more people. For example, someone who is a genius at picking horses at the races may not be a “nice” person, may lack social skills, may not ingratiate himself in polite company, but he is good at picking horses. And at the betting track, this is all that matters.

Similarly, greatness of thought is what is needed in deliberation over the great matters of state, and the individual who has most perfected his intellect to penetrate the mysteries of these great matters may be no better company than a picker of winners at the track, but in matters of state, the likeability or unlikeability of such an individual is irrelevant.

Guardians as pure philosophers might not be much fun to party with, but they would be formidable as pilots of the ship of state.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

. . . . .

Theses on Easter

4 April 2010

Sunday


Theses on the Occasion of Easter Sunday

A Theoretical Account of Ritualized Celebration


1. Distinctions must be made among myth, ritual, and celebration.

1.1 Myth, ritual, and celebration, though distinct, are logically related.

1.11 A celebration is an occasion for a ritual,
A ritual is an opportunity to participate in a myth,
Therefore a celebration is an occasion in which to participate in a myth.
Q. E. D.

1.2 Rituals of burial are older than agricultural rituals of life-death-rebirth, even extending to other species (Neanderthals, now extinct), and may well be the origin of life-death-rebirth rituals.

2. Among the most ancient of continually observed celebrations is that of the life-death-resurrection of the Year-God, eniautos daimon.

2.1 The celebration of the life and re-birth of the Year-God, eniautos daimon, is at least as old as settled, agrarian society.

2.11 Agriculture and the written word together produced settled, historical civilization.

2.12 Settled historical civilization has defined the norm of human history from the Neolithic Agricultural Revolution to the Industrial Revolution.

2.2 Settled agrarian society coincides with the origins of civilization.

2.21 The celebration of the life and re-birth of the Year-God, eniautos daimon, coincides with the origins of civilization.

3. Once the breakthrough to history has been made by way of the written word, it is the nature of historical civilization to commemorate nodal points of the year, whether with solemnities, festivities, or both.

3.1 Historical civilization is predicated upon the presumed value of the history that brings that civilization into being.

3.2 Nodal points of the year celebrated in historical civilizations are observed as validation of their historicity through the performance of rituals.

3.21 In a temperate climate, summer and winter solstices and spring and fall equinoxes are nodal points of the year.

4. The mythology of a settled, agricultural civilization emerges from the same regularities of nature observed of necessity by agricultural peoples.

4.1 The calendrics of celebration emerges from the regularities of nature observed of necessity by agricultural peoples.

4.11 The mythology and calendar of celebrations of settled, agricultural civilizations come from the same source.

4.2 Celebrations are the points of contact between the two parallel orders of mythological events and the actual historical calendar.

4.21 A civilization validates its mythology by establishing a correspondence between mythological events and historical events.

4.3 Enacting a myth in historical time, by way of a ritual, makes that myth literal truth by giving to it a concrete embodiment.

5. Easter is one species of the genus of life-death-rebirth celebrations.

5.1 The particular features of the Easter celebration are the result of the adaptive radiation of the dialectic of sacrifice and resurrection.

6. Easter is that species of life-death-rebirth celebration specific to Christendom.

6.1 Christendom was primarily a construction of the Middle Ages.

6.11 Christendom was the legacy of Medieval Europe that disappeared with the passing of medieval civilization but which, like the Roman Empire before it, is with us still and remains a touchstone of the Western tradition.

6.12 Christendom was an empire of the spirit and of the cross as Rome was an empire of the will and of the sword.

6.13 To have once been Roman, and then to have been Christian, and finally to have become modern, is the condition of Western man.

6.2 Easter is a celebration specific to civilization, the civilized celebration par excellence.

7. The naturalistic civilization that is emerging from the consequences of the Industrial Revolution represents the first significant change in the social structure of human society since the Neolithic Agricultural Revolution.

7.1 With the advent of the Industrial Revolution, we have ceased to be an agrarian society.

7.2 For the first time in history, life-death-rebirth celebrations face interpretation by a non-agrarian society.

7.21 Not only should we not hesitate to find new meanings in ancient celebrations, of which Easter represents the latest adaptive radiation, but rather we should actively and consciously seek meanings relevant to the present in such celebrations.

8. As the painters of the renaissance drew upon the traditions of pagan antiquity already at that time a thousand years out of date, so too the post-Christian Western civilization will draw upon the traditions of Christendom for hundreds if not thousands of years to come.

8.1 The period of time that we have come to call the modern era — roughly the past five hundred years — has not been the modern era proper but rather has been the period of the formation of modernity.

8.2 Modernity simpliciter has but begun.


. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Defunct Ideas

11 January 2010

Monday


Coronation of Louis VIII and Blanche of Castile at Reims in 1223; a miniature from the Grandes Chroniques de France, painted in the 1450s, and replete with the symbolism of the age, a tribute to medieval ideas.

A few days ago in Ideas: Blindness and Illusion I discussed the adequacy of conceptual schemes and ontological inventories of the furniture of the universe. There I suggested that a rigorous adherence to both the principle of parsimony — avoiding ontological indulgences — and the principle of adequacy — avoiding ontological impoverishment — would yield us the most accurate result in terms of seeing the world for what it is, neither more nor less than what it is.

This, of course, is a great over-simplification and obviously inadequate. For starters, there is no definite number of things in the world, including the fact that there is no definite number of ideas in the world. I can think of at least three reasons for this (there may be others, but this is what occurs to me as I write). A rigorous definition of what constitutes an individual would be necessary to be a rigorous inventory of the number and kind of individuals that exist. It is by no means obvious what is an individual and what is not. This is a function of vagueness. Also connected with vagueness, even given a definition of what constitutes an individual, there will be many cases that, due to vagueness, are ambiguous. Lastly, and certainly not least, the inventory of the world is not static, but dynamic.

The world is not frozen in any one state of affairs. If we consider populations of biological entities, for example, we know that there are always some individuals being born while other individuals are dying. Populations can be stable, but they must be viewed statistically in terms of averages and approximations. There is no Platonic form of the number of people on the planet. You can make that number precise by formulating a number of conventions, but the number would be constantly changing and the conventions adopted would in some cases be arbitrary.

What holds for human bodies also holds for ideas, with some exceptions (you can count this as an example of naturalism’s minimalist materialism that I have written about on several occasions). There is no fixed number of ideas, but ideas grow in number disproportionately to populations of non-abstract objects. Plato was nearly right on this, at least: ideas, once they emerge, are nearly eternal.

There are probably a few cases when ideas have emerged in history, played a role in human societies, and then disappeared, but we cannot prove this. Once an idea enters circulation, even if it is later abandoned, any record of the idea preserves that idea. In the earliest portions of human history, especially in prehistory (when, by definition, there are no written records that might preserve the ideas that enjoyed currency in early societies), an idea may have been conceived and subsequently lost to the vast stretches of time that have since intervened. It is likely that ideas were lost to history during the Greek dark ages, when the art of writing disappeared in places and had to be reinvented. Societies at this stage of development (think of the heroic world of Homer) had reached a stage of sophistication and complexity that many ideas would have been in circulation, but these societies hadn’t yet reached the stability or resilience of contemporary civilization, and thus much may have been lost to history.

Since the beginning of the historical period proper, few ideas have been lost to history once they entered general circulation. If an individual hits on a great idea but forgets it, or does not communicate it, or the papers upon which he wrote it were scattered, burned, or lost upon his death, countless ideas of this sort may be and are still being lost to history. But once an idea enters Popper’s third world and takes on a life of its own, beyond the mind of a particular individual, our present information technologies will preserve such an idea indefinitely. There is no reason to believe that, if civilization lasts (and maintains its continuity) for another ten thousand or even a hundred thousand years, some future individual would not be able to educate themselves in the ideas in circulation, for example, in Elizabethan England.

Which, at last, brings us to defunct ideas. Just because an idea has been preserved, and any interested, sentient party (not even necessarily human) can, through appropriate research, form an adequate conception of the idea, does not mean that that idea is a living option. An idea that has fallen out of use, no longer widely circulates in current human societies, and which is no longer a force in shaping the lives of individuals or populations, I call a defunct idea. (We could, alternatively, call them dead ideas, by analogy with dead languages, i.e., languages of which there is scholarly knowledge but which are no longer spoken by a population.)

There are many defunct ideas. For example, I would say that the idea of the divine right of kings (and some of the ideas clustered with it, like royal absolutism) is a defunct idea. Certainly there are those who still believe in individuals being divinely anointed for some purpose in life, and certainly there are still kings that rule countries (though not many any more), but as a topic that has the power to move men to passionate debate and armed conflict, the divine right of kings is no longer a “mover and shaker” in the world of ideas. All we need do is compare it to an idea that truly has currency — like democracy, communism, or revolution — to understand the difference.

Some of the ideas closely clustered with the idea of the divine right of kings, such as constitutionalism, are not defunct. In fact, they are very much alive because they won out in the historical contest between a dying idea and an emerging idea. In the early modern period, royal absolutism was an old idea, and while still an idea that would later reach its apogee under Louis XIV in France, it was nevertheless already a dying idea. At the same time, constitutionalism, as an alternative to monarchical government, was a new and exciting idea, at times as weak and as defenseless as a new-born babe, but soon enough to grow into its maturity and to replace the dying ideas of the medieval past.

We need to here distinguish further between ideas that are properly defunct and ideas that are points of reference for contemporary societies (ideas that are, in other words, embodied ideas) but are not made explicit. Such implicit ideas would include classic Enlightenment conceptions such as the perfectibility of man. If you asked the typical man-in-the-street today about the perfectibility of man, he probably wouldn’t know what you were talking about. But if you explained the idea, he would almost certainly have an opinion on it one way or the other. In other words, even if it is not given an explicit formulation, the perfectibility of man is a live option in today’s world. People not only think about it, they respond to it and may well have strong feelings about it.

As civilization continues in existence, retaining its continuity of tradition, the list of defunct ideas will and must grow. It is not likely that once an idea becomes defunct that it can reenter circulation, but is possible, over the long term, that societies could change so dramatically that a once-defunct idea could again rise from the dead and take an active role in society. It could be argued that the emergence of religious fundamentalism at the end of the twentieth century represents the recrudescence of a defunct idea. Many scholars of fundamentalism insist upon the modernity of this historical development, and I have some sympathy with this position, but it could be argued that the idea behind fundamentalism is that of fideism, and that fideism is a perennial idea.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

%d bloggers like this: