14 February 2017
Nietzsche’s Big History
One of the most succinct formulations of Big History of which I am aware is a brief paragraph from Nietzsche:
“In some remote corner of the universe, poured out and glittering in innumerable solar systems, there once was a star on which clever animals invented knowledge. That was the highest and most mendacious minute of ‘world history’ — yet only a minute. After nature had drawn a few breaths the star grew cold, and the clever animals had to die.
“On Truth and Lie in an Extra-Moral Sense,” Friedrich Nietzsche, Fragment, 1873: from the Nachlass. Translated by Walter Kaufmann
…and in the original German:
In irgend einem abgelegenen Winkel des in zahllosen Sonnensystemen flimmernd ausgegossenen Weltalls gab es einmal ein Gestirn, auf dem kluge Tiere das Erkennen erfanden. Es war die hochmütigste und verlogenste Minute der “Weltgeschichte”: aber doch nur eine Minute. Nach wenigen Atemzügen der Natur erstarrte das Gestirn, und die klugen Tiere mußten sterben.
Über Wahrheit und Lüge im außermoralischen Sinne, Friedrich Nietzsche, 1873, aus dem Nachlaß
This passage has been translated several times, so, for purposes of comparison, here is another translation:
“In some remote corner of the universe that is poured out in countless flickering solar systems, there once was a star on which clever animals invented knowledge. That was the most arrogant and the most untruthful moment in ‘world history’ — yet indeed only a moment. After nature had taken a few breaths, the star froze over and the clever animals had to die.”
ON TRUTH AND LYING IN AN EXTRA-MORAL SENSE (1873), Edited and Translated with a Critical Introduction by Sander L. Gilman, Carole Blair, and David J. Parent, New York and Oxford: OXFORD UNIVERSITY PRESS, 1989
Bertrand Russell, who rarely passed over an opportunity to criticize Nietzsche in the harshest terms, expressed a tragic interpretation of human endeavor that is quite similar to Nietzsche’s capsule big history:
“That Man is the product of causes which had no prevision of the end they were achieving; that his origin, his growth, his hopes and fears, his loves and his beliefs, are but the outcome of accidental collocations of atoms; that no fire, no heroism, no intensity of thought and feeling, can preserve an individual life beyond the grave; that all the labours of the ages, all the devotion, all the inspiration, all the noonday brightness of human genius, are destined to extinction in the vast death of the solar system, and that the whole temple of Man’s achievement must inevitably be buried beneath the debris of a universe in ruins–all these things, if not quite beyond dispute, are yet so nearly certain, that no philosophy which rejects them can hope to stand. Only within the scaffolding of these truths, only on the firm foundation of unyielding despair, can the soul’s habitation henceforth be safely built.”
Bertrand Russell, “A Free Man’s Worship”
Even closer to Nietzsche, in both style and spirit, is the passage that immediately precedes this in the same essay by Russell, told, as with Nietzsche, in the form of a parable:
“For countless ages the hot nebula whirled aimlessly through space. At length it began to take shape, the central mass threw off planets, the planets cooled, boiling seas and burning mountains heaved and tossed, from black masses of cloud hot sheets of rain deluged the barely solid crust. And now the first germ of life grew in the depths of the ocean, and developed rapidly in the fructifying warmth into vast forest trees, huge ferns springing from the damp mould, sea monsters breeding, fighting, devouring, and passing away. And from the monsters, as the play unfolded itself, Man was born, with the power of thought, the knowledge of good and evil, and the cruel thirst for worship. And Man saw that all is passing in this mad, monstrous world, that all is struggling to snatch, at any cost, a few brief moments of life before Death’s inexorable decree. And Man said: `There is a hidden purpose, could we but fathom it, and the purpose is good; for we must reverence something, and in the visible world there is nothing worthy of reverence.’ And Man stood aside from the struggle, resolving that God intended harmony to come out of chaos by human efforts. And when he followed the instincts which God had transmitted to him from his ancestry of beasts of prey, he called it Sin, and asked God to forgive him. But he doubted whether he could be justly forgiven, until he invented a divine Plan by which God’s wrath was to have been appeased. And seeing the present was bad, he made it yet worse, that thereby the future might be better. And he gave God thanks for the strength that enabled him to forgo even the joys that were possible. And God smiled; and when he saw that Man had become perfect in renunciation and worship, he sent another sun through the sky, which crashed into Man’s sun; and all returned again to nebula.
“`Yes,’ he murmured, `it was a good play; I will have it performed again.'”
Here Russell, unlike Nietzsche, gives theological meaning to the spectacle, however heterodox that meaning may be; I can easily imagine someone preferring Russell’s theological version to Nietzsche’s secular version, though both highlight the meaninglessness of human endeavor in a thermodynamic universe.
Our sun — a star among stars — will be a relatively early casualty in the heat death of the universe. While the life of the sun is orders of magnitude beyond the life of the individual human being, as soon as we understood that the sun’s life will pass through predictable stages of stellar evolution, we understood that the sun, like any human being, was born, will shine for a time, and then will die, and, when the sun dies, everything that is dependent upon the light of the sun for life will die also. It is only if we can make ourselves independent of the sun that we will not inevitably share the fate of the sun.
The idea that the sun is a star among stars, and that any star will do in terms of supporting human life, is embodied in a quote attributed to Wernher von Braun by Tom Wolfe and reported in Bob Ward’s book about von Braun:
“The importance of the space program is not surpassing the Soviets in space. The importance is to build a bridge to the stars, so that when the Sun dies, humanity will not die. The Sun is a star that’s burning up, and when it finally burns up, there will be no Earth… no Mars… no Jupiter.”
quoted in Dr. Space: The Life of Wernher von Braun, Bob Ward, Chapter 22, p. 218, with a footnote giving as the source, “Transcript, NBC’s Today program, New York, November 11, 1998”
Wernher von Braun had seized upon the essential insight of existential risk mitigation, as had many involved in the space program from its inception. As soon as one adopts a naturalistic understand of the place of humanity in the universe, and when technology develops to a point at which its extrapolation offers human beings options and alternatives within the universe, anyone will draw the same conclusion. Another quote from von Braun makes the same point in another way:
“…man’s newly acquired capability to travel through outer space provides us with a way out of our evolutionary dead alley.”
Bob Ward, Dr. Space: The Life of Wernher von Braun, Annapolis, US: Naval Institute Press, 2013.
I have previously written about the idea that humanity is a solar species, but the fact that humanity and the biosphere from which we derive has been utterly dependent upon solar insolation has been an accident of history. Any sun will do. We can, accordingly, re-conceive humanity as a stellar species, the kind of species that requires a star and its planetary system to make a home for ourselves. In this sense, all species of planetary endemism are stellar species.
Even this idea of immigration to another star, and of any other star being as good as the sun, is ultimately too narrow. Our sun, or any star, can be the source of energy that powers our civilization, but it can easily be seen that substitute forms of energy could equally well power the future of our civilization, and that it has merely been an historical contingency — a matter of our planetary endemism — that we have been dependent upon a single star, or upon any star, for our energy needs.
This more radical and farther-reaching vision is embodied in a quote attributed to Ray Bradbury by Oriana Fallaci:
“Don’t let us forget this: that the Earth can die, explode, the Sun can go out, will go out. And if the Sun dies, if the Earth dies, if our race dies, then so will everything die that we have done up to that moment. Homer will die. Michelangelo will die, Galileo, Leonardo, Shakespeare, Einstein will die, all those will die who now are not dead because we are alive, we are thinking of them, we are carrying them within us. And then every single thing, every memory, will hurtle down into the void with us. So let us save them, let us save ourselves. Let us prepare ourselves to escape, to continue life and rebuild our cities on other planets: we shall not be long of this Earth! And if we really fear the darkness, if we really fight against it, then, for the good of all, let us take our rockets, let us get well used to the great cold and heat, the no water, the no oxygen, let us become Martians on Mars, Venusians on Venus, and when Mars and Venus die, let us go to the other solar systems, to Alpha Centauri, to wherever we manage to go, and let us forget the Earth. Let us forget our solar system and our body, the form it used to have, let us become no matter what, lichens, insects, balls of fire, no matter what, all that matters is that somehow life should continue, and the knowledge of what we were and what we did and learned: the knowledge of Homer and Michelangelo, of Galileo, Leonardo, Shakespeare, of Einstein! And the gift of life will continue.”
Oriana Fallaci, If the Sun Dies, New York: Atheneum, 1966, pp. 14-15
Fallaci refers to this as a “prayer,” and indeed we might see this as a prayer or a catechism of the Space Age — not a belief, not merely belief, but an imperative ever-present in the hearts and minds of those who have fully imbibed the spirit of the age and who seek to carry that spirit forward with evangelical fervor, proselytizing to the masses and bringing them to the True Faith through purity of will and vision — another way of saying naïveté.
Do the clever animals have to die? No, not yet. Not if they are clever enough to move on to another planet, another star, another galaxy. Not if they are clever enough to change themselves so that, when the changed conditions of the universe in which they exist no longer allow the lives of clever animals to continue, what the clever animals have achieved can be preserved in some other way, and they themselves can be preserved in another form.
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
9 August 2012
I can remember the first time that I came to realize that history is a powerful tool for conveying in interpretation. History isn’t just an account of the past, a chronicle of names, dates, and places, that only becomes distorted when the facts were selected and organized according to some idea that was no part of the facts as they occurred. History is always a selection of past facts and always organized according to some idea or other. No history can be complete, including all facts, so that every history is partial, and a partial selection of relevant facts means that there must be some principle of selection, and it is the principle selection of relevant facts that is the idea that governs even the most objective of histories.
This realization that history is always an interpretation came to me when I was writing extensively on the history of logic (some time in the early 1990s, I think). This may seem an unlikely point of origin for an essentially political realization, but the history of logic, no less than the history of princes and thrones and battles, is a human, all-too-human story with its distinctive protagonists who each put forward their particular version of the events that go to make up the history of logic, and which in the most tendentious accounts culminate in their work of the individual formulating the given narrative of logic.
What is true for logic is true in spades for the histories of less abstract and more human, all-too-human stories. The narratives we rely on to orientate ourselves within the world — narratives of our own personal history, narratives of our families, narratives of our communities, nation-states, cultures, civilizations, and species — are interpretations of events even when every event incorporated in the narrative is objectively and unproblematically true. Meaning and value are given to facts and events when they are made part of a story that has meaning and value for those who create stories, those who transmit stories, and those who listen to stories.
Traditional narrative history tells a story; when you begin a story, you already know what kind of story you’re going to tell — whether it’s a romance or a comedy or a tragedy — since for any of these genres a successful telling of the story requires that the genre be “set up” in the very first lines of the tale. This has been made particularly clear by Hayden White’s detailed typology of narratives in his book Metahistory, in which he sedulously distinguishes modes of emplotment, argumentation and ideology.
Even while traditional narrative history has continued to dominate popular historical writing, academic historiography has moved ever further away from narrative models of historical exposition. In several posts I have mentioned the influence of Braudel and the Annales school of historiography, which, influenced by mid-century structuralism on the European continent (think Claude Lévi-Strauss), brought a much more “scientific” approach to writing history. Braudel’s writing is so accomplished that we scarcely notice he is writing more as a scientist than an historian, but this development was only to continue and to escalate as scientific historiography migrated to the New World and had the resources of Big Science upon which to draw.
While scientific historiography possesses the gold standard in terms of objectivity and the veracity of the facts employed, science writers tend to be much less sophisticated and less subtle writers than traditional historians, so when the inevitable popularizations of ideas in the vanguard of science emerge they tend to be penned with the kind of naïve optimism one would expect of the Enlightenment, with a generous admixture of theological posturing and ham-handed moralizing (I have briefly addressed the latter two in Higgs: what was left unsaid). The result is that when scientific historiography enters the marketplace of ideas, it, too, is freighted with meanings and values that are independent of the facts presented, although the scientific framework of the discovery and exposition of the facts sometimes conceals the moral message.
Well, none of this should really be new to any of us. Any sophisticated reader is already aware of the cautions I have formulated above about interpretations versus facts, and already in the nineteenth century Nietzsche put the whole matter in a particularly unambiguous formulation when he said that, “Against that positivism which stops before phenomena, saying ‘there are only facts,’ I should say: no, it is precisely facts that do not exist, only interpretations.” Nevertheless, my recent reflections have once again impressed me with the importance of this observation.
I have mentioned in several posts how much Sartre’s lecture Existentialism is a Humanism has influenced my thinking over the years. I was reflecting on this again recently, and the lesson that I took away from this most recent review was the importance of taking responsibility for our interpretations, including if not especially our interpretations of history.
Here is a passage from Sartre that I quoted previously in Of moral choices and existential choices, in which Sartre has just told a story of how a student came to him to ask whether he should stay at home to be a comfort to his mother or if he should leave to join the resistance:
“…I can neither seek within myself for an authentic impulse to action, nor can I expect, from some ethic, formulae that will enable me to act. You may say that the youth did, at least, go to a professor to ask for advice. But if you seek counsel — from a priest, for example you have selected that priest; and at bottom you already knew, more or less, what he would advise. In other words, to choose an adviser is nevertheless to commit oneself by that choice. If you are a Christian, you will say, consult a priest; but there are collaborationists, priests who are resisters and priests who wait for the tide to turn: which will you choose? Had this young man chosen a priest of the resistance, or one of the collaboration, he would have decided beforehand the kind of advice he was to receive. Similarly, in coming to me, he knew what advice I should give him, and I had but one reply to make. You are free, therefore choose, that is to say, invent. No rule of general morality can show you what you ought to do: no signs are vouchsafed in this world.”
Jean-Paul Sartre, Existentialism is a Humanism
By concluding this passage with, “no signs are vouchsafed in this world,” Sartre is not only saying that each must take responsibility for explicit decisions and actions, but also for our identification of signs and what we make of them. Contrary to Sartre’s declaration of the absence of signs, I think that most people do sincerely believe that signs are vouchsafed in this world. I have come to think of this belief in signs as a way to avoid responsibility for one’s interpretations. If one says, e.g., “a rainbow appeared in the sky as I was contemplating suicide, and I realized that this was a sign from on high that I should not kill myself,” one is surrendering one’s autonomy even while acting — the moral equivalent of keeping one’s cake and eating it too.
I don’t think that most people have a problem with the explicit judgments they formulate when they say things like, “I think…” or “I believe…” or “I have decided to…” since these are clear statements of personal responsibility for one’s decisions and actions. But interpretations can be much more subtle — in some cases, perhaps in many cases, interpretations are so subtle that they are difficult to understand as interpretations rather than as cold, hard facts.
Individuals who have never had their Weltanschauung called into question are particularly vulnerable to giving their interpretations an air of facticity. In so far as travel can place an individual into a situation in which everything formerly taken for granted is questioned (something I touched upon in Being the Other), one of the virtues of travel is to make one aware of one’s Weltanschuung, and to know that there is nothing necessary about the particular interpretations that one gives to particular states of affairs.
Of course, travel in and of itself is not enough. Some people, when they travel, surround themselves with their compatriots so that they are never exposed to an unaccustomed world without the support of like-minded fellows. People do exactly the same thing without bothering to travel: i.e., always surrounding themselves with like-minded individuals and never placing themselves in a situation in which their beliefs can be radically questioned — or even gently questioned.
Thus we see that the work of taking responsibility for our interpretations is the painful work of self-knowledge even to the point of self-alientation. For this, few have the requisite hardihood. But we must try.
For those who do possess the intestinal fortitude for self-examination that reveals interpretations as interpretations, stripping them of their spurious facticity, there is an added aesthetic benefit: it is from this point of view, seeing the world for what it is, that we are able to see and to forget the name of the thing on sees.
The uninterpreted world — what Husserl called the prepredicative world — is an ideal, and as an ideal it is likely to be elusive and difficult of accomplishment. But that is no argument against it. As Spinoza said, All noble things are as difficult as they are rare. Taking full responsibility for our interpretations is both difficult and rare, but it is a noble ideal to pursue.
. . . . .
. . . . .
. . . . .
8 April 2012
Easter is, at bottom, a holiday that is about ideas. That is one reason that I am fascinated by Easter, and why over the past few years I’ve written many posts about Easter and the Lenten season, including:
And, most recently…
I have been at pains to point out in earlier posts that spring celebrations of the renewal of life seem to be as old has our species, and with this in mind it sounds more than a little odd that I should say that Easter is about ideas, except that Easter has become about ideas because it has been so repeatedly exapted throughout human history. As ideas are the currency of human interaction within civilization, the exaptation of Easter since the advent of civilization has meant the construction of an ideological exaptation mechanism of sufficient power to displace earlier celebrations with their established institutions.
It was necessary to overlay a Christian idea on a Pagan idea, and the Pagan idea was overlain on an even more ancient idea — if we take the stages of savagery, barbarism, and civilization (which I recently discussed in Savagery, Barbarism, and Civilization) as our model for the development of the forms by which we conceptualize life, we can see the Christian idea as an idea of medieval European civilization, the Greco-Roman idea that was exapted by Christianity as an idea of the civilization of classical antiquity, the earlier idea exapted by greco-Roman civilization as an idea of barbarism, and the earlier idea exapted by the barbaric idea as the savage idea — and now we have made it all the way back to “the savage mind” of Levi-Strauss.
In Christian civilization (i.e., Western civilization), Christmas and Thanksgiving have become more-or-less easily assimilated to the family gatherings that have become identified with these holidays, but Easter does not involve the kind of travel season that we find at Christmas and Thanksgiving. Perhaps this is because everyone has just had their Spring Break and is not in a position to travel again immediately for a holiday family gathering.
At Thanksgiving there is the preparation and consumption of a large meal, while at Christmas there is the trimming of the tree and gift exchange. In a large family these can be undertakings of significant proportions. While from a devotional standpoint these family-based rituals are not central to the holiday, from a sociological standpoint these features are in fact very central to the holidays, and if we could quantify that amount of time people spent thinking about, planning, and preparing for the practical consequences of Thanksgiving and Christmas and compare this to the time spent thinking about, planning, and preparing for the devotional significance of these holidays, it would probably be pretty obvious what concerns dominated the holidays.
In such cases as Thanksgiving and Christmas, we could say that holidays become exapted by the infrastructure of celebration. The infrastructure of familial celebration can, in turn, become exapted by the practical demands seemingly imposed by major holidays. In one of my least-read posts, Personal Dystopias, I tried to show how these socio-familial concerns can get out of hand and reduce or entirely eliminate any joy felt in the holiday or celebrated event. I believe that this is more common than is generally recognized.
This is, of course, the Protestant in me speaking: for those of a Protestant temperament, the “real” celebration is rigorously defined in devotional terms, and anything that detracts from the intensity of devotional observations is an impiety and indeed an impurity of the will. But knowing that Easter (like most holidays) has layer upon layer of sedimented meaning, and that the ideational content of devotional observance may well be the most superficial “meaning” of the holiday, compels us to respect the oldest and most continuous meaning of the celebration, which is the celebration itself. This recognition, however, of a continuity to the celebration that transcends the changing meanings that have been associated with the holiday is itself an idea — another perspective that one might bring to the celebration.
The history of Easter is the history of the exaptations of a holiday continuously celebrated since human beings have been celebrating holidays, and as civilization has added to the complexity of the forms by which we conceptualize life, the history of the exaptations of Easter has become a history of the exaptation of ideas.
. . . . .
Happy Easter… whatever it happens to mean to you!
. . . . .
. . . . .
. . . . .
1 April 2012
How often does Palm Sunday fall on April Fools’ Day? It must happen with a certain (predictable) regularity, I would guess, since April Fools’ Day falls within what we might call the parameters of Easter. No doubt someone, somewhere, has made the calculation and can give a definite answer to the question. Since Easter is a moveable feast, and it carries all of Passiontide with it, including Palm Sunday and Good Friday, all these days move around the Gregorian calender like wanderers seeking a place to rest.
Easter must be calculated, since it falls on the first Sunday after the full moon following the vernal equinox in the northern hemisphere. And Easter is the still point in the turning world of moveable feasts in the Christian calendar, because all the other moveable feasts are calculated in number of days before or after Easter. The calculation of the date of Easter is an astronomical task that requires some expertise. Copernicus was among the few in early modern Europe who possessed the expertise to arrive at a better calculation.
The accumulating errors of the Julian calendar had, over the centuries, contributed to confusion and unnecessary complexity in the calculation of dates. It was possible to continue with the old system, but the whole process could be streamlined by a root-and-branch rethinking. This is what Copernicus provided. He did not limit himself to local and parochial concerns, but attempted to get the cosmology right so that it agreed with astronomical observations, and this in turn could bring the calendar into accord with both cosmology and astronomy.
Copernicus, like Darwin, long delayed the publication of his book De revolutionibus orbium coelestium not least because he was, like Darwin, concerned about the reaction it would cause. The story is that Copernicus received a copy of the first printed edition of his book on his death bed, roused himself from a stroke-induced torpor long enough to recognize this life work, and then passed away. The fears of both men were justified.
Copernicus’ calendar reform had some unintended consequences. This is perhaps the ultimate April Fools’ joke. While it is true that Copernicus himself completed only the first step from geocentric cosmology to heliocentric cosmology, and that we have since gone far beyond heliocentric cosmology even to the point that today any center of the world at all is questionable, it is probably also true that Copernicus’ reform extended as far as cosmological knowledge extended in his time. In its context, the Copernican revolution was radical and complete.
Now we know that neither earth nor sun nor galaxy nor galactic cluster nor super cluster nor the universe itself is the center of anything. There is no center — or, rather, everywhere is the center, which amounts to the same thing, and this coincides with the perennial insights of mysticism and mythology.
The Copernican revolution is still unfolding. The slow, gradual, cumulative process of attaining Copernican conceptions continues today. It is worth noting that the revolution began at the rarefied intellectual level of cosmology, so that a Copernican conception of cosmology itself preceded a Copernican conception of any of the special sciences. Indeed, in Eo-, Exo-, Astro- I recently argued that we are only now able to formulate Copernican conceptions of the sciences, which have, to date, received mostly geocentric formulations.
The calculation of the date of Easter turned out to be one of the truly deconstructive episodes in Western history, when the unraveling of what had seemed to be a single intellectual thread eventually meant the unraveling of a world entire. Copernicus was the first deconstructionist.
. . . . .
. . . . .
. . . . .
24 April 2011
The other kind of visualization
Last year in Virtuoso of Visualization I discussed the innovative visual aids employed by Hans Rosling to better explain complex economic issues. In several other posts I have discussed spatiality, especially in relation to fractals, which also involves the use of visual aids to better understanding (for example, in Fractals and the Banach-Tarski Paradox). In these discussions, visualization means employing geometrical intuition as an aid in the exposition of an idea. We are, after all, highly visually-oriented organisms, so that an appeal through visualization is more likely to be effective that any appeal that neglects the role of images in our thinking.
However, to speak of visualization simpliciter, without further qualification, risks misunderstanding because of the role that visualization has come to have in popular culture. I have been using “visualization” in a sense almost entirely divorced from the significance of the term in popular culture, but today I want to invoke this other sense of visualization.
The other kind of visualization, sometimes called creative visualization, has a book devoted to it, Creative Visualization: Use the Power of Your Imagination to Create What You Want in Your Life by Shakti Gawain, who describes creative visualization in this way:
“Creative visualization is magic in the truest and highest meaning of the word. It involves understanding and aligning yourself with the natural principles that govern the workings of our universe, and learning to use these principles in the most conscious and creative way.”
On a more practical level, the Wikipedia article on creative visualization characterizes it in this way:
“Creative visualization is the technique of using one’s imagination to visualize specific behaviors or events occurring in one’s life.”
The Wikipedia article cites Wallace Wattles (1860–1911) and his book The Science of Getting Rich as am important antecedent to creative visualization as it is known today. However, we can go back much farther than that.
Early modern visualization
We can go back, in fact, to the earliest part of the early modern period in European history, and particularly we should consider the work of Ignatius Loyala, Spiritual Exercises.
Loyala’s Spiritual Exercises is one of the most remarkable texts from the western religious tradition. It is essentially a handbook for spiritual directors to guide the meditations of those of whom they have charge. The Spiritual Exercises have all the concrete detail and specificity that we expect from creative visualization. In the Fifth Meditation of the First Week the saint tells us in detail how to imagine “the length, breadth, and depth of hell.”
I suspect that these spiritual exercises are quite effective if followed faithfully. The saint, before he was a saint, was a military man, and no doubt of a practical, concrete turn of mind. Exercises like this helped him, and those whose spiritual development he influenced, to focus on practical, concrete images of religious devotion. It is a bit like the visions of Julian of Norwich shorn of the mystical accretions.
The religious life of Ignatius Loyola is sometimes referred to as an intensely interior spirituality. He shares this quality with this contemporary, Saint Teresa of Avila. One of her many books is called The Interior Castle, and it too gives detailed, concrete images to religious meditation. These two saints are sometimes treated as exemplars of saintly, selfless piety, but we already see in these texts — the Spiritual Exercises and The Interior Castle — is the emergence of a robust inner life that is the condition for the emergence of individuality — that distinction non-institutional institution of Western civilization.
Descartes, who died almost exactly a hundred years after Ignatius Loyola died, and who is widely seen as a philosophical exemplar of individualism, proving the existence of the world entire on the basis of his subjective assertion cogito ergo sum, attended a Jesuit school. It is tempting to suppose that Descartes, while in school, was directed through a course of spiritual meditation based on the Spiritual Exercises. In the philosophical works of Descartes, especially the early Rules for the Direction of Mind — which we might call, analogously to Saint Ignatius Loyola, Intellectual Exercises — we find the same orientation to interiority and the same careful, methodical approach.
It was the keen, poignant subjectivity of figures like Saint Teresa and Saint Ignatius Loyola that laid the groundwork and made possible the rigorous subjectivity of Descartes. Philosophical individualism has its roots in an intense devotional individualism, conceived selflessly, but individualistic in effect.
A guided meditation on Easter
Ignatius Loyola’s suggestions for the visualization of Easter do not appear until near the end of his Spiritual Exercises, and thus due to its handbook-like character, the instructions are much less detailed than we find in the earlier sections of the book. If we take the earlier part of the book as our guide — which is certainly what its author intended — it is not difficult to fill in the details ourselves. Indeed, the saint himself wrote:
“In the following Contemplations let one go on through all the Mysteries of the Resurrection, in the manner which follows below, up to the Ascension inclusive, taking and keeping in the rest the same form and manner in all the Week of the Resurrection which was taken in all the Week of the Passion.”
So it is the “form and manner” that we derive from the earlier meditations, and if we follow this method — what we might call, analogously to Descartes, Rules for the Direction of Spirit — we will arrive at a concrete and detailed appreciation of the Resurrection no less than of any other events not directly accessible to us.
Descartes is well known for his negative view of history. In his Discourse on the Method for Rightly Conducting the Reason and Seeking for Truth in the Sciences (usually simply called the Discourse on Method) Descartes wrote:
“…I believed I had already given enough time to languages and even to reading ancient books as well, and to their histories and stories. For talking with those from other ages is almost the same as traveling. It is good to know something about the customs of various people, so that we can judge our own more sensibly and do not think everything different from our own ways ridiculous and irrational, as those who have seen nothing are accustomed to do. But when one spends too much time traveling, one finally becomes a stranger in one’s own country, and when one is too curious about things which went on in past ages, one usually lives in considerable ignorance about what goes on in this one. In addition, fables make us imagine several totally impossible events as possible, and even the most faithful histories, if they neither change nor increase the importance of things to make them more worth reading, at the very least almost always omit the most menial and less admirable circumstances, with the result that what is left in does not depict the truth. Hence, those who regulate their habits by the examples which they derive from these histories are prone to fall into the extravagances of the knights of our romances and to dream up projects which exceed their powers.”
It is remarkable how Descartes’ criticism of traditional historiography should take the form of noting the absence of “the most menial and less admirable circumstances,” which is precisely what the recent movement determined to write history “from the bottom up” has attempted to address. The implicit Cartesian historiography that would emerge from addressing Descartes’ criticisms would also presumably embody the interiority and subjectivity that marked Cartesian thought, and which seems to fall in the same tradition as the intense interiority of Ignatius Loyola and Teresa of Avila.
In several posts (for example, The End of the End of the World) I have mentioned Collingwood’s conception of an a priori historical imagination, which is, in Collingwood’s words, an “activity which, bridging the gaps between what our authorities tell us, gives the historical narrative or description its continuity.” This exercise of the a priori historical imagination, in a devotional rather than in an historical context, which what Ignatius Loyola was formulating in his Spiritual Exercises, and it embodies the implicit historiography of Cartesianism.
In a more mundane and familiar context, images furnish a point of departure for imagination to provide the individual a window with which to view of the past. Icons are just such images for specifically devotional contexts. Clark in his Civilisation: A Personal View, remarked that the Middle Ages constituted a Civilization of the Image, which was in time superseded in northern Europe by a Civilization of the Word, which began in an orgy of iconoclasm. In the early modern period, Ignatius Loyola and Teresa of Avila still belonged unambiguously to the Civilization of the Image, and it is by way of the appropriation of the image that interiority is directed to the outer world, as it is through the creation of images that world is in turn subordinated to the interiority of the soul.
Interiority and exteriority exist in a relationship of escalating co-evolution, so that subjectivity must supplement the world as much as the world must supplement subjectivity.
. . . . .
. . . . .
. . . . .
. . . . .
17 April 2011
Palm Sunday is upon us again. There is a traditional Swedish holiday song that tells us that Christmas joy will last ’til Easter, and Easter joy will last ’til Christmas. For the ecclesiastical calendar, Christmas and Easter have been the devotional centers around which the liturgical year revolves. Each holiday has its attendant holiday season — Advent and Yuletide for Christmas and Passiontide and Holy Week for Easter — which is itself broken into an expectant before and a fulfillment that follows.
There is an admirable symmetry to these holiday seasons, with expectation and fulfillment of life followed by the expectation and fulfillment of death (and rebirth). One would have difficulty formulating a more profound account and celebration of human life if one tried to do so consciously and explicitly. Instead, this emerged from Western history unconsciously and implicitly, and eventually came to dominate the year in Christendom.
Last year in The Devotional Meaning of Palm Sunday I suggested that Christ’s entry into Jerusalem on Palm Sunday was an “upping of the ante” and represented the apotheosis of what some historians now call “The Jesus Movement” while Christ was still the living figurehead of his movement. The pace of the narrative of Christ’s ministry picks up with Palm Sunday and culminates on Good Friday with Christ’s trial and crucifixion.
Thus the liturgical calendar, not only schematically but also in terms of narrative, is quite perfectly adapted to the human need for chronology by which to celebrate. Year before last, in A Meditation on the Occasion of Palm Sunday, I suggested that we can…
“push our a priori imagination to the limits of its possibility in attempting to understand points of view distinct from that egocentric point of view native and natural to each one of us. To this end, thinking through history from both directions, thinking of the present in terms of the past and the past in terms of the present, is one place to start.”
This means not only extrapolating our intellectual innovations backward through history and re-interpreting the past in terms of present knowledge, but also the extrapolation of past traditions into the future and interpreting our epistemic and technological innovations, and all the unexpected unknown unknowns of history in terms of our past.
The maturation of agricultural civilization in an Axial Age that provided to us a robust structure for human life is an achievement for which we should have proper reverence, and something from which we can learn in perpetuity, even if the axialization of industrial civilization eventually produces a tradition that supersedes and supplants that of agricultural civilization.
In order to do this, in order to draw fully upon the past, we need to understand the past, and most of all we need to formulate a comprehensive theoretical understanding of past traditions. I made a first essay in this direction last year in my Theses on Easter, which I hope at some point to revise, expand, and extend. For the moment, however, I merely remind myself of the need to follow up on this.
. . . . .
. . . . .
12 March 2011
The scale of destruction and suffering caused by the earthquake and tsunami that has just struck the northern part of Hokkaido in Japan (2011 Sendai earthquake and tsunami — 東北地方太平洋沖地震), cannot but remind us of other natural disasters, some of them in the recent past, and some long past. It is likely that the Sumatra-Andaman earthquake of 26 December 2004 was the worst natural disaster that has (or will) occur in my lifetime, in terms of total casualties, with almost a quarter million dead, most as a result of the tsunami following the earthquake.
The most famous earthquake and tsunami in Western history is the disaster that struck Lisbon in 1755. I previously mentioned this in The Rational Reconstruction of Cities. I mentioned in that post Nicholas Shrady’s book, The Last Day: Wrath, Ruin, and Reason in the Great Lisbon Earthquake of 1755, which presented in some detail the intellectual controversy that emerged following the disaster. Shrady cited the work of Gabriel Malagrida, whose 1756 pamphlet, “An Opinion on the True Cause of the Earthquake” (“Juizo da verdadeira causa do terramoto”), argued that the disaster in Lisbon was divine retribution for the sins of the people of Lisbon:
“Learn, Oh Lisbon, that the destroyers of our houses, palaces, churches, and convents, the cause of the death of so many people and of the flames that devoured such vast treasures, are your abominable sins, and not comets, stars, vapours and exhalations, and similar natural phenomena. Tragic Lisbon is now a mound of ruins. Would that it were less difficult to think of some method of restoring the place; but it has been abandoned, and the refugees from the city live in despair. As for the dead, what a great harvest of sinful souls such disasters send to Hell! It is scandalous to pretend the earthquake was just a natural event, for if that be true, there is no need to repent and to try to avert the wrath of God, and not even the Devil himself could invent a false idea more likely to lead us all to irreparable ruin. Holy people had prophesied the earthquake was coming, yet the city continued in its sinful ways without a care for the future. Now, indeed, the case of Lisbon is desperate. It is necessary to devote all our strength and purpose to the task of repentance. Would to God we could see as much determination and fervour for this necessary exercise as are devoted to the erection of huts and new buildings! Does being billeted in the country outside the city areas put us outside the jurisdiction of God? God undoubtedly desires to exercise His love and mercy, but be sure that wherever we are, He is watching us, scourge in hand.”
There are probably people who continue to think such things today, and a few who say so in private, but this is not the dominant narrative today. We certainly bear traces of a past dominated by an eschatological conception of history, but civilization has largely moved beyond this. Now, whether we like it or not, or whether we know it or not, Occidental civilization today embodies a naturalistic conception of history. The transition from medievalism to modernism was also a transition from an eschatological Weltanschauung to a naturalistic Weltanschauung. This does not mean that we have “solved” the problems of an earlier era, but only that we have moved on to other problems.
As I said above, we bear the traces of our history, and some of us bear these traces more heavily than others. Recently I have been listening to Bart D. Ehrman’s book God’s Problem: How the Bible Fails to Answer Our Most Important Question — Why We Suffer. Ehrman is a serious scholar of early Christianity, and, by his own account, someone who started out as a sincerely and devoutly believing Christian, only to find that he had lost his faith after many years of study and confronting the problem of suffering.
Over the years I’ve talked with a lot of people about issues pertaining to suffering, and I am struck by the kinds of reactions I get.
After briefly discussing avoidance of the issue of suffering altogether and those responses to the problem of suffering that he considers to be answers that are “too pat” to be satisfying, Ehrman moves on to a third category of responses to suffering that he cannot accept:
Other people — including some of my brilliant friends — realize why it’s a religious problem for me but don’t see it as a problem for themselves. In its most nuanced form (and for these friends everything is extremely nuanced), this view is that religious faith is not an intellectualizing system for explaining everything. Faith is a mystery and an experience of the divine in the world, not a solution to a set of problems. (p. 15)
In Ehrman’s formulation of this view that he cannot accept, there is an echo of a famous passage from Hume, who wrote in his essay on miracles:
“…we may conclude, that the Christian Religion not only was at first attended with miracles, but even at this day cannot be believed by any reasonable person without one. Mere reason is insufficient to convince us of its veracity: and whoever is moved by Faith to assent to it, is conscious of a continued miracle in his own person, which subverts all the principles of his understanding, and gives him a determination to believe what is most contrary to custom and experience.”
This passage from Hume has become a standard point of reference not only for those who followed Hume and formulated the naturalistic conception of the world that dominates our thinking today, but also fideists who see in this the last remaining legitimate form of an uncompromising expression of faith: faith is a miracle, therefore proof in and of itself of what faith wants to believe. Thus this from Hume is, at once, both purple passage and locus classicus.
Ehrman writes that he respects this view and sometimes wishes that he could share it, but ultimately he cannot share it. He writes, “The God that I once believed in was a God who was active in this world.” He makes it clear that his confrontation with suffering was crucial to his loss of faith, and he further makes explicit that his remarks about his faith are all in the past tense, so there is no question of an equivocation in his belief; he is not about to say that he will change his mind if only someone can show him an intellectually legitimate way to formulate the problem of human suffering.
Ehrman is living a conundrum. He has abandoned the faith that the world has a particular metaphysical and eschatological structure, but he hangs on to the idea that our suffering must be expressed in eschatological terms, and our response to that suffering must also be expressed in eschatological terms. If it is not so expressed, according to Ehrman, it is avoidance, too pat, or a naturalism that he cannot share. But he has nothing to offer in place of naturalism, except strong feelings of its inadequacy.
I have encountered this attitude elsewhere, and when I thought about it I realized that I had written about it. In my post Cosmic War: An Eschatological Conception I wrote:
Because a cosmic war does not occur in a cosmic vacuum, but it occurs in an overall conception of the world, the grievances too occur within this overall conception of history. If we attempt to ameliorate grievances formulated in an eschatological context with utilitarian and pragmatic means, no matter what we do it will never be enough, and never be right. An eschatological solution is wanted to grievances understood eschatologically, and that is why, in at least some cases, religious militants turn to the idea of cosmic war. Only a cosmic war can truly address cosmic grievances.
Ehrman does not express himself in terms of war, but there is a close parallel between those who reject utilitarian and pragmatic assistance because it does not come wrapped up like an eschatological care package, and those who cannot accept a naturalistic conception of human suffering because it does not answer their deepest needs and longing to do justice to a noble and honorable conception of man, but a conception rooted in an eschatological conception of history that is no longer defensible in rational terms.
For Ehrman, human suffering is a cosmic grievance, and a cosmic grievance can only be addressed by a cosmic remedy. I don’t think that Ehrman is alone in this. Indeed, what makes his view interesting is that he is able to give eloquent expression to something that is sharply and poignantly felt by many who do not have the means to express themselves so well.
The question, then, as I see it, it not how to give the proper cosmic response to the cosmically formulated dilemma, but rather this: is our modern naturalism merely a superficial overlay, so that the vital forces that drive life remain profoundly and unalterably eschatological, or is the kind of attitude Ehrman expresses typical of a transitional age, indicating that we still have a long way to go in coming to terms with the naturalism formulated by visionaries like Hume? There are, of course, other possibilities as well — and interesting possibilities at that. The only reason I am going to bring this post to an end at this time is not because I am satisfied with what I have said, but only because I am tired.
. . . . .
. . . . .
25 November 2010
Human nature is not well known for spontaneous expressions of gratitude. To give thanks is, on the contrary, a thing of sufficient rarity that we usually know it (and usually remember it) when we receive a sincere and authentic “Thank you.” William Blake wrote that, “Gratitude is heaven itself,” and if we know anything, we certainly know that the City of Man in this world is not to be mistaken for the City of God in the other world. Thus it is all the more astonishing that there should be a national day of giving thanks.
The Spanish for Thanksgiving Day is El Día de Acción de Gracias. The literal translation of this would be “The Day of an Act of Thanks,” but we can easily see that the Spanish for “thanks” — “gracias” — is etymologically related to the same root (the Latin gratia) that we get the word “grace” in English. Thus it would be fair (at least, in a sense) to translate the Spanish as An Act of Grace. This is a poetic and even a beautiful formulation.
What ought a national day of acts of grace to consist of? Theologically we know that grace is an unmerited gift. But a gift requires both a giver and a receiver. Thus for every act of grace understood as the bequest of a gift, there is at the same time an act of grace understood as the receipt of a gift. Thus grace, presumably among the purest of the theological virtues, has a transactional character. In the language of phenomenology, grace has an intentional structure: there can be no giver without a receiver, and no receiver without a giver. Thus grace implies a community.
The transactional nature of grace makes its resemble what economists call Say’s Law. Say’s Law of Markets was named after French economist Jean-Baptiste Say, who however was not the first to formulate it. Here is Say’s formulation of the economic law that bears his name:
It is worthwhile to remark that a product is no sooner created than it, from that instant, affords a market for other products to the full extent of its own value. When the producer has put the finishing hand to his product, he is most anxious to sell it immediately, lest its value should diminish in his hands. Nor is he less anxious to dispose of the money he may get for it; for the value of money is also perishable. But the only way of getting rid of money is in the purchase of some product or other. Thus the mere circumstance of creation of one product immediately opens a vent for other products.
Say’s law is much more familiar to us in its aphoristic forms, such as, “Supply creates its own demand,” which also implies that, “Demand creates its own supply.” The reasoning behind this is that every purchase is a sale, and every sale is a purchase. Thus there is an intentionality of economic activity that mirrors the intentionality of grace, and equally implies a community. The intentionality of economic activity implies an economic community; the intentionality of grace implies a moral community. We are, all of us, embedded in both communities. It is a judgment upon our character how far these two communities overlap and intersect, to employ a Wittgensteinian locution to express the family resemblance of economic exchange and gift exchange.
However, the exchange mechanism of Say’s Law would not appear to apply to the transactions of grace, since the transactions of grace embody an asymmetry. There can be no gratitude without a previous a prior action of generosity, but there can be an action of generosity that is not followed by the action of gratitude. Thus it would seem that there can be a general glut of generosity, but not of gratitude. No doubt this is precisely why Blake said that “Gratitude is heaven itself.”
In The Moral Truth of Re-Gifting I wrote, “What one gives is a function of what one can give, and there are conditions for the possibility of giving. One can only give that which one possesses.” All giving, I now realize, is a form of re-gifting, and the transactional nature of grace raises the act of re-gifting to a higher level of complexity. But however complex the hierarchy of giving and receiving becomes, it boils down to a few simple moral truths. Among these we may count this moral truth: One can only give gratitude in return for generosity if one has gratitude to give. And to give the gift of gratitude is perhaps the gift of greater rarity.
. . . . .
. . . . .
23 June 2010
In my forty-five years I have learned at least one moral truth, and it is this:
The more one gives, the more one is fulfilled; the more one demands, the more empty one is.
I will not attempt to give an exposition, much less an explanation of this that I consider to be a moral truth, but will for the moment leave it in its aphoristic form, for the interested reader to consider and come to his or her own understanding of it. A moral truth won in the world is worth a thousand unheeded admonitions from others, and one form of the winning of a truth is coming to one’s own understanding of it.
Since I am not going to attempt an exposition, what I am going to do with this that I have identified as a moral truth? If one agrees with me that this is a moral truth, and that generosity is the way to (naturalistic) beatitude, then the question of the good life becomes what will one give, and how?
What one gives is a function of what one can give, and there are conditions for the possibility of giving. One can only give that which one possesses. What does the individual possess? Our material possessions are the result of chance, a matter of the birth lottery. While we would do well to be generous in a material sense, there is a more profound sense of generosity that goes beyond the material. Ultimately what we have to give is the gift we have ourselves been given.
It is the gift that the individual possess that he is empowered to give. Perhaps there are individuals born with no gifts at all, but certainly the greater part of humanity consists of individuals, each of whom possesses some gift, some unique talent, some utterly unexpected, unpredictable, and unprecedented ability. It is this gift that the individual possesses that is the most valuable thing that the individual can give. What the source of an individual’s gift is, no one can say. But it is a gift, and in returning this gift to the world by giving generously of oneself, one is essentially regifting. This is the moral truth of regifting.
The spiral of generosity is a virtuous circle; the world is improved by it. The spiral of demand is a vicious circle, and the world is degraded by it. By relating oneself to the world in the attitude of generosity, one makes the presumption of participating in the world on the basis of giving, rather than on the basis of demanding, and this is a far better way of establishing one’s relationship to the world. If you remain skeptical, make an experiment of it: spend a week (or a month, or a year) only making demands, and then assess your life, and honestly look at the consequences of your demands. Were they fulfilled? Were you fulfilled by having your demands met by others? Then spend a week (or a month, or a year) only practicing generosity (to the extent that this is practically possible). Again, assess your life, and honesty ask yourself the outcome of your experience. Have you received more from life from demanding or from giving?
. . . . .
. . . . .