Saturday


Manifest Destiny personified

In my last post, Taking Responsibility for Our Interpretations, I wanted to emphasize how both individuals and political wholes (social groups) seek to vacate their responsibilities by cloaking them in a specious facticity, so that an interpretation of the world is treated as if it were something more than or other than a mere interpretation. One of the most common ways of doing this in relation to history is to formulate an interpretation of history, whether personal or social, as “destiny.”

We are all painfully familiar with loaded terms from historiography like “destiny,” “progress,” “inevitability,” and the like. We find them impartially on the left and the right. In fact, the most strongly ideologically motivated institutions make a practice of most grievously distorting history to fit a particular model that flatters the ideology in question. All one need do is recall the utopian plans of communism and Nazism from the previous century to understand the extent to which visions of the past and the future supposedly inherent in the very nature of things issue in dystopian consequences.

I realize that I’ve engaged with this issue recently in slightly different terms. In Gibbon, Sartre, and the Eurozone I formulated two principles that I called Gibbon’s Principle and Sartre’s Principle. Gibbon’s Principle is that the authority of a social whole is inalienable. Sartre’s Principle is that the authority of the individual is inalienable. In other words, even if a social whole or an individual engages in the pretense of surrendering its autonomy, this is an act of bad faith (mauvaise foi) because the social whole or the individual retains the autonomy to act even as it denies this autonomy to itself. Gibbon’s Principle as applied to history means taking responsibility for the history of social wholes; Sartre’s Principle as applied to history means taking responsibility for the individual’s personal history.

It may seem a bit incredible to compare the benign Eurozone to malevolently utopian visions like communism or Nazism, but the narratives employed to defend the Euro — the inevitability of European integration and its historical irreversibility — are on a par with inherentist narratives that make claims upon history that cannot be sustained. In Gibbon, Sartre, and the Eurozone I compared the attempt to make the Eurozone permanent to the Cuban attempt to incorporate its present socio-political regime as a permanent feature of its constitution, which latter I had discussed in The Imperative of Regime Survival.

It is significant in this connection that the US experienced a traumatic challenge to its national claims of permanence that took the form of the Civil War. Had I been alive in the 1860s, I suspect that I would have argued that it was utter folly to craft a national constitution that had provisions for adding to the territories of the United States but no provisions for the peaceful succession of regions that no longer desired to be part of the US. Because there were no peaceful provisions for succession, the succession took the form of militant succession, which was answered by militancy on the part of those who believed the Union to be indissoluble.

So am I arguing that the Confederates were right? That would certainly put me in an awkward position. If the South had peacefully succeeded from the Union, it is entirely possible that the Balkanization of North American would have yielded a map of minor states such as we find in South America (after the breakup of Gran Colombia), though it is equally possible that the fractured Union would have left only two successor states in North America. Counterfactuals are difficult to argue with any kind of confidence precisely because inherentist and essentialist conceptions of history almost never provide an adequate narrative of what happens.

Regardless of what might have happened, what did in fact happen is the the unity of the US was imposed by force of arms, more or less guaranteeing the US a continental land empire without any power able to seriously challenge the US in the Western hemisphere. This likely resulted in the US repeatedly intervening in the internecine quarrels of Europe until the US itself took responsibility for European security, eventually winning the Cold War and becoming the dominant world power. None of this was inevitable, but it has been given the air of inevitability by nationalistic narratives of American exceptionalism.

There is a sense in which the Cuban narrative of a permanent revolutionary government and the Eurozone narrative of indissolubility seek to emulate the apparently successful indissolubility revealed by the US national experience. Who, after all, would not want to be the exception to the mutability of all human things?

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Advertisements

Thursday


Revisiting my old friend Sartre

I can remember the first time that I came to realize that history is a powerful tool for conveying in interpretation. History isn’t just an account of the past, a chronicle of names, dates, and places, that only becomes distorted when the facts were selected and organized according to some idea that was no part of the facts as they occurred. History is always a selection of past facts and always organized according to some idea or other. No history can be complete, including all facts, so that every history is partial, and a partial selection of relevant facts means that there must be some principle of selection, and it is the principle selection of relevant facts that is the idea that governs even the most objective of histories.

This realization that history is always an interpretation came to me when I was writing extensively on the history of logic (some time in the early 1990s, I think). This may seem an unlikely point of origin for an essentially political realization, but the history of logic, no less than the history of princes and thrones and battles, is a human, all-too-human story with its distinctive protagonists who each put forward their particular version of the events that go to make up the history of logic, and which in the most tendentious accounts culminate in their work of the individual formulating the given narrative of logic.

What is true for logic is true in spades for the histories of less abstract and more human, all-too-human stories. The narratives we rely on to orientate ourselves within the world — narratives of our own personal history, narratives of our families, narratives of our communities, nation-states, cultures, civilizations, and species — are interpretations of events even when every event incorporated in the narrative is objectively and unproblematically true. Meaning and value are given to facts and events when they are made part of a story that has meaning and value for those who create stories, those who transmit stories, and those who listen to stories.

Traditional narrative history tells a story; when you begin a story, you already know what kind of story you’re going to tell — whether it’s a romance or a comedy or a tragedy — since for any of these genres a successful telling of the story requires that the genre be “set up” in the very first lines of the tale. This has been made particularly clear by Hayden White’s detailed typology of narratives in his book Metahistory, in which he sedulously distinguishes modes of emplotment, argumentation and ideology.

Even while traditional narrative history has continued to dominate popular historical writing, academic historiography has moved ever further away from narrative models of historical exposition. In several posts I have mentioned the influence of Braudel and the Annales school of historiography, which, influenced by mid-century structuralism on the European continent (think Claude Lévi-Strauss), brought a much more “scientific” approach to writing history. Braudel’s writing is so accomplished that we scarcely notice he is writing more as a scientist than an historian, but this development was only to continue and to escalate as scientific historiography migrated to the New World and had the resources of Big Science upon which to draw.

While scientific historiography possesses the gold standard in terms of objectivity and the veracity of the facts employed, science writers tend to be much less sophisticated and less subtle writers than traditional historians, so when the inevitable popularizations of ideas in the vanguard of science emerge they tend to be penned with the kind of naïve optimism one would expect of the Enlightenment, with a generous admixture of theological posturing and ham-handed moralizing (I have briefly addressed the latter two in Higgs: what was left unsaid). The result is that when scientific historiography enters the marketplace of ideas, it, too, is freighted with meanings and values that are independent of the facts presented, although the scientific framework of the discovery and exposition of the facts sometimes conceals the moral message.

Well, none of this should really be new to any of us. Any sophisticated reader is already aware of the cautions I have formulated above about interpretations versus facts, and already in the nineteenth century Nietzsche put the whole matter in a particularly unambiguous formulation when he said that, “Against that positivism which stops before phenomena, saying ‘there are only facts,’ I should say: no, it is precisely facts that do not exist, only interpretations.” Nevertheless, my recent reflections have once again impressed me with the importance of this observation.

I have mentioned in several posts how much Sartre’s lecture Existentialism is a Humanism has influenced my thinking over the years. I was reflecting on this again recently, and the lesson that I took away from this most recent review was the importance of taking responsibility for our interpretations, including if not especially our interpretations of history.

Here is a passage from Sartre that I quoted previously in Of moral choices and existential choices, in which Sartre has just told a story of how a student came to him to ask whether he should stay at home to be a comfort to his mother or if he should leave to join the resistance:

“…I can neither seek within myself for an authentic impulse to action, nor can I expect, from some ethic, formulae that will enable me to act. You may say that the youth did, at least, go to a professor to ask for advice. But if you seek counsel — from a priest, for example you have selected that priest; and at bottom you already knew, more or less, what he would advise. In other words, to choose an adviser is nevertheless to commit oneself by that choice. If you are a Christian, you will say, consult a priest; but there are collaborationists, priests who are resisters and priests who wait for the tide to turn: which will you choose? Had this young man chosen a priest of the resistance, or one of the collaboration, he would have decided beforehand the kind of advice he was to receive. Similarly, in coming to me, he knew what advice I should give him, and I had but one reply to make. You are free, therefore choose, that is to say, invent. No rule of general morality can show you what you ought to do: no signs are vouchsafed in this world.”

Jean-Paul Sartre, Existentialism is a Humanism

By concluding this passage with, “no signs are vouchsafed in this world,” Sartre is not only saying that each must take responsibility for explicit decisions and actions, but also for our identification of signs and what we make of them. Contrary to Sartre’s declaration of the absence of signs, I think that most people do sincerely believe that signs are vouchsafed in this world. I have come to think of this belief in signs as a way to avoid responsibility for one’s interpretations. If one says, e.g., “a rainbow appeared in the sky as I was contemplating suicide, and I realized that this was a sign from on high that I should not kill myself,” one is surrendering one’s autonomy even while acting — the moral equivalent of keeping one’s cake and eating it too.

I don’t think that most people have a problem with the explicit judgments they formulate when they say things like, “I think…” or “I believe…” or “I have decided to…” since these are clear statements of personal responsibility for one’s decisions and actions. But interpretations can be much more subtle — in some cases, perhaps in many cases, interpretations are so subtle that they are difficult to understand as interpretations rather than as cold, hard facts.

Individuals who have never had their Weltanschauung called into question are particularly vulnerable to giving their interpretations an air of facticity. In so far as travel can place an individual into a situation in which everything formerly taken for granted is questioned (something I touched upon in Being the Other), one of the virtues of travel is to make one aware of one’s Weltanschuung, and to know that there is nothing necessary about the particular interpretations that one gives to particular states of affairs.

Of course, travel in and of itself is not enough. Some people, when they travel, surround themselves with their compatriots so that they are never exposed to an unaccustomed world without the support of like-minded fellows. People do exactly the same thing without bothering to travel: i.e., always surrounding themselves with like-minded individuals and never placing themselves in a situation in which their beliefs can be radically questioned — or even gently questioned.

Thus we see that the work of taking responsibility for our interpretations is the painful work of self-knowledge even to the point of self-alientation. For this, few have the requisite hardihood. But we must try.

For those who do possess the intestinal fortitude for self-examination that reveals interpretations as interpretations, stripping them of their spurious facticity, there is an added aesthetic benefit: it is from this point of view, seeing the world for what it is, that we are able to see and to forget the name of the thing on sees.

The uninterpreted world — what Husserl called the prepredicative world — is an ideal, and as an ideal it is likely to be elusive and difficult of accomplishment. But that is no argument against it. As Spinoza said, All noble things are as difficult as they are rare. Taking full responsibility for our interpretations is both difficult and rare, but it is a noble ideal to pursue.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

%d bloggers like this: