Saturday


road closure

In my previous post on The Finality Fallacy I discussed the fallacy of treating open matters as though closed, and quoted Hermann Weyl’s 1932 lectures The Open World as a countervailing point of view. If the world is an open world, an unfinished world, then there will always be unfinished business — no finality, no closure, no resolution, no end of anything — and no beginning either.

Bertrand Russell wonderfully described the ontology implicit in such a conception of the world:

“Academic philosophers, ever since the time of Parmenides, have believed that the world is a unity. This view has been taken over from them by clergymen and journalists, and its acceptance has been considered the touchstone of wisdom. The most fundamental of my intellectual beliefs is that this is rubbish. I think the universe is all spots and jumps, without unity, without continuity, without coherence or orderliness or any of the other properties that governesses love. Indeed, there is little but prejudice and habit to be said for the view that there is a world at all.”

Bertrand Russell, The Scientific Outlook, Part One, Chapter IV. Scientific Metaphysics

There is a subtle difference, of course, between finality and unity; the presumption of unity that Russell mocked could be finitistic or infinitistic in character, but, as I pointed out in my last post, I suspect that Russell and Weyl, whatever their differences, could have agreed that the world is open. Unity may not imply openness, but openness implies the possibility of revision, the possibility of revision implies the iteration of revision, the iteration of revision implies evolution, and evolution implies anti-realism, at least in the essentialist sense of “realism.” Anything that changes gradually over an indefinite period of time may be so transformed by its incremental and cumulative change that it can be transformed into something entirely other that what it once was. This, I have argued elsewhere, is the essence of existential viability.

By the same token, there is a subtle difference between finitude and contingency. I can imagine that someone might argue that finitude implies contingency and contingency implies finitude, but I would reject any such argument. The distinction is subtle but important, and I think that it marks that difference between a naturalistic philosophy, that is essentially a philosophy of contingency, and an anthropocentric point of view that reduces the infinitistic contingency of the world to a manageable finitude because human beings are comfortable with finitude. That is to say, I am suggesting that finitistic modes of thought constitute a cognitive bias. But let’s try to penetrate a little further into what self-described finitists have in mind, and let’s try to find an unambiguously finitistic perspective.

I remember running across the phrase “radical finitude” in some of my past reading, so I looked for the original source in which I had first encountered the term and was unable to find it, but I have found many other references to radical finitude. The name that comes up most often in relation to radical finitude is that of Martin Heidegger (on Heidegger cf. my Conduct Unbecoming a Philosopher and Ott on Heidegger). Heidegger is mentioned by Weyl as a representative of the “thesis of the categorical finiteness of man” in the quote from Weyl in my last post, The Finality Fallacy. Here, again, is an abbreviated portion of the section I previously quoted from Weyl, where Weyl singles out Heidegger:

“We reject the thesis of the categorical finiteness of man, both in the atheistic form of obdurate finiteness which is so alluringly represented today in Germany by the Freiburg philosopher Heidegger…”

Here, on the other hand, is a representative exposition of radical finitude that draws upon the Heideggerian tradition:

“Nonbeing as the principle of finitude is non-being understood in its relative and dialectical character through which it becomes a constitutive factor of human being or Dasein himself. Anxiety in its disclosure of nothingness thus brings man to an awareness of his radical finitude, and what ever else is to be said of existentialist philosophy, it must be said that existentialism is an emphatic philosophy of human finitude. The principle of finitude is central to all the existentialist thinkers, and it emerges with particular emphasis in the philosophy of Heidegger. Heidegger interprets this philosophy of human finitude to be, at least in part, a legacy of Kant’s critical philosophy. With his emphasis on the finite character of human reason and his insight into the negativities of moral striving, Kant paved the way for the development of fundamental ontology formulated in terms of finite structures.”

Calvin O. Schrag, Existence and Freedom: Towards an Ontology of Human Finitude, pp. 73-74

According to Schrag, then, it seems that existentialism can be defined in terms of Weyl’s thesis of the categorical finiteness of man. If this is so, and existentialism is, “an emphatic philosophy of human finitude,” as Schrag said it was, it might still be possible to define another philosophical position, entirely parallel to existentialism, but which would reject the thesis of the categorical finiteness of man. What would we call this logical complement of existentialism? It doesn’t really matter what we call it, but I’m sure there must be a clever moniker that eludes me at the moment.

Although it doesn’t really matter what we would call the infinitistic complement of existentialism, it does matter that such a philosophy would reject finitism (and its tendency to commit the finality fallacy). With a slight change to Schrag’s formulation, we could say that the complement of existentialism imagined above would be an emphatic philosophy of human contingency. This is a position that I could endorse, even while I would continue to reject a philosophy of human finitude. And this formulation in terms of contingency is not necessarily at odds with non-Heideggerian existentialism.

Sartre’s formulation of existentialism — existence precedes essence — is in no sense intrinsically finitistic. I can imagine that someone might argue that existence is intrinsically finite — that the existential is existential in virtue of being marked out by the boundaries that define its finitude — but I would reject that argument. That same argument could made for essence (i.e., that essence is intrinsically finite), and thus for the whole idealistic tradition that preceded Sartre, and which Sartre and others saw themselves as overturning. (Heidegger, it should be noted, categorically rejected Sartre’s categorical formulation of existentialism.) The existence that precedes essence may well be an infinitistic existence, just as the essence that precedes existence in the idealistic tradition may well be an infinitistic essence.

To return to one of the roots of existential thought, we find in Nietzsche that it is contingency rather than finitude that is at stake. In a note from 1873 Nietzsche wrote:

“That my life has no aim is evident even from the accidental nature of its origin; that I can posit an aim for myself is another matter.”

Friedrich Nietzsche, The Portable Nietzsche, edited and translated by Walter Kaufmann, New York: Viking, p. 40

Recognition of the contingency of life, and especially (given the anthropocentrism of our human minds) the contingency of human life, is a touchstone of existential thought. Some, as I have noted above, frame contingency in finitistic terms, but as I see it contingency is the infinite context of all existents, stretching out into space and time without end. From this point of view, any finitude is an arbitrary division within the Heraclitean flux of the world, the concordia discors that precedes us, follows us, and surrounds us.

What is the relationship between Nietzschean contingency and Weyl’s openness? I would argue that the open world implies an open life. It was one of the central literary conceits of Plato’s Republic that it is easier to see justice in the large — i.e., in the just state — than to see justice in the small — i.e., in the just man — and this is how Socrates shifts the conversation to an investigation of the ideal state, which, once defined, will give us the image that we need in order to understand the ideally proportioned man. If Plato (and Socrates) are right this this, one might hold that Weyl’s open world can be a guide to the open life.

What would an open life look like? One vision of the open life is described in Charles Dickens’ classic A Christmas Carol, from the mouth of Jacob Marley:

“It is required of every man,” the Ghost returned, “that the spirit within him should walk abroad among his fellowmen, and travel far and wide; and if that spirit goes not forth in life, it is condemned to do so after death. It is doomed to wander through the world — oh, woe is me! — and witness what it cannot share, but might have shared on earth, and turned to happiness!”

Charles Dickens, A Christmas Carol, “Marley’s Ghost”

This is the open life of the individual — to walk abroad, literally and metaphorically — and to share what can be shared. The open life of the species is again another question — a question mid-way between the open world and the individual open life — and one that might simply be answered by asserting that an open humanity is the sum total of open human lives, if one regards humanity as nothing in itself and reducible to its individual instances.

This is the point at which I may perhaps lose my reader, because what I would like to suggest is that the open life for humanity is another way to understand transhumanism. Transhumanism is the openness of humanity to revision, and openness to revision implies iterated revision, iterated revision implies evolution, and the evolution of humanity implies an essentially different humanity in the future than humanity today.

What I have come to realize since writing my last post is that human finitude is one manifestation of human contingency, and, like any contingency, it is subject to revision by future contingencies. Again, our finitude, so far as it extends, is a contingency, and therefore, like any contingency, is subject to change.

The critics of transhumanism who have tried to find ways to praise suffering and death, and who go out of their way to argue that human life only has meaning and value in virtue of its limitation, overlook the role of contingency in human life. They pretend that human life is final, and that its contingent features are essential to humanity, if not necessary to the definition of what it means to be human — which is to say, they commit the finality fallacy. For the prophets of wholesome loss, humanity is finished.

Human being is no more final than any other form of being. The openness of human being means that human viability is predicated upon contingency, and that we must evolve or perish.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Advertisements

Freedom and Ressentiment

1 September 2013

Sunday


Friedrich Nietzsche (1844–1900)

Friedrich Nietzsche

Sometimes when I am asked my favorite book I reply that it is Nietzsche’s Genealogy of Morals, which is the most systematic of his books on ethics and which gives his most detailed exposition of ressentiment. I reread the third essay in the book today — “What is the meaning of ascetic ideals?” — keeping in mind while I did what I wrote about freedom day before yesterday in Theory and Practice of Freedom.

To give a flavor of Nietzsche’s argument I want to cite a couple of passages from the book that I take to be particularly crucial. Firstly, here is the passage in which Nietzsche introduces the idea of ressentiment becoming creative and creating its own values:

“The beginning of the slaves’ revolt in morality occurs when ressentiment itself turns creative and gives birth to values: the ressentiment of those beings who, denied the proper response of action, compensate for it only with imaginary revenge. Whereas all noble morality grows out of a triumphant saying ‘yes’ to itself, slave morality says ‘no’ on principle to everything that is ‘outside’, ‘other’, ‘non-self ’: and this ‘no’ is its creative deed.”

Nietzsche, Friedrich, On the Genealogy of Morality, EDITED BY KEITH ANSELL-PEARSON, Department of Philosophy, University of Warwick, TRANSLATED BY CAROL DIETHE, Cambridge University Press, 1994, 2007, p. 20

Near the end of the book, Nietzsche reiterates one of his central themes, that man would rather will nothing than not will:

“It is absolutely impossible for us to conceal what was actually expressed by that whole willing that derives its direction from the ascetic ideal: this hatred of the human, and even more of the animalistic, even more of the material, this horror of the senses, of reason itself, this fear of happiness and beauty, this longing to get away from appearance, transience, growth, death, wishing, longing itself — all that means, let us dare to grasp it, a will to nothingness, an aversion to life, a rebellion against the most fundamental prerequisites of life, but it is and remains a will! …And, to conclude by saying what I said at the beginning: man still prefers to will nothingness, than not will…”

Nietzsche, Friedrich, On the Genealogy of Morality, EDITED BY KEITH ANSELL-PEARSON, Department of Philosophy, University of Warwick, TRANSLATED BY CAROL DIETHE, Cambridge University Press, 1994, 2007, p. 120

One of the themes that occurs throughout Nietzsche’s works is the critique of nihilism — Nietzsche finds nihilism in much that others fail to recognize as such, while Nietzsche himself has been accused of nihilism because of his iconoclasm. The immediately preceding passage strikes me as one of Nietzsche’s most powerful formulations of unexpected and unrecognized nihilism: willing nothing.

I think Nietzsche primarily had institutional religion in mind, especially those institutionalized religions that put a priestly caste in power (whether directly or indirectly), but there are plenty of examples of thoroughly secular forms of ressentiment developing to the point of creating its own values, and I think one of the principal forms of secular ressentiment takes the form of the denial or the repudiation or the rejection of freedom. The denial of freedom is a particularly pure form of the nihilistic will saying “No!” to life, since life, in the living of it, is all about freedom — we realize our freedom in the dizziness that is dread, and make our choices in fear and trembling. Many people quite literally become physically ill when faced with a momentous choice — so great a role does the idea of freedom play in our thoughts, that our thoughts are manifested physically.

The denial of freedom takes many forms. For example, it often takes the form of determinism, and determinism itself can take many forms. On my other blog I wrote about determinism from the point of view of the denial of freedom as a philosophical problem — something I wanted to do to counter the prevalent attitude that asks why so many people believe in their own freewill. This approach seems to me incredibly perverse, and the more reasonable question is to ask why so many people believe they do not have freewill. Now, Nietzsche himself was a determinist, so he likely would not be sympathetic to what I’m saying here, but that does not stop us from applying Nietzsche’s own ideas to himself (something Max Scheler also did in his book on Ressentiment).

Probably the most common form that the denial of freedom takes is a rationalization of a failure to take advantage of one’s freedoms. This is a much more subtle denial of freedom than determinism, and in fact assumes the reality of free will. If the palpable reality of freedom, and the potential upsets to the ordinary business of life that it presents, were not all-too-real, there would be no need to formulate elaborate rationales for not taking advantage of one’s freedom and opting for a life of conformity and servile acquiescence to authority.

Understanding that freedom is honored more in the breach than the observance was a well-trodden path in twentieth century thought. Although Freud had deterministic sympathies, his theories of reason as the mere rationalization of what the unconscious was going to do anyway incorporates both determinist and free willist assumptions. The denial of freedom is a central theme in Sartre’s work (the spirit of seriousness and the idea of bad faith are both important forms of the denial of freedom), and through Freud and Sartre the influence on twentieth century thought and literature was profound. I have previously cited the role of Gooper Pollitt in Tennessee Williams’ Cat on a Hot Tin Roof as a paradigm of inauthenticity (in Existential Due Diligence).

All one need do is look around at the world we’ve made, with all its laws and statutes, its codes and regulations, its institutions and rules, its traditions and customs — it would be entirely possible to pass an entire lifetime in this context without realizing, much less exercising, one’s freedom. And these are only passive discouragements. When it comes to active discouragements to freedom, every nay-sayer, every pessimist, every wagging finger, every shaming tactic, every snide and cynical comment is an attempt to dissuade us from enjoying our freedom and entering into the same self-chosen misery of all those who have systematically extirpated all traces of freedom from their own lives.

Everyone who has given up freedom in their own life understandably resents seeing the exercise of freedom in the lives of others, and when this resentment turns creative it gives birth to every imaginable form of slander of freedom and of praise of servility — whether to a cause or to a movement or to an individual or to an institution — not to mention endless rationalizations of why the refusal of freedom isn’t really a refusal of freedom. Don’t believe it. Don’t believe any of it. Don’t buy into it. There is nothing in this world that is worth surrendering your freedom for — not matter how highly it is praised or how enthusiastically it is celebrated — this praise and this celebration of unfreedom is nothing but the creative response of ressentiment directed against freedom.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Tuesday


Day before yesterday in Philosophies of the Secret Garden I discussed a passage in Nietzsche where he compared his philosophical efforts to tending a secret garden, and I suggested that there are “secret gardens” in both science and philosophy that fall between Kuhnian normal science (or philosophy) and revolutionary science (or philosophy). Some of what I said applies to military doctrine, though the intrinsic properties of an essentially social experience make it a slightly different case that the essentially solitary activity of philosophy. This makes the example of science particularly interesting, since it occupies a position between philosophy and doctrine.

A philosopher and a mathematician can work in near isolation. Most, as a contingent matter of fact do not work in complete isolation, but prefer the stimulus afforded by interaction with like-minded thinkers, but some do in fact isolate themselves, and often this is purposeful. Descartes reputedly moved his residence repeatedly in order to avoid unannounced callers. Even today there are some well known thinkers who work in near isolation. Perhaps the most famous example in the present age is Grigori Perelman, the mathematician who proved the Poincaré conjecture.

Some creative undertakings demand the contributions of many persons and many talents. One cannot produce a show on Broadway or a film in Hollywood without the collective efforts of a great many people. One can write a screenplay in isolation, but it will never be produced as a film without the participation of others. Similarly, a visionary architect can design a building in isolation, but without the efforts and cooperation of a great many others, his buildings will never get built. The isolated novelist or philosopher or mathematician can hope that their work will survive and resonate with future ages, even if it falls flat in their own time, but the more that a creative expression is communal, like film or architecture, the less likely this will happen, or, if it does happen, that it will resemble the vision of the isolated visionary.

Military doctrine — whether strategic, operational, or tactical — is a social art, like film or architecture. As a social art, military doctrine is less open to the work of an isolated genius. There certainly is normal doctrine and revolutionary doctrine, parallel to normal science and revolutionary science, but there is far less latitude for a secret garden of strategy. Furthermore, doctrine is not only a social art, it is also an overwhelmingly contingent art that has little to do with necessary, a priori truths. Doctrine is learned from particular, empirical states of affairs. This knowledge can, of course, be acquired in isolation, like a knowledge of philosophy of literature, but the most recent developments are not likely to be widely available, and in fact most of the relevant details may be classified, or, if not classified, certainly difficult of access.

Having made the case for doctrine as a social art, and acknowledged the difficulty of acquiring knowledge of doctrine in isolation, not to mention the near impossibility of attracting any interest in such an effort, it remains to point out that, while difficult and rare, it still remains possible for there to be a secret garden of strategy, and the very possibility of this, as slim as it is, presents the possibility of a game-changing confrontation with established doctrine. No one can afford to neglect the possibility, since it presents the aspect of a strategic shock that could upset accepted calculations.

As I noted above, individual pursuits like literature present no great difficulties to the individual enthusiast. Science was once like this, and science was once primarily the pursuit of gentlemen amateurs. Some of these gentlemen amateurs made great contributions, and the greatest of them — Charles Darwin — not only made contributions, but probably changed the way that science is done and effected a conceptual revolution as profound as that of Copernicus. Elsewhere I have called this the heroic conception of science — an individual, working alone, on a project that would transform the world, knowing that if the project is made public precipitously, it will certainly invite ridicule rather than foment revolution. Darwin knew well, as Nietzsche counseled, how to keep silent long enough.

Today science is mostly Big Science, but it isn’t all Big Science. There remains the possibility of the heroic individual scientist going against the establishment, which pursues the iterative conception of science with an army of scientists, organized in a top-down hierarchical structure that resembles military organization more than it resembles the discoveries of Galileo, Newton, and Einstein.

One could say that the more institutionalized science becomes, the more resources it will have at its command, and therefore the more difficult it would be for any individual to make a meaningful contribution to science outside this structure. But at the same time as institutionalized Big Science has many resources and an army of contributors jointly pursuing the same end, the spirit of individual initiative is weakened and the institution becomes vulnerable to group think that simply dismisses anything outside its purview as irrelevant and uninteresting. Institutionalized power carries with it the ability to pursue and attain ends that lie far beyond the ability of the individual, but it also carries with it the risk of stifling innovation.

To return to my distinction above between social arts and solitary arts, what could be more of a social art that politics? And is not politics the very soul of institutionalized power, being institutionalized power in its purest form, unencumbered by any desire other than power? As a nearly perfect exemplification of a social art, it ought to be the case that only those with extensive knowledge and experience within the social milieu that defines the art of politics would possess the particular epistemic background that it would make it possible for such an individual to make innovations within the field. But what we find in fact is that politics is the most uncreative arts, in fact, nearly hostile to innovation, and those who have been in it the longest are the most impervious to new ideas. Thus in the case of the social art of politics, institutionalized ossification so dominates political discourse that trying something new has become a near impossibility — indeed, as I have observed elsewhere, it literally takes a revolution to effect political change.

Just as the intensely social milieu of political thought takes a revolution even to implement small changes, so too the intensely social milieu of military thought requires the military equivalent of a revolution in order to effect changes. However, while in politics social conflicts are primarily resolved within a single social system, military conflicts primarily resolved in a contest between different social systems, except in the case of civil wars. This is an important distinction. The political life of a political entity may become so institutionalized that change becomes unthinkable, but the military life of a political entity can be decided from without, but those who have no stake whatsoever in the welfare of that political entity, and may even seek the dissolution of that political entity.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

A Theory of Gift Exchange

25 December 2011

Sunday


There is a tension and a paradox at the heart of gift exchange. The pure gift is given without thought of reciprocity and out of generosity for its own sake, but the very idea of a gift exchange implies that two or more parties will engage in a ritualized and mutual transfer of gifts. Thus in gift exchange, there is an expectation of reciprocity and even of symmetry: gifts are expected to be of roughly equal value, except in special circumstances. And the fact that there are (tacitly acknowledged) exceptions to axiological symmetry is a nod to the fact that gift exchange is a calculation, and nothing is farther from the spirit of a pure gift than the spirit of calculation.

Still, we try, and sometimes we are successful. Moreover, sometimes we are surprised by the unexpected generosity of the other. Sometimes we are presented with a gift completely unexpectedly, from someone we had no expectation of gift exchange. In such circumstances, we are not prepared to give anything in exchange, so we are “forced” to accept the pure gift, for to refuse it would be inexcusable in most circumstances.

Still, unexpected gifts are sometimes refused. It is not uncommon for an individual who looks to another as a potential romantic partner will give an unexpected gift as a way to announce their interest in that person. In a sense, announcing one’s interest in another with a gift is much like announcing one’s intentions and expectations of future exchange. Thus such a gift may well be refused for the implicit contract of future exchange that it suggests. Charitable person will be polite about this and say things like, “Oh, I could never accept such a present, it is much to expensive.” Perhaps they might also add, “I would feel obligated if I accepted a gift like this.”

In this way — and while I have used the example of a gift exchange between potential romantic partners, but it is by no means limited to the romantic dyad — even a forced pure gift, exempt of necessity from immediate exchange, can in fact be part of a gift exchange when understood in a larger context, and therefore fails to constitute a pure gift.

The model of a pure gift is grace, and so the pure gift may be exclusive to theological contexts. While a pure gift may be a rare thing, to what extent can we approximate a pure gift? May be come so close to approximating a pure gift that our gift is, for all practical purposes, a pure gift? I suspect so, as I expect that the more closely a gift approximates the ideal of the pure gift, the more rare it becomes.

Nietzsche wrote the following in his Mixed Opinions and Maxims:

How duty acquires splendor.–The means for changing your iron duty to gold in everyone’s eyes is this: always keep a little more than you promise.

This last suggestion — always keep a little more than you promise — has stayed with me ever since I first read this many years ago. Nietzsche formulates this in the context of duty, and duty is not only a reciprocal and symmetrical relationship, it is usually felt to be a burden. Nietzsche has asked himself, “How can this burden be transformed into something welcome to all?” The surprising answer he gives to this self-posed question is, “Transform duty into a gift” — for when you keep a little more than you promise you go beyond duty in keeping your duty. This is a Nietzschean transvaluation of values of an unexpected sort.

One finds this same idea of Nietzsche’s in the New Testament, in famous admonition, “And whosoever shall compel thee to go a mile, go with him twain.” (Nietzsche was, after all, a preacher’s son, and he absorbed some of the radical character of the gospels.) In the Holy Land during classical antiquity, a Roman soldier could legally compel a Jew to carry his pack for one mile. Only one mile was required, after which the Jew impressed into such service was free. Obviously, the Jew felt this law to be a humiliation. The obvious response to such an unjust law would be its repeal. The radical response, the response of the gospels, is not only the go the mile without complaint, but to go an extra mile.

If we can inform our duties, including our duties of gift exchange, with the spirit of keeping a little more than you promise or going the extra mile, our duties are transformed into gifts, and since these gifts grow out of duties, they are unexpected and therefore approximate pure gifts. But there are few who have the moral strength to overcome the feeling of humiliation and moreover to transform this humiliation into what Nietzsche called “golden.” This the world is more marked by impure than pure gifting.

Impure gift exchanges are undertaken with no pretense of being given out of pure generosity. Most gifts are of this kind. When an employer plays an employee more than he strictly has to pay that employee, and the employee works harder than they strictly might be expected to work, economists call this a “gift exchange.” I realized recently that we can narrow this and call it an “economic gift exchange” and contrast it to other forms of gift exchange.

Recently in the Financial Times I read the following in David Pilling’s column, Modern China is yearning for a new moral code:

“Day to day, most Chinese people are able to put aside the broader moral confusion to perform the little acts of kindness and decency that make a society function.”

What Pilling describes here as the little acts that make a society function constitute what we might call a social gift exchange: in a smoothly functioning society individual citizens do more than a narrow interpretation of their social and legal duty would stipulate, and the presumptive exchange for this is to live in a society than is better than that which fulfills the minimum social and legal requirements.

There are many kinds of social gift exchange: a teacher who makes an extra effort to teach and a student to learn, a social worker who goes the extra mile for a client and a client that responds by making an extra effort, and when people yield the right of way for each other in traffic, or when pedestrians, bicyclists, and vehicles do so mutually and on the large scale (and, just as importantly, the anonymity) of urban population densities.

There are also many instances when social gift exchange fails, and people behave rather badly with each other. I’m sure that anyone reading this can recall many instances from their own experience when others gave only the minimum but demanded an extra measure for themselves in return, or when people engage in conflict over who is and who is not abiding by the minimum social and legal standards.

This idea of social gift exchange can be used to define a healthy and fully functioning society: in a healthy society, social gift exchange is routine and unexceptional; in a pathological society, social gift exchange is mostly absent. In the big picture, it is to be expected that most societies fall somewhere in the middle of this continuum stretching between the pathological and the healthy, though it is also be expected that in smaller, culturally and ethnically homogenous societies that social gift exchange functions more easily because of shared expectations and a shared understanding — what sociologists call normative consensus.

Thus out of the impure giving of gift exchange, even when undertaken in a spirit of reciprocity and axiological symmetry, something truly noble and socially beneficial can emerge. A social gift exchange that is moreover informed by the spirit of capitalism, that prioritizes delayed gratification as the sure way to wealth, is all the more powerful, since one is not looking for an immediate and symmetrical payoff of one’s social gifting.

As I see it, then, what Marx called “callous ‘cash payment’” and “icy water of egotistical calculation” may be in fact the thin edge of the wedge for a more just and even a more human social and moral order. Of course, as we practice it, it is imperfect in the extreme. Whether or not we might perfect social gift exchange brings us to the traditional question of the perfectibility of man. Yet, short of the perfectibility of man, we can still sensibly ask about, say, the betterment of man, even if short of perfection.

This, then, is the challenge for the large and diverse societies of contemporary nation-states (presumptively established on the basis of nationhood, but in fact established on the basis of the territorial principle in law): how can social gift exchange be encouraged to flourish in a context in which shared expectations and shared understanding may be absent?

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

The Last Civilization

10 February 2011

Thursday


The subtitle of Nietzsche’s Beyond Good and Evil, one of his most influential books, is Prelude to a Philosophy of the Future, and in a strangely moving passage (section 214 of the same book) he referred collectively to himself and his readers as, “first born of the twentieth century.” Nietzsche was not a “futurist” in the sense we know the term today, but his philosophy was centered on the future.

Nietzsche’s conception of a future Übermensch who would supersede humanity as we know it today is of course one of the most well known and indeed notorious aspects of Nietzsche’s thought. In fact, just last night I watched the very entertaining and informative documentary Protagonist, in which the now reformed bank robber repeatedly stated that during his years of crime he believed himself to be a Nietzschean Superman. Whether you admire or despise the idea of the Übermensch, this was Nietzsche’s vision for what the future might be at its best. But this wasn’t the only future imagined by Nietzsche. He also imagined a worst case scenario for the future, and this worst case scenario was the Last Man (In German: der letzte Mensch).

In a couple of comments to my posts, Greg Lawson has drawn particular attention to Nietzsche’s Last Man. Mr. Lawson noted that Nietzsche’s Last Man appears in the title of Francis Fukuyama’s The End of History and the Last Man, so the idea retains a certain currency. Nietzsche’s exposition of the Last Man occurs in Section 5 of the preface of Thus Spoke Zarathustra:

They have something whereof they are proud. What do they call it, that which maketh them proud? Culture, they call it; it distinguisheth them from the goatherds.

They dislike, therefore, to hear of ‘contempt’ of themselves. So I will appeal to their pride.

I will speak unto them of the most contemptible thing: that, however, is THE LAST MAN!”

And thus spake Zarathustra unto the people:

It is time for man to fix his goal. It is time for man to plant the germ of his highest hope.

Still is his soil rich enough for it. But that soil will one day be poor and exhausted, and no lofty tree will any longer be able to grow thereon.

Alas! there cometh the time when man will no longer launch the arrow of his longing beyond man—and the string of his bow will have unlearned to whizz!

I tell you: one must still have chaos in one, to give birth to a dancing star. I tell you: ye have still chaos in you.

Alas! There cometh the time when man will no longer give birth to any star. Alas! There cometh the time of the most despicable man, who can no longer despise himself.

Lo! I show you THE LAST MAN.

“What is love? What is creation? What is longing? What is a star?” — so asketh the last man and blinketh.

The earth hath then become small, and on it there hoppeth the last man who maketh everything small. His species is ineradicable like that of the ground-flea; the last man liveth longest.

“We have discovered happiness” — say the last men, and blink thereby.

They have left the regions where it is hard to live; for they need warmth. One still loveth one’s neighbour and rubbeth against him; for one needeth warmth.

Turning ill and being distrustful, they consider sinful: they walk warily. He is a fool who still stumbleth over stones or men!

A little poison now and then: that maketh pleasant dreams. And much poison at last for a pleasant death.

One still worketh, for work is a pastime. But one is careful lest the pastime should hurt one.

One no longer becometh poor or rich; both are too burdensome. Who still wanteth to rule? Who still wanteth to obey? Both are too burdensome.

No shepherd, and one herd! Every one wanteth the same; every one is equal: he who hath other sentiments goeth voluntarily into the madhouse.

“Formerly all the world was insane,” — say the subtlest of them, and blink thereby.

They are clever and know all that hath happened: so there is no end to their raillery. People still fall out, but are soon reconciled—otherwise it spoileth their stomachs.

They have their little pleasures for the day, and their little pleasures for the night, but they have a regard for health.

“We have discovered happiness,” — say the last men, and blink thereby. —

And here ended the first discourse of Zarathustra, which is also called “The Prologue”: for at this point the shouting and mirth of the multitude interrupted him. “Give us this last man, O Zarathustra,”—they called out—”make us into these last men! Then will we make thee a present of the Superman!” And all the people exulted and smacked their lips. Zarathustra, however, turned sad, and said to his heart:

“They understand me not: I am not the mouth for these ears.

Too long, perhaps, have I lived in the mountains; too much have I hearkened unto the brooks and trees: now do I speak unto them as unto the goatherds.”

Nietzsche’s focus is on the contemptible Last Man himself, and his fellow last men, but I will observe that the Last Man, if and when he emerges from history will not emerge in a vacuum. The Last Man will be a product of the Last Civilization. The Last Civilization, like the Last Man, is contemptible, and smugly self-satisfied in its contemptuous status. Like the fool which Soloman said delights in his folly, so too the Last Man delights in his contemptible nature, and the Last Civilization delights in the Last Men it has produced desporting themselves as the contemptuous creatures they are. As the Last Man sees himself as the ultimate product of civilization, after which nothing more can possibly follow, so the Last Civilization understands itself as the ultimate civilization, and misunderstands is ultimacy as an expression of its “higher” nature.

Is it possible to discern in the present whether man is becoming the Last Man or Superman? And has our civilization turned a crucial corner to head decisively either in the direction of the Last Civilization or in the direction of Higher Civilization? Not long ago in The Very Idea of Higher Civilization I argued that contemporary industrialized civilization has not yet even begun to compete with the excellence of classical antiquity or the high points of medieval civilization. To date, industrialized civilization is not a peer-to-peer competitor with any civilization of the past.

This worries me, and I hope that it worries you, too. Industrialized civilization seems to be producing the conditions for the Last Man to someday reign, and therefore seems to be transforming itself in the Last Civilization. A simple, uninterrupted development of current trends would issue in precisely this fate. If contemporary industrialized civilization does not eventually produce the conditions of its self-transcendence and thereby justify itself through the creation of truly great works of civilization, distinctive of its milieu, then we will certainly evolve into the Last Man. Continued mediocrity is sufficient for the Last Man to triumph and to create (and be created by) the Last Civilization.

. . . . .

I have long had it on my mind to write about the Last Man, and also to write about structural forces in industrialized civilization that tend toward the degradation of excellence. I had not planned to bring these two ideas together; this is something that just happened to occur to me today. So I still have (at least) two more posts to write on these topics separately, but these thoughts are not yet sufficiently mature to expose them to the light of day.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Thursday


Asked to recite some examples of institutions, it is not likely that language would be among the examples cited, but language is an institution, and moreover the institutions of language are the institutions of communication, cooperation, reasoning, and understanding. In so far as human experience involves communication, cooperation, reasoning, and understanding (inter alia), it is pervasively linguistic. That is to say, human experience is institutionalized in language.

I find the institutionalization of human experience in language interesting at present because language provides an excellent example of the distinction between formal institutions, based on an explicit social contract, and informal institutions, based on an implicit social contract, that I recently discussed in Twelve Theses on Institutionalized Power. Roughly speaking, spoken language is an informal institution while written language is a formal institution. We ought also to note in this context that spoken language has a deep history that goes far back into the Paleolithic, may be coextensive with biologically modern human beings, and which may also be shared by other species (both extant and extinct). On the other hand, written language is historically recent (from the perspective of the longue durée), emerging within the Agricultural Paradigm, seems to be exclusively human, and marks the distinction between prehistory and history proper (at least, in traditional historiography).

The institution of language demonstrates quite vividly how implicit social contracts can and do change quite rapidly, and, more importantly, more rapidly than explicit social contracts. The formal institutions of explicit social contracts often possess explicit mechanisms for recognizing change (for example, in relation to language, whether or not an English word appears in the Oxford English Dictionary, or, in French, whether some usage is recognized by L’Académie française) — a due process, as it were, that is most familiar in the case of the explicit social contract of legal codes. The existence of explicit mechanisms for change suppresses spontaneous change, whereas spoken language thrives on spontaneous change.

One of the most familiar ways in which inter-generational conflict is expressed is in the different linguistic usages of older and younger generations. The implicit social contract of spoken language can be spontaneously changed by a single clever remark, coinage, or pronunciation. Since the ordinary business of life is largely driven by the fashion of the moment, a spontaneous change may be picked up and imitated by others quite quickly (this is now known as “going viral”). I read somewhere that the Castilian Spanish shift to pronouncing “s” and “c” with a lisp (i.e., pronouncing them as “th” as in “Barthelona,” which some Castilians say, but no Catalonian says) was the result of the imitation of a particular aristocrat who spoke with a lisp.

With the example in mind of language expressed both as a formal and as an informal institution, it is then interesting to consider socio-political social contracts in this context. I think we find that, as with language, implicit social contracts can and do change with some degree of rapidity, while explicit social contracts tend to change much more slowly. As observed above in relation to the law, if due process must be followed in, for example, changing the constitution of a nation-state, this will happen much more slowly than political opinion changes in those areas of social and political life not subject to formal institutions. At times this tension between formal and informal institutions, and their different rates of change, can result in revolution, when the implicit socio-political contract has changed very rapidly over a large proportion of a population even while the explicit socio-political contract has not changed (or not changed enough to satisfy public opinion).

In a couple of posts (The Totemic Paradigm and Why Revolutions Happen) I have mentioned Nietzsche’s idea of a “morality of mores” (In German: “die Sittlichkeit der Sitte”, also translated as the “morality of custom”), which Nietzsche compelling described thus:

“…those tremendous eras of ‘morality of custom’ which precede ‘world history’ as the actual and decisive eras of history which determined the character of mankind: the eras in which suffering counted as a virtue, cruelty counted as a virtue, dissembling counted as a virtue, revenge counted as a virtue, denial of reason counted as a virtue, while on the other hand well-being was accounted a danger, desire for knowledge was accounted a danger, peace was accounted a danger, pity was accounted a danger, being pitied was accounted an affront, work was accounted an affront, madness was accounted godliness, and change was accounted immoral and pregnant with disaster!”

Nietzsche, Daybreak, Preface, section 18

In his lectures, Joseph Campbell does not use Nietzsche’s terminology, but it is obvious in describing the rituals of early human societies that he has something very similar in mind, especially in his discussions of what Yeats called the “primary mask” that societies impose upon their members. Many of these rituals of social initiation and communal conformity are horrendous to modern eyes, and they embody much of what Nietzsche described in the above-quoted passage.

The social rituals of proto-civilizations lack the intellectual and conceptual infrastructure to emerge as fully formal institutions; however — and this is important — these institutions were formalized in the only way that it was possible to formalize an institution prior to the emergence of written language and explicit legal codes: by way of ritual. The extreme taboos that applied to the violation of ritual was itself a reaction to how easily practices can change when there is no permanent point of reference (like a written text) to secure consistency over time. One could argue the horror of pre-literate ritual culture was given its horrendous form precisely because it had to make an unforgettable impression at a time when there was no other way to preserve tradition.

Which brings us back to the evanescent nature of implicit social contracts. When I was musing over the above ideas yesterday, I realized that the only reason that we have in our history the “morality of mores” and horrific initiation rituals is because of the all-too-real and constant possibility of change. That is to say, these are reactionary developments — a social embodiment of the Freudian Verneinung, i.e., the negation that in its violence paradoxically confirms exactly what it seeks to deny: “I had a dream of an old man, but it was not my father!”

The situation of early peoples attempting to preserve their traditions and way of life — preserving life itself, as it were, the only life than they knew — was deeply problematic, and they knew it. They did what they could with their limited technology to preserve what could be preserved, but this presented insuperable problems. Civilization emerged as a “solution” to some of these insuperable problems.

These problems persist today in different forms. I discussed the desire of dictators to preserve their personal or dynastic rule in The Imperative of Regime Survival. There I quoted one of my favorite passages from Gibbon:

“In earthly affairs, it is not easy to conceive how an assembly equal of legislators can bind their successors invested with powers equal to their own.”

Edward Gibbon, History of the Decline and Fall of the Roman Empire, Vol. VI, Chapter LXVI, “Union Of The Greek And Latin Churches.—Part III.

The principle that Gibbon expresses here (a principle I have elaborated elsewhere in Gibbon, Sartre, and the Eurozone) is formulated in terms of formal legal institutions — an assembly of legislators — but it is equally true in pre-literate proto-civilizations that possess only the informal institutions of spoken language and social ritual, both of which, without some method for the preservation of tradition, would rapidly mutate beyond recognition due to the openness to change of informal institutions.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Nietzsche on Sexuality

18 September 2010

Saturday


One might suppose that Nietzsche ought to be the last author that we would read for any insight into sexuality, as his own sexual experience was remarkably limited, and may have consisted of only a few visits to a student brothel in his youth. But it is a mere fallacy implicit in the instinctive positivism and empiricism of contemporary industrialized civilization that it is quantity rather than quality of experience that counts.

One could as well hold that it would have been impossible for Emily Dickinson to have any better insight into the human condition than the average man-in-the-street because of her self-imposed isolation and asceticism, which meant that she had much less experience of the human condition, in simple quantitative terms, than the average person. But it was what Emily Dickinson made of her experience, and not what experience imposed upon her, that made her poetry great — a revelatory insight into the human condition.

It is this sense of what we can make of our experience that I was trying to get at when, a few days ago, I wrote on Twitter:

The active and fertile mind must sip only sparingly at the sweetness of life, for a full draught would overwhelm any sensitive soul.

Just so the active and fertile minds of Nietzsche and Emily Dickinson worked on very limited material, and neither sought to sate themselves upon the world. Which brings us to Nietzsche on sexuality. In his still challenging and influential book Beyond Good and Evil: Prelude to a Philosophy of the Future, Nietzsche wrote:

The degree and kind of a man’s sexuality reach up into the ultimate pinnacle of his spirit. (section 75)

I believe that this is true. I also believe that the corollary is true, namely, that the ultimate pinnacle of man’s spirit reaches deep into the degree and kind of a man’s sexuality. Or, if you prefer, that the degree and kind of a man’s spirit is expressed in the ultimate pinnacle of sexuality.

This is no small matter. From a biological point of view, anything that we do other than reproduce is epiphenomenal to life, and our sexual instincts are a concrete and personal embodiment of this will to live that projects itself onto future generations, willing that they should live also. If, then, there is a unity of that which we have believed to be most bestial in our character and that which we have heretofore believed to be ideal and edifying, this tells us something about who and what we are. We are not divided between a bestial element and a celestial element; we are one and whole.

The unity of body, mind, and spirit is a perennial theme of western thought, just as the fragmentation of body, mind and spirit is the perennial lot of western man. We always seek to recover this unity, and we always fail in the attempt. Both making the attempt and failing in the attempt define a large part of western civilization. And Nietzsche has, in the above quote, given the unity of body, mind, and spirit a particularly beautiful expression, which suggests a particularly beautiful form of failure.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

. . . . .

Becoming What We Are

18 January 2010

Monday


Today is not the actual birthday of Dr. Martin Luther King jr., but it is the day that his birthday is celebrated in the United States as a national holiday, so we will take this occasion to consider an aspect of King’s thought, but only by way of an unlikely digression.

One of the great themes in Nietzsche’s thought is that of becoming what one is. Nietzsche conceived this as the ultimate quest and fulfillment of individuality. The Übermensch is a man who has become what he is. Nietzsche’s intellectual autobiography, Ecce Homo, was subtitled, “How One Becomes What One Is.” In Ecce Homo Nietzsche wrote:

The fact that one becomes what one is, presupposes that one has not the remotest suspicion of what one is. From this standpoint even the blunders of one’s life have their own meaning and value, the temporary deviations and aberrations, the moments of hesitation and of modesty, the earnestness wasted upon duties that lie outside the actual life-task.

One recognizes in this passage many of the familiar devices that Nietzsche uses to shock the reader into thinking for himself, to question not only the conventions of society but also to question the conventions that one tacitly sets up for oneself. Such questions must be asked if one is ever to become what one is. Complacency spells the end of all becoming.

One would not say that Dr. Martin Luther King jr. was a Nietzschean figure. He had to have been energetic and driven to achieve what he did achieve, but King was apparently sincere in his Christianity and African-American Christian churches and traditions were central to the civil rights struggle in the US.

King belonged to a very different intellectual tradition than Nietzsche. Indeed, it would be interesting to know in detail what figures like King and Gandhi thought of Nietzsche, if they ever expressed themselves on the topic, and it is amusing to imagine how Nietzsche might have characterized or even caricatured men like King and Gandhi.

Despite the chasm between King’s Biblically-inspired thought and his American-inspired rhetoric and Nietzsche’s classically-inspired thought and European rhetoric, however, I see at least one common thread: becoming what one is.

King made the motif of becoming what one is central to one of his most famous speeches, the “I Have a Dream” speech, an address delivered at the March on Washington on 28 August 1963. But there is a twist: King’s conception is a communal reformulation of Nietzsche’s individualist imperative of becoming what one is. Nietzsche’s becoming what one is became with King becoming what we are:

I say to you today, my friends, that in spite of the difficulties and frustrations of the moment, I still have a dream. It is a dream deeply rooted in the American dream. I have a dream that one day this nation will rise up and live out the true meaning of its creed: “We hold these truths to be self-evident: that all men are created equal.”

And later,

And if America is to be a great nation this must become true.

There is, ringing in King’s words, the imperative of becoming, of becoming what we are. For America to “…live out the true meaning of its creed…” would be for America to become what it is. And I would argue that for any society or community of people to become what is it to become, that the several individuals that constitute that society or community must each and every one become what they are. With this in mind we are better prepared to understand the continuation of the passage from Nietzsche’s Ecce Homo quoted above:

Expressed morally, to love one’s neighbor and to live for others and for other things may be the means of protection employed to maintain the hardest kind of egoism. This is the exceptional case in which I, contrary to my principle and conviction, take the side of the altruistic instincts; for here they are concerned in subserving selfishness and self-discipline.

Thus, even in Nietzsche himself, becoming what one is and becoming what we are would seem to be integral developments.

Now for a step even farther afield. There is a figure in Buddhist thought known as a Bodhisattva. In Mahayana Buddhism (less so in Theravada Buddhism) a Bodhisattva is a partially enlightened being that chooses to delay its own attainment of ultimate enlightenment in order to conduct others on the path to enlightenment. I am fascinated by the very idea, and I cannot think of a similar conception realized in any other tradition. But that does not mean that we cannot appropriate the idea of a Bodhisattva for our own thought.

In the context of becoming what one is, figures like King and Gandhi are secular Bodhisattvas who have taken a non-egoistic path, delaying their own opportunity to become what they are, in order to guide communities toward becoming what they are, and, in so doing, the Bodhisattva and the community so guided together become what we are.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Friday


Paul Valéry

Paul Valéry

To see is to forget the name of the thing one sees.” This is a quote frequently attributed to Paul Valéry, and the line has a quality that is at once both searching and poetic, making the attribution reasonable. I don’t know if Valéry actually said it (I can’t find the source of the quote), but I think of this line every once in a while: my mind returns to it as to an object of fascination, like an intellectual fetish. A good aphorism is perennially pregnant with meaning, and always repays further meditation.

If seeing is forgetting the name of the thing one sees, and mutatis mutandis for the aesthetic experiences that follow from the other senses — e.g., to taste is to forget the name of thing one tastes, and so forth — we may take the idea further and insist that it is the forgetting of not only the name but of all the linguistic (i.e., formal) accretions, all categorizations, and all predications, that enables us to experience the thing-in-itself (to employ a Kantian locution). What we are describing is the pursuit of prepredicative experience after the fact (to employ a Husserlian locution).

This is nothing other than the familiar theme of seeking a pure aesthetic experience unmediated by the intellect, undistracted by conceptualization, unmarred by thought — seeing without thinking the seen. In view of this, can we take the further step, beyond the generalization of naming, extending the conceit to all linguistic formalizations, so that we arrive at a pure aesthesis of thought? Can we say that to think is to forget the name of the thing one thinks?

The pure aesthesis of thought, to feel a thought as one feels an experience of the senses, would be thought unmediated by the conventions of naming, categories, predication, and all the familiar machinery of the intellect, i.e., thought unmediated by the accretions of consciousness. It would be thought without all that we usually think of as being thought. Is such thought even possible? Is this, perhaps, unconscious thought? Is Freud the proper model for a pure aesthesis of thought? Possible or not, conscious or not, Freudian or not, the pursuit of such thought would constitute an effort of thought that must enlarge our intellectual imagination, and the enlargement of our imagination is ultimately the enlargement of our world.

Wittgenstein famously wrote that the limits of my language are the limits of my world (Tractatus Logico-Philosophicus, 5.6 — this is another wonderful aphorism that always repays further meditation). But the limits of language can be extended; we can systematically seek to transcend the limits of our language and thus the limits of our world, or we can augment our language and thus augment our world. Russell, Wittgenstein’s mentor and one-time collaborator, rather than focusing on limits of the self, developed an ethic of impersonal self-enlargement, i.e., the transgression of limits. In the last chapter of his The Problems of Philosophy Russell wrote:

All acquisition of knowledge is an enlargement of the Self, but this enlargement is best attained when it is not directly sought. It is obtained when the desire for knowledge is alone operative, by a study which does not wish in advance that its objects should have this or that character, but adapts the Self to the characters which it finds in its objects. This enlargement of Self is not obtained when, taking the Self as it is, we try to show that the world is so similar to this Self that knowledge of it is possible without any admission of what seems alien. The desire to prove this is a form of self-assertion and, like all self-assertion, it is an obstacle to the growth of Self which it desires, and of which the Self knows that it is capable. Self-assertion, in philosophic speculation as elsewhere, views the world as a means to its own ends; thus it makes the world of less account than Self, and the Self sets bounds to the greatness of its goods. In contemplation, on the contrary, we start from the not-Self, and through its greatness the boundaries of Self are enlarged; through the infinity of the universe the mind which contemplates it achieves some share in infinity.

The obvious extension of this conception of impersonal self-enlargement to an ethics of thought enjoins the self-enlargement of the intellect, the transgression of the limits of the intellect. It is the exercise of imagination that enlarges the intellect, and a great many human failures that we put to failures of understanding and cognition are in fact failures of imagination.

The moral obligation of self-enlargement is a duty of intellectual self-transgression. As Nietzsche put it: “A very popular error: having the courage of one’s convictions; rather it is a matter of having the courage for an attack on one’s convictions!”

. . . . .

Since the above I have written more on the same in Of seeing and forgetting…

. . . . .

Bertrand Russell

Bertrand Arthur William Russell (1872 – 1970) formulated an ethic of impersonal self-enlargement.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

Tuesday


Carl Philipp Gottlieb von Clausewitz (July 1, 1780 – November 16, 1831)

Carl Philipp Gottlieb von Clausewitz (July 1, 1780 – November 16, 1831)

While on the topic of Clausewitz

Contradictory concepts are locked in a dialectical relation. Logically, this means that a definition of a given concept yields the definition of its contradictory through negation. If war is the contradictory of peace, and peace the contradictory of war, then a negation of a definition of war yields a definition of peace, and a negation of a definition of peace yields a definition of war.

Things are rarely as simple as this in fact; this kind of conceptual neatness is rare. Concepts — especially old concepts with a long history — tend to be complex and to be related by implication to many other concepts. Conceptual pairs like war and peace — sometimes called polar concepts — are assumed to be contradictories when they are in fact richer in content and the polar concepts imply more than each other. And what concepts could be older than those of war and peace? The emergence of civilization is nearly identical with the emergence of war in human history, and the idea of peace emerges as a hope immediately following upon the depredations of war.

Thus war and peace are not precisely dialectical, not precisely definable in terms of the contradictory of the other. This in itself renders war and peace as dialectical concepts, as any attempt to think them through coherently and systematically engages the thinker in an attempt to reconciling internal tensions within the concept. If successful, this process yields a higher synthesis that transcends the limited perspective and scope of previous definitions of the concept and establishes a more comprehensive concept informed by previous conceptions but more adequate than earlier formulations.

The Dialectic of Conceptual Pairs

Conceptual pairs, like war and peace, that are apparently or superficially contradictory yet integral in fact are common in our intellectual experience. A few weeks ago I mentioned Romero’s distinction between doctrinaire and inorganic democracy. This is a great example of what I am trying to illustrate. What we have are two clusters of concepts that suggest in turn two further contradictory clusters. Doctrinaire democracy is contradicted by non-doctrinaire democracy (each can be defined as the negation of the other), while inorganic democracy is contradicted by organic democracy (which, again, can each be defined as the negation of the other). Thus doctrinaire and inorganic democracy stand in a problematic relationship to each other, as do non-doctrinaire and organic democracy. But systematically setting these concepts within a theoretical context that includes them all may help to illuminate the initial pair of concepts with which we began.

In my Political Economy of Globalization I made similar observations regarding the dialectic of the conceptual pair of globalism and localism:

Globalism is correctly understood as one half of a dialectic, that of globalism and localism, or globalism and tribalism. And this extension of the concept of globalism to the pair of concepts globalism/tribalism emphasizes the departure from twentieth century nationalism that is already becoming a fact of political life: the nation-state appears nowhere in this dialectic. However, the concept of globalism is also extended by another dialectic: that of advocacy and opposition, or globalism/anti-globalism…

Thus the pair of concepts, globalism and anti-globalism, extends the concept of globalism simpliciter, so that the only obvious permutation missing in this twice extended concept of globalism is that of anti-tribalism, and it is here, finally, that we recover the nation-state. For the nation-state is an undeclared anti-tribalism: personal loyalty to chieftain must be abolished so that a territorial loyalty to the nation-state can take its place.

There I also cited section 2 of Nietzsche’s Beyond Good and Evil:

“How could anything originate out of its opposite? For example, truth out of error? Or the will to truth out of the will to deception? Or selfless action out of self-interest? Or the pure sunlike gaze of the sage out of covetousness? Such origins are impossible; whoever dreams of them is a fool, even worse; the things of the highest value must have another, separate origin of their own—they cannot be derived from this transitory, seductive, deceptive, lowly world, from this turmoil of delusion and desire! Rather from the lap of being, the intransitory, the hidden god, the ‘thing-in-itself ’—there must be their basis, and nowhere else!”— This way of judging constitutes the typical prejudice by which the metaphysicians of all ages can be recognized; this kind of valuation looms in the background of all their logical procedures; it is on account of this “belief” that they trouble themselves about “knowledge,” about something that is finally christened solemnly as “the truth.” The fundamental belief of the metaphysicians is the belief in antithetical of values.

What could be more true of the opposites of war and peace? The faith in antithetical values has encouraged us to believe that war and peace are precisely contradictory, but we have seen that the concepts are more complex than that.

The Means and Ends of War

We can easily see how the concept of peace might emerge from the concept of war, or vice versa, from Clausewitz’s famous definition of war as the pursuit of politics by other means. Clausewitz restates this principle throughout On War and gives it several formulations, so that it constitutes a point of reference for his thought and is the locus classicus for what Anatol Rapoport called political war (in contradistinction to eschatological war and catastrophic war).

This Clausewitzian principle inevitably invited the formulation of its inversion by Foucault: “politics is the continuation of war by other means.” (“Society Must be Defended”: Lectures at the College de France 1975-1976, p. 15) Thus politics, ideally peaceful, can be transformed into war, and war can be transformed into peace.

In holding that war is the pursuit of politics by other means, Clausewitz implicitly invokes the ends/means distinction, and suggests that the end, aim, and goal of war and politics alike is the same; only the means are different. War is the use of military means — violence — to compel another to do our will, whereas politics employs diplomatic means in the attempt to compel another to do our will. Seen in this context of means and ends, the transformation of war into peace and peace into war becomes obvious. Politicians pursue their ends with diplomacy, and finding the result unsatisfying turn to force in the attempt to attain the same ends. The use of force either attains these ends satisfactorily, in which case the war ends, or the ends are not attained, and eventually the war ends because it is seen as ineffectual in attaining the desired ends, and the politicians return to diplomacy in the attempt to secure that which could be be gotten by force.

Omnipresent War

Recent history has been rich in indecisive conflicts — the Colombian civil war, the Lebanese civil war, and the recently settled Sri Lankan civil war — in which the combatants have gone between peace table and battlefield as though through a revolving door. In such contexts, “peace” means little, and the temporary absence of armed conflict is only called peace for lack of a better term.

In so far as peace is an ideal — and we are well familiar with this ideal from literature and art — and not merely the cessation of hostility or the temporary absence of armed conflict, the greater part of the world for the greater part of history have not known peace. It was a tradition among the Romans that the doors to the Temple of Janus — called the Gates of War — would be closed in time of peace. This is said to have happened only five times in the combined history of the Republic and the Empire.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

%d bloggers like this: