Grand Strategy: Nine Years

8 November 2017


This month marks nine years of Grand Strategy: The View from Oregon. I started regularly posting in November 2008. Since then I have continuously maintained this blog, though my rate of posting has declined, especially over the past couple of years. My reduced rate of posting here is not due to my running out of ideas. On the contrary, I have more material than I can even write down. My posts have become more detailed and in-depth, which requires more research and more care in composition. This also means that I hesitate to post my more half-baked ideas. When I look back on some of my early posts I find things that I would never write today: it is no longer enough for me to suggest an idea; I want to develop the ideas that I present.

Already sensing my hesitation to post half-baked ideas some years ago, and knowing that the key to working out ideas is to maintain a continuous engagement with them (which is best done by writing about them every day), I started a blog on Tumblr, Grand Strategy Annex, where I post more spontaneously, just so that I can keep the ideas flowing without monitoring each word so closely that scholarly conscience prevents one from writing anything at all. I’m glad that I did this, even though it divides my efforts, because I often capture an idea in a quick Tumblr post fist, and later incorporate this in a longer post here, or on Medium, or on Centauri Dreams.

In addition to these online writings (and three Twitter accounts), I also keep numerous notebooks in which I write in longhand, and I work on dozens of different manuscripts on my computer. All this material, if collected together, would run to many thousands of pages. And over the past year or so I have discovered that I can accelerate my formulation of ideas even more by always carrying a digital recorder with me. I spend a lot of time each day driving around and running errands, and now I use that time listening to the ideas that I have recorded on previous days and then elaborating on them in further recordings. That means that I also have hundreds of spoken word notes that have not been transcribed. So, as I said above, I haven’t run out of ideas.

My approach to philosophy is what in the early modern period was called copia. (Erasmus wrote a short book On Copia of Words and Ideas.) I prioritize the generation of new ideas. I can imagine that, to someone who pursues the other strategy — that of confining oneself to a small number of ideas and spending a lifetime elaborating these in the most detailed and comprehensive manner possible — this sounds like a rather trivial way to think about things. However, I would suggest that one is statistically more likely to hit upon a significant idea by surveying many of them rather than focusing on a familiar few.

A blog is a good way to present the results of a copia strategy in philosophy, but I sometimes have misgivings about the time I put into writing blog posts. I could instead use this time to refine a manuscript. I worry that spending another ten years of writing blog posts may mean that I never produce anything more substantial. But I have already tried the book strategy. More than ten years ago I produced a couple of books that I self-published (Political Economy of Globalization and Variations on the Theme of Life). I thought (naïvely, as it turns out) that these two books would develop a readership over time, if only I could be patient. This has not happened. I changed my strategy and started writing blog posts instead of books as a compromise. While my blog readership is very small, at least these posts do occasionally get read, and when I post to Paul Gilster’s Centauri Dreams I have gotten as many as a hundred thoughtful comments on a single post. That is real engagement, and worth the effort to know that others have read carefully and have responded thoughtfully.

Part of my strategy of writing blog posts, then, follows from my native temperament; some of my strategy follows from my peculiar circumstances. Individuals in an academic or scholarly community, I assume, have others with whom they can have informal conversations in which they can float ideas that are not yet ready for systematic exposition. It is necessary to have a sympathetic ear for this sort of thing, as any tender, young, and inchoate idea can easily be torn apart. What is important is to try to discern within an idea if it has potential. Since I do my work in isolation, I float my ideas here. And what I post here is but a small fragment of the ideas I am working on at any given moment.

I won’t say that I have chosen the right strategy, and I certainly know that I haven’t chosen an optimal strategy, but I have chosen a strategy that is consonant with my own temperament. This consonance plays a role in the development of my ideas. Because I am doing what comes naturally to me, without any extrinsic prompting from any source outside myself, this is something that I can continue to do as long as I have life in me. It does not get old to me; the salt does not lose its savor.

. . . . .

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .




Last month, November 2016, marked the eight year anniversary for this blog. My first post, Opening Reflection, was dated 05 November 2008. Since then I have continued to post, although less frequently of late. I have become much less interested in tossing off a post about current events, and more interested in more comprehensive and detailed analyses, though blog posts are rarely associated with comprehensivity or detail. But that’s how I roll.

It is interesting that we have two distinct and even antithetical metaphors to identify non-trivial modes of thought. I am thinking of “dig deep” or “drill down” on the one hand, and, on the other hand, “overview” or “big picture.” The two metaphors are not identical, but each implies a particular approach to non-triviality, with the former implying an immersion in a fine-grained account of anything, while the latter implies taking anything in its widest signification.

Ideally, one would like to be both detailed and comprehensive at the same time — formulating an account of anything that is, at once, both fine-grained and which takes the object of one’s thought in its widest signification. In most cases, this is not possible. Or, rather, we find this kind of scholarship only in the most massive works, like Gibbon’s Decline and Fall of the Roman Empire, or Mario Bunge’s Treatise on Basic Philosophy. Over the past hundred years or so, scholarship has been going in exactly the opposite direction. Scholars focus on a particular area of thought, and then produce papers, each one of which focuses even more narrowly on one carefully defined and delimited topic within a particular area of thought. There is, thus, a great deal of very detailed scholarship, and less comprehensive scholarship.

Previously in Is it possible to specialize in the big picture? I considered whether it is even possible to have a scholarly discipline that focuses on the big picture. This question is posed in light of the implied dichotomy above: comprehensivity usually comes at the cost of detail, and detail usually comes at the cost of comprehensivity.

Another formulation of this dichotomy that brings out other aspects of the dilemma would to ask if it is possible to be rigorous about the big picture, or whether it is possible to be give a detailed account of the big picture — a fine-grained overview, as it were? I guess this is one way to formulate my ideal: a fine-grained overview — thinking rigorously about the big picture.

While there is some satisfaction in being able to give a concise formulation of my intellectual ideal — a fine-grained overview — I cannot yet say if this is possible, or if the ambition is chimerical. And if the ambition for a fine-grained overview is chimerical, is it chimerical because finite and flawed human beings cannot rise to this level of cognitive achievement, or is it chimerical because it is an ontological impossibility?

While an overview may necessarily lack the detail of a close and careful account of anything, so that the two — overview and detail — are opposite ends of a continuum, implying the ontological impossibility of their union, I do know, on the other hand, that clear and rigorous thinking is always possible, even if it lacks detail. Clarity and rigor — or, if one prefers the canonical Cartesian formulation, clear and distinct ideas — is a function of disciplined thinking, and one can think in a disciplined way about a comprehensive overview. If one allows that a fine-grained overview can be finely grained in virtue of the fine-grained conceptual infrastructure that one employs in the exposition of that overview, then, certainly, comprehensive detail is possible in this respect (even if in no other).

I could, then, re-state my ambition as formulated in my opening reflection such that, “my intention in this forum to view geopolitics through the prism of ideas,” now becomes my intention to formulate a fine-grained overview of geopolitics through the prism of ideas. But, obviously, I now seldom post on geopolitics, and am out to bag bigger game. This is, I think, implicit in the remit of a comprehensive overview of geopolitics. F. H. Bradley famously said, “Short of the Absolute God cannot stop, and, having reached that goal, He is lost, and religion with Him.” We might similarly say, short of big history geopolitics cannot stop, and, having reached that goal, it is lost, and political economy with it.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .


Jean Piaget

One of the important ideas from Piaget’s influential conception of cognitive development is that of perspective taking. The ability to coordinate the perspectives of multiple observers of one and the same state of affairs is a cognitive skill that develops with time and practice, and the mastery of perspective taking coincides with cognitive maturity.

From a philosophical standpoint, the problem of perspective taking is closely related to the problem of appearance and reality, since one and the same state of affairs not only appears from different perspectives for different observers, it also appears from different perspectives for one and the same observer at different times. In other words, appearance changes — and presumably reality does not. It is interesting to note that developmental psychologists following Paiget’s lead have in fact conducted tests with children in order to understand at what stage of development they can consistently distinguish between appearance and reality.

Just as perspective taking is a cognitive accomplishment — requiring time, training, and natural development — and not something that happens suddenly and all at once, the cognitive maturity of which perspective taking is an accomplishment does not occur all at once. Both maturity and perspective taking continue to develop as the individual develops — and I take it that this development continues beyond childhood proper.

While I find Piaget’s work quite congenial, the developmental psychology of Erik Erikson strikes me as greatly oversimplified, with its predictable crises at each stage of life, and the implicit assumption built in that if you aren’t undergoing some particular crisis that strikes most people at a given period of life, then there is something wrong with you; you ought to be experiencing the right crisis at the right time. That being said, what I find of great value in Erikson’s work is his insistence that development continues throughout the human lifespan, and does not come to a halt after a particular accomplishment of cognitive maturity is achieved.

Piagetian cognitive development in terms of perspective taking can easily be extended throughout the human lifespan (and beyond) by the observation that there are always new perspectives to take. As civilization develops and grows, becoming ever more comprehensive as it does so, the human beings who constitute this civilization are forced to formulate ever more comprehensive conceptions in order to take the measure of the world being progressively revealed to us. Each new idea that takes the measure of the world at a greater order of magnitude presents the possibility of a new perspective on the world, and therefore the possibility of a new achievement in terms of perspective taking.

The perspectives we attain constitute a hierarchy that begins with the first accomplishment of the self-aware mind, which is egocentric thought. Many developmental psychologists have described the egocentric thought patterns of young children, though the word “egocentric” is now widely avoided because of its moralizing connotations. I, however, will retain the term “egocentric,” because it helps to place this stage within a hierarchy of perspective taking.

The egocentric point of departure for human cognition does not necessarily disappear even when it is theoretically surpassed, because we know egocentric thinking so well from the nearly universal phenomenon of human selfishness, which is where the moralizing connotation of “egocentric” no doubt has its origin. An individual may become capable of coordinating multiple perspectives and still value the world exclusively from the perspective of self-interest.

In any case, the purely egocentric thought of early childhood confines the egocentric thinker to a tightly constrained circle defined by one’s personal perspective. While this is a personal perspective, it is also an impersonal perspective in so far as all individuals share this perspective. It is what Francis Bacon called the “idols of the cave,” since every human being, “has a cave or den of his own, which refracts and discolours the light of nature.” This has been well described in a passage from F. H. Bradley made famous by T. S. Eliot, because the latter quoted it in a footnote to The Waste Land:

“My external sensations are no less private to myself than are my thoughts or my feelings. In either case my experience falls within my own circle, a circle closed on the outside; and, with all its elements alike, every sphere is opaque to the others which surround it… In brief, regarded as an existence which appears in a soul, the whole world for each is peculiar and private to that soul.”

F. H. Bradley, Appearance and Reality, p. 346, quoted by T. S. Eliot in footnote 48 to The Waste Land, “What the Thunder Said”

I quote this passage here because, like my retention of the term “egocentric,” it can help us to see perspectives in perspective, and it helps us to do so because we can think of expanding and progressively more comprehensive perspectives as concentric circles. The egocentric perspective is located precisely at the center, and the circle described by F. H. Bradley is the circle within which the egocentric perspective prevails.

The next most comprehensive perspective taking beyond the transcendence of the egocentric perspective is the transcendence of the ethnocentric perspective. The ethnocentric perspective corresponds to what Bacon called the “idols of the marketplace,” such that this perspective is, “formed by the intercourse and association of men with each other.” The ethnocentric perspective can also be identified with the sociosphere, which I recently discussed in Eo-, Eso-, Exo-, Astro- as an essentially geocentric conception which, in a Copernican context, should be overcome.

Beyond ethnocentrism and its corresponding sociosphere there is ideocentrism, which Bacon called the “idols of the theater,” and which we can identify with the noösphere. The ideocentric perspective, which Bacon well described in terms of philosophical systems, such that, “all the received systems are but so many stage-plays, representing worlds of their own creation after an unreal and scenic fashion.” Trans-ethnic communities of ideology and belief, like world’s major religions and political ideologies, represent the ideocentric perspective.

The transcendence of the ideocentric perspective by way of more comprehensive perspective taking brings us to the anthropocentric perspective, which can be identified with the anthroposphere (still a geocentric and pre-Copernican conception, as with the other -spheres mentioned above). The anthropocentric perspective corresponds to Bacon’s “idols of the tribe,” which Bacon described thus:

“The Idols of the Tribe have their foundation in human nature itself, and in the tribe or race of men. For it is a false assertion that the sense of man is the measure of things. On the contrary, all perceptions as well of the sense as of the mind are according to the measure of the individual and not according to the measure of the universe. And the human understanding is like a false mirror, which, receiving rays irregularly, distorts and discolours the nature of things by mingling its own nature with it.”

Bacon was limited by the cosmology of his time so that he could not readily identify further idols beyond the anthropocentric idols of the (human) tribe, just as we are limited by the cosmology of our time. Yet we do today have a more comprehensive perspective than Bacon, we can can identify a few more stages of more comprehensive perspective taking. Beyond the anthropocentric perspective there is the geocentric perspective, the heliocentric perspective, and even what we could call the galacticentric perspective — as when early twentieth century cosmologists argued over whether the Milky Way was the only galaxy and constituted an “island universe.” Now we know that there are other galaxies, and we can be said to have transcended the galacticentric perspective.

As I wrote above, as human knowledge has expanded and become more comprehensive, ever more comprehensive perspective taking has come about in order to grasp the concepts employed in expanding human knowledge. There is every reason to believe that this process will be iterated indefinitely into the future, which means that perspective taking also will be indefinitely iterated into the future. (I attempted to make a similar and related point in Gödel’s Lesson for Geopolitics.) Therefore, further levels of cognitive maturity wait for us in the distant future as accomplishments that we cannot yet attain at this time.

This last observation allows me to cite one more relevant developmental psychologist, namely Lev Vygotsky, whose cognitive mediation theory of human development makes use of the concept of a Zone of proximal development (ZPD). Human development, according to Vygotsky, takes place within a proximal zone, and not at any discrete point or stage. Within the ZPD, certain accomplishments of cognitive maturity are possible. In the lower ZPD there is the actual zone of development, while in the upper ZPD there lies the potential zone of development, which can be attained through cognitive mediation by the proper prompting of an already accomplished mentor. Beyond the upper ZPD, even if there are tasks yet to be accomplished, they cannot be accomplished within this particular ZPD.

With the development of the whole of human knowledge, we’re on our own. There is no cognitive mediator to help us over the hard parts and assist us in the more comprehensive perspective taking that will mark a new stage of cognitive maturity and possible also a new zone of proximal development in which new accomplishments will be possible. But this has always been true in the past, and yet we have managed to make these breakthroughs to more comprehensive perspectives of cognitive maturity.

I hope that the reader sees that this is both hopeful and sad. Hopeful because this way of looking at human knowledge suggests indefinite progress. Sad because we will not be around to see the the accomplishments of cognitive maturity that lie beyond our present zone of proximal development.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .


Yesterday in A Review of Iranian Capabilities I mentioned the current foreign policy debate over the idea of a preventative war against Iran and recounted some of Iran’s known capabilities.

Reflecting the these attempts to make a case for or against preventative war with Iran, I was led back in my thoughts to a post I wrote last summer about what I called The Possible War. In this post I tried to emphasize that ex post facto criticisms of conduct in war — like criticisms of the Allies’ strategic bombing of Germany during the Second World War — presume a parity of capability and opportunity that almost never obtains in fact. Military powers do not engage in ideal wars that meet certain standards; they fight the war that they are able to fight, and this is the possible war.

Moving beyond a description of the possible war, the idea can be formulated as a principle, the principle of possible wars, and the principle is this: in any given conflict, each party to the conflict will fight the war that it is possible for that party to fight. In other words, no party to a conflict is going to fight a war that it is impossible for it to fight. In other words again, no party to a conflict is going to fight a losing war on the basis of peer-to-peer engagement if there is a non-peer strategy that will win the war. This sort of thing makes good poetry, as in The Charge of the Light Brigade, but in so far as it ensures failure in a campaign, it exerts a strong negative selection over military powers that pursue such policies.

The military resources of a given political entity (whether state or non-state entity) will always seek to maximize its advantage by employing its most effective means available against its adversary’s most vulnerable target available. This is what makes war brutal and ugly, this is why it has been said since ancient times that inter arma enim silent leges.

There is a sense in which this principle of possible wars is simply an extension of the classic twin principles of mass and economy of forces. Each party to a conflict concentrates as much force as it can at a point it believes the adversary to be most vulnerable, and the enemy is simultaneously trying to do the same thing. If we think of concentration as concentration of effort, rather than mere numbers of battalions, and we think of vulnerability as any way in which an enemy can be defeated, and not merely a point on the line that is insufficiently defended, then we have the principle of possible war.

War is not always and inevitably brutal and ugly, and the principle of possible wars helps us to understand why this is the case. Previously in Civilization and War as Social Technologies I discussed how in particular historical circumstances warfare can become highly ritualized and stylized. There I cited the non-Western examples of Samurai sword fighting, retained in Japan long after the rest of the world was fighting with guns, and the Aztec Flower Battle, which combined religious rituals of sacrifice with the honor and prestige requirements of combat. However, there are Western precedents for ritualized combat as well, as when, in the ancient world, each party to a conflict would choose an individual champion and the issue was decided by single combat.

Another example of semi-ritualized forms of combat in Western history might include early modern Condottieri wars in the Italian peninsula. Before the large scale armies of the French and the Spanish crossed the Alps to pillage and plunder Italy, the peninsula was dominated by wealth city-states who hired mercenary armies under Condottieri captains to wage war against each other. With two mercenary armies facing each other on the battlefield, there was a strong incentive to minimize casualties, and there are some remarkable stories from the era of nearly bloodless battles.

Another example would be the maneuver warfare of small, professional European armies during the Enlightenment, who sometimes managed to fight limited wars with a minimal impact on non-combatants. This may well have been a cultural response to the horrific slaughter of the Thirty Years War.

In these latter two examples, limited wars were the possible war because a sufficient number of social conventions and normative presuppositions were shared by all parties to the conflict, who were willing to abide by the results of the contest even when a more ruthless approach might have secured a Pyrrhic victory. Under these socio-political conditions, limited wars were possible wars because all parties recognized that it was in their enlightened self-interest not to escalate wars beyond a certain threshold. Such social conventions touching even upon the conduct of war can only be effective in a suitably homogenous cultural region.

After the escalating total wars leading up to the middle of the twentieth century, limited wars emerged again out of fear of crossing the nuclear threshold. Parties to the conflicts were willing to abide by the issue of these limited wars because the alternative was mutually assured destruction. Also, all parties to proxy wars knew they would have another chance at achieving their goals in another theater when the proxy war would shift to another region of the world. Thus limited wars because possible wars because the alternative was unthinkable.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

A Theory of Gift Exchange

25 December 2011


There is a tension and a paradox at the heart of gift exchange. The pure gift is given without thought of reciprocity and out of generosity for its own sake, but the very idea of a gift exchange implies that two or more parties will engage in a ritualized and mutual transfer of gifts. Thus in gift exchange, there is an expectation of reciprocity and even of symmetry: gifts are expected to be of roughly equal value, except in special circumstances. And the fact that there are (tacitly acknowledged) exceptions to axiological symmetry is a nod to the fact that gift exchange is a calculation, and nothing is farther from the spirit of a pure gift than the spirit of calculation.

Still, we try, and sometimes we are successful. Moreover, sometimes we are surprised by the unexpected generosity of the other. Sometimes we are presented with a gift completely unexpectedly, from someone we had no expectation of gift exchange. In such circumstances, we are not prepared to give anything in exchange, so we are “forced” to accept the pure gift, for to refuse it would be inexcusable in most circumstances.

Still, unexpected gifts are sometimes refused. It is not uncommon for an individual who looks to another as a potential romantic partner will give an unexpected gift as a way to announce their interest in that person. In a sense, announcing one’s interest in another with a gift is much like announcing one’s intentions and expectations of future exchange. Thus such a gift may well be refused for the implicit contract of future exchange that it suggests. Charitable person will be polite about this and say things like, “Oh, I could never accept such a present, it is much to expensive.” Perhaps they might also add, “I would feel obligated if I accepted a gift like this.”

In this way — and while I have used the example of a gift exchange between potential romantic partners, but it is by no means limited to the romantic dyad — even a forced pure gift, exempt of necessity from immediate exchange, can in fact be part of a gift exchange when understood in a larger context, and therefore fails to constitute a pure gift.

The model of a pure gift is grace, and so the pure gift may be exclusive to theological contexts. While a pure gift may be a rare thing, to what extent can we approximate a pure gift? May be come so close to approximating a pure gift that our gift is, for all practical purposes, a pure gift? I suspect so, as I expect that the more closely a gift approximates the ideal of the pure gift, the more rare it becomes.

Nietzsche wrote the following in his Mixed Opinions and Maxims:

How duty acquires splendor.–The means for changing your iron duty to gold in everyone’s eyes is this: always keep a little more than you promise.

This last suggestion — always keep a little more than you promise — has stayed with me ever since I first read this many years ago. Nietzsche formulates this in the context of duty, and duty is not only a reciprocal and symmetrical relationship, it is usually felt to be a burden. Nietzsche has asked himself, “How can this burden be transformed into something welcome to all?” The surprising answer he gives to this self-posed question is, “Transform duty into a gift” — for when you keep a little more than you promise you go beyond duty in keeping your duty. This is a Nietzschean transvaluation of values of an unexpected sort.

One finds this same idea of Nietzsche’s in the New Testament, in famous admonition, “And whosoever shall compel thee to go a mile, go with him twain.” (Nietzsche was, after all, a preacher’s son, and he absorbed some of the radical character of the gospels.) In the Holy Land during classical antiquity, a Roman soldier could legally compel a Jew to carry his pack for one mile. Only one mile was required, after which the Jew impressed into such service was free. Obviously, the Jew felt this law to be a humiliation. The obvious response to such an unjust law would be its repeal. The radical response, the response of the gospels, is not only the go the mile without complaint, but to go an extra mile.

If we can inform our duties, including our duties of gift exchange, with the spirit of keeping a little more than you promise or going the extra mile, our duties are transformed into gifts, and since these gifts grow out of duties, they are unexpected and therefore approximate pure gifts. But there are few who have the moral strength to overcome the feeling of humiliation and moreover to transform this humiliation into what Nietzsche called “golden.” This the world is more marked by impure than pure gifting.

Impure gift exchanges are undertaken with no pretense of being given out of pure generosity. Most gifts are of this kind. When an employer plays an employee more than he strictly has to pay that employee, and the employee works harder than they strictly might be expected to work, economists call this a “gift exchange.” I realized recently that we can narrow this and call it an “economic gift exchange” and contrast it to other forms of gift exchange.

Recently in the Financial Times I read the following in David Pilling’s column, Modern China is yearning for a new moral code:

“Day to day, most Chinese people are able to put aside the broader moral confusion to perform the little acts of kindness and decency that make a society function.”

What Pilling describes here as the little acts that make a society function constitute what we might call a social gift exchange: in a smoothly functioning society individual citizens do more than a narrow interpretation of their social and legal duty would stipulate, and the presumptive exchange for this is to live in a society than is better than that which fulfills the minimum social and legal requirements.

There are many kinds of social gift exchange: a teacher who makes an extra effort to teach and a student to learn, a social worker who goes the extra mile for a client and a client that responds by making an extra effort, and when people yield the right of way for each other in traffic, or when pedestrians, bicyclists, and vehicles do so mutually and on the large scale (and, just as importantly, the anonymity) of urban population densities.

There are also many instances when social gift exchange fails, and people behave rather badly with each other. I’m sure that anyone reading this can recall many instances from their own experience when others gave only the minimum but demanded an extra measure for themselves in return, or when people engage in conflict over who is and who is not abiding by the minimum social and legal standards.

This idea of social gift exchange can be used to define a healthy and fully functioning society: in a healthy society, social gift exchange is routine and unexceptional; in a pathological society, social gift exchange is mostly absent. In the big picture, it is to be expected that most societies fall somewhere in the middle of this continuum stretching between the pathological and the healthy, though it is also be expected that in smaller, culturally and ethnically homogenous societies that social gift exchange functions more easily because of shared expectations and a shared understanding — what sociologists call normative consensus.

Thus out of the impure giving of gift exchange, even when undertaken in a spirit of reciprocity and axiological symmetry, something truly noble and socially beneficial can emerge. A social gift exchange that is moreover informed by the spirit of capitalism, that prioritizes delayed gratification as the sure way to wealth, is all the more powerful, since one is not looking for an immediate and symmetrical payoff of one’s social gifting.

As I see it, then, what Marx called “callous ‘cash payment’” and “icy water of egotistical calculation” may be in fact the thin edge of the wedge for a more just and even a more human social and moral order. Of course, as we practice it, it is imperfect in the extreme. Whether or not we might perfect social gift exchange brings us to the traditional question of the perfectibility of man. Yet, short of the perfectibility of man, we can still sensibly ask about, say, the betterment of man, even if short of perfection.

This, then, is the challenge for the large and diverse societies of contemporary nation-states (presumptively established on the basis of nationhood, but in fact established on the basis of the territorial principle in law): how can social gift exchange be encouraged to flourish in a context in which shared expectations and shared understanding may be absent?

. . . . .


. . . . .

Grand Strategy Annex

. . . . .


Aristotle as portrayed by Raphael

Aristotle claimed that mathematics has no ethos (Metaphysics, Book III, Chap. 2, 996a). Aristotle, of course, was more interested in the empirical sciences than his master Plato, whose Academy presumed and demanded familiarity with geometry — and we must understand that for the ancients, long before the emergence of analytical geometry in the work of Descartes (allowing us to formulate geometry algebraically, hence arithmetically), that geometry was always axiomatic thought, rigorously conceived in terms of demonstration. For the Greeks, this was the model and exemplar of all rigorous thought, and for Aristotle this was a mode of thought that lacked an ethos.

Euclid provided the model of formal thought with his axiomatization of geometry. Legend has it that there was a sign over the door of Plato's Academy stating, 'Let no one enter here who has not studied geometry.'

In this, I think, Aristotle was wrong, and I think that Plato would have agree on this point. But the intuition behind Aristotle’s denial of a mathematical ethos is, I think, a common one. And indeed it has even become a rhetorical trope to appeal to rigorous mathematics as an objective standard free from axiological accretions.

In his famous story within a story about the Grand Inquisitor, Dostoyevsky has the Grand Inquisitor explain how “miracles, mystery, and authority” are used to addle the wits of others.

In his famous story within a story about the Grand Inquisitor, Dostoyevsky has the Grand Inquisitor explain how “miracles, mystery, and authority” are used to addle the wits of others.

Our human, all-too-human faculties conspire to confuse us, to addle our wits, when we begin talking about morality, so that the purity and rigor of mathematical and logical thought seem to be called into question if we acknowledge that there is an ethos of formal thought. We easily confuse ourselves with religious, mystical, and ethical ideas, and since the great monument of mathematical thought has been mostly free of this particular species of confusion, to deny an ethos of formal thought can be understood as a strategy to protect and defend of the honor of mathematics and logic by preserving it from the morass that envelops most human attempts to think clearly, however heroically undertaken.

Kant famously said that he had to limit knowledge to make room for faith.

Kant famously stated in the Critique of Pure Reason that, “I have found it necessary to deny knowledge in order to make room for faith.” I should rather limit faith to make room for rigorous reasoning. Indeed, I would squeeze out faith altogether, and find myself among the most rigorous of the intuitionists, one of whom has said: “The aim of this program is to banish faith from the foundations of mathematics, faith being defined as any violation of the law of sufficient reason (for sentences). This law is defined as the identification (by definition) of truth with the result of a (present or feasible) proof…”

Western asceticism can be portrayed as demonic torment or as divine illumination; the same diversity of interpretation can be given to ascetic forms of reason.

Though here again, with intuitionism (and various species of constructivism generally), we have rigor, denial, asceticism — intuitionistic logic is no joyful wisdom. (An ethos of formal thought need not be an inspiring and edifying ethos.) It is logic with a frown, disapproving, censorious — a bitter medicine justified only because it offers hope of curing the disease of contradiction, contracted when mathematics was shown to be reducible to set theory, and the latter shown to be infected with paradox (as if the infinite hubris of set theory were not alone enough for its condemnation). Is the intuitionist’s hope justified? In so far as it is hope — i.e., hope and not proof, the expectation that things will go better for the intuitionistic program than for logicism — it is not justified.

Dummett has said that intuitionistic logic and mathematics are to wear their justification on their face:

“From an intuitionistic standpoint, mathematics, when correctly carried on, would not need any justification from without, a buttress from the side or a foundation from below: it would wear its own justification on its face.”

Dummett, Michael, Elements of Intuitionism, Oxford University Press, 1977, p. 2

The hope that contradiction will not arise from intuitionistic methods clearly is no such evident justification. As a matter of fact, empirically and historically verifiable, we know that intuitionism has resulted in no contradictions, but this could change tomorrow. Intuitionism stands in need of a consistency proof even more than formalism. There is, in its approach, a faith invested in the assumption that infinite totalities caused the paradoxes, and once we have disallowed reference to them all will go well. This is a perfectly reasonable assumption, but one, in so far as it is an article of faith, which is at variance with the aims and methods of intuitionism.

And what is a feasible proof, which our ultra-intuitionist would allow? Have we not with “feasible proof” abandoned proof altogether in favor of probability? Again, we will allow them their inconsistencies and meet them on their own ground. But we shall note that the critics of the logicist paradigm fix their gaze only upon consistency, and in so doing reveal again their stingy, miserly conception of the whole enterprise.

“The Ultra-Intuitionistic Criticism and the Antitraditional program for the foundations of Mathematics” by A. S. Yessenin-Volpin (who was arguing for intellectual freedom in the Soviet Union at the same time that he was arguing for a censorious conception of reason), in Intuitionism and Proof Theory, quoted briefly above, is worth quoting more fully:

The aim of this program is to banish faith from the foundations of mathematics, faith being defined as any violation of the law of sufficient reason (for sentences). This law is defined as the identification (by definition) of truth with the result of a (present or feasible) proof, in spite of the traditional incompleteness theorem, which deals only with a very narrow kinds [sic] of proofs (which I call ‘formal proofs’). I define proof as any fair way of making a sentence incontestable. Of course this explication is related to ethics — the notion fair means ‘free from any coercion or fraud’ — and to the theory of disputes, indicating the cases in which a sentence is to be considered as incontestable. Of course the methods of traditional mathematical logic are not sufficient for this program: and I have to enlarge the domain of means explicitly studied in logic. I shall work in a domain wherein are to be found only special notions of proof satisfying the mentioned explication. In this domain I shall allow as a means of proof only the strict following of definitions and other rules or principles of using signs.

Intuitionism and proof theory: Proceedings of the summer conference at Buffalo, N.Y., 1968, p. 3

What is coercion or fraud in argumentation? We find something of an illustration of this in Gregory Vlastos’ portrait of Socrates: “Plato’s Socrates is not persuasive at all. He wins every argument, but never manages to win over an opponent. He has to fight every inch of the way for any assent he gets, and gets it, so to speak, at the point of a dagger.” (The Philosophy of Socrates, Ed. by Gregory Vlastos, page 2)

According to Gregory Vlastos, Socrates used the kind of 'coercive' argumentation that the intuitionists abhor.

What appeal to logic does not invoke logical compulsion? Is logical compulsion unique to non-constructive mathematical thought? Is there not an element of logical compulsion present also in constructivism? Might it not indeed be the more coercive form of compulsion that is recognized alike by constructivists and non-constructivists?

The breadth of the conception outlined by Yessenin-Volpin is impressive, but the essay goes on to stipulate the harshest measures of finitude and constructivism. One can imagine these Goldwaterite logicians proclaiming: “Extremism in the defense of intuition is no vice, and moderation in the pursuit of constructivist rigor is no virtue.” Brouwer, the spiritual father of intuitionism, even appeals to the Law-and-Order mentality, saying that a criminal who has not been caught is still a criminal. Logic and mathematics, it seems, must be brought into line. They verge on criminality, deviancy, perversion.

Quine was no intuitionist by a long shot, but as a logician he brought a quasi-disciplinary attitude to reason and adopted a tone of disapproval not unlike Brouwer.

The same righteous, narrow, anathamatizing attitude is at work among the defenders of what is sometimes called the “first-order thesis” in logic. Quine sees a similar deviancy in modal logic (which can be shown to be equivalent to intuitionistic logic), which he says was “conceived in sin” — the sin of confusing use and mention. These accusations do little to help us understand logic. We would do well to adopt Foucault’s attitude on these matters: “leave it to our bureaucrats and our police to see that our papers are in order. At least spare us their morality when we write.” (The Archaeology of Knowledge, p. 17)

Foucault had little patience for the kind of philosophical reason that seemed to be asking if our papers are in order, a function he thought best left to the police.

The philosophical legacy of intuitionism has been profound yet mixed; its influence has been deeply ambiguous. (Far from the intuitive certainty, immediacy, clarity, and evident justification that it would like to propagate.) There is in inuitionism much in harmony with contemporary philosophy of mathematics and its emphasis on practices, the demand for finite constructivity, its anti-philosophical tenor, its opposition to platonism. The Father of Intuitionism, Brouwer, was, like many philosophers, anti-philosophical even while propounding a philosophy. No doubt his quasi-Kantianism put his conscience at rest in the Kantian tradition of decrying metaphysics while practicing it, and his mysticism gave a comforting halo (which softens and obscures the hard edges of intuitionist rigor in proof theory) to mathematics which some have found in the excesses of platonism.

L. E. J. Brouwer: philosopher of mathematics, mystic, and pessimistic social theorist

In any case, few followers of Brouwer followed him in his Kantianism and mysticism. The constructivist tradition which grew from intuitionism has proved to be philosophically rich, begetting a variety of constructive techniques and as many justifications for them. Even if few mathematicians actually do intuitionistic mathematics, controversies over the significance of constructivism have a great deal of currency in philosophy. And Dummett is explicit about the place of philosophy in intuitionistic logic and mathematics.

The light of reason serves as an inspiration to us as it shines down from above, and it remains an inspiration even when we are not equal to all that it might ideally demand of us.

Intuitionism and constructivism command our respect in the same way that Euclidean geometry commanded the respect of the ancients: we might not demand that all reasoning conform to this model, but it is valuable to know that rigorous standards can be formulated, as an ideal to which we might aspire if nothing else. And and ideal of reason is itself an ethos of reason, a norm to which formal thought aspires, and which it hopes to approximate even if it cannot always live up the the most exacting standard that it can recognize for itself.

. . . . .

Studies in Formalism

1. The Ethos of Formal Thought

2. Epistemic Hubris

3. Parsimonious Formulations

4. Foucault’s Formalism

5. Cartesian Formalism

6. Doing Justice to Our Intuitions: A 10 Step Method

7. The Church-Turing Thesis and the Asymmetry of Intuition

8. Unpacking an Einstein Aphorism

9. Methodological and Ontological Parsimony (in preparation)

10. The Spirit of Formalism (in preparation)

. . . . .


. . . . .

Grand Strategy Annex

. . . . .


William Blake's 'Angel of Revelation' is a fitting figure of the inspirations of genius.

Thinking again of what I wrote few days ago in The Mind’s Singular Function, I realized that I should have said that, while the product of inspiration could be considered a memorialization of the singular, it is important to note that it is in no sense an attempt to reproduce, recreate, simulate, or imitate the singular episode of inspiration. In so far as a work of creative expression is a memorialization of the singular, it is an oblique memorialization, and mimesis plays no part in it. This follows from the fact I previously noted that inspiration is not identical to the products derived from inspiration. This makes of the intellectual singular a haeccietas more absolute than any chance event in the mundane world, which latter may have representational memorializations. Inspiration is not a representational memorialization.

I realize that my formulations are as yet highly imperfect, but I have at least the idea (more or less), which I can attempt to refine and to apply. One application is the possibility of defining genius, which is usually treated as an ineffable quality of mind. But in view of the character of inspiration as the singular in the sphere of the intellect, genius can be defined as the continual, or near-continual, immersion in the singular. This reminds me of a remark attributed to Blake’s wife, which I may have quoted previously (it is one of my favorite quotes): “I have little of Mr. Blake’s company — he is always in paradise.” This puts the matter succinctly, not to mention personalizing it.

This latter formulation — i.e., genius as the immersion in the singular — begs the question of immersion. I think that degrees of immersion need to be recognized, but that absolute immersion can be given a relatively simple formulation: it is when the mind is so concentrated on a single focus that the remainder of the world is relegated to the periphery. I call this the “undivided mind”. It is a rare but not unknown state of mind. Another formulation is suggested by the extrapolation that I made of the quote attributed to Paul Valéry, namely, to see is to forget the name of of the thing one sees. The idea implicit here can ultimately be pushed beyond the senses to the transcendence of thought itself: to think is to forget the name of the thing one thinks.

The two instances I cited in Interests and Identity, Camus saying near the end of his life that his work had not yet begun and Cézanne also near the end of his life saying that he is making slow progress, are perfect examples of genius utterly immersed in the object of its fascination. Kenneth Clark, in discussing Mozart, mentioned the “single-mindedness of genius.” This is of a piece with these fragments from the life of Camus and Cézanne.

Sören Kierkegaard, passionate Protestant preacher than he was, devoted an entire devotional work to the proposition Purity of heart is to will one thing. Kierkegaard writes in Chapter 3 of this work:

So let us, then, upon the occasion of a time of Confession speak about this sentence: PURITY OF HEART IS TO WILL ONE THING as we base our meditation on the Apostle James’ words in his Epistle, Chapter 4, verse 8: “Draw nigh to God and he will draw nigh to you. Cleanse your hands, ye sinners; and purify your hearts ye double-minded.” For only the pure in heart can see God, and therefore, draw nigh to Him; and only by God’s drawing nigh to them can they maintain this purity. And he who in truth wills only one thing can will only the Good, and he who only wills one thing when he wills the Good can only will the Good in truth.

Let us speak of this, but let us first put out of our minds the occasion of the office of Confession in order to come to an agreement on an understanding of this verse, and on what the apostolic word of admonition “purify your hearts ye double-minded” is condemning, namely, double-mindedness. Then at the close of the talk we may return more specifically to a treatment of the occasion.

What I have above called the undivided mind is here seen as the condition of having transcended double-mindedness — as much a concern for a theologian like Kierkegaard as for a thorough-going naturalist. This famous proposition of Kierkegaard can be given a reformulation much as I reformulated the famous line from Valéry, and Kierkegaard thus extrapolated would run like this: Purity of mind is to think one thing.

The single-mindedness of genius, the immersion of the mind in its object, is the purity of mind that comes from thinking one thing.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

%d bloggers like this: