One of the most annoying constructions of contemporary sociology is Richard Florida’s conception of the “creative class.” Florida isn’t necessarily wrong in his claims, and indeed I am sympathetic to some of his arguments, though much of his analysis turns upon taking a naïve conception of creativity and moving the goal posts so that this intuitive conception of creativity comes to be bestowed upon patently uncreative individuals who pad the ranks of the corporate hierarchy. By marginalizing a “Bohemian” creative class and putting at the center of his analysis the suits who congratulate themselves on being creative, he has arguably misconstrued the sources of creativity in society, but that is not what I want to focus on today.

Here is how Florida defines his “creative class”:

“I define the core of the Creative Class to include people in science and engineering, architecture and design, education, arts, music, and entertainment whose economic function is to create new ideas, new technology, and new creative content. Around this core, the Creative Class also includes a broader group of creative professionals in business and finance, law, health care, and related fields. These people engage in complex problem solving that involves a great deal of independent judgment and requires high levels of education or human capital. In addition, all members of the Creative Class — whether they are artists or engineers, musicians or computer scientists, writers or entrepreneurs — share a common ethos that values creativity, individuality, difference, and merit.”

Richard Florida, The Rise of the Creative Class, Revisited, second edition, pp. 8-9

Florida’s use of the phrase “high human capital individuals” (employed throughout his book) begs the question as to who exactly are the low human capital individuals. Needless to say, formulations like this are self-congratulatory to the point of delusion, because no one who uses the phrase “high human capital individuals” believes themselves to be anything other than a high human capital individual. Here Nietzsche is relevant, though what he said of philosophers must now be applied to sociology: It has gradually become clear to me what every great social science up till now has consisted of — namely, the personal confession of its originator, and a species of involuntary and unconscious memoir.

We need not employ Florida’s annoying formulations. Let’s consider another approach to essentially the same idea. Take, for example, Marx’s version of the “creative class”:

“Milton, who wrote Paradise Lost, was an unproductive worker. On the other hand, a writer who turns out work for his publisher in factory style is a productive worker. Milton produced Paradise Lost as a silkworm produces silk, as the activation of his own nature. He later sold his product for £5 and thus became a merchant. But the literary proletarian of Leipzig who produces books, such as compendia on political economy, at the behest of a publisher is pretty nearly a productive worker since his production is taken over by capital and only occurs in order to increase it.”

Karl Marx, Capital, Vol. 1, London et al.: Penguin, 1976, p. 1044

Clearly, Marx here evinces no romantic notions of the creative genius in isolation, praising the Leipzig hack over the genius of Milton. And this is Florida’s conception of “creativity” in a nutshell, nearly indistinguishable from “productivity” as used in contemporary economics. One can imagine in one’s mind’s eye Richard Florida reading this passage from Marx and nodding his head with an odd grin on his face.

Suit-and-tie guys who are “knowledge workers” in their own imaginations, but in who are in reality time-servers in a corporate hierarchy, are the members of the “creative class” who are fulfilling the function that Marx assigned to the Leipzig hack. In other words, the same kind of people who, fifty years ago would have been reading the Financial Times and the Wall Street Journal, are the same people who still today are reading the Financial Times and the Wall Street Journal, but now they fancy themselves to be part of the “creative class” and they take micro-doses of LSD when they go to Burning Man each year to “unleash” their creativity.

But this is exactly the kind of “creative class” that the global economy wants and needs; Marx had put his finger on something important when he raised the Leipzig hack over Milton. The less creative you are, and the more you have adapted yourself to be a creature of the institutions you are serving, the more successful you will be (according to conventional measures of success) and the more money you will make.

The pedestrian fact of the matter is that industry — whether something as flashy as the film industry or something as prosaic as the energy industry — advances mediocrities to its top positions. Usually the top people are mediocrities with some redeeming qualities, or a hint of limited talent, but still mediocrities. The truly creative types know that mediocrities are being advanced beyond them and taking the top positions in the industry, and that there is nothing that they can do about this. These truly creative types aren’t living the life of the one percent; indeed, they aren’t living the life the ten percent. Most of them make less than six figures, and there are probably many plumbers, sheet rockers, electricians, and truck drivers who make a lot more than them, and who have no massive college debt hanging over their heads.

The Bohemian creatives, the ones actually creating things, find themselves in the position of performing alienated labor at the behest of their corporate masters, who neither understand nor appreciate them. Having failed to learn one of the simplest lessons in life — that you catch more flies with honey than with vinegar — the lowest strata of the creative class spew their resentment at every opportunity. (The dirtbag left today might be thought of as part of a Bohemian fringe of creative types, though at the political end of the creative spectrum.) They are so convinced of their own virtue that they are unable to see or to comprehend that they themselves have become the bitter, punitive gatekeepers that as “creatives” they presume to despise.

Resentment, it seems, flows uphill. By creating a permanently resentful underclass, which is the basis of the entirety of society (because the underclass have the jobs that keep industrialized civilization functioning), the resentful underclass creates a popular culture derived from this pervasive resentment, and this pervasive popular culture resentment eventually finds its way into the routines of comedians, into television, into films, and ultimately into élite cultural institutions, which imagine themselves setting the cultural and aesthetic agenda, but which in fact respond like reactionaries to the authentic energies of the lower classes.

The phenomenon of resentment flowing uphill manifests itself powerfully among the “creative class.” As we have seen, the most creative members of the creative class experience the appearance of fame but the financial reality of entry-level positions, so that they belong to the permanent underclass and its bitterly resentful view of the world, which is a view of the world from the bottom up. They are well aware of their low financial status, and that they do not share in the rewards of the uncreative members of the “creative class.”

Ultimately, the resentment of the creative class and the bourgeoisie becomes, over time, the resentment of the élites, and this is when we know that society is rotten from top to bottom. When those who have been given every advantage and every preferment in life are bitter and angry about their world, clearly something has gone off the rails. Of course, the resentment of the élites is expressed in a distinctive way, filtered through their thinly-veiled dog whistles and symbols, but not only is it there to be seen, as clear as day, but also pervasively present throughout the institutions that they superintend.

Apparently, it isn’t enough to rule the world and to enjoy a standard of living that is the envy of the masses; more than this, one must have the acquiescence of those masses in their subjection to the rule of élites. Mere compliance and conformity is not enough; there is also to be some formal recognition that the élites deserve their status and are making the best choices for the rest of us. (We live in a meritocracy, right?) When this recognition is not forthcoming, we glimpse the resentment of the élites for those they fancy the low human capital individuals.

It is a fascinating commentary on the resentment of the élites who grow out of a “creative class” that Nietzsche’s analysis of ressentiment crucially turns upon creativity:

“The slave revolt in morality begins when ressentiment itself becomes creative and gives birth to values: the ressentiment of natures that are denied the true reaction, that of deeds, and compensate themselves with an imaginary revenge. While every noble morality develops from a triumphant affirmation of itself, slave morality from the outset says No to what is ‘outside,’ what is ‘different,’ what is ‘not itself’; and this No is its creative deed. This inversion of the value-positing eye — this need to direct one’s view outward instead of back to oneself — is of the essence of ressentiment: in order to exist, slave morality always first needs a hostile external world; it needs, physiologically speaking, external stimuli in order to act at all — its action is fundamentally reaction.”

Friedrich Nietzsche, On the Genealogy of Morals, First Essay, section 10

This is a dialectic of creativity, in which creativity has nothing to work on, so it works on nothing — ressentiment is the creation of new values from nothing. It is an ex nihilo morality par excellence. Nietzsche once wrote of the finest flower of ressentiment, as related by Walter Kaufmann:

“Among the exceedingly few discoveries made in recent times concerning the origin of moral value judgments, Friedrich Nietzsche’s discovery of ressentiment as the source of such value judgments is the most profound, even if his more specific claim that Christian morality and in particular Christian love are the finest ‘flower of resssentiment’ should turn out to be false.”

From Walter Kaufmann’s introduction to his translation of On the Genealogy of Morals

Today our finest flower of ressentiment is the resentful élite who rule over us with a bad conscience — the creative class, the powerful, the educated, the well connected, the wealthy — and who never tire of reminding us of how deeply we have disappointed them. This is the kind of contempt that is exhibited when urbanites speak of “white trash” or some such similar social construct that expresses the bitter hatred of the privileged for the downtrodden. Both in the US and the UK, the political parties that formerly represented the interests of the working classes have been transformed in the past half century into parties that represent urbanized professionals, and they do not even bother to veil their contempt for the working class, who now appear to them as a distasteful embarrassment at best, a contemptible mass at worse, fit only to be ridiculed and despised.

In a Nietzschean analysis, one would expect that it would be the creative few who would be de facto Übermenschen, and so possessed of the virtues of the Übermensch — or, if you prefer, the virtù of the Übermensch — therefore these few would be among the least resentful elements in society, because the Übermensch expends his energies. If we were a society dominated by a truly creative class, we should be a society and an economy of supermen, creating new values and spontaneously releasing any pent up energies, but it is ressentiment that rules the present. Why?

The artificiality of our institutions, which demands that the ruling élites must bend the knee to democratic forms and make a pretense to upholding the rule of law that, in theory, binds their actions no less than ours, constitutes the hostile external world against which the ruling élites react, the Other that is Outside and Different. The creative deed of the élites of the creative class is its emphatic “No!” directed against the world from which it seeks to distinguish itself. Robbed of triumphant affirmation, they must rule without appearing to rule, and the reality of power coupled with its seeming denial is creating new values even now, though these are values that only can be savored in submerged and secret places — that is to say, in the hearts of the members of the creative class.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

. . . . .

Discord Invitation

. . . . .



Nietzsche’s Big History

One of the most succinct formulations of Big History of which I am aware is a brief paragraph from Nietzsche:

“In some remote corner of the universe, poured out and glittering in innumerable solar systems, there once was a star on which clever animals invented knowledge. That was the highest and most mendacious minute of ‘world history’ — yet only a minute. After nature had drawn a few breaths the star grew cold, and the clever animals had to die.

“On Truth and Lie in an Extra-Moral Sense,” Friedrich Nietzsche, Fragment, 1873: from the Nachlass. Translated by Walter Kaufmann

…and in the original German:

In irgend einem abgelegenen Winkel des in zahllosen Sonnensystemen flimmernd ausgegossenen Weltalls gab es einmal ein Gestirn, auf dem kluge Tiere das Erkennen erfanden. Es war die hochmütigste und verlogenste Minute der “Weltgeschichte”: aber doch nur eine Minute. Nach wenigen Atemzügen der Natur erstarrte das Gestirn, und die klugen Tiere mußten sterben.

Über Wahrheit und Lüge im außermoralischen Sinne, Friedrich Nietzsche, 1873, aus dem Nachlaß

This passage has been translated several times, so, for purposes of comparison, here is another translation:

“In some remote corner of the universe that is poured out in countless flickering solar systems, there once was a star on which clever animals invented knowledge. That was the most arrogant and the most untruthful moment in ‘world history’ — yet indeed only a moment. After nature had taken a few breaths, the star froze over and the clever animals had to die.”

ON TRUTH AND LYING IN AN EXTRA-MORAL SENSE (1873), Edited and Translated with a Critical Introduction by Sander L. Gilman, Carole Blair, and David J. Parent, New York and Oxford: OXFORD UNIVERSITY PRESS, 1989

Bertrand Russell, who rarely passed over an opportunity to criticize Nietzsche in the harshest terms, expressed a tragic interpretation of human endeavor that is quite similar to Nietzsche’s capsule big history:

“That Man is the product of causes which had no prevision of the end they were achieving; that his origin, his growth, his hopes and fears, his loves and his beliefs, are but the outcome of accidental collocations of atoms; that no fire, no heroism, no intensity of thought and feeling, can preserve an individual life beyond the grave; that all the labours of the ages, all the devotion, all the inspiration, all the noonday brightness of human genius, are destined to extinction in the vast death of the solar system, and that the whole temple of Man’s achievement must inevitably be buried beneath the debris of a universe in ruins–all these things, if not quite beyond dispute, are yet so nearly certain, that no philosophy which rejects them can hope to stand. Only within the scaffolding of these truths, only on the firm foundation of unyielding despair, can the soul’s habitation henceforth be safely built.”

Bertrand Russell, “A Free Man’s Worship”

Even closer to Nietzsche, in both style and spirit, is the passage that immediately precedes this in the same essay by Russell, told, as with Nietzsche, in the form of a parable:

“For countless ages the hot nebula whirled aimlessly through space. At length it began to take shape, the central mass threw off planets, the planets cooled, boiling seas and burning mountains heaved and tossed, from black masses of cloud hot sheets of rain deluged the barely solid crust. And now the first germ of life grew in the depths of the ocean, and developed rapidly in the fructifying warmth into vast forest trees, huge ferns springing from the damp mould, sea monsters breeding, fighting, devouring, and passing away. And from the monsters, as the play unfolded itself, Man was born, with the power of thought, the knowledge of good and evil, and the cruel thirst for worship. And Man saw that all is passing in this mad, monstrous world, that all is struggling to snatch, at any cost, a few brief moments of life before Death’s inexorable decree. And Man said: `There is a hidden purpose, could we but fathom it, and the purpose is good; for we must reverence something, and in the visible world there is nothing worthy of reverence.’ And Man stood aside from the struggle, resolving that God intended harmony to come out of chaos by human efforts. And when he followed the instincts which God had transmitted to him from his ancestry of beasts of prey, he called it Sin, and asked God to forgive him. But he doubted whether he could be justly forgiven, until he invented a divine Plan by which God’s wrath was to have been appeased. And seeing the present was bad, he made it yet worse, that thereby the future might be better. And he gave God thanks for the strength that enabled him to forgo even the joys that were possible. And God smiled; and when he saw that Man had become perfect in renunciation and worship, he sent another sun through the sky, which crashed into Man’s sun; and all returned again to nebula.

“`Yes,’ he murmured, `it was a good play; I will have it performed again.'”

Here Russell, unlike Nietzsche, gives theological meaning to the spectacle, however heterodox that meaning may be; I can easily imagine someone preferring Russell’s theological version to Nietzsche’s secular version, though both highlight the meaninglessness of human endeavor in a thermodynamic universe.

Our sun — a star among stars — will be a relatively early casualty in the heat death of the universe. While the life of the sun is orders of magnitude beyond the life of the individual human being, as soon as we understood that the sun’s life will pass through predictable stages of stellar evolution, we understood that the sun, like any human being, was born, will shine for a time, and then will die, and, when the sun dies, everything that is dependent upon the light of the sun for life will die also. It is only if we can make ourselves independent of the sun that we will not inevitably share the fate of the sun.

The idea that the sun is a star among stars, and that any star will do in terms of supporting human life, is embodied in a quote attributed to Wernher von Braun by Tom Wolfe and reported in Bob Ward’s book about von Braun:

“The importance of the space program is not surpassing the Soviets in space. The importance is to build a bridge to the stars, so that when the Sun dies, humanity will not die. The Sun is a star that’s burning up, and when it finally burns up, there will be no Earthno Marsno Jupiter.”

quoted in Dr. Space: The Life of Wernher von Braun, Bob Ward, Chapter 22, p. 218, with a footnote giving as the source, “Transcript, NBC’s Today program, New York, November 11, 1998”

Wernher von Braun had seized upon the essential insight of existential risk mitigation, as had many involved in the space program from its inception. As soon as one adopts a naturalistic understand of the place of humanity in the universe, and when technology develops to a point at which its extrapolation offers human beings options and alternatives within the universe, anyone will draw the same conclusion. Another quote from von Braun makes the same point in another way:

“…man’s newly acquired capability to travel through outer space provides us with a way out of our evolutionary dead alley.”

Bob Ward, Dr. Space: The Life of Wernher von Braun, Annapolis, US: Naval Institute Press, 2013.

I have previously written about the idea that humanity is a solar species, but the fact that humanity and the biosphere from which we derive has been utterly dependent upon solar insolation has been an accident of history. Any sun will do. We can, accordingly, re-conceive humanity as a stellar species, the kind of species that requires a star and its planetary system to make a home for ourselves. In this sense, all species of planetary endemism are stellar species.

Even this idea of immigration to another star, and of any other star being as good as the sun, is ultimately too narrow. Our sun, or any star, can be the source of energy that powers our civilization, but it can easily be seen that substitute forms of energy could equally well power the future of our civilization, and that it has merely been an historical contingency — a matter of our planetary endemism — that we have been dependent upon a single star, or upon any star, for our energy needs.

This more radical and farther-reaching vision is embodied in a quote attributed to Ray Bradbury by Oriana Fallaci:

“Don’t let us forget this: that the Earth can die, explode, the Sun can go out, will go out. And if the Sun dies, if the Earth dies, if our race dies, then so will everything die that we have done up to that moment. Homer will die. Michelangelo will die, Galileo, Leonardo, Shakespeare, Einstein will die, all those will die who now are not dead because we are alive, we are thinking of them, we are carrying them within us. And then every single thing, every memory, will hurtle down into the void with us. So let us save them, let us save ourselves. Let us prepare ourselves to escape, to continue life and rebuild our cities on other planets: we shall not be long of this Earth! And if we really fear the darkness, if we really fight against it, then, for the good of all, let us take our rockets, let us get well used to the great cold and heat, the no water, the no oxygen, let us become Martians on Mars, Venusians on Venus, and when Mars and Venus die, let us go to the other solar systems, to Alpha Centauri, to wherever we manage to go, and let us forget the Earth. Let us forget our solar system and our body, the form it used to have, let us become no matter what, lichens, insects, balls of fire, no matter what, all that matters is that somehow life should continue, and the knowledge of what we were and what we did and learned: the knowledge of Homer and Michelangelo, of Galileo, Leonardo, Shakespeare, of Einstein! And the gift of life will continue.”

Oriana Fallaci, If the Sun Dies, New York: Atheneum, 1966, pp. 14-15

Fallaci refers to this as a “prayer,” and indeed we might see this as a prayer or a catechism of the Space Age — not a belief, not merely belief, but an imperative ever-present in the hearts and minds of those who have fully imbibed the spirit of the age and who seek to carry that spirit forward with evangelical fervor, proselytizing to the masses and bringing them to the True Faith through purity of will and vision — another way of saying naïveté.

Do the clever animals have to die? No, not yet. Not if they are clever enough to move on to another planet, another star, another galaxy. Not if they are clever enough to change themselves so that, when the changed conditions of the universe in which they exist no longer allow the lives of clever animals to continue, what the clever animals have achieved can be preserved in some other way, and they themselves can be preserved in another form.

. . . . .


. . . . .


. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .


Leopold von Ranke (1795 - 1886)

Leopold von Ranke (1795 – 1886)

In George Orwell’s dystopian classic Nineteen Eighty-Four there occurs a well known passage that presents a frightening totalitarian vision of history:

“And if all others accepted the lie which the Party imposed — if all records told the same tale — then the lie passed into history and became truth. ‘Who controls the past,’ ran the Party slogan, ‘controls the future: who controls the present controls the past.’ And yet the past, though of its nature alterable, never had been altered. Whatever was true now was true from everlasting to everlasting. It was quite simple. All that was needed was an unending series of victories over your own memory. ‘Reality control’, they called it: in Newspeak, ‘doublethink’.”

George Orwell, Nineteen Eighty-Four, Part One, Chapter 3

What Orwell called, “…an unending series of victories over your own memory,” is something anticipated by Nietzsche, who, however, placed it in the context of pride rather than dissimulation:

“I have done that,” says my memory. “I cannot have done that,” says my pride, and remains inexorable. Eventually — memory yields.

Friedrich Nietzsche, Beyond Good and Evil: Prelude to a Philosophy of the Future, section 68

The phrase above identified as the “party slogan” — Who controls the past, controls the future: who controls the present controls the past — is often quoted out of context to give the misleading impression that this was asserted by Orwell as his own position. This is, rather, the Orwellian formulation of the Stalinist position. (Stalin reportedly hated both Nineteen Eighty-Four and Animal Farm.) The protagonist of Nineteen Eighty-Four, Winston Smith, is himself part of the totalitarian machinery, rewriting past newspaper articles so that they conform to current party doctrine, and re-touching photographs to erase individuals who had fallen out of favor — both of which Stalin presided over in fact.

The idea that the control over history entails control over the future, and the control over history is a function of control in the present, constitutes a political dimension to history. Winston Churchill (who is said to have enjoyed Nineteen Eighty-Four as much as Stalin loathed it) himself came close to this when he said that, “History will be kind to me for I intend to write it.” This political dimension to history is one of which Orwell and other authors have repeatedly made us aware. There is another political dimension to history that is more difficult to fully appreciate, because it requires much more knowledge of the past to understand.

More than mere knowledge of the past, which seems empirically unproblematic, it also requires an understanding of the theoretical context of historiography in order to fully appreciate the political dimension of history. The name of Leopold von Ranke is not well known outside historiography, but Ranke has had an enormous influence in historiography and this influence continues today even among those who have never heard his name. Here is the passage that made Ranke’s historiographical orientation — the idea of objective and neutral history that we all recognize today — the definitive expression of a tradition of historiographical thought:

“History has had assigned to it the office of judging the past and of instructing the account for the benefit of future ages. To show high offices the present work does not presume; it seeks only to show what actually happened.”

Leopold von Ranke, History of the Latin and Teutonic Nations

The deceptively simple phrase, what actually happened (in German: wie es eigentlich gewesen — became a slogan if not a rallying cry among historians. The whole of the growth of scientific historiography, to which I have referred in many recent posts — Scientific Historiography and the Future of Science and Addendum on Big History as the Science of Time among them — is entirely predicated upon the idea of showing what actually happened.

Sometimes, however, there is a dispute about what actually happened, and the historical record is incomplete or ambiguous, so that to get the whole story we must attempt to fill in the ellipses employing what R. G. Collingwood called the historical a priori imagination (cf. The A Priori Futurist Imagination). Historical extrapolation, placed in this Collingwoodian context, makes it clear that the differing ways in which the historical record is filled in and filled out is due to the use of different a priori principles of extrapolation.

I have noted that diachronic extrapolation is a particular problem in futurism, since it develops historical trends in isolation and thereby marginalizes the synchrony of events. So, too, diachronic extrapolation is a problem in historiography, as it fills in the ellipses of history by a straight-forward parsimonious extrapolation — as though one could unproblematically apply Ochkam’s razor to history. (The symmetry of diachronic extrapolation in history and futurism nicely reveals how futurism is the history of the future and history the futurism of the past.) The political dimension of history is one of the synchronic forces that represents interaction among contemporaneous events, and this is the dimension of history that is lost when we lose sight of contemporaneous events.

There were always contemporaneous socio-political conflicts that defined the terms and the parameters of past debates; in many cases, we have lost sight of these past political conflicts, and we read the record of the debate on a level of abstraction and generality that it did not have as it occurred. In a sense, we read a sanitized version of history — not purposefully santitized (although this is sometimes the case), not sanitized for propagandistic effect, but sanitized only due to our limited knowledge, our ignorance, our forgetfulness (at times, a Nietzschean forgetfulness).

Many historical conflicts that come down to us, while formulated in the most abstract and formal terms, were at the time political “hot button” issues. We remember the principles today, and sometimes we continue to debate them, but the local (if not provincial) political pressures that created these conflicts has often all but disappeared and considerable effort is required to return to these debates and to recover the motivating forces. I have noted in many posts that particular civilizations are associated with particular problem sets, and following the dissolution of a particular civilization, the problems, too, are not resolved but simply become irrelevant — as, for example, the Investiture Controversy, which was important to agrarian-ecclesiastical civilization, but which has no parallel in industrial-technological civilization.

Some of these debates (like that of the Investiture Controversy) are fairly well known, and extensive scholarly research has gone into elucidating the political conflicts of the time that contributed to these debates. However, the fact that many of these past ideas — defunct ideas — are no longer relevant to the civilization in which we live makes is difficult to fully appreciate them as visceral motives in the conduct of public policy.

Among the most well-known examples of politicized historiography is what came to be called the Black Legend, which characterized the Spanish in the worst possible light. In fact, the Spanish were cruel and harsh masters, but that does not mean that every horrible thing said about them was true. But it is all too easy to believe the worst about people whom one has a reason to believe the worst, and to embroider stories with imagined details that become darker and more menacing over time. During the period of time in which the Black Legend originates, Spain was a world empire with no parallel, enforcing its writ in the New World, across Europe, and even in Asia (notably in the Philippines, named for Spanish Monarch Philip II). As the superpower of its day, Spain was inevitably going to be the target of smears, which only intensified as Spain become the leading Catholic power in the religious wars that so devastated Europe in the early modern period. Catholics called Protestants heretics, and Protestants called the Pope the Antichrist; in this context, political demonization was literal.

There are many Black Legends in history, often the result of conscious and purposeful propagandistic effort. There are also, it should be noted, white legends, also the work of intentional propaganda. White legends whitewash a chequered history — exactly the task that Stalin set for Soviet civilization and which Winston Smith undertook for Oceania.

. . . . .

Philip II of Spain (1527-1598)

Philip II of Spain (1527-1598)

. . . . .


. . . . .

Grand Strategy Annex

. . . . .


Friedrich Nietzsche (1844–1900)

Nietzschean Economics: A Utopian Division of Labor

One of my favorite quotes from Nietzsche is not at all well known, though it comes from that is probably Nietzsche’s best-known book, Beyond Good and Evil:

“In the end, it must be as it is and has always been: great things for the great, abysses for the profound, shudders and delicacies for the refined, and in sum, all rare things for the rare.”

“Zuletzt muss es so stehn, wie es steht und immer stand: die grossen Dinge bleiben für die Grossen übrig, die Abgründe für die Tiefen, die Zartheiten und Schauder für die Feinen, und, im Ganzen und Kurzen, alles Seltene für die Seltenen. —”

Nietzsche, Beyond Good and Evil, section 43

In so saying Nietzsche was echoing one of his own earlier pronouncements — something he often did in refining his own formulations. Here is the earlier version of the same idea:

My Utopia.–In a better arranged society the heavy work and trouble of life will be assigned to those who suffer least through it, to the most obtuse, therefore; and so step by step up to those who are most sensitive to the highest and must sublimated species of suffering and who therefore suffer even when life is alleviated to the greatest degree possible.”

“M e i n e U t o p i e. — In einer besseren Ordnung der Gesellschaft wird die schwere Arbeit und Noth des Lebens Dem zuzumessen sein, welcher am wenigsten durch sie leidet, also dem Stumpfesten, und so schrittweise aufwärts bis zu Dem, welcher für die höchsten sublimirtesten Gattungen des Leidens am empfindlichsten ist und desshalb selbst noch bei der grössten Erleichterung des Lebens leidet.”

Nietzsche, Human, All Too Human, section 462

I cited both of these passages in my Variations on the Theme of Life (in a footnote to section no. 404), where I wrote (with Nietzsche firmly in mind):

“As an educated taste discriminates finer distinctions, appreciates more subtleties, and discerns greater detail, so an educated intellect conceives more clearly, sees in sharper outline, and penetrates deeper than an uneducated intellect. Knowledge sharpens awareness; understanding focuses consciousness.”

I started thinking about Nietzsche’s utopian division of labor again when I was reading a recent column in the Financial Times. The column in question was Lucy Kellaway’s advice column, to which individuals write in questions, and Lucy Kellaway responds, also inviting responses from her readership. This is one of my favorite FT features, and in fact I wrote in to respond to one of these questions last year and my answer was published (anonymously, of course) among a selection of other comments from FT readers.

The question in question, Why can’t I get a job?, was one almost calculated to provoke a response from FT readers:

“In 2009, I graduated from a top-tier US university with a degree in European history and since then I have struggled to find work in the US. I tried civilian intelligence, then finance and venture capital — everything from sales to being a police officer. Now, in despair, I am enlisting in the swollen US military. I believe my liberal arts education has given me a good basis for joining the workforce (I also speak Russian) but it seems employers do not agree. They prefer candidates from a state university with qualifications in business or marketing. What has gone wrong?”

Lucy Kellaway (who, by the way, is an Oxford PPE) responded (in part) as follows:

“In career terms your degree has been a waste of time. It has not prepared you for the workforce at all: writing essays about Bismarck or the causes of the Crimean war is no grounding for the world of spreadsheets and marketing campaigns… The point of a history degree is not to get a job at the end of it but to broaden the mind, to learn to write a proper sentence — something that, though good in itself, is neither necessary nor sufficient to get on in corporate life.”

Many of the reader responses in the FT were more openly derisive of a liberal arts education than was Ms. Kellaway — this is, I suppose, to be expected from a business publication. But Kellaway and her readers are, ultimately, right: a humanistic education has no place in industrialized civilization. Being part of the “workforce” means being able to do the practical things demanded by an industrialized economy, and these things today are dominated by computer and technical and marketing skills. The accomplishments of a traditional humanistic education literally have no place in the world today.

Not only is a liberal education a “waste of time,” as Ms. Kellaway puts it, but it could be argued that it is an actual impediment to fulfilling one’s role in the workforce. It is entirely possible to competently undertake some technical task without any knowledge or appreciation of history, philosophy, literature, poetry, or art. And an awareness of such things may well be a distraction that could obstruct a meticulous and purely instrumental attention to a technical task. Moreover, it is well known that highly educated people are often dissatisfied with their work and are therefore a source of discontent in the lives of coworkers. This may help to explain why the very idea of “higher” civilization has become controversial today, and why industrialized modernity, in terms of its contribution to the tradition of civilization, cannot be considered a peer competitor (or even near-peer competitor) to classical antiquity or medievalism.

As pathetic as the questioner sounds, he has a point also; he, too, is ultimately right. He had probably been told to follow his passion, and he had gotten into a “good” school, but he did not realize that the world is changing at an ever-faster pace, and that sinecures that might have been available in the recent past are rapidly becoming unavailable as the contemporary economy is ruthlessly pared down to a sleek and minimalist functionalism, like the buildings we now have in our cities instead of the gorgeous architecture of ages past.

In my Variations on the Theme of Life, which I quoted above, I also wrote:

“Fire a young man with ambition, fill his mind with an edifying education, swell his heart with proper pride, urge him to dream big dreams, tell him that the world waits like a ripe fruit that comes to meet the hand that plucks it, prepare him for a life of adventure and achievement — then show him the practical impossibility of attaining his ambitions, and you may just as well have shown him the instruments of his martyrdom.” (section 41)

This is what happens to many young people who follow idealistic advice on their career choice, rather than the kind of hard-headed career advice dispensed by readers of the Financial Times. Who can fault them? The young are, by and large, by nature passionate, idealistic and innocent. It is a violation of that innocence to tell them the hard facts of life, and if they are told, they may not listen.

The irrelevance of humanistic education was not always the case. In the ancient world, a humanistic education was central to obtaining a position in political society. The kind of men for whom Aristotle wrote his Nichomachaean Ethics — other people (like slaves) didn’t matter and were therefore invisible to ancient philosophical ethics — would have obtained an education in philosophy and rhetoric as a preparation for a public career, which was essentially the only kind of career such men could have. Under such a system — the socio-economic system of the agricultural paradigm — the vast majority of people spent their time farming the land, while only a tiny minority were literate administrators making up urban, civil society.

Now the masses, who once labored on farms, labor in production facilities or in offices, and now they are literate, and they may have a say in the running of the political machinery by which they are ruled. The industrial revolution that created these changed social conditions is still quite young in historical terms. In many places in the world it has occurred within the lifetime of those now living. Traditional social institutions have struggled to keep up with the pace of change dictated by the industrial revolution, and educational institutions are no exception. The ideal of traditional humanistic scholarship is still to be found, like a vestigial trace of an earlier age, but changed conditions are rendering it progressively more marginal with the passage of time.

Whereas once masses unfit for farming but with no other option in life ended up doing agricultural work and drinking themselves into a stupor on holidays in order to forget the misery of their lives, now masses unfit (by and large) for industrial production or office work labor at these tasks because these are the tasks that are available, not because they are the best things for people to be doing (or the things that people do best), and they too drink themselves into a stupor to forget the misery of their lives.

Someone who hates their work is not likely to be a productive and effective worker. Someone who is indifferent about their work is not likely to be much more productive or efficient than someone who outright hates their work, but the conditions of labor today virtually guarantee that the greater part of a vastly swollen human population will labor at jobs to which they are indifferent, and perhaps which they openly despise.

There is an amusing and probably uncomfortably true description of unmotivated office work on the Asian Failure blog. This is from I almost got fired today!:

“I have absolutely no interest in the well being of the company, or my individual assignments, or my reputation, or even my self preservation for the most part. My general attitude has been to glide just under the radar, and skim by with just enough to keep getting paid and not get fired — but just like a dog you just bring home, I test all the boundaries of what I can get away with first… There are days where I come into work at 11 AM, surf the internet until 5PM. I have two monitors. That means youtube on one monitor, and reading comics and police blotters on the other. Then I work for about 30 minutes, and then I duck out 15 minutes before 6.”

(This is funnier in context; you should read the whole post. I have edited it for my present purposes.)

Probably everyone knows someone — maybe many people — who work at mind-numbing dead-end jobs, or who once had a passion but couldn’t earn a living from it and so went on to more “practical” pursuits. All of this lost passion and lost opportunity to do anything greater is a very real economic loss. An economy that could find a way to truly tap the ambitions and creativity of its population would find itself surging ahead of competitors.

When I think of the people that I have known in my life, and reflect as a kind of thought experiment what these individuals might have been capable of doing, I realize how much the right person in the right position could accomplish. Now, I am sure that my labor assignments in my Walter Mitty economy would probably surprise some of the people I have placed in imaginary positions of importance. Nevertheless, I quite sincerely believe that a better distribution of labor is possible under an alternative socio-economic structure, though I cannot say what form that economic system would take. But if such an economic system could, one day in the future, come into being, it would closely resemble the utopian division of labor that Nietzsche considered.

Just to reiterate: if anyone (or any society) can find a way to harness the passion, enthusiasm, and good will that people bring to work that they love, they will have an enormous competitive advantage. While the perennial dream of a better world is often a mere pipe dream, an economy able to tap the full talents of a population, rather than having intelligent and creative people stapling and date-stamping papers, would be a more productive, more profitable, and more resilient economy that would grow at a faster rate than existing economic institutions. These are practical, concrete advantages you can take to the bank, not pipe dreams.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

Against My Ruin

27 February 2011


——————I sat upon the shore
Fishing, with the arid plain behind me
Shall I at least set my lands in order?
London Bridge is falling down falling down falling down
Poi s’ascose nel foco che gli affina
Quando fiam uti chelidon — O swallow swallow
Le Prince d’Aquitaine à la tour abolie
These fragments I have shored against my ruins
Why then Ile fit you. Hieronymo’s mad againe.
Datta. Dayadhvam. Damyata.
———-Shantih shantih shantih

T. S. Eliot, The Waste Land, “What the Thunder Said”

The moral innocence of youth is understood to reflect the inexperience of youth, and as time passes and experience accumulates, youth passes and is replaced by the person shaped by the experiences that have robbed that person of youth. Yet we can think of experience in two senses which could be called experiences of agency and experiences of sufferancy (following the distinction I drew between agents — those who act — and sufferants — those who suffer the actions of others — in Agents and Sufferants). Most experiences involve both acting and suffering, but many experiences are predominantly one or the other. When we consider the life experiences that bring us from youth to maturity, we can make a rough distinction between those experiences we initiated and therefore, in a sense, “did” to ourselves, and those experiences that befell us, sometimes the result of what others “did” to us, and sometimes simply the result of what happened to us quite apart from any intentional agency.

One thing that I have learned from middle age is how losses accumulate in life: we suffer more losses the longer we live — losses of all kinds. Now that I understand a little better the reality of loss, I look to those older people that I know (like my parents) and I find myself asking how people can continue to go on as the losses mount. The answer, of course, is that some individuals do not go on. Some among us are overwhelmed by losses and are broken by them, in some sense or other of “break.” Just as there are many senses of loss, so too there are many ways of being broken. (I previously wrote about what it means to be broken in Broken Lives.)

Most of us are not broken. Even those who suffer repeated catastrophic losses may not be catastrophically broken, although the experience of loss certainly changes us even if it does not break us. The little losses the mount over time, like the mass wasting that silently, incrementally levels mountains, break us in small ways, a little bit at a time. We become broken in a thousand minor ways. That is to say, we become damaged. Most of us are damaged, even if we are not broken.

T. S. Eliot, in his repudiated book, After Strange Gods, wrote that “…the damage of a lifetime, and of having been born in an unsettled society, cannot be repaired at the moment of composition.” I came upon this quote in Walter Kaufmann at a time when Eliot’s book was virtually unobtainable. (Now the whole book can be read by all, for free, on the internet.) Kaufmann took this as a sign of Eliot feeling sorry for himself, though with the full text available we can consider a longer quote that doesn’t sound quite so self-pitying:

“No sensible author, in the midst of something that he is trying to write, can stop to consider whether it is going to be romantic or the opposite. At the moment when one writes, one is what one is, and the damage of a lifetime, and of having been born into an unsettled society, cannot be repaired at the moment of composition. The danger of using terms like ‘romantic’ and ‘classic’ — this does not however give us permission to avoid them altogether — does not spring so much from the confusion caused by those who use these terms about their own work, as from inevitable shifts of meaning in context.”

T. S. Eliot, After Strange Gods, p. 26

This is the recognizable voice of Eliot the critic. But Eliot the poet also recognized the toll of loss, and the predictable human reaction to loss, in the final lines of The Waste Land: “These fragments I have shored against my ruins.” While as a critic Eliot had his splenetic moments, Eliot the poet — whether the early poet of The Waste Land or the late poet of the Four Quartets — was much too much the artist to give vent to mere sentimentality. Eliot as a poet is a witness to a moral truth, and not a self-pitying scold.

While even the most passive among us will inevitably suffer losses, merely as a sufferant, one may also suffer losses as a result of taking action and placing oneself in a position of agency. Indeed, failed action is often a pretext for a defeated individual to renounce his agency and profess a cataclysmic or eschatological conception of history in which human beings are understood to suffer only and be almost without ability to act. In this way a Weltanschauung may embody the self-pity of those broken by loss, and a loss can become a pretext for the denial of human agency.

More interesting than a conversion attributable to loss are those losses knowingly suffered as a consequence of agency. One can become broken, damaged, and imperfect even while striving toward the attainment of greater perfection — or especially because of such striving. To pursue a momentous undertaking is to consciously take risk, and to consciously take risk is to be aware of the ever-present possibility of failure. And even if one is successful in one’s momentous undertaking, there will almost certainly be casualties, even if one is not oneself broken. Being the cause of another’s suffering is, in turn, its own particular species of suffering.

At this point we may wish to appeal to what can be called the principle of inoculation, most famously expressed in an aphorism of Nietzsche: “That which does not destroy me makes be stronger.” I do not wish to deny this outright, but it is a principle that admits of qualifications. Often one is stronger in one sense from having suffered adversity, even while in another sense one is damaged.

From a naturalistic perspective, the one observation that can be made here (i.e., the one naturalistic observation that is not merely a reiteration of the brutal facts of life) is that every loss is a selection event, and that those that remain have been selected for. This may be cold comfort with the memory of those selected against still fresh in the mind, but it remains true and can be accepted on some level as a naturalistic form of hope. When we are ready for it. This day may not yet have dawned.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

The Last Civilization

10 February 2011


The subtitle of Nietzsche’s Beyond Good and Evil, one of his most influential books, is Prelude to a Philosophy of the Future, and in a strangely moving passage (section 214 of the same book) he referred collectively to himself and his readers as, “first born of the twentieth century.” Nietzsche was not a “futurist” in the sense we know the term today, but his philosophy was centered on the future.

Nietzsche’s conception of a future Übermensch who would supersede humanity as we know it today is of course one of the most well known and indeed notorious aspects of Nietzsche’s thought. In fact, just last night I watched the very entertaining and informative documentary Protagonist, in which the now reformed bank robber repeatedly stated that during his years of crime he believed himself to be a Nietzschean Superman. Whether you admire or despise the idea of the Übermensch, this was Nietzsche’s vision for what the future might be at its best. But this wasn’t the only future imagined by Nietzsche. He also imagined a worst case scenario for the future, and this worst case scenario was the Last Man (In German: der letzte Mensch).

In a couple of comments to my posts, Greg Lawson has drawn particular attention to Nietzsche’s Last Man. Mr. Lawson noted that Nietzsche’s Last Man appears in the title of Francis Fukuyama’s The End of History and the Last Man, so the idea retains a certain currency. Nietzsche’s exposition of the Last Man occurs in Section 5 of the preface of Thus Spoke Zarathustra:

They have something whereof they are proud. What do they call it, that which maketh them proud? Culture, they call it; it distinguisheth them from the goatherds.

They dislike, therefore, to hear of ‘contempt’ of themselves. So I will appeal to their pride.

I will speak unto them of the most contemptible thing: that, however, is THE LAST MAN!”

And thus spake Zarathustra unto the people:

It is time for man to fix his goal. It is time for man to plant the germ of his highest hope.

Still is his soil rich enough for it. But that soil will one day be poor and exhausted, and no lofty tree will any longer be able to grow thereon.

Alas! there cometh the time when man will no longer launch the arrow of his longing beyond man—and the string of his bow will have unlearned to whizz!

I tell you: one must still have chaos in one, to give birth to a dancing star. I tell you: ye have still chaos in you.

Alas! There cometh the time when man will no longer give birth to any star. Alas! There cometh the time of the most despicable man, who can no longer despise himself.

Lo! I show you THE LAST MAN.

“What is love? What is creation? What is longing? What is a star?” — so asketh the last man and blinketh.

The earth hath then become small, and on it there hoppeth the last man who maketh everything small. His species is ineradicable like that of the ground-flea; the last man liveth longest.

“We have discovered happiness” — say the last men, and blink thereby.

They have left the regions where it is hard to live; for they need warmth. One still loveth one’s neighbour and rubbeth against him; for one needeth warmth.

Turning ill and being distrustful, they consider sinful: they walk warily. He is a fool who still stumbleth over stones or men!

A little poison now and then: that maketh pleasant dreams. And much poison at last for a pleasant death.

One still worketh, for work is a pastime. But one is careful lest the pastime should hurt one.

One no longer becometh poor or rich; both are too burdensome. Who still wanteth to rule? Who still wanteth to obey? Both are too burdensome.

No shepherd, and one herd! Every one wanteth the same; every one is equal: he who hath other sentiments goeth voluntarily into the madhouse.

“Formerly all the world was insane,” — say the subtlest of them, and blink thereby.

They are clever and know all that hath happened: so there is no end to their raillery. People still fall out, but are soon reconciled—otherwise it spoileth their stomachs.

They have their little pleasures for the day, and their little pleasures for the night, but they have a regard for health.

“We have discovered happiness,” — say the last men, and blink thereby. —

And here ended the first discourse of Zarathustra, which is also called “The Prologue”: for at this point the shouting and mirth of the multitude interrupted him. “Give us this last man, O Zarathustra,”—they called out—”make us into these last men! Then will we make thee a present of the Superman!” And all the people exulted and smacked their lips. Zarathustra, however, turned sad, and said to his heart:

“They understand me not: I am not the mouth for these ears.

Too long, perhaps, have I lived in the mountains; too much have I hearkened unto the brooks and trees: now do I speak unto them as unto the goatherds.”

Nietzsche’s focus is on the contemptible Last Man himself, and his fellow last men, but I will observe that the Last Man, if and when he emerges from history will not emerge in a vacuum. The Last Man will be a product of the Last Civilization. The Last Civilization, like the Last Man, is contemptible, and smugly self-satisfied in its contemptuous status. Like the fool which Soloman said delights in his folly, so too the Last Man delights in his contemptible nature, and the Last Civilization delights in the Last Men it has produced desporting themselves as the contemptuous creatures they are. As the Last Man sees himself as the ultimate product of civilization, after which nothing more can possibly follow, so the Last Civilization understands itself as the ultimate civilization, and misunderstands is ultimacy as an expression of its “higher” nature.

Is it possible to discern in the present whether man is becoming the Last Man or Superman? And has our civilization turned a crucial corner to head decisively either in the direction of the Last Civilization or in the direction of Higher Civilization? Not long ago in The Very Idea of Higher Civilization I argued that contemporary industrialized civilization has not yet even begun to compete with the excellence of classical antiquity or the high points of medieval civilization. To date, industrialized civilization is not a peer-to-peer competitor with any civilization of the past.

This worries me, and I hope that it worries you, too. Industrialized civilization seems to be producing the conditions for the Last Man to someday reign, and therefore seems to be transforming itself in the Last Civilization. A simple, uninterrupted development of current trends would issue in precisely this fate. If contemporary industrialized civilization does not eventually produce the conditions of its self-transcendence and thereby justify itself through the creation of truly great works of civilization, distinctive of its milieu, then we will certainly evolve into the Last Man. Continued mediocrity is sufficient for the Last Man to triumph and to create (and be created by) the Last Civilization.

. . . . .

I have long had it on my mind to write about the Last Man, and also to write about structural forces in industrialized civilization that tend toward the degradation of excellence. I had not planned to bring these two ideas together; this is something that just happened to occur to me today. So I still have (at least) two more posts to write on these topics separately, but these thoughts are not yet sufficiently mature to expose them to the light of day.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

The Genealogy of Ideas

24 October 2010


Previously I discussed idea diffusion in Civilization and Idea Diffusion, but even as I posted that short contribution, I realized the inadequacy of it. A suitably detailed treatment of idea diffusion and its place in the history of human experience would run to volumes. What we need is perhaps, rather than the traditional history of ideas, is a genealogy of ideas. “Genealogy” in this sense comes from Nietzsche’s use of the term and his implementation of the idea, but it is Foucault who brought this kind of Nietzschean genealogy to maturity.

In his essay “Nietsche, Genealogy, History,” (collected in the volume Language, Counter-Memory, Practice) Foucault wrote:

Genealogy is gray, meticulous, and patiently documentary. It operates on a field of entangled and confused parchments, on documents that have been scratched over and recopied many times…

Genealogy… requires patience and a knowledge of details and it depends on a vast accumulation of source material. It’s “cyclopean monuments” are constructed from “discreet and apparently insignificant truths and according to a rigorous method”; they cannot be the product of “large and well-meaning errors.” In short, genealogy demands relentless erudition. Genealogy does not oppose itself to history as the lofty and profound gaze of the philosopher might compare to the molelike perspective of the scholar; on the contrary, it rejects the metahistorical deployment of ideal significations and indefinite teleologies. It opposes itself to the search for “origins.”

Michel Foucault, Language, Counter-Memory, Practice, “Nietzsche, Genealogy, History,” pp. 139-140

These are obviously the principles and practices by which Foucault pursued his scholarly research. And this is exactly what we need for the mind: instead of a history of ideas, as that discipline has been practiced, we need a genealogy of ideas that is as gray and patient and meticulous as the research that Foucault imagines (and which he in fact pursued) in reference to more familiar topics of history.

Equally obviously, I cannot do anything to even approach this in the space of a blog post, except to point out the need for such an approach, and to observe the relationship that a genealogy of ideas would have to the idea of idea diffusion as an historical process. A genealogy of ideas would trace, in detail, the paths of idea diffusion, if there are any such paths in a given case. Ideas diffuse over both time and space. The diffusion leaves a trace along the path those ideas have taken. In time, ideas experience descent with modification, and in space ideas experience adaptive radiation. These processes are not isolated from each other, but rather occur concurrently.

Foucault emphasized the meticulousness and detail required by genealogy, and we need to bring these scholarly habits to the genealogy of ideas. Because it is so difficult to deal with ideas with precision — it requires an unfamiliar effort of thought to do so — ideas have more often been given vague and ambiguous treatment that has caused them to be held in low repute. But if we can bring rigorous habits of mind to the genealogy of ideas, we could contribute to restoring ideas to their proper dignity.

For example, idea diffusion can occur on many different levels. We must pay careful attention to how we count our ideas, and how we place each idea within a hierarchy of ideas, so as not to conflate ideas of different orders of magnitude. Idea diffusion can take place on many different levels because any given particular falls under many different ideas.

How many squares are there on a chess board? It depends upon how we count them, and how we count them will depend upon how narrowly we have defined “square” in this context. Moreover, some definitions will admit of more than one answer because of the vagueness they incorporate, while some definitions will be more precise, and precisely because they are more precise they will exclude instances that are included under broader, less restrictive definitions. On a chess board there are, of course, the individually colored squares, and there are 64 of these. But the chess board taken on the whole is also a square. If we count both the individual squares and the whole, there are 65 squares. But there are also squares made up of 4, 9, 16, and so forth individual colored squares. There is no right or wrong answer here; it is only a matter of setting up a convention upon which we can agree. And once we have agreed upon a rigorously defined convention, we are prepared to treat the question of the number of squares on a chess board with precision.

This may seem like a silly exercise, but it is very much to the point. Without rigorous definitions, we will never be capable of thinking precisely about ideas. And given that few people ever make the time or take the effort to formulate rigorous definitions of ideas (except for mathematicians), it follows that ideas are usually not conceived with the requisite precision.

All ideas, and not just chess board squares, are to a greater or lesser extent subject to ambiguity, and therefore can only be treated with precision after we have made the appropriate effort to conceive them rigorously. Yesterday in Epistemological Warfare we remarked upon how all phases of the OODA loop (AKA the Boyd cycle) are theory-laden, therefore subject to interpretation, and therefore potentially ground for divergent observations, divergent orientations, divergent decisions, and divergent actions. This is partly a consequence of the ambiguity of the ideas employed in formulating the OODA loop. The more rigorously we can deal with each element of the cycle, the more we can minimize (though not eliminate) divergencies.

. . . . .


. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

%d bloggers like this: