11 November 2016
When I attempt to look back on my personal history in a spirit of dispassionate scientific inquiry, I find that I readily abandon entire regions of my past in my perhaps unseemly hurry to develop the next idea that I have, and which I am excited to see where it leads me. Moreover, contemplating one’s personal history can be a painful and discomfiting experience, so that, in addition to the headlong rush into the future, there is the desire to dissociate oneself from past mistakes, even when these past mistakes were provisional positions, known at the time to be provisional, but which were nevertheless necessary steps in order to begin (as well as to continue) the journey of self-discovery, which is at the same time a journey of discovering the world and of one’s place in the world.
In my limited attempts to grasp my personal history as an essential constituent of my present identity, among all the abandoned positions of my past I find that I understood two important truths about myself early in life (i.e., in my teenage years), even if I did not formulate them explicitly, but only acted intuitively upon things that I immediately understood in my heart-of-hearts. One of these things is that I have never been, am not now, and never will be either of the left or of the right. The other thing is, despite having been told many times that I should have pursued higher education, and despite the fact that most individuals who have the interests that I have are in academia, that I am not cut out for academia, whether temperamentally, psychologically, or socially — notwithstanding the fact that, of necessity, I have had to engage in alienated labor in order to support myself, whereas if I had pursued in a career in academia, I might have earned a living by dint of my intellectual efforts.
The autodidact is a man with few if any friends (I could tell you a few stories about this, but I will desist at present). The non-partisan, much less the anti-partisan, is a man with even fewer friends. Adults (unlike childhood friends) tend to segregate along sectional lines, as in agrarian-ecclesiastical civilization we once segregated ourselves even more rigorously along sectarian lines. If you do not declare yourself, you will find yourself outside every ideologically defined circle of friends. And I am not claiming to be in the middle; I am not claiming to strike a compromise between left and right; I am not claiming that I have transcended left and right; I am not claiming that I am a moderate. I claim only that I belong to no doctrinaire ideology.
It has been my experience that, even if you explicitly and carefully preface your remarks with a disavowal of any political party or established ideological position, if you give voice to a view that one side takes to be representative of the other side, they will immediately take your disavowal of ideology to be a mere ruse, and perhaps a tactic in order to gain a hearing for an unacknowledged ideology. The partisans will say, with a knowing smugness, that anyone who claims not to be partisan is really a partisan on the other side — and both sides, left and right alike, will say this. One then finds oneself in overlapping fields of fire. This experience has only served to strengthen my non-political view of the world; I have not reacted against my isolation by seeking to fall into the arms of one side or the other.
This non-political perspective — which I am well aware would be characterized as ideological by others — that eschews any party membership or doctrinaire ideology, now coincides with my sense of great retrospective relief that I did not attempt an academic career path. I have watched with horrified fascination as academia has eviscerated itself in recent years. I have thanked my lucky stars, but most of all I have thanked my younger self for having understood that academia was not for me and for not having taken this path. If I had taken this path, I would be myself subject to the politicization of the academy that in some schools means compulsory political education, increasingly rigid policing of language, and an institution more and more making itself over into the antithesis of the ideal pursuit of knowledge and truth.
But the university is a central institution of western civilization; it is the intellectual infrastructure of western civilization. I can affirm this even as an autodidact who has never matriculated in the university system. I have come to understand, especially in recent years, how it is the western way to grasp the world by way of an analytical frame of mind. The most alien, the most foreign, the most inscrutable otherness can be objectively and dispassionately approached by the methods of scientific inquiry that originated in western civilization. This character of western thought is far older than the scientific revolution, and almost certainly has its origins in the distinctive contribution of the ancient Greeks. As soon as medieval European civilization began to stabilize, the institution of the university emerged as a distinctive form of social organization that continues to this day. Since I value western civilization and its scientific tradition, I must also value the universities that have been the custodians of this tradition. It could even be said that the autodidact is parasitic upon the universities that he spurns: I read the books of academics; I benefit from the scientific research carried on at universities; my life and my thought would not have been possible except for the work that goes on in universities.
It is often said of the Abrahamic religions that they all pray to the same God. So too all who devote their lives to the pursuit of truth pay their respects to the same ancestors: academicians and their institutions look back to Plato’s Academy and Aristotle’s Lyceum, just as do I. We have the same intellectual ancestors, read the same books, and look to the same ideals, even if we approach those ideals differently. In the same way that I am a part of Christian civilization without being a Christian, in an expansive sense I am a part of the intellectual tradition of western civilization represented by its universities, even though I am not of the university system.
As an autodidact, I could easily abandon the western world, move to any place in the world where I was able to support myself, and immerse myself in another tradition, but western civilization means something to me, and that includes the universities of which I have never been a part, just as much as it includes the political institutions of which I have never been a part. I want to know that these sectors of society are functioning in a manner that is consistent with the ideals and aspirations of western civilization, even if I am not part of these institutions.
There are as many autodidacticisms as there are autodidacts; the undertaking is an essentially individual and indeed solitary one, even an individualistic one, hence also essentially an isolated undertaking. Up until recently, in the isolation of my middle age, I had questioned my avoidance of academia. Now I no longer question this decision of my younger self, but am, rather, grateful that this is something I understood early in my life. But that does not exempt me from an interest in the fate of academia.
All of this is preface to a conflict that is unfolding in Canada that may call the fate of the academy into question. Elements at the The University of Toronto have found themselves in conflict with a professor at the school, Jordan B. Peterson. Prior to this conflict I was not familiar with Peterson’s work, but I have been watching his lectures available on Youtube, and I have become an unabashed admirer of Professor Peterson. He has transcended the disciplinary silos of the contemporary university and brings together an integrated approach to the western intellectual tradition.
Both Professor Peterson and his most vociferous critics are products of the contemporary university. The best that the university system can produce now finds itself in open conflict with the worst that the university system can produce. Moreover, the institutional university — by which I mean those who control the institutions and who make its policy decisions — has chosen to side with the worst rather than with the best. Professor Peterson noted in a recent update of his situation that the University of Toronto could have chosen to defend his free speech rights, and could have taken this battle to the Canadian supreme court if necessary, but instead the university chose to back those who would silence him. Thus even if the University of Toronto relents in its attempts to reign in the freedom of expression of its staff, it has already revealed what side it is on.
There are others fighting the good fight from within the institutions that have, in effect, abandoned them and have turned against them. For example, Heterodox Academy seeks to raise awareness of the lack of the diversity of viewpoints in contemporary academia. Ranged against those defending the tradition of western scholarship are those who have set themselves up as revolutionaries engaged in the long march through the institutions, and every department that takes a particular pride in training activists rather than scholars, placing indoctrination before education and inquiry.
If freedom of inquiry is driven out of the universities, it will not survive in the rest of western society. When Justinian closed the philosophical schools of Athens in 529 AD (cf. Emperor Justinian’s Closure of the School of Athens) the western intellectual tradition was already on life support, and Justinian merely pulled the plug. It was almost a thousand years before the scientific spirit revived in western civilization. I would not want to see this happen again. And, make no mistake, it can happen again. Every effort to shout down, intimidate, and marginalize scholarship that is deemed to be dangerous, politically unacceptable, or offensive to some interest group, is a step in this direction.
To employ a contemporary idiom, I have no skin in the game when it comes to universities. It may be, then, that it is presumptuous for me to say anything. Mostly I have kept my silence, because it is not my fight. I am not of academia. I do not enjoy its benefits and opportunities, and I am not subject to its disruptions and disappointments. But I must be explicit in calling out the threat to freedom of inquiry. Mine is but a lone voice in the wilderness. I possess no wealth, fame, or influence that I can exercise on behalf of freedom of inquiry within academia. Nevertheless, I add my powerless voice to those who have already spoken out against the attempt to silence Professor Peterson.
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
27 October 2016
When I was an adolescent I was quite taken with what was known at the time as “survivalism.” With the little money that I had a bought a copy of Life After Doomsday by Bruce Clayton, I subscribed to Survive magazine (at the same time I was reading Soldier of Fortune magazine), and my favorite science fiction novels were those that dealt with the end of the world. There is an entire sub-genre of science fiction that dwells on the end of the world — some of it concerns itself with the actual process of societal collapse, some considers the short term consequences of societal collapse, and some considers the far future consequences. The most famous novel in this genre is also perhaps the most famous novel in science fiction — Walter Miller’s A Canticle for Leibowitz — which lingers over a post apocalyptic future at three distinct six hundred year intervals. My interest in the end of the world also led to my studying civil defense and eventually nuclear strategy, which fascinated me. This autodidactic process eventually led me to high culture sources of declension narratives, and hence to an intellectual engagement that ceased to be related to survivalism.
My Cold War childhood provided ample scope for my secular apocalyticism, but in reading about survivalism it was not long before I discovered that, ideologically, the survivalist movement was far to the right, though with some exceptions. There is was a bit of overlap between the counter-culture back-to-the-land movement, which was typically on the political left, and the survivalists, who were typically on the right. Both camps read the Foxfire books and imagined themselves returning to a simpler, and more self-sufficient life — an obvious response to the alienation produced by industrialized society and exponential urban growth. The exponential urban growth that especially blossomed in Europe and North America following the Second World War, and which effectively led to the depopulation of the rural countryside, continues in our time (cf. The Rural-Urban Divide). One of the most significant global demographic trends has been, is, and will continue to be the movement of rural populations into increasingly large megacities. This process means that the communities of the rural countryside are dismantled, while new communities are created in urban contexts, but the transition is by no means smooth, and some weather the change better than others.
Several changes occurred at or around the middle of the twentieth century that severally contributed to the rise of declension narratives: the exponential growth in urbanism mentioned above, atomic weapons and the Cold War, the dissolution of extended families, the Pill, and so on. Before this time, narratives of the future were largely expansionist and optimistic. During the Golden Age of science fiction, uncomplicated heroes traveled from planet to planet in a quixotic quest to right wrongs and to rescue damsels in distress. Now this seems very innocent, if not naïve, and we now prefer anti-heroes to heroes, as we identify more with their tortured struggles than with the uncomplicated heroes and their happily-ever-after.
Thus while our cities are larger than ever before in the history of civilization, and they are growing larger by the day, civilization is more integrated around the planet than ever before, and becoming more tightly integrated all the time (even as politicians today flee from the label of “globalization” because they know it is, at the moment, politically radioactive), and civilization is more robust than ever, with higher levels of redundancy in essential infrastructure and services than ever before, as well as possessing long-term, large-scale disaster planning and preparation, we are more pessimistic than ever before about the prospects of this vigorous civilization. Perhaps this is simply because it is not the civilization we expected to have.
In the social atmosphere of Cold War tension and the omnipresent threat of nuclear annihilation, which could materialize at any moment out of the clear blue sky, those initially disaffected by the emerging character of modern urbanized life sought to opt out, and this process of opting out of the emerging social order was often given intellectual justification in terms of a Weltanschauung of decline, which I call declension historiography. Declensionism varies in scope, from mainstream media columnists bemoaning the declining stature of the US in a multipolar world, to disaster preparedness, to societal collapse, to awareness of global catastrophic risk and existential risk, to a metaphysical doctrine of universal, inevitable, and unavoidable decline (which is today often expressed in scientific terms by reference to the second law of thermodynamics). Doomsday preparedness, then, comes in all varieties, from those hoping to survive “the big one,” where “the big one” is a massive earthquake, hurricane, or even an ephemeral political revolution, to those gearing up for the collapse of civilization and living in a world where there is no more electricity, no hospitals, schools, governments, or indeed any social institutions at all beyond the individual survivalist and his intimate circle.
The prehistory of doomsday preppers also included those preparing for a variety of different environmental, social, and political ills. Hippies founded communes and used their agricultural skills to grow better dope. Several apocalyptic churches have predicted the end of the world, and some explicitly urged members to build fallout shelters in order to survive nuclear war (such as the Church Universal and Triumphant). The communitarians on the right have often chosen to opt out of mainstream society under the umbrella of one of these apocalyptic churches, while the rugged individualists on the right became survivalists and they prepared to meet an apocalyptic future on their own terms, but, again, often justified in terms of a much larger conception of history. This declension narrative has become pervasive in contemporary society. While the end of the Cold War has meant the decline in the risk of nuclear war, the political left now favors scenarios of environmental collapse, while the political right favors scenarios of institutional collapse due to bank failure, currency collapse, the welfare state, or the decline of traditional social institutions (such as the church and the family).
The terms “suvivalist” and “survivalism” are not used as widely today, but the same phenomenon is now known in terms of “preppers,” short for “doomsday preppers,” which indicates those who actively plan and prepare for apocalyptic scenarios. The political division and overlap is still evident. The left, focusing on environmental collapse, continues to look toward the “small is beautiful” ideal of the early environmental movement, inherited from the Club of Rome’s Limits to Growth study; they focus on community and sustainable organic farming and tend not to stress the necessarily violent social transition that would occur if the most shrill predictions of “peak oil” came to pass, and industrialized civilization ground to a halt (this sort of scenario approaches mainstream respectability in some popular books such as $20 Per Gallon: How the Inevitable Rise in the Price of Gasoline Will Change Our Lives for the Better, which I discussed in Are Happy Days Here Again?). These left-of-center declensionists are rarely called “preppers,” but their activities overlap with those usually called preppers.
The right, in contrast, does focus on the presumptively violent transitional period of social collapse, fetishizing armed resistance to marauding hordes, who will stream by the millions from overcrowded cities when the electricity stops and trucks stop bringing in food. While details are usually absent, the generic social collapse scenario has come to be called “SHTF,” which is an acronym for “shit hits the fan,” as in, “when the shit hits the fan, if you aren’t prepared, things are going to go badly for you.” Right-of-center declensionists, like the left, have an overarching vision of the collapse of civilization (as strange as that may sound), but drawing on different ideas and different causes than the left.
What are these declensionist ideas and the presumed causes of declension? Where does vernacular declensionism get its ideas? Why is declensionism so prevalent today? I have touched upon this issue previously, especially in Fear of the Future, where I made an argument specific to the nature of industrialized society and the reaction against it:
“…apocalyptic visions graphically illustrate the overthrow of the industrial city and the order over which it presided… While such images are threatening, they are also liberating. The end of the industrial city and of industrial civilization means the end of wage slavery, the end of the clocks and calendars that control our lives, and the end of lives so radically ordered and densely scheduled that they have ceased to resemble life and appear more like the pathetic delusions of the insane.”
This explains the motivation for entertaining declensionist ideas, but it does not explain the sources of these ideas. But in the same post I also cited a number of science fiction films that have prominently depicted apocalyptic visions. It is difficult to name a science fiction film that is not dystopian and apocalyptic, and these films have had a great impact on popular culture. Even those unsympathetic to the prepper mindset effortlessly recognize the familiar tropes of societal collapse portrayed in film. Presumably the writers of these films derive their declensionist ideas from a mixture of vernacular, social media, mass media, and high culture declensionism, as these ideas have percolated through society.
The mass media rarely recognizes preppers (although I see that there is a television program, Doomsday Preppers), and when it does do, it does so in a spirit of condescension. The greatest friends of civilization today are those who never think about it and take for granted all of the comforts and advantages of civilization. For most of them, the end of civilization is simply unimaginable, and it is this perspective that is operative when the occasional article on preppers appears in the mass media, where it is presented with a mixture of bemused pity and incredulity. The target audience for these stories are precisely the people that preppers believe will not last very long when the shit hits the fan. I could easily write a separate blog post (or an entire book) about the relationship of the mass mainstream media to declension scenarios, but this is a distinct topic from that of vernacular declensionism. There is some overlap between mass media and social media, as every mainstream media outlet also has a social media presence, and the occasional social media post will “go viral” and be picked up by the mainstream media. In this way, some survivalist ideas find a wider audience than the core audience, already familiar with the message, and this can draw in the curious, who may eventually become converts to the message. Other than this, the contribution by mass media to declension historiography is very limited (except for supplying a steady stream of inflammatory news articles that are pointed out as sure signs that the end is near).
Social media is vast and amorphous, but is given shape by each and every one of us as we pick and choose the social media we consume. This filtering effect means that like-minded individuals share a common ideological space in social media, and they overlap very little with those of divergent ideologies. The prepper community is well represented in social media, which has taken over from the small private presses that formerly distributed survivalist literature to the small survivalist community. The social media presence of preppers is all over the map, with an array of diverse social collapse scenarios, but, like survivalists of the 70s and 80s, still primarily on the political right, and often inspired by Biblical visions of apocalypse. In 72 Items That Will Disappear First When The SHTF, preppers are urged to buy boxes of Bibles: “Bibles will be in demand and can be used to barter items. A box of 100 small Bibles cost about $20.” Perhaps the writer of this article has watched The Book of Eli too many times and imagines that the Bible may be hard to come by in post-apocalyptic America. It would be extraordinarily difficult for the Bible to become a rarity — as difficult as it would be for human beings to go extinct. Both are too widely distributed to be eradicated by anything short of terrestrial sterilization. If you want trade goods, you would be much better off stocking up on books that will be rare than books that will be common, but this doesn’t stoke the prepper narrative, so the logic of commerce gives way to the ideology of social cohesion through embattled belief.
High culture declensionism, as to be found, for example, in Oswald Spengler’s classic The Decline of the West (Der Untergang des Abendlandes), is scholarly, if not pedantic, and is essentially an exercise in the philosophy of history. (Interestingly, the most famous representatives of the Beat Generation, who foreshadowed the hippies’ back-to-the-land rejectionism of industrialized society, were avid readers of Spengler; cf. Sharin N. Elkholy, The Philosophy of the Beats, University Press of Kentucky, 2012, p. 208.) Spengler employs the old standby of a cyclical conception of history, and despite the intellectual and cultural distance we can come since cyclical history was the norm, vernacular cyclical history continues to be an influence. Vernacular cyclical history can appeal to intuitions about the life cycle of all things, and it is easy to conceive of civilization as participating in this coming to be and passing away of everything sublunary.
Saint Augustine, the father of the philosophy of history, may be cited as another high culture representative of declensionism, living as he did as the Roman world was unraveling. The sack of Rome by the Visigoths in 410 AD was the occasion of Saint Augustine writing his magnum opus, The City of God (De Civitate Dei). Rome had been a city untouched by any invading army for more than eight hundred years, and had functioned as the capital of the known world, and yet it had been laid low by unsophisticated barbarians. How was this to be explained? This is the task Augustine set himself, and Augustine had an answer. The ruination of the City of Man was, for Augustine, a mere detail of history, of no great importance, as long as the City of God was thriving, as he believed it to be. Indeed, the City of God would go on to thrive for more than a thousand years after Augustine as western Europe attempted to make itself over as the Earthly image of the City of God.
Augustine represents a sharp break with cyclical history. Throughout the City of God Augustine is explicit in his rejection of cyclical history, arguing against it both as a theory of history as well as due to its heterodox consequences. Thus while we can construe Augustine as a representative of declension history, it is a linear declension history. Augustine’s vision of linear declension history was remarkably influential during the European middle ages, when the few educated members of society did not perceive any break in history from classical antiquity to medievalism. For them, they were still Romans, but degraded Romans, very late in the history of Rome. The miserable condition of life of the middle ages was to be put to having come at the tail end of history, waiting for the world to well and truly end.
Vernacular declension, with its intuitive retention of cyclical history, resides awkwardly side-by-side with the Whig historiography and progressivism (ultimately derived from Augustine’s linear conception of history) that is so common in the modern world — the idea that we are modern, and therefore different from the people of the past and their world, is axiomatic and unquestioned. Human periodization of time is as natural as the categories of folk biology — our modernism, then, is, in part, a function of folk historiography (on folk concepts cf. Folk Concepts and Scientific Progress and Folk Concepts of Scientific Civilization). What are the categories of folk historiography, what kind of historical understanding of the world is characteristic of folk historiography? This will have to be an inquiry for another time.
I will conclude only with the observation that vernacular declensionism might paradoxically be employed in the service of civilization, if an interest in responses to existential threats to societal stability could be used as a stepping stone to the study of and preparation for global catastrophic risks and existential risks. That is a big “if.” When I think back to my own frame of mind when I was an enthusiast of survivalism, I thought that civilization had little or nothing of interest to me, and that all the adventure that might be possible in the world would follow from the “struggle for subsistence” that Keynes took to be the “economic problem” of humanity, and which contemporary civilization has largely solved. I still have sympathy for those who find little to value in civilization, as I can remember that stage in my own development quite clearly. In a sense, I only became reconciled to civilization; I never belonged to those who never question civilization, and who can’t imagine its extirpation. Civilization was, for me, always open to question.
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
22 September 2014
Science has a problematic relationship to mythology, so that to speak in terms of the mythological function of science is to court misunderstanding, but this idea is so important that I am going to take the risk of being profoundly misunderstood in order to try to explicate the mythological function of science, both in its descriptive and normative aspects. One of the problems that science has with mythology is that a great many if not most prominent institutional representatives of science explicitly reject mythology, or, if they do not explicitly reject mythology, they invoke a quasi-NOMA doctrine in order to demonstrate their respect for and tolerance of traditional mythologies as long as these mythologies do not interfere in the practice of science.
Science today, however, cannot be neatly contained within any category or limited to any one aspect of life. Science is the driving force behind our industrial-technological civilization, and as such it penetrates into every aspect of life whether or not we recognize this penetration, and whether or not science is even wanted in every aspect of life. Science has become as comprehensive as the global civilization with which it is integral, and so we find ourselves, both as individuals and as part of a society, facing a comprehensive institution that shapes almost every aspect of life. We have a relationship to this institution whether we like it or not, and in some cases this relationship approximates a mythological relationship, although (as I argued in The Next Axial Age) we have not yet seen the axialization of industrial-technological civilization.
On my other blog I have recently written a series of posts on religious experience, across the broad expanse of civilization from the transition from our hunter-gatherer origins to the forms that religious experience may take in the future. These posts are as follows:
These posts were narrowly focused on religious experience, and not on other aspects of religious ideas and practices. However, I took as my guide Joseph Campbell’s delineation of the functions of mythology.
Campbell makes a fourfold distinction in the functions of mythology, including the mystical function, the cosmological function, the social function, and the psychological function, as follows:
● The Mystical Function is concerned with reconciling consciousness with the pre-conditions of its existence.
● The Cosmological Function is a unified and comprehensive conception of the cosmos consistent with the mystical function of mythology (above) and the social function of mythology (below).
● The Social Function is a conception of the social order that establishes a model and a form for social institutions, as well as a conception the relation of the individual to the social order, and, through the social order, to the cosmos at large.
● The Psychological Function, which I would prefer to call this the “personal function,” is the function of a myth to guide an individual through the stages of life and to act as a support and as comfort in the individual’s hour of need.
This is a somewhat schematic approach to how an mythological world-view functions in a social context, and not the only possible way to analyze religion. Recently I was skimming some of the work of Ninian Smart, who distinguished seven “dimensions” of religion: 1. Doctrinal, 2. Mythological, 3. Ethical, 4. Ritual, 5. Experiential, 6. Institutional, and 7. Material. Smart further decomposed these seven dimensions into the para-historical (1-3), which must be studied by “dialogue and participation,” and the historical (4-7), which can be studied empirically, like any branch of science. It would be an interesting intellectual undertaking to do a detailed comparison among taxonomies of religious study. Campbell’s master category of mythology is, in Smart, reduced to one among seven dimensions of religion, so some care would be required to sort through respective definitions.
For the moment, however, acknowledging that there are other theoretical frameworks for studying religion, I am going to remain within Joseph Campbell’s structure of the functions of mythology in taking up the central role of science in our civilization. Campbell’s four functions of mythology provide an agenda to approach how science functions in the society of industrial-technological civilization, which can in turn be compared to past instances of mythologies that have served the central role in earlier civilizations equivalent to the role of science in contemporary civilization.
Science, as we all know, has been a source of the dissolution of the cosmological function of traditional mythology. Wherever traditional mythology supplied a myth of origins explaining the structure of the world, this myth has been rudely confronted with the scientific account of the structure of the world. Where the mythological account could gracefully be accepted as a metaphor, this was not a problem, but when great value has been attached to literal interpretations, then it is a problem. Eventually, and slowly, science has supplanted any and all mythological accounts of the nature of the world. Science, then, is uniquely suited to serving the cosmological function of mythology, and does so today even if it is not understood to be a mythological account of the origins and the structure of the world.
In regard to the social function of mythology, I find the position of contemporary science to be very hopeful at the same time that it is very distressing. On the hopeful side, we have sciences of society that are becoming more sophisticated all the time. From an adequate social science human beings are in the position for the first time in the whole of human history to say what kind of cities function well, and which kind of cities function poorly; what kinds of intervention work well, and which kinds fail; what kind of societies are likely to provide health, wealth, and happiness, and which kinds of societies consistently fail to do so. On the distressing side, every utopian program derived from the most advanced social thought of every era of human history has been a disastrous failure that not only fails to provide for health, wealth, and happiness, but which more often than not is transformed in practice into a dystopian nightmare. Thus the ability of a social science to design and maintain even a mediocre society is in question, and we cannot yet count science as ready to fulfill the social function of mythology, even if we are optimistic about the hopeful progress of social science.
The psychological or personal function of mythology is, in some senses, the whole of the problem in miniature. If science can provide an adequate account of the individual, many of the other functions of mythology will fall into place; if science cannot provide an adequate account of the individual, nothing else will work. The best science of the human individual is to be found today in evolutionary psychology. While evolutionary psychology remains controversial, the growing body of work on evolutionary psychology is giving us insights into human nature as derived from our biology and our evolutionary history. We should distinguish criticisms of evolutionary psychology between the political rejections of evolutionary psychology (which is hated by both left and right, in the same what the both the political left and the political right ultimately cannot countenance natural history) and the criticisms of evolutionary psychology that rest on a is/ought conflation. The politicized rejection of evolutionary psychology is uninteresting, so I will ignore it, and only discussed is/ought conflation in the criticism of evolutionary psychology
Evolutionary psychology is a descriptive science with no normative content, but, sadly and inevitably, no matter how carefully one points out that evolutionary psychology only studies human history, and how we got to be the way that we are, and has nothing whatsoever to say about what we ought to do, nor does it contain any prescriptions, many are unconvinced and are profoundly disturbed by the unflattering evolutionary origins of behaviors that we think of as being typically human. The confusion over the word “natural” in contemporary popular culture embodies a similar problem, except that “natural” is not a scientific term. People use “natural” in ordinary language to describe the world apart from the intervention of human civilization, but they also use the world to express certain values, especially connecting nature with conservation values and environmental concerns. It is extremely difficult to talk about nature without others jumping to the conclusion that one is also going to advocate for a range of issues related to environmentalism. While advocacy may grow out of the growth of scientific knowledge (as was explicitly the case with Lori Marino), and scientists often grow to love their object of study no less their their personal contributions in terms of a theory of their object of study, there is no necessary connection between scientific knowledge and advocacy. It has been considered highly counter-intuitive that, for example, Michel Foucault has been called an “anti-humanist human scientist,” as it is simply assumed that if you study humanity by way of the human sciences, you will also be an advocate of humanity. Similarly with evolutionary psychology, it is often assumed that one is being an advocate for behaviors conditioned by evolutionary, rather than merely explaining the evolutionary mechanism that brought them about.
If we can get past these simple-minded conflations, evolutionary psychology can teach us a great deal about ourselves and our relations with others while in no sense arguing that we are obligated to blindly follow those instincts engendered in us by our evolutionary development. It is a familiar theme that human instinctual life must be repressed in the context of civilized life; this was, of course, the theme of Freud’s Civilization and its Discontents. Another way to formulate this would be to observe that civilized life is incompatible with the instinctual life, so that evolutionary psychology would seem to provide no guide whatsoever to life in our industrial-technological civilization. But this is a deceptive claim to make. To understand the discontent of man in civilization, and especially the widespread anomie of alienated individuals, it is necessary to understand exactly the conflict between instinctual behavior and the behavior demanded by civilized society. Individuals who have studied evolutionary psychology have gained a unique measure of insight into these instinctual conflicts, and I think it is entirely reasonable to assert that such knowledge would likely be a help in guiding the individual through the stages of life experienced in civilized society — especially if evolutionary psychology were supplemented by an evolutionary account of the development of civilization — so that science could be said to be within reach of a robust ability to serve the psychological function of mythology.
This leaves us with the mystical or metaphysical function of mythology, and this will be the toughest task for science, because the science that has propelled industrial-technological civilization relentlessly forward has been a positivistically-conceived science that distances itself both from the mystical and the metaphysical, almost to the point of a cultivated ignorance of the tradition — what I have elsewhere called Fashionable Anti-Philosophy.
I see two possible sources for a mystical function that science could serve: 1. the eventual reconciliation of science with philosophy that allows science to draw from the resources of philosophy of produce a metaphysical conception consistent with modern science, or 2. a scientific theory of consciousness that is neither eliminativist or reductionist, but which gives a definitive account of consciousness that individuals without scientific training will feel is adequate to the explanation of their experience of the world. While many scientists are working on consciousness, and several scientifically-minded philosophers have claimed to “explain” consciousness, we cannot regard any of these efforts or explanations as yet being adequate to the task that would be required of a scientific approach to the mystical function of mythology.
A definitive scientific account of consciousness coupled with an account of evolutionary psychology, including evolutionary social psychology, would give a thorough descriptive account that could serve the mystical, social, and psychological functions as mythology as well as science now serves the cosmological function of mythology. The same is/ought distinction, however, the prevents us from being forced to regard a descriptive account of evolutionary psychology as a prescriptive account of how individuals and societies ought to conduct themselves, constitutes a limitation on the ability of science to function as a mythology, though even here science is not powerless. Sam Harris has recently written a book and given many lectures on the possibility of a scientific approach to morality, and while I disagree with his account, it demonstrates that scientific thought still has many resources that it can bring to the table. Here is where philosophy becomes indispensable. The kind of rapprochement between science and philosophy mentioned above as a possible source for a scientific metaphysics that could serve the mystical function of mythology is perhaps more crucial in overcoming the limitations of science to be prescriptive without violating the is/ought dichotomy.
. . . . .
. . . . .
. . . . .
14 September 2014
From time to time we need to be reminded (that is to say, we need to remind both ourselves and others) what it means to live in a free and open society, as the discipline of liberty is a stern one, and it is easy to go slack and to find oneself becoming tolerant of all kinds of compromises to one’s freedom, not to mention the freedom of others, which is relatively easy to sacrifice. Every day a thousand details compete for our attention, and these practical exigencies of life are often sufficient to distract us from our true interests in the long term, and in the big picture.
History does not stand still. Those in possession of the apparatus of state power are always seeking new ways to get the public to go along with the fashionable governmental programs of the moment, while citizens are always seeking ways around the controls that government attempts to impose upon them. It is a cat-and-mouse game — à bon chat, bon rat. Descartes, who lived during the period of the consolidation of the nation-state (and who fought as a solider in the Thirty Years’ War, the settlement of which was part of this process) adopted a motto from Ovid, bene qui latuit, bene vixit: He who hid well, lived well. This is a prudent maxim for any who are subject to state power.
The continual flux of one’s individual perspective, and the continual movement of history, together tend to obscure rather than to clarify where our true interests lie, and so we would do well to recur to classic formulations of liberal democracy (in the sense in which Fukuyama uses that term), and there is no more classic formulation of individual liberty in liberal democracy than is to be found in John Stuart Mill:
“…the sole end for which mankind are warranted, individually or collectively, in interfering with the liberty of action of any of their number, is self-protection… the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others. His own good, either physical or moral, is not a sufficient warrant. He cannot rightfully be compelled to do or forbear because it will be better for him to do so, because it will make him happier, because, in the opinions of others, to do so would be wise, or even right. These are good reasons for remonstrating with him, or reasoning with him, or persuading him, or entreating him, but not for compelling him, or visiting him with any evil in case he do otherwise. To justify that, the conduct from which it is desired to deter him, must be calculated to produce evil to some one else. The only part of the conduct of any one, for which he is amenable to society, is that which concerns others. In the part which merely concerns himself, his independence is, of right, absolute. Over himself, over his own body and mind, the individual is sovereign.
John Stuart Mill, On Liberty, Chapter I, “Introductory”
Mill’s uncompromising assertion of individual sovereignty is one of the high points of specifically western civilization, with its emphasis upon the individual and individualism. Uncompromising though it may be, it is not, however, absolute: the prevention of harm to others is not defined, and therefore subject to interpretation. An individual or a group that is bound and determined to exercise control over some other individual or group will twist their interpretation of the world until they they proved to their own satisfaction that the actions of the other individual or group are invidious to the public good, and therefore, under classic principles of political liberalism, they are justified in bringing coercion to bear in forcing the individual or group to conform to social expectations.
Mill’s assertion of individual sovereignty is also, in a sense, unexpected. For Mill in this passage, how we exercise state power matters. This way of thinking about Mill’s conception of liberty is really quite remarkable in view of the fact that Mill is probably also the most famous utilitarian, and therefore as a utilitarian is committed to a teleological (or consequentialist) ethic. But this passage is much more in the spirit of deontology than teleology. In my last post, Teleological and Deontological Conceptions of Civilization, I sought to show that teleological and deontological systems of ethical thought that have been applied to the individual also can be applied to social wholes, and here Mill, among the greatest of the representatives of utilitarian teleology, presents a case for a thoroughly deontological conception of the state and its power (i.e., mankind taken collectively).
What are we to make of individual sovereignty in an age of choice architecture? I can imagine the advocates of choice architecture making the argument that “nudging” rather than forcing citizens to adopted preferred behaviors in order to arrived at preferred outcomes is ultimately to recognize the sovereignty of the individual, and not to infringe upon that sovereignty any more than is necessary. But what is this necessity? What is the necessity of state power in industrial-technological civilization? State power in our time is primarily technical, so that its necessity is also understood as a technical requirement.
I have noted in several recent posts (Religious Experience in Industrial-Technological Civilization among them) that in industrial-technological civilization the organizing principle is technical; it is procedural rationality in its many forms that is the basis of social organization. (The term “procedural rationality” originates in the work of economist Herbert A. Simon — also known for his work on bounded rationality — though I am using the term in a wider signification than that employed by Simon, intended to include all decision making undertaken in complex contexts employing available empirical evidence in a theoretical framework that recognizes bounded rationality.)
Rationality is more constrained for some than for others; the technocrat of procedural rationality imagines that those in possession of state power have more and better information available to them than the subjects of state power, who suffer from a more tightly bounded rationality than their leaders. Therefore those with less bounded rationality and possessing greater horizons have a political responsibility to transfer their greater knowledge to the population at large through the power of the state. Choice architecture seems to be the least coercive way of doing so. (Choice architecture is not limited to state power: one could argue that the private enterprises that make it very easy to sign up to receive a good or service, but make it almost impossible to stop the delivery of said good or service, are practicing a kind of choice architecture, but these unsavory business practices are occasionally reviewed by the courts, and when found to be sufficiently coercive the courts may provide legal redress to aggrieved customers.)
The publication of Nudge: Improving Decisions About Health, Wealth, and Happiness in 2009 by Cass Sunstein and Richard Thaler, was an event of some significance in Anglophone political circles, as it was immediately seized upon by policymakers as a legitimation and justification of their “expertise” in social organization — precisely the expertise in procedural rationality that is central to industrial-technological civilization. This was unintended intellectual flattery of the first order. The great unwashed require experts to shape the finer aspects of their lives, rough-hew them though the ignorant masses may. Delivered from their miserable choices to preferred outcomes to which they are nudged, the people should be grateful to their leaders for their enlightened intervention.
In the context of social organization through procedural rationality, the inevitable rise of expertise in technical matters comes to dominate society at large. The process begins with the mere details of how life is organized, but the nature of state power is to grow without bounds (see below on the slippery slope, here applied to state power), and as procedural rationality steadily expands its scope, the state approximates what Erving Goffman called a total institution:
“The common characteristics of total institutions derive from the coercion of inmates to conform to an internal regime. They are stripped of their former identities and obliged to accept an alternative selfhood, designed to fit the expectations of staff. This transformation is effected by procedures and practices including the breakdown of the divisions separating work, sleep and play. All activities are tightly scheduled and geared to serve institutionally set tasks. These can be carried out only by obeying rules and regulations that are sanctioned by privileges and punishments administered by staff whose authority is sustained through the maintenance of a distance from inmates.”
The Social Science Encyclopedia, Third edition, Edited by Adam Kuper and Jessica Kuper, VOLUME II L-Z, LONDON AND NEW YORK: Routledge, 2004, “Total Institutions,” pp. 1031-1032
The state as a total institution could be employed as a definition of totalitarianism. “Nudge” politics is very long way from being totalitarianism, or even the thin edge of the wedge of totalitarianism, but there are dangers nevertheless of which we should be aware.
At what point does choice architecture become coercive? How narrow may an individual’s options be made before we are willing to acknowledge that that individual’s life has been compromised by the institutions with the power to shape the choices available to the individual? How much can the life of the individual be compromised before we recognize this as a form of coercion? If coercion is held below the threshold of violence, is it more morally acceptable that coercion that is openly violent?
Ultimately, state power is about violence; it is not always or inevitably manifested as violence, but as violence is the ultimate guarantor of state power, any politicized question is ultimately about violence. Everyone is familiar with Max Weber’s definition of sovereignty: “The state is the human community that, within a defined territory — and the key word here is ‘territory’ — (successfully) claims the monopoly of legitimate force for itself.” While such a state does not always employ force, it can employ force if necessary, and here the only necessity is political necessity, as defined by the sovereign state. As noted above, today this is a technical necessity governed by technical requirements, and in so far as the human condition is made rigorous, technical necessity leaves no aspect of life untouched.
A common but commonly unstated theme in such discussions is the doctrine of tacit consent. Everyone today, in virtue of being born on some particular scrap of geography, is the subject of some territorially-defined nation-state that seeks to enforce the territorial principle in law. Thus every human being alive today has been judged to have given their tacit consent to the state power of some nation-state or other. What is the basis for this claim? Along with John Stuart Mill, one of the godfathers of political liberalism is John Locke, whose Second Treatise of Government was an important influence on the American founding fathers, but Locke was willing to assert a sweeping doctrine of tacit consent that I find problematic at best, an invitation of inter-generational tyranny at worst:
“Nobody doubts but an express consent of any man entering into any society makes him a perfect member of that society, a subject of that government. The difficulty is, what ought to be looked upon as a tacit consent, and how far it binds — i. e., how far any one shall be looked upon to have consented and thereby submitted to any government, where he has made no expressions of it at all. And to this I say that every man that has any possessions or enjoyment of any part of the dominions of any government does thereby give his tacit consent and is as far forth obliged to obedience to the laws of that government, during such enjoyment, as anyone under it; whether this his possession be of land to him and his heirs for ever, or a lodging only for a week, or whether it be barely traveling freely on the highway; and, in effect, it reaches as far as the very being of anyone within the territories of that government.”
John Locke, Second Treatise of Government, section 119
Here Locke appears unambiguously as a theorist of the territorial principle in law (also assumed by Weber in the quote above), as no one who came from interpenetrating ethnic communities, each ruled by their own law (i.e., the personal principle in law), would ever assert that lodging for a week in some territory subjects the individual to the law of the government that claims sovereignty over that territory. In this way, we can see Locke as one of many political philosophers who contributed to the formulation of the theory of the nation-state at a time when the nation-state remained yet inchoate.
The slippery slope of political obligation in the context of tacit consent would imply that every citizen of a nation-state that engages in genocidal persecution and warfare is at least an accessory, if not a willing and active participant, in such moral outrages (cf. Genocide and the Nation-State). Throughout the twentieth century, in fact, this was the conclusion that was derived in fact, if not in theory. Thus the destruction of a wartime enemy’s population was justified because that population facilitated the prosecution of the war, even if their employment had not changed since the war in question began (i.e., even if they are not employed in war industries). They have, after all, given their tacit consent to the nation-state in which they reside. This kind of political reasoning brought humanity face-to-face with annihilation in the twentieth century, and we can be glad that, whatever the horrendous depredations of that century, it did not ultimately follow through to the bitter end the political logic of its time.
For a logician, a slippery slope is a fallacy, and the logician is right: there is no logical way to derive a transition from the thin edge of the wedge to the thick edge — but if there is a sledgehammer pounding down on the wedge, the likelihood of the thin edge leading to the thick edge is quite high. Life is not logical. Psychologically a slippery slope is very real, very treacherous, and every consummate manipulator (if you live long enough, you will meet many of them) knows how to exploit human frailty with a slippery slope. (This is why we say, “in for a penny, in for a pound.”) Indeed, any logical explication of the slippery slope fallacy ought to be presented with an explication of the cognitive biases of availability cascade, bandwagon effect, illusory correlation, and irrational escalation. We could, in fact, name a new cognitive bias — say, the slippery slope effect — which is the likelihood of individuals to allow themselves to be led down a slippery slope despite this slope being a logical fallacy.
While logicians recognize the appeal to a slippery slope as a fallacy, the logicians have no answer to the paradoxes of the heap, also called sorites paradoxes, which consider the problems inherent in vagueness. Well, it would not exactly be right to say that logicians have “no answer” to sorites paradoxes, only that there are many logical theories for dealing with sorites paradoxes, but none of these theories are universally accepted, and the paradox appears so frequently in human experience that its paradoxicality cannot be wished away. Where exactly the transition from choice architecture to coercion occurs admits of no easy answers.
. . . . .
. . . . .
. . . . .
17 July 2014
In George Orwell’s dystopian classic Nineteen Eighty-Four there occurs a well known passage that presents a frightening totalitarian vision of history:
“And if all others accepted the lie which the Party imposed — if all records told the same tale — then the lie passed into history and became truth. ‘Who controls the past,’ ran the Party slogan, ‘controls the future: who controls the present controls the past.’ And yet the past, though of its nature alterable, never had been altered. Whatever was true now was true from everlasting to everlasting. It was quite simple. All that was needed was an unending series of victories over your own memory. ‘Reality control’, they called it: in Newspeak, ‘doublethink’.”
George Orwell, Nineteen Eighty-Four, Part One, Chapter 3
What Orwell called, “…an unending series of victories over your own memory,” is something anticipated by Nietzsche, who, however, placed it in the context of pride rather than dissimulation:
“I have done that,” says my memory. “I cannot have done that,” says my pride, and remains inexorable. Eventually — memory yields.
Friedrich Nietzsche, Beyond Good and Evil: Prelude to a Philosophy of the Future, section 68
The phrase above identified as the “party slogan” — Who controls the past, controls the future: who controls the present controls the past — is often quoted out of context to give the misleading impression that this was asserted by Orwell as his own position. This is, rather, the Orwellian formulation of the Stalinist position. (Stalin reportedly hated both Nineteen Eighty-Four and Animal Farm.) The protagonist of Nineteen Eighty-Four, Winston Smith, is himself part of the totalitarian machinery, rewriting past newspaper articles so that they conform to current party doctrine, and re-touching photographs to erase individuals who had fallen out of favor — both of which Stalin presided over in fact.
The idea that the control over history entails control over the future, and the control over history is a function of control in the present, constitutes a political dimension to history. Winston Churchill (who is said to have enjoyed Nineteen Eighty-Four as much as Stalin loathed it) himself came close to this when he said that, “History will be kind to me for I intend to write it.” This political dimension to history is one of which Orwell and other authors have repeatedly made us aware. There is another political dimension to history that is more difficult to fully appreciate, because it requires much more knowledge of the past to understand.
More than mere knowledge of the past, which seems empirically unproblematic, it also requires an understanding of the theoretical context of historiography in order to fully appreciate the political dimension of history. The name of Leopold von Ranke is not well known outside historiography, but Ranke has had an enormous influence in historiography and this influence continues today even among those who have never heard his name. Here is the passage that made Ranke’s historiographical orientation — the idea of objective and neutral history that we all recognize today — the definitive expression of a tradition of historiographical thought:
“History has had assigned to it the office of judging the past and of instructing the account for the benefit of future ages. To show high offices the present work does not presume; it seeks only to show what actually happened.”
Leopold von Ranke, History of the Latin and Teutonic Nations
The deceptively simple phrase, what actually happened (in German: wie es eigentlich gewesen — became a slogan if not a rallying cry among historians. The whole of the growth of scientific historiography, to which I have referred in many recent posts — Scientific Historiography and the Future of Science and Addendum on Big History as the Science of Time among them — is entirely predicated upon the idea of showing what actually happened.
Sometimes, however, there is a dispute about what actually happened, and the historical record is incomplete or ambiguous, so that to get the whole story we must attempt to fill in the ellipses employing what R. G. Collingwood called the historical a priori imagination (cf. The A Priori Futurist Imagination). Historical extrapolation, placed in this Collingwoodian context, makes it clear that the differing ways in which the historical record is filled in and filled out is due to the use of different a priori principles of extrapolation.
I have noted that diachronic extrapolation is a particular problem in futurism, since it develops historical trends in isolation and thereby marginalizes the synchrony of events. So, too, diachronic extrapolation is a problem in historiography, as it fills in the ellipses of history by a straight-forward parsimonious extrapolation — as though one could unproblematically apply Ochkam’s razor to history. (The symmetry of diachronic extrapolation in history and futurism nicely reveals how futurism is the history of the future and history the futurism of the past.) The political dimension of history is one of the synchronic forces that represents interaction among contemporaneous events, and this is the dimension of history that is lost when we lose sight of contemporaneous events.
There were always contemporaneous socio-political conflicts that defined the terms and the parameters of past debates; in many cases, we have lost sight of these past political conflicts, and we read the record of the debate on a level of abstraction and generality that it did not have as it occurred. In a sense, we read a sanitized version of history — not purposefully santitized (although this is sometimes the case), not sanitized for propagandistic effect, but sanitized only due to our limited knowledge, our ignorance, our forgetfulness (at times, a Nietzschean forgetfulness).
Many historical conflicts that come down to us, while formulated in the most abstract and formal terms, were at the time political “hot button” issues. We remember the principles today, and sometimes we continue to debate them, but the local (if not provincial) political pressures that created these conflicts has often all but disappeared and considerable effort is required to return to these debates and to recover the motivating forces. I have noted in many posts that particular civilizations are associated with particular problem sets, and following the dissolution of a particular civilization, the problems, too, are not resolved but simply become irrelevant — as, for example, the Investiture Controversy, which was important to agrarian-ecclesiastical civilization, but which has no parallel in industrial-technological civilization.
Some of these debates (like that of the Investiture Controversy) are fairly well known, and extensive scholarly research has gone into elucidating the political conflicts of the time that contributed to these debates. However, the fact that many of these past ideas — defunct ideas — are no longer relevant to the civilization in which we live makes is difficult to fully appreciate them as visceral motives in the conduct of public policy.
Among the most well-known examples of politicized historiography is what came to be called the Black Legend, which characterized the Spanish in the worst possible light. In fact, the Spanish were cruel and harsh masters, but that does not mean that every horrible thing said about them was true. But it is all too easy to believe the worst about people whom one has a reason to believe the worst, and to embroider stories with imagined details that become darker and more menacing over time. During the period of time in which the Black Legend originates, Spain was a world empire with no parallel, enforcing its writ in the New World, across Europe, and even in Asia (notably in the Philippines, named for Spanish Monarch Philip II). As the superpower of its day, Spain was inevitably going to be the target of smears, which only intensified as Spain become the leading Catholic power in the religious wars that so devastated Europe in the early modern period. Catholics called Protestants heretics, and Protestants called the Pope the Antichrist; in this context, political demonization was literal.
There are many Black Legends in history, often the result of conscious and purposeful propagandistic effort. There are also, it should be noted, white legends, also the work of intentional propaganda. White legends whitewash a chequered history — exactly the task that Stalin set for Soviet civilization and which Winston Smith undertook for Oceania.
. . . . .
. . . . .
. . . . .
. . . . .
18 May 2014
The Culture War is not over. Instead of being waged over textbooks for elementary and high schools, it has moved into universities, and the inflection point has become the symbolic role of the commencement address. Even while the most prestigious institutions seek the most prestigious speakers for their commencement celebrations, interest groups on campuses across the US have been agitating and campaigning to block the appearance of some of these prestigious speakers, which action constitutes a kind of symbolic victory over imagined enemies — the greater the prestige of the speaker prevented from speaking at commencement, the greater the symbolic victory.
There have been two recent articles in the Wall Street Journal, “The Closing of the Collegiate Mind” by Ruth R. Wisse (Monday 12 May 2014) and “Bonfire of the Humanities” by Daniel Henninger (Thursday 15 May 2014), that have taken particular aim at the withdrawn invitations for three high-profile speakers: Christine Lagarde of the IMF was to speak at Smith College, but withdrew after 480 students signed a petition against her appearance; Ayaan Hirsi Ali was to speak at Brandeis University but was uninvited due to a claim of her making anti-Islamic statements; Robert J. Birgeneau, former chancellor of UC Berkeley, was forced to withdraw from speaking at Haverford College’s commencement due to accusations that he had condoned the use of force by Berkeley police to clear away Occupy protesters. (It’s times like these that I have no regrets about not going into academia.)
It is easy — all too easy — to spin this latest round of battles over commencement speakers as political correctness gone out of control at major universities. The readers of the Wall Street Journal are likely to lap up this narrative like a cat laps up cream. And rightly so. Colleges today have become places of “ideological conformity” (as Ruth R. Wisse puts it) and “tendentious gibberish” (as Daniel Henninger puts it) — all in the name of “tolerance.” As Pascal said, “‘Men never do evil so completely and cheerfully as when they do it from religious conviction” And here the “religious conviction” is the surrogate religion of diversity.
This could have been the opportunity for some new and innovative political thinking — only, it hasn’t been. It has, rather, been a pretext for the same old, same old in politics. It never ceases to amaze me how quickly the response to campus radicalism deteriorates into all the familiar tropes of the trad-con axis (“trad-con” is the term used by some today to designate “traditional conservatism”). A case in point is another news story that has energized trad-con types and become a minor cause célèbre among the chattering classes, reported in Tal Fortgang not sorry for being white and privileged and Harvard’s so-called ‘white privilege’ class both by Anthony Zurcher, Editor, BBC Echo Chambers.
The young man who wrote this now widely-circulated essay in which he refuses to apologize for his “privilege” could have used the opportunity to question the left//right dialectic, but instead (and in spite of his youth) settles comfortably into a trad-con rut from which he is unlikely to ever extricate himself. Having become a hero of the moribund right in his early years, he need to do little more in order to assure himself a bright future in telling people what they want to hear. And if you can tell people what they want to hear while making their political opponents outraged, well, so much the better.
The left-right dialectic belongs to the past, and those today who seek to keep the flame alive — whether they come from the left or the right — are beating the carcass of a dead horse. In my post Ideas that will Shape the Future I wrote of the decline of left/right politics:
The political landscape as we know it today continues to be shaped by the left/right dialectic that emerged in the wake of the French Revolution, as some sought to continue the revolution, others to reverse it, and others yet to expand it. But the traditional governing coalitions based on left/right politics have been increasingly confronted with new political problems that cannot be easily analyzed along a left/right axis. As the most advanced industrialized nation-states converge on political gridlock, innovative solutions are increasingly likely to emerge from non-traditional political sources, marginalizing the left/right dichotomy and possibly giving life to new political movements that cannot be reduced to a left/right division. Moreover, structural changes within society such as increasing urbanization, globalization, technological unemployment, exponentialism (albeit selective), and bitter conflicts over the life sciences that divide people across previously established coalitions expose mass populations to new forces that shape these populations and their opinions in new ways.
The decline of left/right politics will not transform the nation-states of today into apolitical communities; political communities will continue to be political communities, but their politics will change over time, old coalitions and marriages of convenience will fall apart while new coalitions will emerge. All of this will take time. It will be a slow and very gradual political evolution. In fact, it will be such a slow transition that it will be plausible to deny for many decades — perhaps for a century or more — that anything has fundamentally changed in the political structure of society.
The “debate” today — if we can call it that — cannot move forward because it is mired in the past, it is conducted in the terms of the past, and it cannot do more than to reassert the values, meanings, and purposes of the past. And when I here write of “the past” I am not only speaking of trad-cons nostalgic for an imagined past, but also of their political opponents who are equally deluded about the world as it is today. Moreover, it is debate that is contextualized in other debates occurring simultaneously, and so it cannot move forward unless and until progress is made in these other debates.
All of this is happening at a time when college tuition is significantly outpacing inflation, students are taking on increasingly large debt burdens to pay for their education, potential employers are skeptical of the qualifications of graduates, and many see online courses as the future of higher education. No one knows in detail how these issues will play out, but the transformation of the university from an educational institution into an economic institution, and the transformation of education into a commodity, is one of those larger social forces to which the university can only respond, and it cuts across ideological lines. Some trad-cons like the idea of education as an industry, since it corresponds to their own economic preoccupations, while other conservatives are among the staunchest supporters of the traditional ideal of a liberal education.
There is a larger and older debate going on as well. Higher education in the US has always been subject to an underlying social tension, which is the desire on the one hand to fulfill the traditional ideal of a liberal education, and on the other hand to provide practical skills that are applicable in the workplace. Both imperatives have their representatives inside and their advocates outside academic institutions. The result is an ongoing compromise that shifts as the underlying social tension shifts, sometimes tending toward the traditional mission of the university and sometimes tending toward the “hands on” and “good ol’ American know-how” school of thought. Because the shift in educational institutions always follows after the shift in social attitudes, the university is never fully in harmony with American society, and always seems to be struggling to make itself relevant in the particular way that society believes higher education should be relevant at any given moment in history.
. . . . .
. . . . .
. . . . .
3 April 2014
Among the many theoretical innovations for which Michel Foucault is remembered is the idea of biopower. We can think of biopower as a reformulation of perennial Foucauldian themes of the exercise of power through institutions that do not explicitly present themselves as being about power. That is to say, the subjugation of populations is brought about not through the traditional institutions of state power, but by way of new institutions purposefully constituted for the reason of monitoring and administrating the unruly bodies of the individuals who collectively constitute the body politic.
Foucault introduced the idea of biopower in The History of Sexuality, Vol. 1, in the chapter, “Right of Death and Power over Life.” Like his predecessor in France, Descartes, Foucault writes in long sentences and long paragraphs, so that it is difficult to quote him accurately without quoting him at great length. His original exposition of biopower needs to be read in full in its context to appreciate it, but I will try to pick out a few manageable quotes to give a sense of Foucault’s exposition.
Here is something like a definition of biopower from Foucault:
“…a power that exerts a positive influence on life, that endeavors to administer, optimize, and multiply it, subjecting it to precise controls and comprehensive regulations.”
Michel Foucault, The History of Sexuality, Vol. 1, translated from the French by Robert Hurley, New York: Pantheon, 1978, p. 137
Later Foucault names specific institutions and practices implicated in the emergence of biopower:
“During the classical period, there was a rapid development of various disciplines — universities, secondary schools, barracks, workshops; there was also the emergence, in the field of political practices and economic observation, of the problems of birthrate, longevity, public health, housing, and migration. Hence there was an explosion of numerous and diverse techniques for achieving the subjugation of bodies and the control of populations, marking the beginning of an era of ‘biopower’.”
Michel Foucault, The History of Sexuality, Vol. 1, translated from the French by Robert Hurley, New York: Pantheon, 1978, p. 140
Prior to the above quotes, Foucault begins his exposition of biopower with an examination of the transition from the traditional “power of life and death” held by sovereigns, which Foucault says was in fact restricted to the power of death, i.e., the right of a sovereign to deprive subjects of their life, to a fundamental change in emphasis so that the “power of life and death” became the power of life, i.e., biopower. The shift from right of death to power over life is what marks the emergence of biopower. Foucault, however, explicitly acknowledged that,
“…wars were never as bloody as they have been since the nineteenth century, and all things being equal, never before did regimes visit such holocausts on their own populations.”
Michel Foucault, The History of Sexuality, Vol. 1, translated from the French by Robert Hurley, New York: Pantheon, 1978, pp. 136-137
This thanatogenous phenomenon is what Edith Wyschogrod called “The Death Event” (which I wrote about in Existential Risk and the Death Event), but if Foucault is right, it is not the Death Event that defines the social milieu of industrial-technological civilization, but rather a “Life Event” that we must postulate parallel to the Death Event.
What is the Life Event parallel to the Death Event? This is nothing other than the loss of belief in an otherworldly reward after death (which defined social institutions from the Axial Age to the Death of God, and which may be the source of the relation between agriculture and the macabre), and the response to this lost possibility of eternal bliss by the quest for health and felicity in this world and in this life.
A key idea in Foucault’s exposition of biopower hinges upon how the contemporary power over life that has replaced the arbitrary right of death on the part of the sovereign has been seamlessly integrated into state institutions, so that state institutions are the mechanism by which biopower is applied, enforced, expanded, and preserved over time. From this perspective, biopower becomes the unifying theme of Foucault’s series of earlier books on asylums for the insane, prisons for the criminal, and clinics for the diseased, all of which institutions had the character of the, “subjugation of bodies and the control of populations” through “precise controls and comprehensive regulations.” (At this point Foucault could have profited from the work of Erving Goffman, who identified a particular subset of “total institutions” that completely regulated the life of the individual.)
What we are seeing today is that the “success” of the imperative of biopower has resulted in longer and healthier lives among docile populations, who dutifully report to their mind-numbing labor of choice and rarely riot. To step outside the confines of acceptable social behavior is to find oneself committed to a total institution such as an asylum or a prison, so that that individual self-censors and self-restrains in order to preempt state action that would bring his behavior into conformity with the norm. With the imperative of biopower largely established and largely uncontested, the next frontier is the imperative of extending biopower to the mind, and rendering the population intellectually docile in the way that bodies have been regulated and rendered docile.
The extension of biopower to the life of the mind might be called psychopower. This extension presumably involves parallel regimes of psychic hygiene that will give the individual mind a longer, healthier life, as biopower has bequeathed a longer, heathier life to the body, but the healthy and hygienic mind is also a mind that has subjugated to precise controls and comprehensive regulation. Cognitive pathology here becomes a pretext for state intervention into the private consciousness of the individual.
The proliferating regimes of therapy, counseling, psychiatric services, so-called “social” services that today almost invariably have a psychiatric component, not to mention the bewildering range of psychotropic medications available to the public — and apparently prescribed as widely as they are known and available — are formulated with an eye to regimenting the intellectual life of the body politic. And this “eye” is none other than the medical gaze now trained upon the individual’s introspection.
The mechanism by which psychopower is obtained has, to date, been the same state institutions that have overseen biopower, but this is already changing. The emergence of biopower in the period of European history that Foucault called “The Classical Age” (“l’âge classique”) was a product of agricultural civilization (specifically, agrarian-ecclesiastical civilization) at its most mature and sophisticated stage of development, shortly before all that agrarian-ecclesiastical civilization had built in terms of social institutions would be swept away by the unprecedented social change resulting from the industrial revolution, which would eventually begin to converge upon a new civilizational paradigm, that of industrial-technological civilization.
Thus biopower at its inception was the ultimate regulation of a biocentric civilization. As civilization makes a transition from being biocentric to technocentric, new instrumentalities of power will be required to implement a regime of docility under radically changed socioeconomic conditions, i.e., technocentric socioeconomic conditions, and this will require technopower, which will take up where biopower leaves off. Biopower conceived after the manner of biocentric civilization, of which agrarian-ecclesiastical civilization is an expression, cannot answer to the regulatory needs of a technocentric civilization, which thus will require a regime of technopower.
Already this process has begun, though the transition from biocentric civilization is likely to be as slow and as gradual as the transition from hunter-gatherer nomadism to the discipline of settled civilization, in which the institutions of biopower first begin to assume their inchoate forms. What we are beginning to see is the transition from state power being embodied in and exercised through social institutions to state power being embodied in and exercised through technological infrastructure. Central to this development is the emergence of the universal surveillance state, in which the structures of power are identical to the structures of electronic surveillance.
The individual participates in social media for the presumptive opportunities for self-expression and self-development, which are believed to have many of the positive social effects that the regulation of docile bodies has had upon longevity and physical comfort. The structure of these networks, however, serves only to reinforce the distribution of power within society. The more alternatives we have for media, the more we hear only of celebrities (in what is coming to be called a “winner take all” economic model). At the same time that the masses are encouraged to occlude their identity through the iteration of celebrity culture that renders the individual invisible and powerless, the individual self is relentlessly marginalized. In Is the decontextualized photograph the privileged semiotic marker of our time? I argued that the proliferating “selfies” that populate social media, as a self-objectification of the self, are nothing but the “death of the self” prognosticated by post-modernists.
It is unlikely in the extreme that most or even many individuals have any kind of ideological commitment to the emerging universal surveillance state or to the death of the subject, but the technological institutions that are increasingly the mediators of all expression and commerce are becoming inescapable, and as they converge upon totality they will effect a reconstruction of society that will consolidate technopower in the hands of the systems administrators of the technocentric state. These structures are already being constituted, and the channeling of power through apparently benign networks will be the triumph of technopower as it replaces biopower.
. . . . .
. . . . .
. . . . .
19 October 2013
The question of existential risk is intentionally formulated as a very large conception that is concerned with risks to humanity on the largest scale — the possible extinction, stagnation, flawed realization, or ruination of Earth-originating intelligence. An existential threat (as the term is commonly employed, and in contradistinction to an existential risk) may be considered a relative existential risk, that is to say, an existential threat that constitutes a risk to concerns less comprehensive that the whole of humanity and humanity’s future. Individual human beings face existential threats, as do particular business enterprises, cities, nation-states, and social movements, inter alia. In short, any existing object that faces a threat to is continued existence may be said to face an existential threat.
When nation-states (or, before the advent of nation-states, their predecessor political institutions) that view each other as existential threats become engaged in a war, these wars typically escalate to become wars of extermination. A war of extermination is a particular species of the genus of warfare, uniquely characterized by systematic effort to not merely defeat the enemy, but to annihilate the enemy. Thus wars of extermination are also called wars of annihilation.
Another way to formulate the idea of a war of extermination is to think of it as a genocidal war. Genocides can be carried out in the context of war or in isolation (presumably, in the context of “peace,” but any peace that provides the context for genocide is not a peace worthy of the name). In sense, then, the ideas of war and of genocide can be understood in isolation from each other — war without genocide, and genocide without war — though there is another sense in which genocide is a war against a particular people, i.e., a war of extermination.
It is worthwhile, I think, to distinguish between the Clausewitzean conception of absolute war or the more recent conception of total war and wars of extermination, although this distinction is not always made. Absolute or total wars refer to means, whereas war of extermination refers to ends. Means and ends cannot be cleanly separated in the unkempt reality of the world, and the means of total war is one way to bring about the aim of a war of extermination, but a war of extermination can also be pursued by less than total means.
In several posts I have written about what Daniel Goldhagen calls “human eliminationism,” of which he distinguishes five varieties:
● transformation: “the destruction of a group’s essential and defining political, social, or cultural identities, in order to neuter its members’ alleged noxious qualities.” (this is very similar to what I have called The Stalin Doctrine)
● oppression: “keeping the hated, deprecated, or feared people within territorial reach and reducing, with violent domination, their ability to inflict real or imagined harm upon others.”
● expulsion: “Expulsion, often called deportation… removes unwanted people more thoroughly, by driving them beyond a country’s borders, or from one region of a country to another, or compelling them en masse into camps.” (I wrote about this in The Threshold of Atrocity)
● prevention of reproduction: “those wishing to eliminate a group in whole or in part can seek to diminish its numbers by interrupting normal biological reproduction.”
● extermination: for Goldhagen, extermination seems to be equivalent to genocide simpliciter, in the narrow and strict sense: “killing often logically follows beliefs deeming others to be a great, even mortal threat. It promises not an interim, not a piecemeal, not only a probable, but a ‘final solution’.”
If we take Daniel Goldhagen’s distinctions within this scheme of human eliminationism, we see that many means can be employed to the ultimate aim of genocide. Indeed, what are sometimes called “military operations other than war” (MOOTW) may in some cases be sufficient to bring about some level of human eliminationism, and therefore prosecute (an undeclared) war of extermination.
The above considerations give us six categories of war that overlap and intersect to present an horrific exemplification of Wittgensteinian family resemblances:
● war simpliciter
● war of extermination
● war of annihilation
● genocidal war
● absolute war
● total war
A recent book on wars of annihilation and wars of extermination, War of Extermination: The German Military in World War II, employ several of these concepts of war without trying to make fine distinctions of the sort one would like to see in a comprehensive theory of war:
“The war of annihilation is a cultural phenomenon. It does not exist merely because war exists. A war of annihilation — that is to say, a war which is waged, in the worse case, in order to exterminate or merely to decimate a population, but likewise a war aimed at exterminating the enemy population capable of bearing arms, the opposing armies, and indeed also a battle of annihilation in which the aim is not merely to defeat or beat back the opposing army but to kill the enemy in the greatest possible numbers — all these forms of the war of annihilation, however widespread they may be in geographical space and historical time, are not historical inevitabilities.”
Heer and Naumann, editors, War of Extermination: The German Military in World War II, Berghahn Books, 2004, “The Concept of the War of Annihilation: Clausewitz, Ludendorff, Hitler,” Jan Philipp Reemtsma, p. 13
“Clausewitz had been wrong: the war of extermination was structured not only by grammatical rule, but also by a particular kind of logic. Whereas the grammar — as Schenckendorff and Kluge both correctly assumed — could be controlled, the logic behind the war of extermination — as Hitler knew full well — was utterly dominant and tolerated no half-measures. At the end of the second year of the war in the East this principle was nowhere so clearly in evidence as on the partisan front.”
Heer and Naumann, editors, War of Extermination: The German Military in World War II, Berghahn Books, 2004, “The Logic of the War of Extermination: The Wehrmacht and the Anti-Partisan War,” Hannes Heer, p. 117
It is widely acknowledged by scholars of the Second World War that the Nazi-Soviet war on the Eastern Front came to constitute a war of extermination. Casualties were heavier on the eastern front than casualties on the western front. Perhaps most tellingly, when the German war machine began collapsing, German soldiers made an effort not to be captured by Soviet troops, as they knew that they could expect the worst in this case.
When Hitler violated the Molotov-Ribbentrop Pact and invaded the Soviet Union in the massive Operation Barbarossa, it was the beginning of an existential struggle between ideological enemies — fascism and communism — in which each side explicitly framed the other as an existential threat. Thus the eastern front was, from the outset, expected to be a war of extermination, and the above-quoted book takes the Nazi-Soviet conflict as paradigmatic of a war of extermination.
It was partially in response to the experience of the eastern front and its war of extermination within the larger framework of the Second World War (which also included the Nazi war of extermination against the Jews) that the Nuremberg principles were formulated. The Nuremburg Principles include as principle VI(c) a list of crimes against humanity:
“These consist of murder, extermination, enslavement, deportation, and other inhuman acts done against any civilian population, or persecutions on political, racial, or religious grounds, when such acts are done or such persecutions are carried out in execution of or done in connection with any crimes against the peace or any war crime.”
All of these war crimes were realized in the course of the Second World War with shocking clarity — the kind of clarity that comes from an ideological war in which a war of extermination was expected to follow from explicitly stated positions of the combatants. History has not always been so clear in its demonstrations of philosophy teaching by examples, but even if earlier history was not as explicit in its prosecution of wars of extermination, less obvious forms have always been with us.
Wars of extermination did not begin in the twentieth century, and even Kant mentioned the possibility of such conflict in his Perpetual Peace. Kant states his sixth article as follows:
6. “No State Shall, during War, Permit Such Acts of Hostility Which Would Make Mutual Confidence in the Subsequent Peace Impossible: Such Are the Employment of Assassins (percussores), Poisoners (venefici), Breach of Capitulation, and Incitement to Treason (perduellio) in the Opposing State”
And in light of this he says of wars of extermination:
“These are dishonorable stratagems. For some confidence in the character of the enemy must remain even in the midst of war, as otherwise no peace could be concluded and the hostilities would degenerate into a war of extermination (bellum internecinum).”
The Latin tag employed by Kant points to the antiquity of the idea. Kant continues:
“…a war of extermination, in which the destruction of both parties and of all justice can result, would permit perpetual peace only in the vast burial ground of the human race. Therefore, such a war and the use of all means leading to it must be absolutely forbidden. But that the means cited do inevitably lead to it is clear from the fact that these infernal arts, vile in themselves, when once used would not long be confined to the sphere of war.”
One of the factors that made wars of extermination explicit in the twentieth century was the emergence of the nation-state as an actor on the international stage. The nation-state was conceived as a political representative of a particular people, and as representative of the aspirations and ambitions of a particular ethnic group, nation-states also frequently have exhibited the worst kind of ethnocentric politics.
I began my book Political Economy of Globalization with the assertion that we must begin with the fact of the nation-state as the central political institution of our time. Whatever we may think of the nation-state — whether one believes that it is a permanent feature of human political organization or that will soon join empires and kingdoms on the ash-heap of history — it is the central fact of the international system as it exists in our time.
The existential viability of a nation-state is predicated upon the ability of that nation-state to meet existential threats and overcome them. Few would dispute the right of a nation-state to defend itself in an existential struggle, but many would dispute the legitimacy of the grounds for such a struggle. When a political entity claims to be threatened on ideological grounds — when the mere existence of another ethnic group or ideological movement is construed as an existential threat — then we move beyond a defensive struggle to continue to exist and into the realm of ideological conflict that is the natural ground from which wars of extermination grow.
I have already written about the role of nation-states in genocide in Genocide and the Nation-State. I see a strong connection between the two. No war of such magnitude can be waged without the implicit consent of the peoples from whom the troops are drawn, and who continue to make it possible for the state to prosecute such a war. This is one sense in which total war and war of extermination coincide.
. . . . .
. . . . .
. . . . .
In Industrial-Technological Disruption I tried to describe the systemic disruptions to the cycle that drives industrial-technological civilization — science inventing technologies that are engineered into industries that create new instruments for science, leading to further inventions. This cycle of escalation is impeded by counter-cyclical trends such as science experiencing model crisis, stalled technologies, and unintended consequences of engineering.
Among the unintended consequences of engineering I specifically cited industrial accidents. I explicitly discussed industrial accidents in Impossible Desires and Industrialized Civilization and its Accidents. I also discussed industrial accidents obliquely in Complex Systems and Complex Failure, which was concerned with the ways in which complex systems fail; it is a feature of industrial-technological civilization that as science and technology become more sophisticated, the systems that they produce become more complex and therefore exemplify complex failure when they fail. We like to think that we learn the lessons of our accidents and do better next time. And we do. We learn some hard lessons at the cost of lives, capital, and wasted time.
Learning our lessons, however, does not prevent future industrial accidents, because the cycle that drives industrial-technological civilization develops by continually revolutionizing production, and the continual revolutionizing of production means that there are always new scientific discoveries, new technologies, and new industrial processes. New and unfamiliar industrial processes mean new and unprecedented industrial accidents. And it is for this reason that industrial-technological civilization will always involve industrial accidents. One could say that industrial accidents are the natural disasters of industrial-technological civilization.
Thus while industrial accidents seem to be mere contingencies, ultimately irrelevant to the great project of industrialization, they in fact play a constitutive role in industrial-technological civilization, much as natural disasters play a decisive and constitutive role in agrarian-ecclesiastical civilization. It cannot be otherwise, living, as we do, in an accidental world, in which the importance of the uniqueness of the individual also attaches to the uniqueness of individual events, including accidental events.
There is another sense in which industrial accidents shape industrial-technological civilization that is perhaps even more radical than that outlined above because of the way that it ties in which the maturation of industrial-technological civilization, and therefore with its potential axialization.
Many observers of the regime of contemporary industrial civilization have noted that regulation almost always comes after there has been a major accident that results in multiple deaths. This is one of the ways in which the representatives of the institutions of industrial-technological civilization attempt to demonstrate to their constituents that they have learned the lessons of industrial accidents and are taking measures to address the problem. But, as observed above, industrial-technological civilization will always produce industrial accidents. This means that as industrial-technological civilization develops, it will always produce more accidents, these accidents will usually result in legislation and regulation to address the causes of the accident (ex post facto), and the regulatory burden on industrial will always increase even as new technologies are introduced — technologies which often make past dangers (and past regulations) irrelevant.
Thus the maturation of industrial-technological civilization becomes not an expression of the central idea of the civilization in mythological form — as with the axialization of the nomadic paradigm in the great cave art of paleolithic prehistory, or with the Axial Age religions delineated by Jaspers — but a legalistic compilation of regulations (and it could be argued that this formal legalism represents the essential idea of industrial-technological civilization). We have seen this before in civilization, as with the Corpus Iuris Civilis of the Byzantines, also known as Justinian’s Code.
The increasing legal formalism of mature industrial-technological civilization has significant consequences. In an early post, Exaptation of the Law, I argued that law has an intrinsic bias in favor of the past. In that post I wrote the following:
If we think of the common law tradition, in which there is no constitutional basis but only a history of case law, it is obvious that precedent plays a central role. A ruling in the past establishes a convention that is followed in later rulings preserves the past into the present. And we may think of the establishment of a constitution or formal statutes as a “re-setting” of precedent. Laws and constitutions are not written in a vacuum, and the legal history that precedes such an effort must loom large in the minds of those so occupied.
Industrial-technological civilization develops by continually revolutionizing production, and yet it is being driven by its own institutions in the direction of legalistic regulation biased in favor of the past. This tension comes dangerously close to institutionalizing permanent stagnation, which suggests that the development of industrial-technological civilization carries within itself the seeds of its own existential risk.
And we must not fail to see the central role of procedural rationality in industrial-technological civilization. In Capitalism and Human Rights I argued that the rule of law essential to the emergence of industrial capitalism was subsequently exapted by human rights advocates, and since a rigorous conception of property rights, rigorously observed, is a necessary condition of the development of industrialized capitalism, once these rigorous legal institutions began to be applied to human rights such claims could not be readily denied without calling into question the same property rights that made that civilization possible.
Thus we already have a reference in which industrial-technological civilization has been forced by its own institutions to accept principles that could be said to compromise the unconditioned pursuit of industrial capitalism. It is, then, not unprecedented to speculate that these same rigorous legal institutions of industrial-technological civilization may force that civilization into strangling itself with regulations and legislation that is feels compelled to observe even at the expense of its continued vitality. Indeed, in so far as the first signs of stagnation are social ossification and a de facto feudalism within industrial society, we can see that this growing legalism is perfectly consistent with the view that crony capitalism may be the mature form of industrial-technological civilization.
While this is not a happy prospect for me, the good news here is that, in so far as permanent stagnation is an existential risk of industrial-technological civilization, if we can understand the structures that generate this risk, we can employ our knowledge in the mitigation of that risk.
. . . . .
. . . . .
. . . . .
9 March 2013
A Psychodynamic Account of Contemporary
Islam and its Place in Civilizational Seriation
Some time ago in From Neurotic Misery to Ordinary Human Unhappiness I discussed a famous Freud quote. The quote runs as follows:
…much will be gained if we succeed in transforming your hysterical misery into common unhappiness. With a mental life that has been restored to health, you will be better armed against that unhappiness.
After this, in Miserable and Unhappy Civilizations, I suggested that Freud’s distinction between neurotic misery and ordinary human unhappiness can be applied not only to individuals but also to social wholes. Thus it makes sense to speak of neurotically miserable civilizations as compared to civilizations possessing merely ordinary levels of human unhappiness.
Then I went yet further afield in Agriculture and the Macabre, in which I tried to make the case the agricultural civilization is particularly vulnerable to neurotic misery. While industrial-technological civilization certainly has its problems and its limitations, whatever may be said of it, it is not macabre and retrospective in the way that agricultural civilization is.
I have been even more specific in identifying the religious wars of Early Modern Europe (also corresponding with the witch craze) as the nadir of Western civilization and as a paradigm case of a civilization in the grip of neurotic misery. Eventually Western civilization grew out of its neurotic misery, although not without an unprecedented level of carnage, and today Western civilization is a fine representative of ordinary human unhappiness as the basis for civilization. Not very exciting, but it’s better than the alternative.
Islam, as an historical phenomenon, is several hundred years behind Christianity in its development. I do not intend this statement to in any way imply that there is anything intrinsic to Islam that keeps its development behind that of Christendom, but there is the historical fact that, of these two religious traditions of the masses, Islam was promulgated six hundred years later than Christianity. Christianity had already been at its internecine squabbles for hundreds of years when Mohammad performed the Hijra to Medina to found the first Muslim community.
The strife we see today in Islam is the sign of a civilization — Islamic civilization — in the grip of neurotic misery. This situation did not come about suddenly, and it is not going to go away suddenly. It is a narrative that must unfold over a period of hundreds of years, and, as I recently wrote in Why tyranny always fails but democracy does not always prevail, Homo non facit saltum — Man makes no leaps. All development is evolutionary.
The trend toward the neurotic misery of Islamic civilization has been developing for quite some time. Charles Doughty, who traveled through Arab lands in the nineteenth century, frequently comments on the fanaticism of his hosts, as, for example, in this passage:
“The high sententious fantasy of ignorant Arabs, the same that will not trust the heart of man, is full of infantile credulity in all religious matter; and already the young religionist was rolling the sentiment of divine mission in his unquiet spirit.”
Charles Montagu Doughty, Travels in Arabia Deserta, Volume 1, Cambridge, 1888, p. 95
“I wondered with a secret horror at the fiend-like malice of these fanatical beduins, with whom no keeping touch nor truth of honourable life, no performance of good offices, might win the least favor from the dreary, inhuman, and for our sins, inveterate dotage of their bloodguilty religion. But I had eaten of their cheer, and might sleep among wolves.”
Op. cit., p. 502
Such passages are most unwelcome today, and many would regard them as an embarrassment better forgotten, but I suspect that Charles Doughty knew a great deal more about Arabia than many an Arabist today. Rather than taking such remarks as a sign of Doughty’s racism, we might take them in historical context as intimations of what was to come. And historical context is crucial here, since precisely the same thing would no doubt have been in found in Christendom in a parallel historical context. I have no doubt that if a worldly and learned Muslim visited Europe one or two hundred years before Europe’s religious wars, he would have found much the same thing. In fact, Montesquieu depicted exactly this after Europe’s neurotic misery in his epistolary novel The Persian Letters.
A recent feature in Foreign Policy magazine, It’s Not About Us by Christian Caryl (20 February 2013) about intra-Islamic relations, and especially the split between Sunni and Shia branches of Islam, is an exposition of the extent to which Islam is as much at war with itself as with the infidel — exactly like Christendom during its period of neurotic misery. It is well known that militant Jihadis sympathetic to Al Qaeda tend to be Sunni, while the Persians and minority communities throughout the Arab world are Shia, and that there are radical elements on both sides of this divide who are vying to be recognized as the vanguard to militant Islam in the contemporary world. These sectarian divides within Islam frequently correspond to divisions in political power and economic influence, making the religious quarrel indistinguishable from broader social conflicts (again, like early modern Europe). And why should social groups contest with each other to be recognized as the vanguard of Islamic radicalism? Because there is a social consensus that radical Islamism is the telos of civilization.
Just as there were many sane and rational men who lived through Christendom’s neurotic misery (Michel de Montaigne comes to mind, for example), so too there are many sane and rational Muslims in our age of Islam’s neurotic misery — but it would be dishonest to pretend that the exceptions to the rule are anything other than exceptions. When almost everyone agreed that “spectral evidence’ could be admitted in the trials of individuals accused of witchcraft, we must acknowledge that there existed at that time a social consensus that this is what constituted “justice.” And so, too, today, when polls reveal that a majority of Muslims will not condemn atrocities and acts of terrorism carried out in the name of Islam and Jihad, we must acknowledge that there is a social consensus that such acts are widely considered to be permissible, if not encouraged — no matter the reasonable few who are rightly horrified.
I have learned that when talking about the scales of history that apply to civilization and big history that one must go out of one’s way to emphasize that these are not events or movements that can be observed in a single human lifetime. Christianity’s buildup to its own neurotic misery required hundreds and hundreds of years of development; the actual period of neurotic misery lasted as much as two centuries, and the whole episode is still, hundreds of years later, being put behind us. It doesn’t matter how much you might want things to be tied up neatly in your lifetime — if you’re going to discuss these great forces that shape civilizations, you have to get used to the idea that it’s not like observing the life cycles of fruit flies.
Astronomers, who similarly work on very long time scales, have the same difficulty in explaining themselves and getting others to understand in a visceral sense the elapse of eons. The astronomer reconstructs the dynamic history of a universe that seems, to us, to be standing still, by looking in all different directions in the sky and observing different kinds of celestial bodies at different stages of development. The astronomer must then put all these fragments of cosmological history together on one large canvas that he will never himself see in a lifetime, but which he sees in his mind’s eye.
When archaeologists similarly survey different sites and find pottery in different stages of development in different places, they try to put it all together with the movements of ancient peoples. This assembly of a structure in time is called seriation. The astronomer engages in cosmological seriation. (The Hertzsprung–Russell diagram is the seriation of stellar evolution.) The student of civilization and of big history, engages in civilizational seriation.
We observe but a single slice of time — the present — and from this single slice of time we attempt to reconstruct the whole of the continuum of time. Ultimately, this is a project of temporal seriation.
The limited temporal horizon of most contemporary commentators on political strife makes it impossible to seem the larger patterns revealed by civilizational and temporal seriation, and so they make elementary errors of historiography. And not only in politics, but in every aspect of civilization. I have repeatedly tried to point out the misunderstandings in the media of China’s “peaceful rise,” which is really China’s industrial revolution.
Have I repeated myself a sufficient number of time to make my point? I doubt it. But i will keep at it, reminding the reader at every turn that the perspective of Big History cannot be assimilated to the personal experience of time, and that one must pursue a strategy of temporal seriation to see larger patterns that do not reveal themselves to the eye.
One of these larger patterns is the pattern of the development of religion as a mass social phenomenon, and among mass religions one pattern is that of passing through a stage of neurotic misery on the way to the mature expression of religion within a civilization that does not cripple that civilization.
Religion begins with something as small and as personal as a superstition or a ritual observance. Eventually it becomes a system of mythology, and once the system of mythology is systematically integrated with the state structures of agricultural civilization religion becomes a principle of social order and a locus of conflict. This conflict must play itself out until civilization gropes its way toward a social principle consistent with the change and diversity that makes a state successful in an age of industrialized economies. All of this takes time — much more time than any one individual can observe in a lifetime. (There, I’ve repeated myself again.)
The neurotic misery of Islam will persist for hundreds of years, as the neurotic misery of Christendom persisted for hundreds of years. There are perhaps ways to ease the transition and lessen the suffering, but we cannot simply leap over this unpleasantness. It must be worked on in real time, just as a patient on the psychiatrist’s couch must work his way through painful early memories before he can simply be unhappy instead of being neurotically or hysterically miserable.
. . . . .
. . . . .
. . . . .