25 March 2012
In what style should we think? It sounds like an odd question. I will attempt to make it sound like a reasonable one.
It would, of course, be preferable (or maybe I should say, “more natural”) to ask, “In what manner should we think?” or simply, “How should we think?” But I have formulated my question as I have in order to refer to Heinrich Hübsch’s essay, “In what style should we build?” (In welchem Style sollen wir bauen? 1828)
Building and thinking are both human activities, and thus both can be assimilated to the formulation of Weyl that I quoted in The Clausewitzean Conception of Civilization:
“The ultimate foundations and the ultimate meaning of mathematics remain an open problem; we do not know in what direction it will find its solution, nor even whether a final objective answer can be expected at all. ‘Mathematizing’ may well be a creative activity of man, like music, the products of which not only in form but also in substance are conditioned by the decisions of history and therefore defy complete objective rationalization.”
Hermann Weyl, Philosophy of Mathematics and Natural Science, Appendix A, “The Structure of Mathematics”
What Weyl here refers to as “mathematizing” can be generalized to human cognition generally speaking, and, if we like, we can generalize all the way to a comprehensive Cartesian conception of thought:
By the word ‘thought’, I mean everything which happens in us while we are conscious, in so far as there is consciousness of it in us. So in this context, thinking includes sensing as well as understanding, willing, and imagining. If I say, ‘I see therefore I am,’ or ‘I walk therefore I am,’ and mean by that the seeing or walking which is performed by the body, the conclusion is not absolutely certain. After all, when I am asleep I can often think I am seeing or walking, but without opening my eyes or moving, — and perhaps even without my having any body at all. On the other hand, the conclusion is obviously certain if I mean the sensing itself, or the consciousness that I am seeing or walking, since the conclusion then refers to the mind. And it is only the mind which senses, or thinks about its seeing or walking.
Descartes, Principles of Philosophy, section 9
Do thinking and building have anything in common beyond both being human activities? Is there not something essentially constructive in both activities? (This question is surprisingly apt, because we need to understand what constructive thinking is, but I will return to that later.) Did not Kant refer to the “architectonic” of pure reason, and has it not become commonplace among contemporary cognitive scientists and philosophers of mind to speak of our “cognitive architecture.”
Just taking the term “constructive” in its naïve and intuitive signification, we know that thought is not always constructive. Indeed, it is often said that thought, and especially philosophical thought, must be analytical and critical. Critical thought is not always or invariably destructive, and most of us know the difference between constructive criticism and destructive criticism. Still, thought can be quite destructive. William of Ockham, for example, is often credited with bringing down the Scholastic philosophical synthesis that reached its apogee in Aquinas.
Similar observations can be made about the building trades. While we usually do not include demolition crews among the construction trades, there is a sense in which demolition and construction are both phases in the building process. Combat engineers must be equally trained in the building and demolition of bridges, for example, which demonstrates both the constructive and the destructive aspects of construction engineering.
Just as we have a choice not only what to build, but in what style we will build, so too we have a choice, not only in what we think, but also how we think. As a matter of historical fact, I think you will find that the thinking of most individuals is not much more than a reaction, or a reflex. People think in the way that comes naturally to them, and they do not realize that they are thinking in a certain style unless they pause to think about their thinking. Well, this would be one way to characterize philosophy: thinking about thinking.
The unthinking way in which most of us think has the consequence of fostering what may be called cognitive monoculture. Individuals rarely step outside the parameters of thought with which they are comfortable, and so they allow their thoughts to follow in the ruts and the grooves left by their ancestors, much as architects, for many generations, reiterated classical building styles for lack of imagination of anything different.
It is probably very nearly impossible that I should write about building and thinking without citing Heidegger, so here is my nearly obligatory Heidegger citation, which, despite my general dislike of Heideggerian thought, suits my purposes quite perfectly:
“We come to know what it means to think when we ourselves try to think. If the attempt is to be successful, we must be ready to learn thinking.”
Martin Heidegger, What is called thinking? Lecture I
I agree with this: a serious attempt at thinking entails that we come to know what it means to think, and moreover we must be ready to learn thinking, and not merely take it for granted. But I find that I do not agree with the very next paragraph in Heidegger:
“As soon as we allow ourselves to become involved in such learning, we have admitted that we are not yet capable of thinking.”
Martin Heidegger, What is called thinking? Lecture I
In fact, we are capable of thinking, though the problem is that we do not really know whether we are thinking well or thinking poorly. When we think about thinking, when we reflect on what we are doing when we are thinking, we will discover that we have been thinking in a particular style, even if we were not aware that we were doing so — much like the physician in Moliere who did not know that he had been speaking prose his entire life.
If we pay attention to our thinking, and think critically about our thinking, we stumble across a number of distinctions that we realize can be used to classify the style of thought in which we have been engaged: formal or informal, constructive or non-constructive, abstract or concrete, objective or subjective, theoretical or practical, a priori or a posteriori, empirical or rational. These distinction define styles of thought, and it is only in reflection that we realize that one or another of these terms has applied to our thought, and thus we have been thinking in this particular style.
Ideally one would be aware of how one was thinking, and be able to shift gears in the middle of thinking and adopt a different mode of thought as the need or desire arose. The value of knowing how one has been thinking, and realizing the unconscious distinctions one has been making, is that one is now in a position to provide counter-examples to one’s own thought, and one is therefore no longer strictly reliant upon the objections of others who think otherwise than ourselves.
The cognitive monoculture that we uncritically accept before we learn to reflect on our own thinking is more often than not borrowed from the world, and not the product of our own initiative. Are we living, intellectually, so to speak, in a structure built by others? If so, ought we to question or to accept that structure?
This is a theme to which Merleau-Ponty often returned:
“…it is by borrowing from the world structure that the universe of truth and of thought is constructed for us. When we want to express strongly the consciousness we have of a truth, we find nothing better than to invoke a topos noetos that would be common to minds or to men, as the sensible world is common to the sensible bodies. And this is not only an analogy: it is the same world that contains our bodies and our minds, provided that we understand by world not only the sum of things that fall or could fall under our eyes, but also the locus of their compossibility, the invariable style they observe, which connects our perspectives, permits transition from one to the other, and — whether in describing a detail of the landscape or in coming to agreement about an invisible truth — makes us feel we are two witnesses capable of hovering over the same true object, or at least of exchanging out situations relative to it, as we can exchange out standpoints in the visible world in the strict sense.”
Maurice Merleau-Ponty, The Visible and the Invisible,
I trust Merleau-Ponty with this idea, but, to put it bluntly, there are many that I would not trust with this idea, since the idea that our cognitive architecture is borrowed from the world that we inhabit can be employed as a strategy to dilute and perhaps even to deny the individual. One could make the case on this basis that we are owned by the past, and certainly there are those who believe that inter-generational moral duties flow in only one direction, from the present to the past, but merely to formulate it in these terms suggests the possibility of inter-generational moral duties that flow from the past to the present.
Certainly by being born into the world we are born into a linguistic and intellectual context at the same time as we are born into an existential context, and this fact has profound consequences. As in the passage from Marx that I have quoted many times:
“Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past.”
Karl Marx, The Eighteenth Brumaire of Louis Napoleon, first paragraph
Marx gives us a particular perspective on this idea, but we can turn it around and by reformulating Marx attain to a different perspective on the same idea. Marx takes the making of history to be a unidirectional process, but it goes both ways, men make history and history makes men:
“Men begin under circumstances existing already, given and transmitted from the past, and make their own history as they please from what they select of the past. The past has not reality but that which men give to it.”
The circumstances transmitted to us from the past are not arbitrary; these circumstances are the sum total of the efforts of previous generations to re-make the world during their lives according to their vision. We live with the consequences of this vision. Moreover, the circumstances we then create are then transmitted to the past; this is our legacy, and future generations will do with it as they will.
The architect, too, begins with circumstances existing already, given and transmitted from the past. For Hübsch this is the problem. Hübsch begins his brief treatise with a ringing assertion that architectural thought is dominated to an archaic paradigm:
“Painting and sculpture have long since abandoned the lifeless imitation of antiquity. Architecture has yet to come of age and continues to imitate the antique style. Although nearly everyone recognizes the inadequacy of that style in meeting today’s needs and is dissatisfied with the buildings recently erected in it, almost all architects still adhere to it.”
Heinrich Hübsch, In what style should we build? 1828
In the twenty-first century this is no longer true. Building has been substantially liberated from classical forms. In fact, since Hübsch’s time, a new classicism — international modern — rose, dominated for a short time, and now has been displaced by a bewildering plethora of styles, from an ornately decorative post-modernism to outlandish structures that would have been impossible without contemporary materials technology. There are, to be sure, architectural conventions that remain to be challenge, and in the sphere of urban planning these conventions can be quite rigid because they become embodied in legal codes.
For our time, the most forceful way to understand Hübsch’s question would be, “In what style should we build our cities?” Another way in which Hübsch’s question retains its poignant appeal is in the form that I suggested above: in what style should we think?
Are we intellectually owned by the past? Is there a moral obligation for us to think in the style of our grandfathers? A semi-humorous definition attributed to Benjamin Disraeli has it that, “A realist is a man who insists on making the same mistakes his grandfather did.” Are we obliged to be realists?
Here we see the clear connection between building and thinking. Just as we might think like our grandfathers, so too we might build like our grandfathers. This latter was the concern of Hübsch. That is to say, we can as well inhabit (and restore, and reconstruct) the intellectual constructions of our forefathers as well as the material constructions of our forefathers.
It would be entirely possible for us today to construct classical cities on the Greco-Roman model; it is even possible to imagine a traditional Roman house with hot and cold running water, electric kitchen appliances, and wired for WiFi. That is to say, we could have our modern conveniences and still continue to build as the past built. We could choose to literally inhabit the structure of the past, as civilization did in fact choose to do for almost a thousand years when classical cities were built to essentially the same plan throughout the ancient world. (See my remarks on this in The Iterative Conception of Civilization.)
We can take the Middle Ages as the intellectual analogy for thinking that the modernized Roman house is for living: the role of intellectual authority in medieval thinking was unprecedented and unparalleled. If experience contradicted authority, so much the worse for experience. If a classical text stated that something was the case, and the world seemed at variance with the text, the world was assumed to be in error. As classical antiquity lived with the same buildings for a thousand years, so the Middle Ages lived with the same thoughts for a thousand years. There is no reason that we could not take medieval scholarship, as we might update a Roman house, and add a few modern conveniences — like names for chemical elements, etc. — and have this perfectly serviceable intellectual context as our own.
Thus the two previous macro-historical stages of Western civilization prior to modernism — namely, classicism and medievalism — represent, respectively, the attempt to build in the style of the past and the attempt to think in the style of the past. It has been the rude character of modernism to focus on the future and to be dismissive of the past. While this attitude can be nihilistic, we can now clearly see how it came about: the other alternatives were tried and found wanting.
. . . . .
. . . . .
. . . . .
31 December 2011
It seems appropriate on this, the last day of 2011, to reflect upon the year now almost expired, even as the new year is already being celebrated in time zones in advance of my own. As a night person who is always in better spirits and more energetic very late in the day that than early in the morning, it also seems strangely appropriate that I should be near the end of the global “day,” since the date line lies west of me, out in the Pacific Ocean, and the next large landmass on the other side of the date line lies near the beginning of the global “day” — it is quite literally the Land of the Rising Sun.
It was recently reported that a couple of islands in the Pacific — Samoa and Tokelau — decided to switch to the other side of the international date line, skipping Friday altogether and advancing a day in order to align their calendars with those of their major trading partners, Australia and New Zealand. If I had been a Samoan or a Tokelauer I would have been rather irritated with the date switch, as I would have enjoyed being on the very tail end of the global day.
What is to be said of 2011? Did 2011 reveal any new truths to the world, or exhibit any coherent pattern or structure?
Just a few days ago in The Stratfor Hack I said that I had come to the realization that it is just as important to deny the existence of historical patterns that are not in fact exhibited by events as it is to bear witness to historical patterns that are in fact exhibited in events. The more I think of this, the more I think it is more important to resist the attribution of illusory and fallacious historical patterns and trends, since we as human beings are much more likely to find order where there is none that to deny apparent order where there is, in fact, order.
In Futurism without Predictions I argued for discerning patterns in history as the appropriate form of futurism, as against the attempt to make detailed predictions. This is like the difference between being a day trader in the stock market and buying stocks on the basis of research and value. In Confirmation Bias and Evolutionary Psychology I argued that the well known phenomenon of confirmation bias has a basis in our evolutionary history, since believing viscerally in what one is doing is probably a condition for optimal exertion in the struggle of life.
If we put together the critique of prediction-based futurism, the need to discern patterns in history, and the need to transcend our evolutionary predetermination to find patterns where there are none, we come to the overriding importance of not finding patterns where there are none as one of the most important intellectual exercises in the understanding of history. This strikes me as an application of Copericanism to human history: the principle of mediocrity (or the cosmological principle, if you prefer) demands that we not assume that our perspective is special. Thus to claim for any particular year, such as the year just elapsed, that it was a watershed or an historical pivot or a time a great transition is probably to delude ourselves.
And this is exactly what I see in 2011. Certainly it was a year in which much changed, but there have been at least as many historical continuities as historical discontinuities, if not more continuities. 2011 was in year in which many people suffered horrible events and terrible calamities, but it was also a year in which many of the seven billion people on the planet lived a life largely undisturbed and not greatly differentiated from the previous year. If you were to run the numbers, I suspect that you would find that those who suffered a particularly terrible fate during the year (say, for example, the victims of the combined disasters of the Sendai earthquake and the Fukushima nuclear accident) would constitute a small minority of the world’s total population. This does not mean that their suffering was insignificant, only that it did not necessarily shape world events or constitute an historical pattern.
As I see it then, 2011 was a mixed bag, and in the same spirit of historical Copernicanism, I suspect that 2012 will be a similarly mixed bag. Even as I say this I expect that numerous predictions are being made for great historical watersheds in the coming year, just as numerous retrospectives will be identifying 2011 as the the year in which the world changed entire. But one year is very much like another. Few stand out as anything especially shocking or surprising. There is nothing new under the sun.
My perspective is deflationary (in the best tradition of recent analytical philosophy) but sometimes deflationism is necessary. The alternative is to be deluded, and I prefer not to be deluded.
. . . . .
H a p p y N e w Y e a r !
. . . . .
. . . . .
. . . . .
. . . . .
24 November 2011
It is customary in many households across the US to employ Thanksgiving as a pretext for an explicit “counting of one’s blessings,” which may even take the ritualized form of going around the Thanksgiving table, one person at a time (whether before or after the meal — I’m not sure that this makes a difference and may then exemplify the liberty of indifference) — and having each individual present give a recitation of the things for which he or she is grateful to have received.
I have often quoted Joseph Campbell to the effect that a ritual is an opportunity to participate in a myth. In what myth are we participating when we engage in a ritualized recitation of the things for which we are thankful? This seems like an easy enough question, but I think it would actually be quite difficult to give an adequate answer to it.
At the risk of sounding nationalistic, here’s the short answer: Thanksgiving rituals are an opportunity to participate in the Myth of America.
Thanksgiving is among the most recent and among our most “American” of holidays. Unlike, say, All Souls Day, with its medieval roots, or Christmas, with its roots in early antiquity, or Easter, with its roots extending well into the prehistoric past of spring fertility rituals celebrated from time immemorial, Thanksgiving has particular roots in early American history — more especially, American history before America was America. Thanksgiving represents for us the prehistory of America, that is to say, the essential elements that constitute the sine qua non of a free, independent, and prosperous republic.
Nota Bene: If you prefer an ideologically tendentious version, you may choose to call American prehistory the conquest and exploitation of North America by white males of European descent, though I must point out at that the perpetrators of said conquest and exploitation all ultimately became creoles, and therefore would not have been welcome at the Thanksgiving tables of their “family of source” in the Old World, if indeed this family of source had had a Thanksgiving table, which in fact they would not have had prior to this American innovation.
When we give an explicit account of the things for which we are thankful, we are participating in a re-enactment of the essential elements — presumably all present at the mythical Thanksgiving table shared alike by Pilgrim fathers and Native Americans — that made American possible, and which will sustain the myth of America into the future. In so far as a myth is a metaphor, we are, we become, these virtuous Pilgrim fathers in our action of thanksgiving, shared across time.
Given the prevalence today of apocalypticism and declension rhetoric, I don’t suppose that many people today are thinking in terms of the myth of America sustained over la longue durée, as in times of recession the myth of prosperity and plenty is more difficult to sustain.
The tendency of Americans to more or less exemplify the view of Henry Ford that history is bunk, tends to magnify every crisis, and each obstacle in the path of progress is seen as unprecedented and perhaps insuperable. Such things are forgivable in a young republic, but we are under no obligation to perpetuate them ourselves.
Fernand Braudel, who more than any historian exemplified the perspective of la longue durée, occasionally makes reference to contemporary events in his enormous three volume work, Civilization and Capitalism. The book was published in 1979, when the economies the most highly developed industrialized nation-states were, like today, stagnant and not particularly hopeful. All of Braudel’s references to the present reflect this then-current “crisis” of capitalism. Not long after, this “crisis” of capitalism passed, market economies grew dramatically not only in scale but also in productivity, and the whole computer and telecommuncations revolution, which we now take for granted, came about.
That Braudel, the quintessential historian of la longue durée, would characterize economic crisis in terms of the stagnancy of the late 1970s points both to the limitations of all anthropic bias, and the fact that tensions within the world are perennial: both the conflicts and the ideals (not to mention the attempts to put ideals into practice) repeatedly recur in novel iterations. The problems of the late 1970s look a lot like the problems of today; these problems can be expected to re-emerge and re-assert themselves throughout the history of industrialized civilization. However, events that submerge and de-emphasize these same problems will also recur throughout the history of industrialized civilization. Such forces that create long term cycles in economics and society were thus of the greatest interest to structuralist historians. If there are few structuralist historians today, that is only because history, too, is subject to cycles, and the structuralist mode of thought can be expected to emerge and submerge repeatedly in intellectual history.
So much for history. What about today — Thanksgiving Day? For what am I thankful on this Thanksgiving Day? What is my Thanksgiving Latourian litany?
I am thankful to live in a world that is so astonishingly interesting that I never fail to be surprised and fascinated by whatever I find. Whether I am considering natural history or human history or narrowly conceived intellectual history, there is always something to pique my interest and to which I could, had I only several lifetimes, devote a lifetime of study. I am as intrigued by the ecology of predation as I am by medieval controversies about the beatific vision or contemporary research in the ontology of formal systems. In contemporary parlance, It’s all good.
While my gratitude for living an an endlessly interesting world may be merely an artifact of anthropic bias, such that I find the world interesting because I am a part of this world, and indeed a consequence of this world, the possibly paradoxical fact of the matter is that any event or discovery that would reveal the limitations of my perspective due to anthropic bias would be of the greatest interest to me — and would thereby make the world an even more interesting place.
All of the discoveries of science, all of the Copernican heritage that has heretofore shown up anthropic bias and revealed the world in all its counter-intuitive splendor — these things are to me among the most fascinating things about the world for which I feel a certain epistemic gratitude. If further investigation of the world should reveal humanity as being even more marginal, and the world as ultimately far larger and more diverse than we expected, and perhaps more than we can comprehend, that would be very interesting indeed, and I can only envy the future for its knowledge of such a world
. . . . .
. . . . .
. . . . .
. . . . .
5 November 2011
Today is the third anniversary for Grand Strategy: The View from Oregon! Happy Anniversary to me!
I began back in November 2008 with…
…celebrated my first anniversary in November 2009 with…
…and kept the celebration going in November 2010 with…
Thanks for your readership. I appreciate each and every hit I receive. It has become a great amusement to me to track my hits through StatCounter, and to see what exotic locales around the globe have chanced upon this forum.
In the past year I had a new high day of more than 2,000 hits, and I passed the half million hits mark — still far short of those who write about fashion or Kim Kardashian’s brief marriage or other “trending” topics, but not bad for a philosophical commentary of geopolitics, cosmology, and issues of strictly theoretical interest.
Come back soon!
. . . . .
. . . . .
. . . . .
9 October 2011
3 October 2011
I am headed back to Portland after having come to Florida for the 100 Year Starship Study public symposium in Orlando. I’ve chronicled my reactions to the symposium in three posts that I wrote on the evening of each day of the symposium while the events were still fresh in my mind: 100 Year Starship Study Symposium Day 1, 100 Year Starship Study Symposium Day 2, and 100 Year Starship Study Symposium Day 3.
I certainly learned some important lessons. If I ever get the chance to make another presentation, my first question will be, “How much time do I have?” My second question will be, “Do you have any limit on the number of slides that I can use with my presentation?” These are the parameters of public speaking. On a blog one can write as much or as little as one likes. The format is as flexible as one’s inspiration of the moment. When the personal time of others is involved, however, one’s degrees of freedom are constrained. That is a valuable lesson.
While I will not get a second chance to make a first impression, the ideas that I incorporated into my presentation will get a second chance, as one of the requirements for speaking at the 100YSS was to submit a paper, with the intention of the paper to be published in some future number of the Journal of the British Interplanetary Society.
While I was working on my paper and my presentation I conceived a great many ideas that I could not include in my paper, and it would only take me a few months to write it all up in a book-length manuscript if I chose to do so. I may do this eventually, simply because of the intrinsic interest that I have in the ideas, but an earlier lesson learned is that no one buys and almost no one reads the books that I have self-published, so I hesitate to do any more self-publishing except for definitive manuscripts that express my point of view and which I wish to be preserved in some form, regardless of their being commercially non-viable.
I will continue to work on these ideas, since it was my intrinsic interest in the ideas that made me formulate the thoughts in the first place, and this ultimately led to my being present at the 100YSS symposium. As always happens with my philosophical projects, the ideas ultimately “leak” over into other projects, and I have already found important points of connection between these ideas about the moral value of a spacefaring civilization and more general concerns I have in metaphysics and ontology. While this intertextuality of my projects makes it extraordinarily difficult to finish any one project (which is a disadvantage), it also gives a robust philosophical context to any one idea, so that if I am able to give expression to a given idea, I also have a great deal of background material that gives consistent theoretical underpinnings to my work (which is an advantage).
Note added later
In so far as I filled fourteen pages of my notebook during my flights back from Tampa through Atlanta to Portland, the influence of having attended the symposium seems to be “jelling” in my mind and proving fruitful so far. We shall see if any really first class ideas come out of it.
. . . . .
. . . . .
. . . . .
Today is the birthday of Jorge Luis Borges, one of the greatest writers of the twentieth century, so I wish a very happy birthday to the shade of Borges. The kind of writing that Borges did is a rare treat. He was not only a good writer, an entertaining stylist, and wrote fun stories, but his stories were based on intriguing ideas. When you read a Borges story you are likely to encounter an idea that you would never have thought of on your own, but once you have the idea in mind, it insinuates itself into your thoughts.
This, at least, is what I experienced when I first read Borges, at the urging of a friend who would not let up until I read a Borges story. I am glad in retrospect for his persistence. The first Borges story I read was Funes, The Memorious. This is not among Borges’ better-known stories, and in fact I read a review (I think it was in Time magazine some years ago, so consider the source) of Borges’ collected works in which the author of the review singled out Funes as among the least interesting of Borges’ stories. But it was the first one I read, and therefore it has had an ongoing influence on my thought.
I found it curious that Borges specifically assigned Ireneo Funes an infallible perception of time prior to his acquisition (Is that what we should call it?) of an infallible memory. There is, of course, a relation between time and memory, but, in a sense, an inverse relation. I would think (making of it a thought experiment) that if one had an infallible memory, one would lose track of time, as so much would be noticed that the moments otherwise flying by would be laden with perceptions and associations, so much so that time would drag. Since no one of whom I know has actually had a perfect and infallible memory, this must remain speculation. I have often come back in my thoughts to this Borgesian connection between time and memory (which, I suppose, is also a Proustian connection), and I think that there is much more to be said about this.
At the same time that I read the Funes story, I was reading a book about Homer. The author’s theory in this book was that Homer, traditionally blind (as was Borges), was among the last practitioners of a great tradition of recitation from memory of epic poetry. The author suggested that Homer, in a last, great gesture, expended himself in an especially detailed recitation of the traditional epic that was then taken down for posterity by an amanuensis. Therefore, Homer was blind and illiterate, but through dictation this tradition was preserved just at a time at which oral culture was disappearing due to the advent of literary culture, which would have eventually doomed the tradition that Homer represented.
In Book II of the Iliad, immediately before beginning the recitation of the assembled forces (often called the “catalogue of ships”) — a particularly difficult feat of memory that entails many proper names and fewer of the stock phrases (like “the wine-dark sea”) that made oral epic poetry possible — Homer invokes the Muses for assistance, as he does at the beginning of the epic. Only the immortals, it would seem, have the capacity for such feats of memory.
Robert Greenberg of the San Francisco Conservatory of Music (and serial lecturer for The Teaching Company) says that contemporary listeners don’t know how to understand or appreciate polyphonic music, that they have lost the talent through disuse, and that if one is going to comprehend Palestrina or Bach today, one must school oneself in hearing the intertwining melodic lines. Similarly, literary man has lost the talent of oral epic poetry after the tradition of Homer. But Funes has exceeded Homer. Funes has gone even farther than the Lockean thought experiment in substituting proper names for general concepts, farther than this unworkable conception of thought, which would make language impossible.
Funes, in a sense, then, is like a return to a lost, pre-literary, even pre-conceptual Eden, in which men actually noticed the things they saw because they did not reduce them to words or concepts. (Perhaps Ireneo Funes knew that to see is to forget the name of the thing one sees.) By means of language and then literature, we have been expelled by an angel with a flaming sword from this pre-predicative world, and now must earn our experience by the sweat of our brow. What Adam had for the taking, we struggle to grasp. The reduction of numbers to proper names is a return to a pre-conceptual mathematics, to a tallying of details, an Adamic mathematics. Whether the Fall from Adamic innocence and virtue to explicit and formal knowledge is indeed a Fall, or if it is instead a condition of progress, is in a sense the same question as whether Funes has fallen from our present state of sophistication, or whether he was restored to the mind’s Eden. The Borges story is silent on this point.
Unlike the Funes story, almost everyone is familiar with the Borges story about the universal library (La biblioteca de Babel), as entertaining as it is intriguing. Having heard the idea, one can scarcely stop oneself from occasionally musing on the very thought of it. Quine in his Quiddities calls the universal library a “melancholy fantasy” and goes on in the finest tradition of reductive analytical philosophy to prove that the idea amounts to nothing more than the fact that information can be encoded, and that all we have written down could ultimately be expressed by appropriate concatenations of the dot and dash of Morse code.
Quine is right, of course, but his “explanation” does not dissipate the mystique of the library described by Borges, and anyone who loves libraries will find his mind wandering back to those labyrinthine corridors even after having read Quine. And so it was, musing on this melancholy fantasy, that I hit upon a paradoxical notion. While the universal library is impossibly large, it is nevertheless finite. In a finite library it would seem that there could not be any books of infinite length. However, any long book can be broken into multiple volumes. What is to prevent a book from having an infinitude of continuing volumes, and thus being a book of infinite length?
At some point, the volumes would have to repeat themselves, since possible finite combinations of symbols would be exhausted. But how would the volumes be identified? Each volume would end, “continued in volume such-and-such,” but eventually the number of the volume would be inexpressible within the finite dimensions of the books in the universal library. At this point, alternative conventions for naming numbers could be established, and this puts us in mind of the above-mentioned Funes, the Memorious. The project of Ireneo Funes, a “vernacular superman” (as Borges calls him) with a perfect memory, was to construct a number system with proper names for all numbers, which the narrator of the tale identifies as the opposite of a system of numeration.
On an exhaustive formulation, the vast majority of books in the universal library would consist of gibberish. What would make the library more interesting would be a librarian charged with the task of eliminating volumes of gibberish. However, this task would become controversial. The librarian could safely throw away volumes in which not a single coherent word appears, but one would hesitate to formulate more robust directives, which would force the removal of works by Lewis Carroll and Edward Lear.
Beyond the mere elimination of volumes containing too many (however many that may be) non-words, the task of eliminating volumes in which the arrangement of the words was incoherent would be even more controversial. We could safely exclude volumes containing “too many” failures of syntax (Again, how many are “too many” failures?), but we couldn’t go as far in this regard as we could in eliminating nonsense words, as we would risk excluding point of consciousness narratives. The work of the librarian would reach its apogee of controversy in any attempt to exclude works of semantic or pragmatic incoherence, for any claim in this arena would generate counterclaims ad infinitum. And so, although the universal library is finite in regard to the number of volumes it retains, we can safely predict that it would be infinite in regard to the correspondence it would generate.
¡Feliz cumpleaños Borges!
. . . . .
. . . . .
. . . . .
A couple of years ago I made it to my favorite beach (well, at least one of my favorite beaches) on the Oregon Coast on the very day of the summer solstice, and I called this my Oregon Coast Summer Solstice Celebratory Picnic. I even made a video of the day and put it on Youtube. Last year I wasn’t able to celebrate the solstice on the beach, and this year I’m a bit late, but not too late to enjoy a gloriously sunny day on the Oregon coast.
The beach at Cape Meares is a great place for a picnic because it is covered with the bleached bones of our ancient coastal forests. If you pass through the Coast Range on the way to Tillamook, on the way to the Three Capes Scenic Loop, on the way to Cape Meares, you pass through these forests.
During our heavy winter rains, some trees fall in the forest, and some of these trees fall to the bottom of canyons. Among those that fall to the bottom of canyons, some are caught in a sufficient torrent to drag them them to a river. Among these that float down as far as the river, some are taken further out to the ocean, and among some of them that make to to the ocean, some of these are flung back up on to the beach by the tides.
What this means for the aspiring picnicker, is that the beach has a lot of wood debris, and a lot of that wood debris is sitting high enough up on shore that it is quite dry and perfect for a campfire. The wood debris is also broken up into lots of different sizes, which is also convenient for building and maintaining a campfire.
Cape Meares beach also has a great quantity of stones, out of which one can easily and readily build an area to confine one’s campfire, as well as to provide material to holding up whatever grill or food you attempt to balance over your campfire. An additional virtue of Capes Meares is that, while it has a lot rounded beach rocks, the rocks don’t extend all the way to the ocean (as at Ecola Park, where my parents often took my sisters and me when we were children). There is an enormous expanse of sand, especially when the tide is out, that is great for a long walk on the beach. For this reason, among others, I give five stars to Cape Meares.
. . . . .
. . . . .
. . . . .
21 June 2011
Happy summer solstice!
Today is the first day of summer in the northern hemisphere, when the axial tilt of the earth is most steeply inclined toward the sun.
In Oregon we have been having a wet, rainy, and cool spring. The past part of spring in June before the solstice people expect to begin seeing summer weather, and sometimes they do, but mostly they don’t. Sometimes we even have a cool summer, too. I can remember summers from my childhood when it poured down rain on the 4th of July, spoiling the fireworks in the process, which is the sort of thing that children remember.
Since Oregon’s population has grown rapidly over the past decades, and much of that population growth has come from people moving to Oregon from warmer, sunnier states south of us, there is not a little grumbling to be heard in the late spring when it is still raining and with no sign of the sun. The local newspaper even makes sport of the weather, referring to the “strange orange disk in the sky” when the sun does make an unexpected appearance, but this kind of humor does encapsulate a certain feeling of the sun as an alien presence.
I often tell people that they shouldn’t expect good weather until August. September is often very nice here, and there have been days in early October that I have been to the beaches of the Oregon coast and it was sunny and bright. So, if you should come to Oregon, and you want to see the sun while you’re here, I would recommend August. Don’t expect to show up in June and to see weather like southern California in June.
All the same, this year has been particularly wet. Just yesterday it was overcast most of the day, but for the first summer day itself we have a beautifully sunny day. It is, in fact, a summer day that looks like a summer day.
To celebrate the first day of summer I went canoeing.
. . . . .
. . . . .
. . . . .
. . . . .