29 March 2012
Science has become central to industrial-technological civilization. I would define at least one of the properties that distinguishes industrial-technological civilization from agriculturalism or nomadism as the conscious application of science to technology, and the conscious application in turn of technology to industrial production. Prior to industrial-technological civilization there were science and technology and industry, but the three were not systematically interrelated and consciously pursued with an eye toward steadily increasing productivity.
The role of science within industrial-technological civilization has given science and scientists a special role in society. This role is not the glamorous role of film and music and athletic celebrities, and it is not the high-flying role of celebrity bankers and fund managers and executives, but it is nevertheless a powerful role. As Shelley once said that poets were the unacknowledged legislators of the world, we can say that scientists are the unacknowledged legislators of industrial technological civilization. Foucault came close to saying this when he said that doctors are the strategists of life and death.
I have previously discussed the ideological role of science in the contemporary world in The Political Uses of Science. Perhaps the predominant ideological function of science today is the role of “big science” — enormous research projects backed by government, industry, and universities that employ the talents of hundreds if not thousands of scientists. When Kuhnian normal science has this kind of backing, it is difficult for marginal scientific enterprises to compete. Big science moves markets and moves societies not because it is explicitly ideological in character, but because it is effective in meeting practical needs (though these needs are socially defined by the society in which science functions as a part).
Despite the fact that progress in scientific research is driven by the falsification and revision of theories through the expedient of experimentation, the scientific community has been surprisingly successful in closing ranks behind the most successful scientific theories of our time and presenting a united front that does not really give an accurate impression of the profound differences that separate scientists. Often a scientist spends an entire career trying to get a hearing or his or her idea, and this effort is not always successful. There are very real and bitter differences between the advocates of distinct scientific theories. The scientist sacrifices a life to research in a way not unlike the soldier who sacrifices his life on the battlefield: each uses up a life for a cause.
I have some specific examples in mind when I say that scientists have been successful as closing ranks behind what Kuhn would have called “normal science.” I have written about big bang cosmology and quantum theory in this connection. In Conformal Cyclic Cosmology I noted at least one theory seeking empirical evidence for the world prior to the big bang, while in The limits of my language are the limits of my world I discussed some recent experiments that seem to give us more knowledge of the quantum world that traditional interpretations of quantum theory would seem to suggest is possible.
No one of a truly curious disposition could ever be satisfied with the big bang theory, except in so far as it is but one step — and an admittedly very large step — toward a larger natural history of the universe. Given that the entire observable universe may be the result of a single big bang, any account of the world beyond or before the universe defined by the big bang presents possibly insuperable difficulties for observational cosmology. But the mind does not stop with observational cosmology; the mind does not stop even when presented with obstacles that initially seem insuperable. Slowly and surely the mind seeks the gradual way up what Dawkins called Mount Improbable.
Despite the united front that supports fundamental scientific theories (the sorts of science that Quine would have placed near the center of the web of belief), we know from the examples of Penrose’s conformal cyclic cosmology and the recent experiments attempting to simultaneously measure the position and velocity of quantum particles that scientists are continuing to think beyond the customary interpretations of theories.
The often-repeated claims that space and time were created simultaneously in the big bang and that it is pointless to ask what came before the big bang (as earlier generations were assured that it was illegitimate to ask “Who made God?”), and the claims of the impossibility of simultaneous measurements of a quantum particle’s position and velocity have not stopped the curious from probing beyond these barriers to knowledge. One must, or course, be careful, for there is a danger of being seen as a crackpot, so such inquiries are kept quiet quiet until some kind of empirical evidence can be produced. But before the evidence can be sought, there needs to be an idea of what to look for, and an idea of what to look for comes from a theory. That theory, in turn, must exceed the established interpretations of science if it is too look for anything new.
We know what happens when scientists not only say that something is impossible or unknowable, but also accept that certain things are impossible or unknowable and actually cease to engage in inquiry, and make no attempt to think beyond the limits of accepted theories: we get a dark age. A recent book has spoken of the European middle ages as The Closing of the Western Mind. (In the Islamic world a very similar phenomenon was called “Taqlid” or, “the closing of the gates of Ijtihad“.) When scientists not only say that noting more can be known, but they actually act as though nothing more can be known, and cease to question normal science, this is when intellectual progress stops, and this has happened several times in human history (although I know that this is a controversial position to argue; cf. my The Phenomenon of Civilization Revisited).
It is precisely the fact that science continues to be consciously and systematically pursued in the modern era despite many claims that everything knowable was known that sets industrial-technological civilization apart from all previous iterations of civilization.
Science goes on behind the scenes.
. . . . .
. . . . .
. . . . .
17 March 2012
One of the greatest contributions to science in the twentieth century was Jane Goodall’s study of chimpanzees in the wild at Gombe, Tanzania. Although Goodall’s work represents a major advance in ethology, it did not come without criticism. Here is how Adrian G. Weiss described some of this criticism:
Jane received her Ph.D. from Cambridge University in 1965. She is one of only eight other people to earn a Ph.D. without a bachelor’s (Montgomery 1991). Her adviser, Robert Hinde, said her methods were not professional, and that she was doing her research wrong. Jane’s major mistake was naming her “subjects”. The animals should be given numbers. Jane also used descriptive, narrative writing in her observations and calculations. She anthropomorphized her animals. Her colleagues and classmates thought she was “doing all wrong”. Robert Hinde did approve her thesis, even though she returned with all of his corrections with the original names and anthropomorphizing.
Most innovative science breaks the established rules of the time. If the innovative science is eventually accepted, it eventually also becomes the basis of a new orthodoxy. Given time, that orthodoxy will be displaced as well, as more innovative work demonstrates new ways of acquiring knowledge. As the old orthodoxy passes out of fashion it often falls either into neglect or may become the target of criticism as vicious as that directed at new and innovative research.
I have to imagine that it was this latter phenomenon of formerly accepted scientific discourses falling out of favor and becoming the target of ridicule that inspired one of Foucault’s most famous quotes (which I have cited previously on numerous occasions): “A real science recognizes and accepts its own history without feeling attacked.” Here is the same quote with more context:
Each of my works is a part of my own biography. For one or another reason I had the occasion to feel and live those things. To take a simple example, I used to work in a psychiatric hospital in the 1950s. After having studied philosophy, I wanted to see what madness was: I had been mad enough to study reason; I was reasonable enough to study madness. I was free to move from the patients to the attendants, for I had no precise role. It was the time of the blooming of neurosurgery, the beginning of psychopharmacology, the reign of the traditional institution. At first I accepted things as necessary, but then after three months (I am slow-minded!), I asked, “What is the necessity of these things?” After three years I left the job and went to Sweden in great personal discomfort and started to write a history of these practices. Madness and Civilization was intended to be a first volume. I like to write first volumes, and I hate to write second ones. It was perceived as a psychiatricide, but it was a description from history. You know the difference between a real science and a pseudoscience? A real science recognizes and accepts its own history without feeling attacked. When you tell a psychiatrist his mental institution came from the lazar house, he becomes infuriated.
Truth, Power, Self: An Interview with Michel Foucault — October 25th, 1982, Martin, L. H. et al (1988) Technologies of the Self: A Seminar with Michel Foucault, London: Tavistock. pp.9-15
It remains true that many representatives of even the most sophisticated contemporary sciences react as though attacked when reminded of their discipline’s history. This is true not least because much of science has an unsavory history — at least, by contemporary standards, a lot of scientific history is unsavory, and this gives us reason to believe that many of our efforts today will, in the fullness of time, be consigned to the unsavory inquiries of the past which carry with them norms, evaluations, and assumptions that are no longer considered to be acceptable in polite society. This is, of course, deeply ironic (I could say hypocritical if I wanted to be tendentious) since the standard of acceptability in polite society is one of the most stultifying norms imaginable.
It has long been debated within academia whether history is a science, or an art, or perhaps even a sui generis literary genre with a peculiar respect for evidence. There is no consensus on this question, and I suspect it will continue to be debated so long as the Western intellectual tradition persists. History, at least, is a recognized discipline. I know of no recognized discipline of the study of civilizations, which in part is why I recently wrote The Future Science of Civilizations.
There is, at present, no science of civilization, though there are many scientists who have written about civilization. I don’t know if there are any university departments on “Civilization Studies,” but if there aren’t, there should be. We can at least say that there is an established literary genre, partly scientific, that is concerned with the problems of civilization (including figures as diverse as Toynbee and Jared Diamond). Even among philosophers, who have a great love of writing, “The philosophy of x,” there are very few works on “the philosophy of civilization” — some, yes, but not many — and, I suspect, few if any departments devoted to the philosophy of civilization. This is a regrettable ellipsis.
When, in the future, we do have a science of civilization, and perhaps also a philosophy of civilization (or, at very least, a philosophy of the science of civilization), this science will have to come to terms with its past as every science has had to (or eventually will have to). The prehistory of the science of civilization is already fairly well established, and there are several known classics of the genre. Many of these classics of the study of civilization are as thoroughly unsavory by contemporary standards as one could possibly hope. The history of pronouncements on civilization is filled with short-sighted, baldly prejudiced, privileged, ethnocentric, and thoroughly anthropocentric formulations. For all that, they still may have something of value to offer.
A technological typology of human societies that is no longer in favor is the tripartite distinction between savagery, barbarism, and civilization. This belongs to the prehistory of the prehistory of civilization, since it establishes the natural history of civilization and its antecedents.
Edward Burnett Tylor proposed that human cultures developed through three basic stages consisting of savagery, barbarism, and civilization. The leading proponent of this savagery-barbarism-civilization scale came to be Lewis Henry Morgan, who gave a detailed exposition of it in his 1877 book Ancient Society (the entire book is conveniently available online for your reading pleasure). A quick sketch of the typology can be found at ANTHROPOLOGICAL THEORIES: Cross-Cultural Analysis.
One of the interesting features of Morgan’s elaboration of Tylor’s idea is his concern to define his stages in terms of technology. From the “lower status of savagery” with its initial use of fire, through a middle stage at which the bow and arrow is introduced, to the “upper status of savagery” which includes pottery, each stage of human development is marked by a definite technological achievement. Similarly with barbarism, which moves through the domestication of animals, irrigation, metal working, and a phonetic alphabet. This breakdown is, in its own way, more detailed than many contemporary decompositions of human social development, as well as being admirably tied to material culture and therefore amenable to confirmation and disconfirmation through archaeological research.
Today, of course, we are much too sophisticated to use terms like “savagery” or “barbarism.” These terms are now held in ill repute, as they are thought to suggest strongly negative evaluations. A friend of mine who studied anthropology told me that the word “primitive” is now referred to as “the P-word” within the discipline, so unacceptable has it become. To call a people (even an historical people now extinct) “savage” is similarly considered beyond the pale. We don’t call people “savage” or “primitive” any more. But the dangers of these terminological obsessions are that we get hung up on the terms and no longer consider theories on their theoretical merits. Jane Goodall’s theoretical work was eventually accepted despite her use of proper names in ethology, and now it is not at all uncommon for researchers to name their subjects that belong to other species.
Some theoreticians, moreover, have come to recognize that there are certain things that can be learned through sympathizing with one’s subject that simply cannot be learned in any other way (score one posthumously for Bergson’s conception of “intellectual sympathy”). Of course, science need not limit itself to a single paradigm of valid research. We can have a “big tent” of science with ample room for many methodologies, and hopefully also with plenty of room for disagreements.
It would be an interesting exercise to take a “dated” work like Lewis Henry Morgan’s book Ancient Society, leave the theoretical content intact, and change only the names. In fact, we could formalize Morgan’s gradations, using numbers instead of names just as Jane Goodall was urged to do. I suspect that Morgan’s work would be treated rather better in this case in comparison to the contemporary reception of its original terminology. We ought to ask ourselves why this is the case. Perhaps it is too much to hope for a “big tent” of science so capacious that it could hold Lewis Henry Morgan’s terminology alongside that of contemporary anthropology, but we have arrived at a big tent of science large enough to hold Jane Goodall’s proper names alongside tagged and numbered specimens.
. . . . .
. . . . .
. . . . .