Monday


area_51

There is a mode of fallacious argumentation, related to argumentum ad ignorantiam yet sufficiently distinct from it, that I am going to call the appeal to embargoed evidence (and which could also be called the appeal to sequestered evidence). The appeal to embargoed evidence occurs when someone makes the claim that some open question has been definitely answered, but that the evidence that settles the question is not available to public inspection. The evidence may be missing, or hidden, or suppressed — but whatever its status, it cannot be produced. One is supposed to take the speaker’s assurances on the existence and nature of the evidence.

I have personally experienced the appeal to embargoed evidence many times, as, for example, when a reader responded to my posts Of Distinctions, Weak and Strong and Of Distinctions, Principled and Otherwise with this comment:

May I recommend lunch with a scientist working in nano technology. The ‘mind body problem’ you speak of was “solved” in a lab I worked in years ago. Sadly it’s classified due to ESP Rsrch. Such musings with regard to mind, now seem like Claudius Ptolemy lecturing about his epicycles.

I responded in turn:

If it’s classified, don’t you suppose that I would have a difficult time getting anyone to talk? Also, I would insist on writing about it, which would endanger both myself and my source.

Is anyone persuaded or convinced by claims of evidence that cannot be produced? I can only conclude that the appeal to embargoed evidence must be at least occasionally effective, or I would not run across it as often as I do.

The appeal to embargoed evidence is encountered most frequently today in the form of claims of government suppression of UFOs and alien bodies, or private industry suppression of particular technologies that would adversely affect established business models (e.g., the idea of the 100 MPG car). While the appeal to embargoed evidence is most commonly encountered in discussions of conspiracy theories, it is also to be found in high culture in the form of suppressed books or manuscripts. It is not unusual to hear that a missing or hidden manuscript by some famous author has been glimpsed by an individual, who in virtue of this privileged access to otherwise inaccessible materials has a special insight into the author in question, or maintains that “everything we think we know about x is false” because the speaker is “in the know” about matters denied to the rest of us.

The conspiratorial dimension of the appeal to embargoed evidence appears when it is stated or implied that an omnipotent government entity, or even a non-governmental entity possessed of uncommon power and influence, is able to suppress all, or almost all, evidence relating to certain knowledge kept secret, whether for the good of the public (which is not ready for the knowledge, on which cf. below) or in order to more effectively act upon some comprehensive social engineering project that would presumably be derailed if only the public knew what was really going on.

A moral dimension is added to the appeal to embargoed evidence when it is stated or implied that evidence has been embargoed because you (the individual asking for evidence) are not worthy of seeing it, or, more comprehensively, that the world at large is not ready for some Earth-shattering secret to be revealed, with the implication being that only the elect are allowed to share in the evidence at present, but the world at large will eventually reach a level of maturity when then evidence can be made public without danger.

The appeal to embargoed evidence gives the appearance of respecting scientific canons of knowledge, because it recognizes that evidence is crucial to knowledge, but it represents a fundamental violation of the scientific method because evidence is invoked rather than produced. Scientific knowledge is in principle reproducible by anyone who has the time and cares to make the effort to confirm experimental results for their own satisfaction. Since embargoed evidence cannot be inspected, tested, or made the object of any scientific experimentation, no putative knowledge or proposed theory solely based on embargoed evidence can be considered scientific.

. . . . .

Fallacies and Cognitive Biases

An unnamed principle and an unnamed fallacy

The Truncation Principle

An Illustration of the Truncation Principle

The Gangster’s Fallacy

The Prescriptive Fallacy

The fallacy of state-like expectations

The Moral Horror Fallacy

The Finality Fallacy

Fallacies: Past, Present, Future

Confirmation Bias and Evolutionary Psychology

Metaphysical Fallacies

Metaphysical Biases

Pernicious Metaphysics

Metaphysical Fallacies Again

An Inquiry into Cognitive Bias by way of Memoir

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

Advertisements

Thursday


Joseph Wright orrery

Scientific civilization changes when scientific knowledge changes, and scientific knowledge changes continuously. Science is a process, and that means that scientific civilization is based on a process, a method. Science is not a set of truths to which one might assent, or from which one might withhold one’s assent. It is rather the scientific method that is central to science, and not any scientific doctrine. Theories will evolve and knowledge will change as the scientific method is pursued, and the method itself will be refined and improved, but method will remain at the heart of science.

Pre-scientific civilization was predicated on a profoundly different conception of knowledge: the idea that truth is to be found at the source of being, the fons et origo of the world (as I discussed in my last post, The Metaphysics of the Bureaucratic Nation-State). Knowledge here consists of delineating the truth of the world prior to its later historical accretions, which are to be stripped away to the extent possible. More experience of the world only further removes us from the original source of the world. The proper method of arriving at knowledge is either through the study of the original revelation of the original truth, or through direct communion with the source and origin of being, which remains unchanged to this day (according to the doctrine of divine impassibility).

The central conceit of agrarian-ecclesiastical civilization to be based upon revealed eternal verities has been so completely overturned that its successor civilization, industrial-technological civilization, recognizes no eternal verities at all. Even the scientific method, that drives the progress of science, is continually being revised and refined. As Marx put it in the Communist Manifesto: “All fixed, fast-frozen relations, with their train of ancient and venerable prejudices and opinions, are swept away, all new-formed ones become antiquated before they can ossify. All that is solid melts into air…”

Scientific civilization always looks forward to the next development in science that will resolve our present perplexities, but this comes at the cost of posing new questions that further put off the definitive formulation of scientific truth, which remains perpetually incomplete even as it expands and becomes more comprehensive.

This has been recently expressed by Kevin Kelly in an interview:

“Every time we use science to try to answer a question, to give us some insight, invariably that insight or answer provokes two or three other new questions. Anybody who works in science knows that they’re constantly finding out new things that they don’t know. It increases their ignorance, and so in a certain sense, while science is certainly increasing knowledge, it’s actually increasing our ignorance even faster. So you could say that the chief effect of science is the expansion of ignorance.”

The Technium: A Conversation with Kevin Kelly [02.03.2014]

Scientific civilization, then, is not based on a naïve belief in progress, as is often alleged, but rather embodies an idea of progress that is securely founded in the very nature of scientific knowledge. There is nothing naïve in the scientific conception of knowledge; on the contrary, the scientific conception of knowledge had a long and painfully slow gestation in western civilization, and it is rather the paradigm that science supplants, the theological conception of knowledge (according to which all relevant truths are known from the outset, and are never subject to change), that is the naïve conception of knowledge, sustainable only in the infancy of civilization.

We are coming to understand that our own civilization, while not yet mature, is a civilization that has developed beyond its infancy to the degree that the ideas and institutions of infantile civilization are no longer viable, and if we attempt to preserve these ideas and institutions beyond their natural span, the result may be catastrophic for us. And so we have come to the point of conceptualizing our civilization in terms of existential risk, which is a thoroughly naturalistic way of thinking about the fate and future of humanity, and is amenable to scientific treatment.

It would be misleading to attribute our passing beyond the infancy of civilization to the advent of the particular civilization we have today, industrial-technological civilization. Even without the industrial revolution, scientific civilization would likely have gradually come to maturity, in some form or another, as the scientific revolution dates to that period of history that could be called modern civilization in the narrow sense — what I have called Modernism without Industrialism. And here by “maturity” I do not mean that science is exhausted and can produce no new scientific knowledge, but that we become reflexively aware of what we are doing when we do science. That is to say, scientific maturity is when we know ourselves to be engaged in science. In so far as “we” in this context means scientists, this was probably largely true by the time of the industrial revolution; in so far as “we” means mass man of industrial-technological civilization, it is not yet true today.

The way in which science enters into industrial-technological civilization — i.e., by way of spurring forward the open loop of industrial-technological civilization — means that science has been incorporated as an integral part of the civilization that immediately and disruptively followed the scientific civilization of modernism without industrialism (according to the Preemption Hypothesis). While the industrial revolution disrupted and preempted almost every aspect of the civilization that preceded it, it did not disrupt or preempt science, but rather gave a new urgency to science.

In several posts I have speculated on possible counterfactual civilizations (according to the counterfactuals implicit in naturalism), that is to say, forms of civilization that were possible but which were not actualized in history. One counterfactual civilization might have been agrarian-ecclesiastical civilization undisrupted by the scientific or industrial revolutions. Another counterfactual civilization might have been modern civilization in the narrow sense (i.e., Modernism without Industrialism) coming to maturity without being disrupted and preempted by the industrial revolution. It now occurs to me that yet another counterfactual form of civilization could have been that of industrialization without the scientific conception of knowledge or the systematic application of science to industry.

How could this work? Is it even possible? Perhaps not, and certainly not in the long term, or with high technology, which cannot exist without substantial scientific understanding. But the simple expedient of powered machinery might have come about by the effort of tinkerers, as did much of the industrial revolution as it happened. If we look at the halting and inconsistent efforts in the ancient world to produce large scale industries we get something of this idea, and this we could call industrialism without modernity. Science was not yet at the point at which it could be very helpful in the design of machinery; none of the sciences were yet mathematicized. And yet some large industrial enterprises were built, though few in number. It seems likely that it was not the lack of science that limited industrialization in classical antiquity, but the slave labor economy, which made labor-saving devices pointless.

There are, today, many possibilities for the future of civilization. Technically, these are future contingents (like Aristotle’s sea battle tomorrow), and as history unfolds one of these contingencies will be realized while the others become counterfactuals or are put off yet further. And in so far as there is a finite window of opportunity for a particular future contingent to come into being, beyond that window all unactualized contingents become counterfactuals.

. . . . .

An_Experiment_on_a_Bird_in_an_Air_Pump_by_Joseph_Wright_of_Derby

. . . . .

I have written more on the nature of scientific civilization in…

David Hume and Scientific Civilization …and…

The Relevance of Philosophy of Science to Scientific Civilization

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

The Retrodiction Wall

23 October 2013

Wednesday


scientific-method

Prediction in Science

One of the distinguishing features of science as a system of thought is that it makes testable predictions. The fact that scientific predictions are testable suggests a methodology of testing, and we call the scientific methodology of testing experiment. Hypothesis formation, prediction, experimentation, and resultant modification of the hypothesis (confirmation, disconfirmation, or revision) are all essential elements of the scientific method, which constitutes an escalating spiral of knowledge as the scientific method systematically exposes predictions to experiment and modifies its hypotheses in the light of experimental results, which leads in turn to new predictions.

The escalating spiral of knowledge that science cultivates naturally pushes that knowledge into the future. Sometimes scientific prediction is even formulated in reference to “new facts” or “temporal asymmetries” in order to emphasize that predictions refer to future events that have not yet occurred. In constructing an experiment, we create a few set of facts in the world, and then interpret these facts in the light of our hypothesis. It is this testing of hypotheses by experiment that establishes the concrete relationship of science to the world, and this is also a source of limitation, for experiments are typically designed in order to focus on a single variable and to that end an attempt is made to control for the other variables. (A system of thought that is not limited by the world is not science.)

Alfred North Whitehead captured this artificial feature of scientific experimentation in a clever line that points to the difference between scientific predictions and predictions of a more general character:

“…experiment is nothing else than a mode of cooking the facts for the sake of exemplifying the law. Unfortunately the facts of history, even those of private individual history, are on too large a scale. They surge forward beyond control.”

Alfred North Whitehead, Adventures of Ideas, New York: The Free Press, 1967, Chapter VI, “Foresight,” p. 88

There are limits to prediction, and not only those pointed out by Whitehead. The limits to prediction have been called the prediction wall. Beyond the prediction wall we cannot penetrate.

effective history

The Prediction Wall

John Smart has formulated the idea of a prediction wall in his essay, “Considering the Singularity,” as follows:

With increasing anxiety, many of our best thinkers have seen a looming “Prediction Wall” emerge in recent decades. There is a growing inability of human minds to credibly imagine our onrushing future, a future that must apparently include greater-than-human technological sophistication and intelligence. At the same time, we now admit to living in a present populated by growing numbers of interconnected technological systems that no one human being understands. We have awakened to find ourselves in a world of complex and yet amazingly stable technological systems, erected like vast beehives, systems tended to by large swarms of only partially aware human beings, each of which has only a very limited conceptualization of the new technological environment that we have constructed.

Business leaders face the prediction wall acutely in technologically dependent fields (and what enterprise isn’t technologically dependent these days?), where the ten-year business plans of the 1950’s have been replaced with ten-week (quarterly) plans of the 2000’s, and where planning beyond two years in some fields may often be unwise speculation. But perhaps most astonishingly, we are coming to realize that even our traditional seers, the authors of speculative fiction, have failed us in recent decades. In “Science Fiction Without the Future,” 2001, Judith Berman notes that the vast majority of current efforts in this genre have abandoned both foresighted technological critique and any realistic attempt to portray the hyper-accelerated technological world of fifty years hence. It’s as if many of our best minds are giving up and turning to nostalgia as they see the wall of their own conceptualizing limitations rising before them.

Considering the Singularity: A Coming World of Autonomous Intelligence (A.I.) © 2003 by John Smart (This article may be reproduced for noncommercial purposes if it is copied in its entirety, including this notice.)

I would to suggest that there are at least two prediction walls: synchronic and diachronic. The prediction wall formulated above by John Smart is a diachronic prediction wall: it is the onward-rushing pace of events, one following the other, that eventually defeats our ability to see any recognizable order or structure of the future. The kind of prediction wall to which Whitehead alludes is a synchronic prediction wall, in which it is the outward eddies of events in the complexity of the world’s interactions that make it impossible for us to give a complete account of the consequences of any one action. (Cf. Axes of Historiography)

wyoming dig

Retrodiction and the Historical Sciences

Science does not live by prediction alone. While some philosophers of science have questioned the scientificity of the historical sciences because they could not make testable (and therefore falsifiable) predictions about the future, it is now widely recognized that the historical sciences don’t make predictions, but they do make retrodictions. A retrodiction is a prediction about the past.

The Oxford Dictionary of Philosophy by Simon Blackburn (p. 330) defines retrodiction thusly:

retrodiction The hypothesis that some event happened in the past, as opposed to the prediction that an event will happen in the future. A successful retrodiction could confirm a theory as much as a successful prediction.

I previously wrote about retrodiction in historical sciences, Of What Use is Philosophy of History in Our Time?, The Puppet Always Wins, and Futurism without predictions.

As with predictions, there is also a limit to retrodiction, and this is the retrodiction wall. Beyond the retrodiction wall we cannot penetrate.

I haven’t been thinking about this idea for long enough to fully understand the ramifications of a retrodiction wall, so I’m not yet clear about whether we can distinction diachronic retrodiction and synchronic retrodiction. Or, rather, it would be better to say that the distinction can certainly be made, but that I cannot think of good contrasting examples of the two at the present time.

Albert Einstein Quote mathematics reality

Effective History

We can define a span of accessible history that extends from the retrodiction wall in the past to the prediction wall in the future as what I will call effective history (by analogy with effective computability). Effective history can be defined in a way that is closely parallel to effectively computable functions, because all of effective history can be “reached” from the present by means of finite, recursive historical methods of inquiry.

Effective history is not fixed for all time, but expands and contracts as a function of our knowledge. At present, the retrodiction wall is the Big Bang singularity. If anything preceded the Big Bang singularity we are unable to observe it, because the Big Bang itself effectively obliterates any observable signs of any events prior to itself. (Testable theories have been proposed that suggest the possibility of some observable remnant of events prior to the Big Bang, as in conformal cyclic cosmology, but this must at present be regarded as only an early attempt at such a theory.)

Prior to the advent of scientific historiography as we know it today, the retrodiction wall was fixed at the beginning of the historical period narrowly construed as written history, and at times the retrodiction wall has been quite close to the present. When history experiences one of its periodic dark ages that cuts it off from his historical past, little or nothing may be known of a past that once familiar to everyone in a given society.

The emergence of agrarian-ecclesiastical civilization effectively obliterated human history before itself, in a manner parallel to the Big Bang. We know that there were caves that prehistorical peoples visited generation after generation for time out of mind, over tens of thousands of years — much longer than the entire history of agrarian-ecclesiastical civilization, and yet all of this was forgotten as though it had never happened. This long period of prehistory was entirely lost to human memory, and was not recovered again until scientific historiography discovered it through scientific method and empirical evidence, and not through the preservation of human memory, from which prehistory had been eradicated. And this did not occur until after agrarian-ecclesiastical civilization had lapsed and entirely given way to industrial-technological civilization.

We cannot define the limits of the prediction wall as readily as we can define the limits of the retrodiction wall. Predicting the future in terms of overall history has been more problematic than retrodicting the past, and equally subject to ideological and eschatological distortion. The advent of modern science compartmentalized scientific predictions and made them accurate and dependable — but at the cost of largely severing them from overall history, i.e., human history and the events that shape our lives in meaningful ways. We can make predictions about the carbon cycle and plate tectonics, and we are working hard to be able to make accurate predictions about weather and climate, but, for the most part, our accurate predictions about the future dispositions of the continents do not shape our lives in the near- to mid-term future.

I have previously quoted a famous line from Einstein: “As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality.” We might paraphrase this Einstein line in regard to the relation of mathematics to the world, and say that as far as scientific laws of nature predict events, these events are irrelevant to human history, and in so far as predicted events are relevant to human beings, scientific laws of nature cannot predict them.

Singularity-magnify

Singularities Past and Future

As the term “singularity” is presently employed — as in the technological singularity — the recognition of a retrodiction wall in the past complementary to the prediction wall in the future provides a literal connection between the historiographical use of “singularity” and the use of the term “singularity” in cosmology and astrophysics.

Theorists of the singularity hypothesis place a “singularity” in the future which constitutes an absolute prediction wall beyond which history is so transformed that nothing beyond it is recognizable to us. This future singularity is not the singularity of astrophysics.

If we recognize the actual Big Bang singularity in the past as the retrodiction wall for cosmology — and hence, by extension, for Big History — then an actual singularity of astrophysics is also at the same time an historical singularity.

. . . . .

I have continued my thoughts on the retrodiction wall in Addendum on the Retrodiction Wall.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Saturday


Jane Goodall

One of the greatest contributions to science in the twentieth century was Jane Goodall’s study of chimpanzees in the wild at Gombe, Tanzania. Although Goodall’s work represents a major advance in ethology, it did not come without criticism. Here is how Adrian G. Weiss described some of this criticism:

Jane received her Ph.D. from Cambridge University in 1965. She is one of only eight other people to earn a Ph.D. without a bachelor’s (Montgomery 1991). Her adviser, Robert Hinde, said her methods were not professional, and that she was doing her research wrong. Jane’s major mistake was naming her “subjects”. The animals should be given numbers. Jane also used descriptive, narrative writing in her observations and calculations. She anthropomorphized her animals. Her colleagues and classmates thought she was “doing all wrong”. Robert Hinde did approve her thesis, even though she returned with all of his corrections with the original names and anthropomorphizing.

Most innovative science breaks the established rules of the time. If the innovative science is eventually accepted, it eventually also becomes the basis of a new orthodoxy. Given time, that orthodoxy will be displaced as well, as more innovative work demonstrates new ways of acquiring knowledge. As the old orthodoxy passes out of fashion it often falls either into neglect or may become the target of criticism as vicious as that directed at new and innovative research.

“A real science recognizes and accepts its own history without feeling attacked.” Michel Foucault

I have to imagine that it was this latter phenomenon of formerly accepted scientific discourses falling out of favor and becoming the target of ridicule that inspired one of Foucault’s most famous quotes (which I have cited previously on numerous occasions): “A real science recognizes and accepts its own history without feeling attacked.” Here is the same quote with more context:

Each of my works is a part of my own biography. For one or another reason I had the occasion to feel and live those things. To take a simple example, I used to work in a psychiatric hospital in the 1950s. After having studied philosophy, I wanted to see what madness was: I had been mad enough to study reason; I was reasonable enough to study madness. I was free to move from the patients to the attendants, for I had no precise role. It was the time of the blooming of neurosurgery, the beginning of psychopharmacology, the reign of the traditional institution. At first I accepted things as necessary, but then after three months (I am slow-minded!), I asked, “What is the necessity of these things?” After three years I left the job and went to Sweden in great personal discomfort and started to write a history of these practices. Madness and Civilization was intended to be a first volume. I like to write first volumes, and I hate to write second ones. It was perceived as a psychiatricide, but it was a description from history. You know the difference between a real science and a pseudoscience? A real science recognizes and accepts its own history without feeling attacked. When you tell a psychiatrist his mental institution came from the lazar house, he becomes infuriated.

Truth, Power, Self: An Interview with Michel Foucault — October 25th, 1982, Martin, L. H. et al (1988) Technologies of the Self: A Seminar with Michel Foucault, London: Tavistock. pp.9-15

It remains true that many representatives of even the most sophisticated contemporary sciences react as though attacked when reminded of their discipline’s history. This is true not least because much of science has an unsavory history — at least, by contemporary standards, a lot of scientific history is unsavory, and this gives us reason to believe that many of our efforts today will, in the fullness of time, be consigned to the unsavory inquiries of the past which carry with them norms, evaluations, and assumptions that are no longer considered to be acceptable in polite society. This is, of course, deeply ironic (I could say hypocritical if I wanted to be tendentious) since the standard of acceptability in polite society is one of the most stultifying norms imaginable.

I'm not sure that acceptability in polite society is the best standard for rigorous science.

It has long been debated within academia whether history is a science, or an art, or perhaps even a sui generis literary genre with a peculiar respect for evidence. There is no consensus on this question, and I suspect it will continue to be debated so long as the Western intellectual tradition persists. History, at least, is a recognized discipline. I know of no recognized discipline of the study of civilizations, which in part is why I recently wrote The Future Science of Civilizations.

There is, at present, no science of civilization, though there are many scientists who have written about civilization. I don’t know if there are any university departments on “Civilization Studies,” but if there aren’t, there should be. We can at least say that there is an established literary genre, partly scientific, that is concerned with the problems of civilization (including figures as diverse as Toynbee and Jared Diamond). Even among philosophers, who have a great love of writing, “The philosophy of x,” there are very few works on “the philosophy of civilization” — some, yes, but not many — and, I suspect, few if any departments devoted to the philosophy of civilization. This is a regrettable ellipsis.

When, in the future, we do have a science of civilization, and perhaps also a philosophy of civilization (or, at very least, a philosophy of the science of civilization), this science will have to come to terms with its past as every science has had to (or eventually will have to). The prehistory of the science of civilization is already fairly well established, and there are several known classics of the genre. Many of these classics of the study of civilization are as thoroughly unsavory by contemporary standards as one could possibly hope. The history of pronouncements on civilization is filled with short-sighted, baldly prejudiced, privileged, ethnocentric, and thoroughly anthropocentric formulations. For all that, they still may have something of value to offer.

A technological typology of human societies that is no longer in favor is the tripartite distinction between savagery, barbarism, and civilization. This belongs to the prehistory of the prehistory of civilization, since it establishes the natural history of civilization and its antecedents.

Edward Burnett Tylor proposed that human cultures developed through three basic stages consisting of savagery, barbarism, and civilization. The leading proponent of this savagery-barbarism-civilization scale came to be Lewis Henry Morgan, who gave a detailed exposition of it in his 1877 book Ancient Society (the entire book is conveniently available online for your reading pleasure). A quick sketch of the typology can be found at ANTHROPOLOGICAL THEORIES: Cross-Cultural Analysis.

One of the interesting features of Morgan’s elaboration of Tylor’s idea is his concern to define his stages in terms of technology. From the “lower status of savagery” with its initial use of fire, through a middle stage at which the bow and arrow is introduced, to the “upper status of savagery” which includes pottery, each stage of human development is marked by a definite technological achievement. Similarly with barbarism, which moves through the domestication of animals, irrigation, metal working, and a phonetic alphabet. This breakdown is, in its own way, more detailed than many contemporary decompositions of human social development, as well as being admirably tied to material culture and therefore amenable to confirmation and disconfirmation through archaeological research.

Today, of course, we are much too sophisticated to use terms like “savagery” or “barbarism.” These terms are now held in ill repute, as they are thought to suggest strongly negative evaluations. A friend of mine who studied anthropology told me that the word “primitive” is now referred to as “the P-word” within the discipline, so unacceptable has it become. To call a people (even an historical people now extinct) “savage” is similarly considered beyond the pale. We don’t call people “savage” or “primitive” any more. But the dangers of these terminological obsessions are that we get hung up on the terms and no longer consider theories on their theoretical merits. Jane Goodall’s theoretical work was eventually accepted despite her use of proper names in ethology, and now it is not at all uncommon for researchers to name their subjects that belong to other species.

Some theoreticians, moreover, have come to recognize that there are certain things that can be learned through sympathizing with one’s subject that simply cannot be learned in any other way (score one posthumously for Bergson’s conception of “intellectual sympathy”). Of course, science need not limit itself to a single paradigm of valid research. We can have a “big tent” of science with ample room for many methodologies, and hopefully also with plenty of room for disagreements.

It would be an interesting exercise to take a “dated” work like Lewis Henry Morgan’s book Ancient Society, leave the theoretical content intact, and change only the names. In fact, we could formalize Morgan’s gradations, using numbers instead of names just as Jane Goodall was urged to do. I suspect that Morgan’s work would be treated rather better in this case in comparison to the contemporary reception of its original terminology. We ought to ask ourselves why this is the case. Perhaps it is too much to hope for a “big tent” of science so capacious that it could hold Lewis Henry Morgan’s terminology alongside that of contemporary anthropology, but we have arrived at a big tent of science large enough to hold Jane Goodall’s proper names alongside tagged and numbered specimens.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

. . . . .

Sunday


In several earlier posts I have made a trial of distinct definitions of naturalism. These posts include:

A Formulation of Naturalism
Two Thoughts on Naturalism
Naturalism: Yet Another Formulation, and
Naturalism and Object Oriented Ontology

I regard all of these formulations of tentative, but there may be something to learn from these tentative formulations if we employ them as a kind of experiment for understanding methodological naturalism. That is to say, each of these attempts to formulate naturalism implies a formulation of methodological naturalism. Furthermore, in so far as methodological naturalism is definitive of contemporary science, each formulation of methodological naturalism implies a distinct conception of science.

In A Formulation of Naturalism I suggested that, “Naturalism is on a par with materialism, and philosophically is to be treated as far as possible like materialism.”

In Two Thoughts on Naturalism I suggested that “Naturalism is on a par with mechanism, and philosophically is to be treated as far as possible like mechanism.” I also suggested that, “Naturalism entails that all ideas will first be manifest in embodied form… there are no abstract ideas that are given to us as abstract ideas; all ideas are ultimately derived from experience.”

In Naturalism: Yet Another Formulation I noted that these earlier efforts at formulations of naturalism are implicitly parsimonious, tending toward conceptual minimalism, and further suggested that, “we can characterize naturalism in terms of a quantitative parsimony, following quantitative formulations as far as they will go, and only appealing to qualitative formulations when quantitative formulations break down.” There is a sense, then, in which we can speak of deflationary naturalism. In so far as these formulations of naturalism embody the principle of parsimony, we need not separately formulate the principle of parsimony as a regulative norm of science.

In Naturalism and Object Oriented Ontology I suggested that an approach to naturalism might be made by way of object oriented ontology, which I there compared to Colin McGinn’s transcendental naturalism thesis, i.e., that the world is “flatly natural” though we are unable to see this for what it is because of our perceptual and cognitive limitations.

While when I first formulated naturalism such that, “Naturalism is on a par with materialism, and philosophically is to be treated as far as possible like materialism,” I intended naturalism as consisting of a more comprehensive scope than materialism, though when applied to the scientific method I see that it can be taken as a doctrine of limiting one’s scope to the problem at hand. This approach to science is as familiar as Newton’s aphorism, Hypotheses non fingo. Science often proceeds by providing a very limited explanation for a very limited range of phenomena. This leaves many explanatory gaps, but the iteration of the scientific method means that subsequent scientists return to the gaps time and again, and when they do so they do so from the perspective of the success of the earlier explanation of surrounding phenomena. Once a species of explanation becomes generally received as valid, the perception of the later extension of this species of explanation (perhaps already considered radical in its initial formulation) becomes more acceptable, and more explanatory power can be derived from the explanation.

Similar considerations to those above hold for the same formulation in terms of mechanism rather than materialism, or in terms of quantification rather than materialism. Initial formulations of mechanism (or quantification) can be crude and seem only to apply to macroscopic features, and is possibly seen as impossibly awkward to explain the fine-grained features of the world. As the mechanistic explanation becomes more refined and flexible, the idea of its application to more delicate matters appears less problematic.

An object-oriented ontological account of naturalism would be the most difficult to formulate and would take us the farthest from methodological concerns and the deepest into ontological concerns, so I will not pursue this at present (as I write this I can feel that my mind is not up to the task at the moment), but I will only mention it here as a viable possibility.

In any case, our formulations of methodological naturalism based on these formulations of naturalism would run something like this:

Methodological materialism pursued as far as possible, leaving any non-material account aside

Methodological mechanism pursued as far as possible, leaving any non-mechanistic account aside

Methodological quantification pursued as far as possible, leaving any qualitative account aside

Methodological flat naturalism, or transcendental naturalism, pursued as fas a possible, leaving any non-flat or non-transcendental account aside

I think that all of these approaches do, in fact, closely describe the methodology of the scientific method, especially as I mentioned above considered from the perspective of the growth of knowledge through the iteration of the scientific method.

The growth of knowledge through the iteration of the scientific method is a formulation of the historicity of scientific knowledge in terms of the future of that knowledge. The formulation of the historicity of scientific knowledge in terms of the past is nothing other than that embodied in the Foucault quote that, “A real science recognizes and accepts its own history without feeling attacked.” (from “Truth, Power, Self: An Interview with Michel Foucault”)

All present scientific knowledge will eventually become past scientific knowledge, and it will become past knowledge through the continued pursuit of the scientific method, which is to say, methodological naturalism in some form or another.

The distant future of scientific knowledge, if only we had access to it, would seem as unlikely and as improbable as the distant past of scientific knowledge, but the past, present, and future of scientific knowledge are all connected in a continuum of iterated method.

It is ultimately the task of philosophy of see scientific knowledge whole, and to this end we must see the whole temporal continuum as the expression of science, and not any one, single point on the continuum as definitive of science. The unity of science, then, is the unity of the scientific method that is the connective tissue between these diverse epochs of science, part, present, and future.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

%d bloggers like this: