The Computational Infrastructure of Civilization

9 April 2014

Wednesday


old computer

Technologies may be drivers of change or facilitators of change, the latter employed by the former as the technologies that enable the development of technologies that are drivers of change; that is to say, technologies that are facilitators of change are tools for the technologies that are in the vanguard of economic, social, and political change. Technologies, when introduced, have the capability of providing a competitive advantage when one business enterprise has mastered them while other business enterprises have not yet mastered them. Once a technology has been mastered by all elements of the economy it ceases to provide a competitive advantage to any one firm but is equally possessed and employed by all. At that point of its mature development, a technology also ceases to be a driver a change and becomes a facilitator of change.

Any technology that has become a part of the infrastructure may be considered a facilitator of change rather than a driver of change. Civilization requires an infrastructure; industrial-technological civilization requires an industrial-technological infrastructure. We are all familiar with infrastructure such as roads, bridges, ports, railroads, schools, and hospitals. There is also the infrastructure that we think of as “utilities” — water, sewer, electricity, telecommunications, and now computing — which we build into our built environment, retrofitting old buildings and sometimes entire older cities in order to bring them up to the standards of technology assumed by the industrialized world today.

All of the technologies that now constitute the infrastructure of industrial-technological civilization were once drivers of change. Before the industrial revolution, the building of ports and shipping united coastal communities in many regions of the world; the Romans built a network of road and bridges; in medieval Europe, schools and hospitals become a routine part of the structure of cities; early in the industrial revolution railroads became the first mechanized form of rapid overland transportation. Consider how the transcontinental railroad in North America and the trans-Siberian railway in Russia knitted together entire continents, and their role as transformative technologies should be clear.

Similarly, the technologies we think of as utilities were once drivers of change. Hot and cold running water and indoor plumbing, still absent in much of the world, did not become common in the industrialized world until the past century, but early agricultural and urban centers only came into being with the management of water resources, which reached a height in the most sophisticated cities of classical antiquity, with water supplied by aqueducts and sewage taken away by underground drainage systems that were superior to many in existence today. With the advent of natural gas and electricity as fuels for home and industry, industrial cities were retrofitted for these services, and have since been retrofitted again for telecommunications, and now computers.

The most recent technology to have a transformative effect on socioeconomic life was computing. In the past several decades — since the end of the Second World War, when the first digital, programmable electronic computers were built for code breaking (the Colossus in the UK) — computer technology grew exponentially and eventually affected almost every aspect of life in industrialized nation-states. During this period of time, computing has been a driver of change across socioeconomic institutions. Building a faster and more sophisticated computer has been an end in itself for technologists and computer science researchers. While this will continue to be the case for some time, computing has begun to make the transition from being a driver of change in an of itself to being a facilitator of change in other areas of technological innovation. In other words, computers are becoming a part of the infrastructure of industrial-technological civilization.

The transformation of the transformative technology of computing from a driver of change into a facilitator of change for other technologies has been recognized for more than ten years. In 2003 an article by Nicholas G. Carr, Why IT Doesn’t Matter Anymore, stirred up a significant controversy when it was published. More recently, Mark R. DeLong in Research computing as substrate, calls computing a substrate instead of an infrastructure, though the idea is much the same. Delong writes of computing: “It is a common base that supports and nurtures research work and scholarly endeavor all over the university.” Although computing is also a focus of research work and scholarly endeavor in and of itself, it also serves a larger supporting role, not only in the university, but also throughout society.

Although today we still fall far short of computational omniscience, the computer revolution has happened, as evidenced by the pervasive presence of computers in contemporary socioeconomic institutions. Computers have been rapidly integrated into the fabric of industrial-technological civilization, to the point that those of us born before the computer revolution, and who can remember a world in which computers were a negligible influence, can nevertheless only with difficulty remember what life was like without computers.

Depsite, then, what technology enthusiasts tell us, computers are not going to revolutionize our world a second time. We can imagine faster computers, smaller computers, better computers, computers with more storage capacity, and computers running innovative applications that make them useful in unexpected ways, but the pervasive use of computers that has already been achieved gives us a baseline for predicting future computer capacities, and these capacities will be different in degree from earlier computers, but not different in kind. We already know what it is like to see exponential growth in computing technology, and so we can account for this; computers have ceased to be a disruptive technology, and will not become a disruptive technology a second time.

Recently quantum computing made the cover of TIME magazine, together with a number of hyperbolic predictions about how quantum computing will change everything (the quantum computer is called “the infinity machine”). There have been countless articles about how “big data” is going to change everything also. Similar claims are made for artificial intelligence, and especially for “superintelligence.” An entire worldview has been constructed — the technological singularity — in which computing remains an indefinitely disruptive technology, the development of which eventually brings about the advent of the Millennium — the latter suitably re-conceived for a technological age.

Predictions of this nature are made precisely because a technology has become widely familiar, which is almost a guarantee that the technology in question is now part of the infrastructure of the ordinary business of life. One can count on being understood when one makes predictions about the future of the computer, in the same way that one might have been understood in the late nineteenth or early twentieth century if making predictions about the future of railroads. But in so far as this familiarity marks the transition in the life of a technology from being a driver of change to being a facilitator of change, such predictions are misleading at best, and flat out wrong at worst. The technologies that are going to be drivers of change in the coming century are not those that have devolved to the level of infrastructure; they are (or will be) unfamiliar technologies that can only be understood with difficulty.

The distinction between technologies that are drivers of change and technologies that are facilitators of change (like almost all distinctions) admits of a certain ambiguity. In the present context, one of these ambiguities is that of what constitutes a computing technology. Are computing applications distinct from computing? What of technologies for which computing is indispensable, and which could not have come into being without computers? This line of thought can be pursued backward: computers could not exist without electricity, so should computers be considered anything new, or merely an extension of electrical power? And electrical power could not have come about with the steam and fossil-fueled industry that preceded it. This can be pursued back to the first stone tools, and the argument can be made the nothing new has happened, in essence, since the first chipped flint blade.

Perhaps the most obvious point of dispute in this analysis is the possibility of machine consciousness. I will acknowledge without hesitation that the emergence of machine consciousness is a potentially revolutionary development, and it would constitute a disruptive technology. Machine consciousness, however, is frequently conflated with artificial intelligence and with superintelligence, and we must distinguish between the two. Artificial intelligence of a rudimentary form is already crucial to the automation of industry; machine consciousness would be the artificial production, in a machine substrate, of the kind of consciousness that we personally experience as our own identity, and which we infer to be at the basis of the action of others (what philosophers call the problem of other minds).

What makes the possibility of machine consciousness interesting to me, and potentially revolutionary, is that it would constitute a qualitatively novel emergent from computing technology, and not merely another application of computing. Computers stand in the same relationship to electricity that machine consciousness would stand in relation to computing: a novel and transformational technology emergent from an infrastructural technology, that is to say, a driver of change that emerges from a facilitator of change.

The computational infrastructure of industrial-technological civilization is more or less in place at present, a familiar part of our world, like the early electrical grids that appeared in the industrialized world once electricity became sufficiently commonplace to become a utility. Just as the electrical grid has been repeatedly upgraded, and will continue to be ungraded for the foreseeable future, so too the computational infrastructure of industrial-technological civilization will be continually upgraded. But the upgrades to our computational infrastructure will be incremental improvements that will no longer be major drivers of change either in the economy or in sociopolitical institutions. Other technologies will emerge that will take that role, and they will emerge from an infrastructure that is no longer driving socioeconomic change, but is rather the condition of the possibility of this change.

. . . . .

Colossus

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Advertisements

One Response to “The Computational Infrastructure of Civilization”

  1. […] cables for high speed internet access. (On the growing infrastructure of civilization cf. my post The Computational Infrastructure of Civilization.) All of these infrastructure requirements have been continually updated since their initial […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: