Quantum Model Drift

7 September 2012

Friday


Some time ago in The Pleasures of Model Drift I discussed how contemporary cosmology is challenged by the accelerating expansion of the universe, and that there are no really good explanations of this yet in terms of the the received cosmological models. The resulting state of cosmological theories, then, is called model drift. This is a Kuhnian term. Almost exactly a year ago, when it was reported that some neutrinos may have traveled faster than light, it looked like we might also have had to face model drift in particle physics. Since these results haven’t been replicated, the standard model not only continues to stand, but has recently been fortified by the announcement of the discovery (sort of discovery) of the Higgs Boson.

But theoretical physics isn’t over yet. Some time ago in The limits of my language are the limits of my world I took up Wittgenstein’s famous aphorism from the perspective of recent work in particle physics that had “bent” the rules of quantum theory. Further work by at least of of the same scientific team at Centre for Quantum Information and Quantum Control and Institute for Optical Sciences, Department of Physics, University of Toronto, Aephraim M. Steinberg, has continued this line of research, which has been reported by the same BBC Science and technology reporter, Jason Palmer, who wrote up the earlier results (cf. Quantum test pricks uncertainty). This story covers research reported in Physical Review Letters, “Violation of Heisenberg’s Measurement-Disturbance Relationship by Weak Measurements.”

The abstract of this most recent research reads as follows:

While there is a rigorously proven relationship about uncertainties intrinsic to any quantum system, often referred to as “Heisenberg’s uncertainty principle,” Heisenberg originally formulated his ideas in terms of a relationship between the precision of a measurement and the disturbance it must create. Although this latter relationship is not rigorously proven, it is commonly believed (and taught) as an aspect of the broader uncertainty principle. Here, we experimentally observe a violation of Heisenberg’s “measurement-disturbance relationship”, using weak measurements to characterize a quantum system before and after it interacts with a measurement apparatus. Our experiment implements a 2010 proposal of Lund and Wiseman to confirm a revised measurement-disturbance relationship derived by Ozawa in 2003. Its results have broad implications for the foundations of quantum mechanics and for practical issues in quantum measurement.

Experimentalists are chipping away at Heisenberg’s Uncertainty Principle. They aren’t presenting their research as something especially radical — one might even think of this recent work as an instantiation of radical theories, modest formulations — but this is theoretically and even philosophically quite important.

We recall that despite himself making crucial early contributions to quantum theory, Einstein eventually came reject quantum theory, offering searching and subtle critiques of the theory. In his time Einstein was isolated among physicists for coming to reject quantum theory at the very time of its greatest triumphs. Quantum theory has gone on to become one of the most verified theories — and verified to the most exacting standards — in the history of physics, notwithstanding Einstein’s criticisms. Einstein primarily fell out with quantum theory over the notion of quantum entanglement, though Einstein, himself a staunch determinism, was also greatly troubled by Heisenberg’s uncertainly principle. Many, perhaps including Einstein himself, conflated physical determinism with scientific realism, so that a denial of determinism came to be associated with a rejection of realism. Heisenberg’s uncertainly principle is Exhibit “A” when it comes to the denial of determinism. So I think that if Einstein had lived to see this most recent work, he would have been both fascinated and intrigued by its implications for the uncertainty principle, and indeed its philosophical implications for physics.

Einstein was a uniquely philosophical physicist — the very antithesis of what recent physics has become, and which I have called Fashionable Anti-Philosophy (and which I elaborated in Further Fasionable Anti-Philosophy). From his earliest years, Einstein carefully studied philosophical works. He is said to have read Kant’s three critiques in his early teens. And Einstein’s rejection of quantum theory, which he modestly and humorously characterized as saying that something in his little finger told him that it couldn’t be right, was a philosophical rejection of quantum theory.

The recent research into Heisenberg’s uncertainly principle is not couched in philosophical terms, but it is philosophically significant. The very fact that this research is going on suggests that others, not only Einstein, have been dissatisfied with the uncertainly principle as it is usually interpreted, and that scientists have continued to think critically about it even as the uncertainty principle has been taught for decades as orthodox physics. This is a perfect example of what I have called Science Behind the Scenes.

The uncertainty of quantum theory, given formal expression in Heisenberg’s uncertainty principle, came to be interpreted not only epistemically, as placing limits on what we can know, but it was also interpreted ontologically, as placing limits on the constituents of the world. In so far as Heisenberg encouraged an ontological interpretation of the uncertainty principle, which I believe to be the case, he was advancing an underdetermined theory, i.e., an ontological interpretation of the uncertainty principle goes beyond — I think it goes far beyond — the epistemic uncertainty that we must posit in order to do quantum theory.

It seems to me that it is pretty easy to interpret the recent research cited above as questioning the ontological interpretation of the uncertainty principle while leaving an epistemic interpretation untouched. The limits of human knowledge are often poignantly brought home to us in our daily lives in a thousand ways, but we need not make the unnecessary leap from limitations on human knowledge to limitations on the world. On the other hand, we also need not make any connection between realism and determinism. It is entirely consistent (even if it seems odd to some of us) that there should be an objective world existing apart from human experience of and knowledge of that world, and that that objective world should not be deterministic. It may well be that it is essentially random and only stochastically defined, when a given radioactive substance decays, but the radioactive substance and the event of decay are as real as Einstein’s little finger. If I could have a conversation with Einstein, I would try to convince him of precisely this.

Indeterminate realism is also an underdetermined theory, and it is to be expected that there are non-realistic theories that are empirically equivalent to indeterminate realism. It is for this reason that I believe there are other arguments, distinct from those above, that favor realism over anti-realism, or even realism over some of the more extravagant interpretations of quantum theory. But I won’t go into that now.

We aren’t about to return to classical theories and their assumptions of continuity such as we had prior to quantum theory, any more than we are about to give up relativistic physics and return to strictly Newtonian physics. That much is clear. Nevertheless, it is important to remember that we are not saddled with any one interpretation of relativity or quantum theory, and we are especially not limited to the philosophical theories formulated by those scientists who originally formulated these physical theories, even if the philosophical theories were part of the “original intent” (if you will forgive me) of the physical theory. Another way to put this is that we are not limited to the original model of a theory, hence model drift.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

Advertisements

Friday


One of the many famous aphorisms that have been plucked out of Wittgenstein’s Tractatus Logico-Philosophicus is, “The limits of my language are the limits of my world” (“Die grenzen meiner sprache sind die grenzen meiner welt” section 5.6). Like much in the Tractatus, this gnomic aphorism invites interpretation and can never be exhausted.

One way to construe this Wittgensteinism very broadly would be to think of it as the limits of my idiom are the limits of my world, with “idiom” construed broadly to include any way of talking about the world, and not merely a particular language. If you’re of a continental persuasion, you could say the limits of my discourse are the limits of my world. It amounts to pretty much the same thing.

Particular theories about the world are idioms for talking about the world, forms of discourse, if you will. Scientific theories are scientific idioms for talking about the world. Now, scientific theories often broaden our horizons and allow us to see and to understand things of which we were previously unaware. But a scientific theory, being a particular idiom as it is, may also limit us, and limit the way we see the world.

The limitations we take upon ourselves by thinking in terms of particular theories or speaking in particular ways are human limits that we have chosen for ourselves; they are not intrinsic limitations imposed upon us by the world, and this, of course, is something that Wittgenstein wanted to bring to our explicit attention.

We very frequently mistake the idioms we employ, and the particular ways in which we understand these idioms, to constitute the very fabric of the world. When in this frame of mind we make claims for our theories that are not supported by the theories themselves, but rather reflect our particular, limited understanding of very difficult matters. This has been the case with the general theory of relativity and quantum theory, both of which are very young sciences, but which now dominate physics. Because of the dominant position of these theories, and of particular interpretations of these theories, we forget how young they are, and how far we have to go in really coming to an adequate understanding of them.

Our inadequate understanding of quantum theory, in particular, has been glossed so many times by physicists seeking to give a popular account of quantum theory that one might be forgiven for supposing that quantum theory is a form of mysticism rather than of science. (For example: “For those who are not shocked when they first come across quantum theory cannot possibly have understood it.” Niels Bohr) It is inevitable that, as our understanding of the world gradually and incrementally improves, much in quantum theory that now seems inscrutable will eventually make sense to us, rather than the theory being a mere systematization of a mystery.

A recent paper in Science by Sacha Kocsis, Boris Braverman, Sylvain Ravets, Martin J. Stevens, Richard P. Mirin, L. Krister Shalm, and Aephraim M. Steinberg, Observing the Average Trajectories of Single Photons in a Two-Slit Interferometer, points to new ways of thinking and talking about quantum theory. Here is the abstract of the paper:

“A consequence of the quantum mechanical uncertainty principle is that one may not discuss the path or “trajectory” that a quantum particle takes, because any measurement of position irrevocably disturbs the momentum, and vice versa. Using weak measurements, however, it is possible to operationally define a set of trajectories for an ensemble of quantum particles. We sent single photons emitted by a quantum dot through a double-slit interferometer and reconstructed these trajectories by performing a weak measurement of the photon momentum, postselected according to the result of a strong measurement of photon position in a series of planes. The results provide an observationally grounded description of the propagation of subensembles of quantum particles in a two-slit interferometer.”

There is a good article by Jason Palmer of the BBC, Quantum mechanics rule ‘bent’ in classic experiment, about the paper and its ramifications. Palmer writes that researchers, “say the feat ‘pulls back the veil’ on quantum reality in a way that was thought to be prohibited by theory.” If one wanted to go seeking headlines, one could say something dramatic like “Scientists break the laws of quantum physics” — you get the idea.

But what has been thought to be prohibited is in large measure a limitation upon the current language of quantum theory and, to a certain extent, an artifact of particular experiments. As more sophisticated experiments are conceived and conducted, we may someday know quite a bit more about quantum theory than has been thought possible to date.

In Palmer’s BBC story there is an excellent quote from Marlan Scully of Texas A&M University:

“The trouble with quantum mechanics is that while we’ve learned to calculate the outcomes of all sorts of experiments, we’ve lost much of our ability to describe what is really happening in any natural language.”

“I think that this has really hampered our ability to make progress, to come up with new ideas and see intuitively how new systems ought to behave.”

Progress in understanding quantum theory will, as implied by Scully, ultimately take the form of being able to discuss it in natural language and to formulate the theory in an intuitively perspicuous manner. We do not yet have the language or the concepts to do this, but each advance like the recent results reported in Science bring us a little closer, chipping away at the limits of our language that currently constitute the limits on our world.

. . . . .

Since writing the above I have learned that the method used in the experiment described is called “weak measurement” (as mentioned in the abstract quoted above) and has been employed in other recent experiments (as well as having been criticized quite harshly). I have written further on weak measurement in some comments on the paper Observation of a quantum Cheshire Cat in a matter-wave interferometer experiment.

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

Saturday


When we look up into the night sky what we see are stars. If it is very clear and very dark, we will see one of the spiral arms of the Milky Way, therefore glimpsing a small part of the large scale structure of our galaxy. The large scale structure of the universe is nearly beyond our comprehension, and beyond the immediate experience of all except a few astronomers. But we all have an immediate experience of the stars, and we have a personal sense of what the stars are like from our acquaintance with our own sun. This makes stars immediately comprehensible to us.

For all the scientific progress of recent decades, almost no real progress has been made in reconciling general relativity with quantum theory. The difference of approach between the two was evident from the start, and the founders of the two theories knew it right from the outset. Little has changed since the emergence of both relativity and quantum theory in the early part of the twentieth century in terms of the different philosophical approach that each has to the natural world.

Relativity is a “classical” theory in so far as it assumes an ideal continuity to space; it is concerned with the big picture of the world, with how gravity has shaped the large scale structure of the cosmos. Quantum theory begins from the other extreme, starting with the smallest possible constituents of the world and building up from there. It is also a “non-classical” theory because it assumes an ultimate graininess to the structure of the world, an atomicity that is nevertheless distinct from classical atomism.

While the best efforts of the best scientists have not yet bridged the gap between relativity and quantum theory, we can see with our own eyes, every time we look at the stars, the practical consequences of the actual unity of relativity and quantum theory in nature. For stars are the nexus of that unity. Stars stand in gravitational relationships to each other. Some stars get big enough that they collapse into black holes, and some black holes become so large that they drag vast quantities of matter around with them. These spinning agglomerations of matter are galaxies, and the largest scale structure of the cosmos is described by the gravitational interactions of galaxies, of clusters of galaxies, and of superclusters of galaxies.

All of this vast structure, so well described by relativity, is born of stars. But stars themselves are born of an astonishing complexity of quantum interactions in the superheated cores of stars. When supernova SN 1987A exploded not far from the Milky Way in the Large Magellanic Cloud and its light finally reached Earth in 1987, scientists were able to check neutrino detectors buried deep within the earth, and when they found that neutrinos had streamed out of the collapsing core of the doomed star before any visible evidence of the supernova appeared, they confirmed an important prediction of theories about stellar structure and evolution. Such theories can only be formulated, and can only be understood, in the context of quantum theory.

It is in the very cores of stars that general relativity and quantum theory are unified, and the light that the stars give off, the warmth that we can feel upon our faces when to turn to look up at the sun on a summer day, is an immediate and concrete experience of that unity.

. . . . .

. . . . .

signature

. . . . .

Grand Strategy Annex

. . . . .

project astrolabe logo smaller

. . . . .

%d bloggers like this: