Monday, August 07, 2006

Should We Have Been Able To Predict Quantum Effects In The 19th Century?

In numerous blog commentaries, I have argued that while parts of the world are causal and consistent, there's no guarantee that the universe is globally so, or will always be so. There might be distant galaxies in which there are uncaused events, or future times when the consistency we see in the world comes to an end.

However, until recently, I harbored a suspicion that our cosmological neighborhood was causal and consistent. If anything can and does happen at any time, it's hard to imagine the universe being intelligible at all. I was prompted to re-evaluate my thinking by my realization that quantum mechanics is effectively acausal. Since not everything about the final state is fixed by the initial state, that which is not fixed is random and acausal. The fact that quantum mechanics is compatible with Newtonian mechanics shows that acausality can be ever-present without disturbing the results of classical physics. The randomness of nuclear decay doesn't make the whole world of planes, trains and automobiles utterly unpredictable.

Thus, I was led back to the question of whether logical inconsistency might also be able to exist in our universe without rendering the whole universe unintelligible.

Interpretations of Quantum Physics
I am no expert on the history of quantum mechanics, but my sense is that quantum inventions such as the Schrödinger Equation were constructed in a holistic or organic fashion. Quantum pioneers looked at the strange results from quantum systems, and built a framework for predicting experimental results while at the same time ensuring that the boundary conditions matched classical physics. Quantum mechanics was not derived from any deep principle, but was thrown together ad hoc.

Now, acausality at quantum scales has made many physicists uncomfortable over the years (Einstein included), but we can wash away our discomfort with the knowledge that our funky trip through the quantum looking glass ends when we make a macroscopic measurement. For example, creepy faster than light effects like the EPR experiment turn out to be mere parlor tricks that don't result in non-local effects in measured systems. In other words, we can put the bizarre nature of quantum physics out of our minds while we go an compute something we will actually see. As Paul Dirac once said, "Shut up and calculate!"

This is not to say that we don't try to interpret philosophically what quantum physics actually means. Over the last century, there have been many attempts to reconcile quantum mechanics with intuition. These interpretations aren't physical theories, and they generally make no experimental predictions. They're just ways to try and make intuitive sense out of the mathematics. The interesting point about these interpretations is that they seek to find a picture of the world in which causality and consistency are either restored or irrelevant. Let's look at the two poster children for this tendency.
  • In the Many Worlds Interpretation, every quantum event that can happen does happen, but the universe replicates itself into multiple, otherwise-identical universes. For some theorists, having an almost countless number of universes come into existence every second is a small price to pay for causality and consistency.
  • In the Bohm Interpretation, the universe is actually causal and consistent underneath, but limitations of the measurement process result in apparent quantum behavior. The tradeoff here is that we have to accept non-local (faster than light) effects that render the classical nature of the universe invisible to us.
Though there is no consensus on a best interpretation, there's a definite tendency to either make quantum mechanics causal and consistent, or else brush the problem under the rug. I don't want to disparage the rug brushing, as it really is quite pointless to assert propositions that will never have any experiential test.

Nonetheless, my perception of this issue has changed in the last few days. Quantum mechanics is a consistent formalism, but it can be seen as modeling a world that is acausal and inconsistent at a quantum level. In broad terms, quantum mechanics explains how to have a world be acausal and illogical at one level while preserving causality and logical consistency at classical scales. In short, it may be time to embrace acausality and logical inconsistency.

For economy, I want to refer to the combination of acausality and logical inconsistency using a single term. The most natural terms to choose are "anarchy" and "chaos." Chaos is a bit confusing because this chaos isn't the same as physical chaos. However, if I use the term anarchy, I will be accused of endorsing political anarchy (by the same idiots who think evolutionary biology endorses social Darwinism). I'll go with chaos.

Two Ideas
What we have is a convergence of two ideas. First, the there is the philosophical possibility that the world is consistent and causal only at classical scales, but chaotic at some deeper level. Second, there is the fact that quantum mechanics as a formalism shows us a way to account for small scale acausality and logical contradiction while maintaining compatibility with classical physics.

The interesting question is whether partial acausality and partial consistency actually predict quantum mechanics. Should the pioneers of quantum theory (and 19th century scientists and philosophers) have been able to derive quantum mechanics from classical physics and philosophical first principles? They might not have known what systems would exhibit quantum effects, nor at what energies the effects would appear, but they might have suspected that quantum systems would exist.

I want to note the non-obvious nature of this connection, if it exists. To begin with, the connection is non-obvious for cultural reasons. For physicists, causality and logical consistency are taken for granted. Using acausality and inconsistency to good advantage doesn't come naturally.

It's also non-obvious mathematically because quantum mechanics seems to give us more than just a way to deal with statistically random effects. Quantum mechanics makes uniquely strange predictions about energy levels and interference patterns. If electrons behave acausally, why should that lead to spaced atomic energy levels instead of a continuum? Why should fundamental logical inconsistency lead to interference patterns in electron beams?

Modeling Partial Acausality and Inconsistency
Partial acausality means that, given prior state, not all paths of the system are determined. Effectively, of all possible paths that can be taken by a system, a random one is actually taken. Therefore, a partially acausal physics can only speak of the probability that a particular path is taken.

Partial inconsistency means that, where there is acausality, multiple mutually exclusive and contradictory paths are taken simultaneously. Classical measurement of the final state will reveal a unique end point of the system, but cannot meaningfully speak of which path the system took to get there.

A mathematical representation of such a system will involve orthogonal state vectors representing mutually exclusive paths the system can take to a particular endpoint. These vectors must be combined using some sort of scalar product so as to produce a scalar probability density. This leads us to an Inner Product Space, or something similar. Such spaces incorporate the possibility of interference between different paths the system might take.

This mathematical structure resembles the Feynman path integral approach to quantum mechanics. Not only does it lead to quantum interference, but to discrete energy levels in bound states.

Still, does partial causality and consistency predict quantum mechanics?

Well, quantum mechanics is normally second degree in the state vectors. That is, the inner product is normally the simplest possible inner product, the scalar product of two vectors. In principle, there might be all sorts of inner products we could define involving, say, determinants of tensor products. Some such generalizations of quantum mechanics have been explored, but only as toy models (as far as I know).

I suppose the key point is that partial acausality and partial inconsistency do generally predict quantum effects such as interfering realities represented by interfering wavefunctions. That is, to avoid quantum effects, one has to fine-tune a chaotic theory. In contrast, a classical theory must be somewhat tuned in order to get quantum effects. Of course, this analysis gets nowhere near a probability assessment for chaos. Science is open-ended, so the task of "integrating over all possible theories" is almost inconceivable.

When I started writing this post, I had hoped to obtain a more startling result. I had hoped to show that our actual quantum theory is necessitated by chaos (and the constraints of classical physics). However, in light of my recent thinking on metatheories, I think that chaos is better regarded as a metatheory than as a scientific theory.

I am left with further confirmation that an invisible cause is indistinguishable from a non-existent one.

I am also left to ponder a new interpretation of quantum mechanics that embraces acausality and inconsistency.

No comments: