Saturday, September 28, 2013

Is the wavefunction ontological or epistemological? 

What is wrong with Copenhagen/standard interpretation?

Arguably, the standard interpretation is the best available interpretation, but it is not without blemishes. The big problem here is the usage of classical mechanics. Here is a nice exposition of why there cannot be a consistent mixed classical-quantum mechanics theory: Quantum and classical mechanics belong to different composability classes: quantum mechanics is a realization of elliptic composability and classical mechanics is the sole realization of parabolic composability. Physically, in a mixed classical-quantum system, there is no back reaction of the quantum system in the classical one. In other words, classical devices cannot measure anything from the quantum world, and the idea of a classical measuring apparatus is a nice fantasy.

However, the old Bohr interpretation got a nice instrumentalist upgrade from late Asher Peres. I highly recommend his book: “Quantum Theory: Concepts and Methods” ( In the Preface, Peres famously states: “quantum phenomena do not occur in a Hilbert space, they occur in a laboratory”.

On pages 172-173 of the book, Peres addresses the question of the universality of quantum mechanics:

Even if quantum theory is universal, it is not closed. […] While quantum theory can in principle describe anything, a quantum description cannot include everything. In every physical situation something must remain unanalyzed. This is not a flaw of quantum theory, but a logical necessity in a theory which is self-referential and describes its own means of verification. This situation reminds of Gödel’s undecidability theorem: the consistency of a system of axioms cannot be verified because there are mathematical statements that can neither be proved nor disproved by the formal rules of the theory; but may nonetheless be verified by metamathematical reasoning.” 

For all practical purposes, the Copenhagen interpretation works very well if we do not notice the logical inconsistencies of using classical measurement devices. Shut up and calculate said Mermin. But is Peres’ quote from above an acceptable defense of why the “something” whichmust remain unanalyzed” be a classical device? Not at all.

What is needed is to obtain the emergence of the classical world from quantum mechanics. Then we can embrace Peres’ position (although there is a sizable amount of handwaving there).

The first step is to see why we don’t observe superposition. The answer is simple: decoherence. But decoherence is not enough. The unique experimental outcome needs an explanation too. Quantum Darwinism shows that what succeeds to be recognized as the measurement outcome is what succeeds to make copies of itself (and this in turn explains the preferred basis). This copying and amplification effect shows that the measurement device is in an unstable situation. The first outcome to be realized from a potentiality of choices wins the day and gives rise to “objective reality” where all observers can agree on the outcome due to the zillions of information copies of the one and only outcome. But how can one explain the very first copying event, the very first ionized atom, the very first collapse of the wavefunction? And do this while preserving unitarity. Here is where Peres hit the nail on the head: “quantum phenomena do not occur in a Hilbert space, they occur in a laboratory”. A Hilbert space is only a mathematical realization of some (operator) algebra (via GNS construction), and if in that algebra one can naturally create two elements out of one (this is called a co-product), Hilbert spaces can pop up or vanish with no mathematical or conceptual problems. 

I don’t think there are any conceptual problems with a Fock space and with particle creation and annihilation, but the very same mechanism happens in ordinary quantum mechanics during interaction/measurement. The mathematical structure behind is called a Hopf algebra and the current challenge is to show that a Hopf algebra arises naturally in the operator algebra for ordinary quantum mechanics too and not only in field theory due to the existence of the Lorentz group. The math is highly nontrivial, but there are no interpretation issues. The math proof is work in progress, but it is advanced enough to be able to present a new interpretation of quantum mechanics in the next post. Since this interpretation stems from the program of deriving quantum mechanics form natural physical principles, this is it. After all, how many physics conferences do you know which are dedicated to the interpretation of special relativity?


  1. The irony is that with D-branes one regains an appreciation for Bohr’s Copenhagen interpretation. A D-brane is a quantum condensate of strings in a state or a set of states with some edge condition. They are similar then to a Fermi surface or edge states in quantum Hall physics. On the large a D-brane is then a very classical sort of structure, even though it is built up from strings on a small scale. One gains a rather Bohr-esque perspective on the quantum-classical dichotomy. Of course as Peres says Bohr’s perspective requires that we do an Admiral Nelson routine of turning a blind eye to nature and sweeping part of it away. It does work “in a way,” but it gives one a less than satisfied feeling,

    Cheers LC

  2. Considering the uncertainty at the beginning of quantum mechanics, Bohr's interpretation was an extraordinary accomplishment. Bohr got it wrong on only two aspects: contextuality (on his reply to EPR) and the usage of classical apparatus.

    On contextuality, Gleason's theorem clarified the matter, and the measurement problem is still open, but the math involved is highly nontrivial. The framework to solve it came from contributions of Emch, Grgin, and Connes among others.

  3. I was going to respond to your post on MWI with the matter of contextuality. I have thought that a problem with MWI is the outcome an observer detects is determined by the context of their experiment the observer sets up. The experimenter is free to set the orientation of their Stern-Gerlach apparatus. Quantum mechanics does not indicate or determine this for that orientation is just one of an infinite number related by a unitary transformation. Yet if the universe is ultimately quantum mechanical “all the way down” then the manner in which an observer is “eigenbranched” along a world line by their choice of basis is ultimately determined by QM. This however is in violation of the Kochen-Specker theorem.

    I think the problem boils down to the problem that wave function collapse is taken as objective. MWI in some ways goes a step in that direction. With CI and GRW wave function collapse is taken as an objective fact, but I think the problem is that this leads to an excess of entropy in the universe. A wave function that is constantly collapsed by some decoherent process will be thrown into a different basis by nonunitary process that generates entropy. So a hot gas at a constant temperature could not maintain a constant entropy. That is nonsense. I think the wave function is set or defined in some ways similar to a coordinate condition in relativity or a gauge condition. In relativity we have causal barriers or horizons that prevent the prediction of a system beyond r = 2GM/c^2. Wave function collapse is something similar, where the appearance of a state reduction is due to the subjective condition of the observer, rather than something that is strictly objective.

    Cheers LC

  4. >The problem boils down to the problem that wave function collapse is taken as objective.

    Indeed , this is one of the problems and if this is though as objective one can get into relativity simultaneity problems.

    >With CI and GRW wave function collapse is taken as an objective fact, but I think the problem is that this leads to an excess of entropy in the universe. A wave function that is constantly collapsed by some decoherent process will be thrown into a different basis by nonunitary process that generates entropy.

    The problem exists only for GRW and this leads to experimental verifiable predictions, but the parameters are tune in such a way that the effect is currently undetectable. In CI, the reduction only happens during actual measurement and there is no issue there.

    >Wave function collapse is something similar, where the appearance of a state reduction is due to the subjective condition of the observer, rather than something that is strictly objective.

    One may take a partial trace to hide the additional information. We need to carefully define what we mean by objective. In QM there is no objective local realism, only non-contextuality of the wavefunction. Objectivity is an emergent concept given by the multiplicity of information which allows different observers to agree.

    I did not discussed the Kochen-Specker theorem which is basically an exercise in the impossibility of coloring certain geometric shapes (by assuming independent existence of experimental outcomes) but KS is not violated by MWI the same way energy conservation is not violated by MWI.

  5. I do think that MWI has some issue with contextuality. It seems to me that MWI has some inconsistency with quantum physics.

    In CI we have to be careful by what we mean by an “actual measurement.” This may be related to your warning about the term objective. In the end all there is I think is some relativity between ontology and epistemology, and we have I think some basic level of uncertainty about what is meant by ontology. So a wave function may then collapse under a range of conditions which lie outside of our prescription for a measurement.

    I have been interested in your elliptic composability and the product of states. I think this may have some bearing on the hyperdeterminants for nilpotent groups that have G_3/H_3 and G_4/H_4 STU quotients

    G_4/H_4 = SL(2,R)xSL(2,R)xSL(2,R)/U(1)^2 rank = 3

    G_3/H_3 = SO(4,4)/SO(2,2)xSO(2,2) rank = 4

    In the N = 2 SUGRA the SO(4,4) can be decomposed into a product of 4 SL(2,R)’s which are the groups for conformal quantum field theory in two dimensions. The rank = 3 case is a three-product of states remarkably similar to your work in your paper arXiv:1305.3594v1 [quant-ph] 14 May 2013. These systems are tripartite and GHZ 4-qubit entanglement systems. The relationship between the two is with respect to the Hopf fibration S3 --- > S7 --- > S4. There is the next higher level in this with S7 --- > S15 --- > S8 where we have

    G4/H_4 = E_{7(7)}/SU(8) rank = 7

    G_3/H_3 = E_{8(8)}/SO(16) rank = 8,

    where this is a 7 and 8 product form corresponding to 7 and 8 qubit systems.

    The hyperdeterminant for these higher rank systems defines the action of the system S = det(M), which for the rank 4 system is a quartic system of equations. In the Euclideanized form this is the entanglement entropy, which has a correspondence with the action. This also has a connection to the spacetime metric as well. The entanglement of a system has a weight which is formally equivalent to a distance metric, as shown by the work of Raamsdonk arXiv:1005.3035v1 [hep-th]. The decomposition of the gauge group into products of SL(2,R) groups is a reflection of the entanglement of states, such as for SL(2,R)^3 a tripartite entanglement. This looks to be a way of building quantum gravity.

    For the higher system we may have two copies of the 4-qubit entangled systems. This seems to be a dead end for with E8 we run out of “compositions.” However, we can use Bott periodicity to extend this to a much larger entanglement of states in massive Hilbert spaces.

    I am not as grounded in the foundations of QM as you are. I do though have an interest in taking QM as seriously as I can.

    Cheers LC

  6. Lawrence,

    In composability/category theory approach the key idea is that the laws of nature are invariant under tensor composition. All concepts should follow from the math naturally and there should not be any superimposed interpretations. This is like the zeroth tenet, the cornerstone principle.

    I am not going to disclose too much, but I left a hint in the post when I said "classical mechanics is the sole realization of parabolic composability" In QM there are an infinite number of realizations, and they correspond to different universes where one has different gauge symmetry groups. Why our particular universe picked a particular algebra which gives rise to the Standard Model was probably a matter of chance. If there is no suppersymmetry, the problem of which algebra/group describes nature was solved by Connes (including renormalizability). The full SM follows from it. But the math will change if supersymmetry is real.

    I don't quite get why you say that MWI has some inconsistency with quantum physics. Nobody could find any inconsistency according to my knowledge.

  7. I am aware that with MWI there is no duplication of energy. The evolution of an observable

    dO/dt = ∂O/∂t + i[H, O]

    is such that for O = H this is zero if H has no explicit t dependency. Suppose that we have a wave function ψ(x_1, x_2, ….x_N) that decomposes into

    ψ(x_1, x_2, ….x_N) = φ1(x_1, …, x_m)f(x_{m+1}, …, x_N)

    + φ2(x_1, …, x_m)g(x_{m+1}, …, x_N)

    with the condition that integrations over all the variables from x_m to x_N, a product of integrations, gives

    ∫prod dx f^2 = ∫prod dx g^2 = 1

    ∫prod dx f*g = 0

    The 1 and 2 indices can indicate the “world” the system enters into corresponding to two world branches. The evaluation of an observable is then

    = ∫prod dx ψ*(x_1, x_2, ….x_N)O ψ(x_1, x_2, ….x_N)

    where the product runs over 1 to N, and this is broken into

    = ∫prod dx φ1*(x_1, x_2, ….x_m)O φ1(x_1, x_2, ….x_m)
    + ∫prod dx φ2*(x_1, x_2, ….x_m)O φ2(x_1, x_2, ….x_m)

    where the product is over 1 through m. The Born theorem tells us this is given by the eigenvalue O_1 and O_1 with their probabilities P_1 and P_2 as

    = P_1O_1/(P_1 + P_2) + P_2O_2/(P_1 + P_2).

    As a result the branching does not violate conservation of energy.

    I will try to think of a form of this for the KS theorem. I do have this nagging question about whether there is a problem with contextuality. The orientation of the SG apparatus in a spin experiment is unitarily equivalent to all other possible orientations. Maybe MWI takes into account of this by branching not just with all possible eigenvalues in a basis, but all possible basis orientations. I have not heard of that though.

    The parabolic case you mention is analogous to the extremal condition on black holes. The entropy or action I mention is null for nilpotent groups of qubits. For the extremal condition there are then 31 nilpotent orbital configurations. For timelike BHs the Lie groups are semi-simple and the conditions are larger in number. The null geodesics correspond to the S = 0 (entropy or proper time = 0) and these can’t be arrived at by group actions ad_X(Y) = [X,Y] when these orbits are not nilpotent. The action is then s > 0 for timelike and S = 0 for null or extremal case.

    The classical condition with respect to quantum physics is an approximation. There is still some tiny quantum spread of any system. This is analogous to the asymptotic convergence of a highly relativistic particle to the null case, or the convergence of a timelike black hole with angular momentum or charge to its extremal case. In this way I think the structure of spacetime is built up from quantum entanglements and that in some way quantum mechanics and general relativity are categorically the same thing which looked at in a proper way.

    Cheers LC

  8. Since you mentioned the Born rule, there is a bit of a conundrum with it in MWI: there is no good way to define relative frequencies because this is a meta-universe concept in this case. Suppose there are only 2 outcomes possible and QM predicts outcome A occurs 20% of time and outcome B occurs 80% of time. If the universe splits between Universe_A and Universe_B, there is only one split and the frequentist position in statistics demands the probabilities to be 50% and 50%.

    Deriving Born's rule in MWI is not an universally accepted business.

    KS theorem basically states that unperformed experiments have no pre-assigned outcomes. Or one might say God cannot know what you will do, so therefore there must be free will. Combining free will with KS resulted in the famous free will paper by Conway and Kochen and the big controversy after that.