## Is there a realistic interpretation of Quantum Mechanics?

### (a critical analysis of the Bohmian mechanics)

In the last two posts the merits of classical (realistic) description of Nature were discussed. The major disagreement was on the burden of proof. I contend it is the responsibility of any alternative explanation to prove it is better than quantum mechanics. Given the overwhelming and irrefutable evidence for the applicability of quantum mechanics in describing nature, this is simply impossible, but is there a realistic interpretation of quantum mechanics?

When discussing physics, there are three possible mathematical frameworks to choose from:
• Lagrangian formalism in the tangent bundle,
• Hamiltonian formalism in the cotangent bundle.
• Formalism in the configuration space.
For the Lagrangian formalism, I cannot do any better than Zee in Chapter I.2 of his "Quantum Field Theory in a nutshell":

"Long ago, in a quantum mechanics class, the professor droned on and on about the double-slit experiment, giving the standard treatment.[...] The amplitude for detection is given by a fundamental postulate of quantum mechanics, the superposition principle, as the sum of the amplitudes for the particle to propagate from the source S through the hole A1 and then onward to the point O and the amplitude for the particle to propagate from the source S through the hole A2 and then onward to the point O.

Suddenly, a very bright student, let us call him Feynman, asked, "Professor, what if we drill a third hole in the screen?" The professor replied, "Clearly, the amplitude for the particle to be detected at the point O is now given by the sum of three amplitudes [...]."

The professor was just about ready to continue when Feynman interjected again, "What if I drill a fourth and a fifth hole in the screen?" Now the professor is visibly losing his patience: "All right wise guy, I think it is obvious to the whole class that we just sum over all the holes."

But Feynman persisted, ""What if I now add another screen with some holes drilled into it? The professor was really losing his patience: "Look, can't you see that you just take the amplitude to go from the source S to the hole Ai in the first screen, then to the hole Bj in the second screen, then to the detector at O, and then sum over all i and j?"

Feynman continue to pester, "What if I put a third screen, a fourth screen, eh? What if I put in a screen and drill an infinite number of holes in it so that the screen in no longer there?" [...]

What Feynman showed is even when there is just empty space between the source and the detector, the amplitude for the particle to propagate from the source to the detector is the sum of the amplitudes for the particle to go through each of one of the holes in each one of the (nonexistent) screens. In other words, we have to sum over the amplitude for the particle to propagate from the source to the detector following all possible paths between the source and the detector."

So if we have to consider all possible paths, the notion of the classical trajectory is simply doomed and there is no realistic quantum interpretation in the Lagrangian formalism. One down, two to go.

We will make a quick pass onto the phase space formalism for quantum mechanics. This is very easy, Wigner functions contain negative probability parts, and hence there is no realistic quantum interpretation in the Hamiltonian formalism either. Two down, one to go.

However the configuration formalism is the tricky one. Here one encounters the Hamilton-Jacobi and the Schrodinger equation.

Let us start from classical physics. Consider 1-d motion in a potential V. The Hamilton-Jacobi equation reads:

$$\frac{\partial S}{\partial t} + \frac{1}{2m}{(\frac{\partial S}{\partial x})}^2 + V(x) = 0$$
and
$$p = \frac{\partial S}{\partial x}$$

if V=0, we consider $$S = W-Et$$ from which one trivially obtains:

$$\frac{\partial p}{\partial x}=\sqrt{2mE}$$. If the particle is at the moment $$t_0$$ at $$x_0$$ then the particle motion is unsurprisingly:
$$x-x_0 = \sqrt{\frac{2E}{m}}(t-t_0)$$

The key point is that we need the initial particle position to solve the equation of motion.

So what would happen when we replace the Hamilton-Jacobi equation with its quantum counterpart, the Schrodinger equation:

$$-i\hbar\frac{\partial \psi}{\partial t} - \frac{\hbar^2}{2m}{(\frac{\partial \psi}{\partial x})}^2 +V\psi(x) = 0$$ ?
If
$$\psi=\sqrt{\rho} exp(i\frac{S}{\hbar})$$
then we get the Hamilton-Jacobi equation for S but with an additional term called the quantum potential, and a continuity equation for $$\rho$$. Welcome to the Bohmian formulation of quantum mechanics!

To solve the equation of motion we again need an initial condition, just like in the classical case. But this should raise big red flags!!! In school we were taught that quantum mechanics is probabilistic, not deterministic, and also that the wavefunction is all there is needed to know to make predictions. How is this possible? Where is going on here?

To make quantum mechanics predictions one needs an additional ingredient: the Born rule. It turns out that in Bohmian quantum mechanics the Born rule constraints the allowed distribution of initial conditions and this is the beginning of the end of the realism claim of this interpretation.

 Max Born

But where is the Born rule coming from? With the advantage of about 100 years of quantum mechanics now we have two nice answers. The wavefunction lives in a Hilbert space, and in there we have Gleason's theorem which basically mandates the Born rule as the only logical possibility to make sense of the lattice of projection operators. But this is an abstract mathematical take on the problem. There is also an excellent physical explanation given by (surprise...) Born himself: http://www.ymambrini.com/My_World/History_files/Born_1.pdf I will not explain Born's paper because it is very well written and very easy to be understood today, but the main point is that Born's rule is incompatible with an arbitrary initial probability density. The supporters of the Bohmian mechanics are well aware of this and they call the consistent initial probability density "quantum equilibrium". Moreover they point out that after some relaxation time an arbitrary probability density "reaches quantum equilibrium" and so any discrepancy of predictions between Bohmian quantum mechanics and standard quantum mechanics could only occur a few seconds after the Big Bang, so problem solved, right? Wrong! It is rather ironic that the undoing of Bohmian's mechanics comes from a most unexpected direction: Bell's theorem!!! Ironic because Bell was inspired to discover his theorem by viewing Bohmian mechanics as a counter-example to von Neumann's no go theorem on hidden variables.

In quantum mechanics it is very easy to see that the position operator at different times does not commute. Why? because the position operator at time $$t-t_0$$ involves the momentum operator: $$(t-t_0)P$$ and P and Q do not commute. However this means that there is no probability space for positions at different times. But in Bohmian mechanics, the particle always have a "real" position and as such in there such a probability space does exists. Hence we can detect in principle statistical violations in the predictions of standard quantum mechanics and that of Bohmian mechanics.

The proponents of Bohmian mechanics are very well aware of this problem and they have a solution: there are no comparison of measurements at different times! I have to agree this is a very clever, but the trouble is only swept under the rug and it does not go away.

The prediction discrepancy goes away in Bohmian mechanics only if the theory is "contextual". Because of this the way velocity is measured in Bohmian mechanics is not what we would normally expect: $$v = (x_1-x_0)/(t_1-t_0)$$ and Bohmian mechanics is known for "surreal trajectories".

Surreal or not, violations of the speed of light or not, non-locality or not, the main trouble is in the sudden change of the probability density after measurement. In the Copenhagen formulation, the wavefunction collapses upon measurement and this is naturally explained as updated information. After all, the wavefunction does not carry any energy or momenta and is just a tool to compute the statistical outcomes of any possible experiment. But one of the advertised virtues of Bohmian mechanics is its observer independence: the measurement simply reveals the pre-existing position of the particle. But is this really the case?

The trouble for Bohmian mechanics is that its predictions for two consecutive measurements differs from that of the standard quantum formalism. Why? Because the "quantum equilibrium" for the first measurement is not a "quantum equilibrium" for the second measurement because wavefunction collapses during the first measurement. (by the way, this is the root cause of why no correct quantum field theory can ever be created for Bohmian mechanics.)

So how does the Bohmian supporters deal with this? The theory is simply declared "contextual" and valid only between preparation (with a pre-supposed quantum-equilibrium distribution) and measurement.

Without contextuality Bohmian mechanics is an inconsistent theory as it predicts violations of the uncertainty principle. With contextuality Bohmian mechanics becomes a time-bound equivalent formulation of quantum mechanics. Think of it as an R^n flat local map of a curved manifold. A field theory in the Bohmian interpretation is impossible the same way a 2-d map of Earth (which topologically is a sphere) cannot avoid distortions. (I see a cohomology no go result for Bohmian quantum field theory in my future ;)).

Because of contextuality the Bohmian interpretation cannot be called realistic.

Now we have exhausted all 3 quantum mechanics formulations.  Quantum mechanics is simply not a realist theory anyway we look at it. It would have been really strange if in one equivalent formalism of quantum mechanics there were a realistic interpretation, and not in the other two.

Basically it is either initial particle positions, or Born rule. But the Born rule is an inescapable self-consistency condition on the Hilbert space due to Gleason's theorem, and we must rule out unrestricted initial statistical distributions.

Quantum mechanics is complete and any addition of "hidden variables" spoils its consistency.

1. Bohm's approach to QM is fine so long as you do not take the idea of the "beable," which means "Be Able," literally. Take the Schrodinger equation, put the wave function in polar form and split it into the modified Hamilton-Jacobi equation and continuity equation for a fluid. Now perform a canonical transformation on the variables. You get a different "beable-path." This means the active channel of Bohm interpretation is just one case in a Usp(n) theory. A sum over all possible symplectic transformations gets you a bundle of paths in a path integral.

The problem arises when one thinks that by putting in a polar form of the wave you get a unique path. The wave function however may be transformed by unitary transformations, and a path in not then unique. There is then no so called active channel where a realistic particle is moving. All paths are just representations of the set summed over in a path integral.

1. Lawrence,

Thanks for commenting. The major appeal to dB-B approach are the beables because they achieve a "QM without observers". This interpretation is ontic. Bohm's theory is to QM what a Lie algebra is to it's Lie group: locally they are the same, but not globally.

Copenhagen QM predicts also only between preparation and measurement and then you have to reset the wavefunction for the next measurement (the so-called collapse), but dB-B has no collapse because the particle position is "real". Here the collapse is replaced by contextuality. dB-B has to explain this contextuality which contradicts the spirit of realism. Alternatively, the dB-B theory has to explain why or how the quantum potential changes at the measurement time.

2. Dear Florin, dB-B doesn't work. All your comments that something works about dB-B are incorrect. For example, it is not true that dB-B may work without any collapse. The position may be "real" but there is also the equally "real" guiding wave and this guiding wave has to be "cleaned" when the particle is observed so that it doesn't cause any problems after the measurements. Any process to "clean" it will cause exactly the same problems as any other real "collapse" in any childish realist emulation of quantum mechanics.

If one adds a pilot wave and offers no mechanism how it's cleaned, he offers ZERO advance towards any "better" picture of the measurement. The measurement still has to involve a process that goes outside the normal continuous equations of the theory - and in the case of dB-B, it is left obscure. In Copenhagen etc., the process is totally well understood - it's the perception of the observed quantity by the observer which evolves the wave function discontinuously. It's very clear that a counterpart of that has to exist in dB-B as well -otherwise the pilot wave isn't "cleaned" - but dB-B stays totally silent about it, just pretending that it has "solved" something about the measurement although it's totally obvious that it has solved nothing.

This problem has a mirror bonus problem in dB-B. A measurement in proper QM also creates the initial conditions. One needs the "concentrated pilot wave" at the beginning of the evolution. dB-B has no way to create the pilot waves in the right initial state, either. This seems like an independent problem because dB-B is doing mess both in the initial and final state. But it's actually the same problem because the final state of one evolution is the initial one for another - and dB-B just completely fails to describe the phenomena during the measurement.

Everyone who fails to see that dB-B says nothing positive that makes sense about the measurement must have a totally dysfunctional brain.

3. The dB-B approach with the polar wave function splits the SE into real and imaginary parts. There seems to be a quibble of this and my use of language that makes this ambiguous.

The dB-B approach is ontic, and in some ways what I might call hyper-ψ-ontic, which gets into my statements about PBR, I will refer to that later.

The problem with dB-B is there is no Hilbert space, there is then no assignment of probability according to states, then as a result no Born rule and ... . This then again gets into my issue with D = 1. It is also a reason the dB-B fails in QFT; there is no ladder of states to predict the generation of particles.

The dB-B is not without its potential uses. It seems like a decent way to address quantum chaos, for one could embed Hamilton chaos in this perspective. The de Broglie-Bohm-Vigier approach to QM is simply very weak.

4. Lawrence, the lack of a Hilbert space is not necessarily a show stopper. in QFT there are examples where a Hilbert space does not exists (https://en.wikipedia.org/wiki/Haag%27s_theorem), and also there are examples of many inequivalent representations of CCR. However all those examples can be disregarded based on physical arguments. In the case of Haag's theorem the IR difficulty is because the photon is required to have undetectable extremely large wavelengths. Use periodic boundary conditions and the problem is cured.

In QFT, dB-B theory has no natural place to define contextuality and the theory is over-extended into the invalid range.

Bohm's QM focuses in on a particular classical-like channel. This emerges from a form of the wave function ψ = Re^{iS}, where the Schrodinger equation splits into a real Hamilton-Jacobi equation with modification plus a continuity equation as the imaginary part. If I were to say what is wrong with this it is that it fixes QM in a type of coordinate system that it does not obey. We could perform any transformation of the wave function ψ → O ψ, so that there exists an infinite number of possible Bohmian equations. In fact I worked out how this can lead to a type of path integral, where we ignore the need to consider one trajectory as real, while all possible other ones are “inactive.”

LC

6. That is very interesting. Do you have a paper on this?

7. I never published this. I am not sure if anyone else has done this. If I find the time maybe I can write it up.

LC

8. This could be something valuable or a dead end, it's hard to say now. However it is a new idea and is worth pursuing.

9. This comment has been removed by the author.

10. I will see if I have time to write this up. It is possible to think of the beable as a representation of a particular path, and there the modified Hamilton-Jacobi equation has this quantum potential that "tells us" how this path is influenced by the occurrence of these other paths. One can then think of there existing a whole class of these paths related to each other by unitary-symplectic transformations. One then arrives at a form of path integral.

I will have to rederive this. As I recall this form of the path integral did not have quite as bad a form of "bugger factor" that the standard path integral has. I am though not so sure this is a particularly major result.

The dB-B approach to QM is not as big a deal as Bohmians seem to think. On the other hand it is not a complete train wreck either. To be honest it seems like a possible approach to looking at quantum chaos. I am not that enamored with quantum interpretations and I think they are all failures in one form or the other.

11. "I am not that enamored with quantum interpretations and I think they are all failures in one form or the other."
Indeed, and that is why I am developing my own using my approach to reconstruct it from physical principles.

12. I have a question with this. I have done a lot of work in the last couple of months on how the theory of BMS supertranslations is equivalent to aspects of the qubit entanglement equivalency with black holes. If you read Duff, Borsten et al you see how they present black hole physics according to BPS formalism and extremal black holes. Non-extremal black holes are timelike or elliptic, extremal black holes are null or parabolic, and if they were to exist (as is doubtful) naked singularities are hyperbolic. My question is then whether this a spacetime version of your approach to QM with elliptic composibility.

Spacetime appears to be an emergent bulk property of quantum fields or entangled states that define the boundary. It would seem plausible that the elliptic, parabolic and hyperbolic conditions on quantum states translate into the same for black holes in the bulk.

LC

13. Lawrence, most likely it has nothing to do with space-time, but I keep an open mind. The short answer is that I don't know at this point. See my next week post where I go in depth in the math for the hyperbolic case and where I'll compare it with the elliptic case. Hyperbolic vs. elliptic boils down on functional analysis.

2. Dear Lubos, if you take the Schrodinger equation, decompose it in real and imaginary parts and bingo you have the equations of dB-B. But the main problem is this cleanup of the guiding wave after measurement: without it you get incorrect predictions. Even worse if you dig deeper you get a mathematical inconsistency. But this "cleanup" is simply impossible if the particle has a well defined position.

Supporters of this interpretation claim "contextuality" to prevent generating inconsistencies, but in field theory the contextuality trick does not work because there is no clear cut where it should take place (and I think I can prove dB-B inconsistent with QFT using algebraic topology arguments).

The Earth is round, but for us it looks more or less flat. Locally you can approximate a sphere with the tangent plane at that sphere, but a plane is not a sphere. Now suppose people on Earth say something like: as far as I can see (within my context) the Earth is flat (decompose the Schrodinger equation in real and imaginary parts and interpret is as dB-B). I travel to a different point on Earth (I perform a different experiment), the Earth is still flat. Now can I conclude the Earth is flat (QM is realistic)? Nope.

On measurement, dB-B has this huge appeal: the particle has a well defined position and there is a well defined trajectory and so there is no mystery. But this is only a sleight of hand. The measurement problem in dB-B is to explain how the quantum potential changes when a person looks at the particle. dB-B is not at all observer independent contrary to the claims of its supporters.

1. Florin, first of all, a mathematical detail, you don't get dB-B by decomposing the wave function into the real and imaginary part. You have to decompose it to the radial and angular part, the absolute value and the argument.

But more importantly, it doesn't matter how you reparameterize the mathematical variables. All parameterizations are equivalent as long as they're equivalent. What matters is what is their interpretation and what equations they obey.

I agree that the "appeal" is a sleight of hand and hasn't done anything like "the decisive step" needed to eliminate the observer - assuming that I understood your comment with its hidden meanings, too. More generally, the whole project - the effort to eliminate the observer - is fundamentally flawed. The observer, the dependence of the claims of the laws of physics on an observer -cannot be eliminated.

2. Picky, picky, picky. Yes it is not a straight real-imaginary decomposition, first you use the polar form (see https://en.wikipedia.org/wiki/Quantum_potential or the text above) but the meaning of the analogy was clear: Bohmian interpretation is an incorrect flat-Earth point of view.

"The observer, the dependence of the claims of the laws of physics on an observer -cannot be eliminated." GRW is a counterexample (not that I believe in this either) but fortunately this does predict different things than QM and it can be put to the test. For now collapse theories keep tuning their parameters to avoid contradictions with all experimental evidence, but there are experiments in progress which in a few years will conclusively reject those kinds of QM modifications.

MWI is also incorrect as it suffers from two problems: the unique basis problem and the definition of probabilities.

Only Copenhagen type explanations of QM have a chance of being correct and this is the framework we should work in. There are several Copenhagen splits though: original Bohr, consistent histories, Qbism, etc.

There is another issue: is QM unique? Are there any other possible consistent theories of nature besides QM and classical physics? I'll answer that in the next post.

3. "but fortunately this does predict different things than QM and it can be put to the test."

Fortunately? You just join the Popperazzi cult here. You present it as an advantage that a theory or idea makes some bold predictions. But you don't actually seem to be interested in the validity of the prediction.

The prediction is instantly falsified by observations. If the spontaneous collapse is sharp and frequent enough to be responsible for the well-defined positions of small objects, then it must violate the energy conservation law by too big amounts that would cause flashes that are observed not to exist.

So the theory is immediately dead which is a reason to use the word "unfortunately".

4. "The prediction is instantly falsified by observations". Not yet. I don't think you are up to date on GRW type extensions. The original formulation was indeed proven incorrectly, but not the recent variants (I think Angelo Bassi may have review papers explaining what is rejected so far by experimental evidence; he gave some talks on the issue). There are free parameters in the theory which can be tuned. There is an experiment in progress at Gran Sasso which (if all will go well) can reject the entire GRW class of QM extensions. My no-brainer prediction on this is that in the end QM will be proven "un-extensible" (almost all physicists have this position).

I can rigorously (as a mathematical theorem) reject GRW based on the physics principle of the universality of QM, but ultimately the experiment is king in physics an without experimental confirmation you cannot convince GRW supporters they are wrong.

On energy conservation, this is not a definite problem. The energy conservation issue can be traded for another problem, the existence of a "tail" due to incomplete collapse. If collapse is introduced in an ad hoc manner, who is to say the collapse has to be complete? A much more serious objection to me is the loss of unitarity to begin with due to the implications for quantum field theories.

5. I'm not sure that Lubos' point about dB-B completely destroys that interpretation, but it certainly makes it more complicated, and calls into question whether it really is a realistic interpretation. If you don't do the "cleanup" after detection, then that means that, even with dB-B, you don't have definite outcomes to experiments. So you end up with something more like the Many-Worlds Interpretation.

6. Daryl, I am not sure I am following what you say. There is no MWI in dB-B. Here you always get a definite outcome because everything is based on position (including dials on measurement devices) and particles have a well defined position at every time. The "cleanup" is abut resetting the quantum potential after measurement to reset the guiding field to continue to be in accordance with QM (preserve "quantum equilibrium").

7. It's a little complicated to spell out, but I'll try. The Von Neumann interpretation of QM (which might be the same as Copehagen, I'm not sure) has two kinds of evolution of the wave function. There is smooth evolution according to the Schrodinger equation. Then there is wave function "collapse" following an observation. I think that the "cleanup" for dB-B directly corresponds to "collapse" for von-Neumann.

The insight that leads to MWI (whether you consider that a successful interpretation, or not) is that you can postpone collapse arbitrarily far into the future. (MWI is sort of the limit where you postpone it indefinitely). Instead of saying that the measuring act causes a collapse of the wave function, one may alternatively say that the measuring act causes the measuring device (which might be a human being) to become entangled with the system being measured. In this view, the use of the collapse can be seen as a pragmatic choice that avoids dealing with the wave function of a complex, macroscopic system such as a measuring device, and instead remains focused on small, simple systems such as electrons, photons, atoms, etc.

If one were to do dB-B, but instead of using a pilot wave for a single electron, one used a pilot wave for the ENTIRE composite system of electron + measuring device + human experimenter, then my claim is that the "cleanup" process could be postponed, in the same way that von Neumann's collapse can be postponed arbitrarily far into the future (the limiting case being MWI, in which there is no collapse at all).

8. I'm assuming that you are familiar with the general idea of postponing collapse. Suppose that you have an experiment in which an electron is produced with an unknown spin direction, and then its spin is measured by some device, and then the device results are checked by experimenter Bob, and then Bob reports the result to Alice, and then the pair of them make a presentation at a conference. Then, the von Neumann collapse interpretation gives you a number of points at which you can claim that a "collapse" takes place:

1. When the measuring device interacts with the electron.
2. When Bob looks at the result.
3. When Bob reports to Alice.
4. etc.

If you treat the electron as the system, then von Neumann says that you should have a collapse at stage 1. If you treat the electron and the measuring device as a system, then you can postpone a collapse until stage 2. If you treat the electron, the device, and Bob as the system, you can postpone the collapse until stage 3. Etc. These choices make essentially no difference after stage 1, because of the practical impossibility of observing macroscopic superpositions. So you might as well assume that the collapse happens at stage 1. But there is no necessity for doing that.

Similarly, there is no necessity in dB-B to consider only the wave function of the electron, as opposed the wave function of the electron + device, or electron + device + Bob, etc. The larger the composite system you consider, the later you can postpone the need to do a "cleanup".

9. Daryl, sorry for the delay, it is only now that I have a bit of time to reply. Interesting approach, but in dB-B there is no collapse, the particle always is and has a well defined position. And as I was saying, correlations of the position at different times are different than what standard QM demands because of that.

Now let's consider your postponement proposal. In each of your enlargement of the problem you are not changing the initial distribution of the particle, but you add additional distributions for measurement device, Alice, Bob, etc. At some point you must stop this process; you cannot double down forever because ultimately you cannot talk about the wavefunction of the universe, because that makes no sense. And when you do stop you are still having the same problem because the original particle distribution is still not suitable for the second measurement.

3. Dear Florin,

“In the last two posts the merits of classical (realistic) description of Nature were discussed. The major disagreement was on the burden of proof. I contend it is the responsibility of any alternative explanation to prove it is better than quantum mechanics.”

Let’s take a look at what Wikipedia sais about the burden of proof issue:

When two parties are in a discussion and one affirms a claim that the other disputes, the one who affirms has a burden of proof to justify or substantiate that claim -i.e., X is good/true/beautiful, etc. An argument from ignorance occurs when either a proposition is assumed to be true because it has not yet been proved false or a proposition is assumed to be false because it has not yet been proved true. This has the effect of shifting the burden of proof to the person criticizing the proposition, but is not valid reasoning.

Let’s see how this applies to our debate.

1. Bell’s theorem:

My claim: classical field theories imply determinism and correlations between all particles, therefore the measurement independence assumption (or free will if you want) fails. Therefore Bell’s theorem cannot rule out classical field theories.

Your claim: Bell’s theorem applies, because:

classical field theories are not superdeterministic (did not get the Matrix analogy) – no evidence for that
classical field theories are not contextual (you have provided a link to probability axioms which are irrelevant given the fact that there are no probabilities in a deterministic theory).

As the Wikipedia quote describes you have tried to shift the burden of proof by asking me to actually provide a classical theory predicting the EPR results.
You did not even try to answer to my points arguing for the failure of measurement independence in classical field theories (super determinism was not a part of my opening statement, and neither Kolmogorov’s theorems).

2. Double slit experiment:

“On Yves Couder's example, the position is completely classical obeying Bell's inequalities. As such Feynman's argument is still valid.”

1. Bell’s theorem is irrelevant for classical field theories (Yves Couder's experiments are about the interaction between the oil-drop particle with a classical field)

2. The experiments reproduce single-particle interference effects, proving Feynman wrong.

3. You have not addressed my original point, that in general, in a field theory you cannot sum up the probabilities as Feynman did for the “bullets” model.

Again, you were trying to shift the burden of proof by the observation that Yves Couder's experiments cannot reproduce nature. I did not make such a claim.

-To be continued, space exceeded

Andrei

4. 4. Atomic stability and discrete energy levels

The evidence you provided against classical physics can only justify the claim that in the approximation that the electrons do not have spin, classical electromagnetism fails. You couldn’t address the electron+ spin example, and no argument that classical field theories in general cannot explain the atom is given.

The usual shifting the burden of proof follows:

“Based on the general information available it is immediately clear that that model has nothing to do with reality”

And again:

“create a computer simulation of say 1000 atoms and use Maxwell's and Newtonian equations of motion only to model the interaction. Then try to find an initial configuration which will be stable. I think there is none. Prove me wrong with such a model and I'll concede this point”

It is your job to justify your claims not mine to prove them wrong.

5. Tunneling

“The point is that the argument needs to have more predictive power than a fuzzy non-committal: "A new theory could predict a much stronger force." Show me the money.””

Another textbook example of the same burden of proof shifting. It is not my job to explain tunneling with a classical theory. It is enough to point out that your examples do not justify your claims.

Now, let’s get to the new post:

“So if we have to consider all possible paths, the notion of the classical trajectory is simply doomed and there is no realistic quantum interpretation in the Lagrangian formalism. One down, two to go.”

You did not prove that “we have to consider all possible paths”, but only that this could be one way to look at the problem. Such an explanation cannot be taken seriously because it implies either non-locality or an inconsistent many-worlds view. The proper way to look at the problem is that the particle’s trajectory is determined by the field associated with the barrier. The “all possible paths” are encoded in the field. You can even visualize how the particle moves by searching Yves Couder's movies on YouTube.

In conclusion, given the fact that the arguments against classical field theories fail it makes little sense to force the “quantum mechanical way” on logic, or assume QM to be fundamental and try to force gravity to deal with indeterminate positions of particles or with indeterminism in general. As far as we can see at this moment QM is just a statistical theory standing upon an yet-to be discovered exact theory of nature. All experiments dealing with uncertain results should be relabeled from “non-explainable, fundamental uncertainty” to “in search of an explanation”.

Andrei

1. Dear Andrei,

Wikipedia also states:
"While certain kinds of arguments, such as logical syllogisms, require mathematical or strictly logical proofs, the standard for evidence to meet the burden of proof is usually determined by context and community standards"

There are two ways to establish truth: by logical arguments as a logical consequence from some axioms, or by agreement with experiment. After Galileo, physics is an experimental science and agreement with experiment trumps abstract logical arguments.

The community standard in physics is that the theory has to be tested against experiments. QM did pass all experimental tests to date. You simply cannot ask anything else from QM. So naturally the burden of proof is now on any competing paradigms to show they are at least just as good as QM.

Florin

2. "Such an explanation cannot be taken seriously because it implies either non-locality or an inconsistent many-worlds view."

I disagree with 4 things in this statement. First, this is taken very seriously in quantum field theory in the form of Feynman diagrams and because of it the most accurate predictions known to mankind were made and confirmed experimentally. Second, this does not imply nonlocality: you cannot send signals exceeding the speed of light. Third, it does not imply MWI. And fourth, MWI is not inconsistent. MWI has its own faults which reject it as unfit do describe what is going on in QM, but inconsistency is not one of them. A wavefunction can be decomposed in many basis. Why is MWI preferring only one base which arise out of decoherence? Why is MWI not splitting the world along a different basis before decoherence takes place?

3. Dear Florin,

"The community standard in physics is that the theory has to be tested against experiments."

Sure, but the debate was not about proposing new theories but to establish to what extent new theories are possible.

"QM did pass all experimental tests to date."

I have never claimed the contrary. What is the relevance of this in regards to the possibility of classical exact theories? It is perfectly possible to have a correct statistical description and an exact description of the same phenomenon. One description does not exclude the other.

"You simply cannot ask anything else from QM."

I cannot ask more because it is a statistical theory, just like I cannot ask thermodynamics to give the exact speed of a gas molecule. But QM is silent in regards to the exact experimental values. You just assume that probabilities are fundamental, but you have no proof for that. Science works by searching for explanations for the observed phenomena not by postulating them to be unexplainable. This search for an explanation should only stop once a solid proof has been found against its existence.

If you make the claim that quantum probabilities cannot be explained it is your burden to justify it. All your arguments fall short of that.

"So naturally the burden of proof is now on any competing paradigms to show they are at least just as good as QM."

An exact theory is not a competing theory, but a theory with a different scope. And, again, the debate was not about the truth of such theories but about the possibility of their existence.

You have claimed:

1. Bell's theorem rules out classical field theories. You did not support that claim.
2. Classical field theories cannot explain atoms. You did not support that claim.
3. Classical field theories cannot explain single particle interference. You did not support that claim.
4. Classical field theories cannot explain tunneling. You did not support that claim.

Just like the wikipedia quote sais, you have appealed to an argument from ignorance. Let's see it again:

"An argument from ignorance occurs when either a proposition is assumed to be true because it has not yet been proved false or a proposition is assumed to be false because it has not yet been proved true."

You simply assume all the above claims to be true because I didn't prove them false (by presenting a well defined theory). You continue to hold to those claims even if I provided counterexamples. So, please restrict your claims to what you actually can prove. What is the purpose of asserting that some theories are not possible when you really do not know that?

4. Dear Florin,

You replied to my claim:

"Such an explanation cannot be taken seriously because it implies either non-locality or an inconsistent many-worlds view."

"First, this is taken very seriously in quantum field theory in the form of Feynman diagrams and because of it the most accurate predictions known to mankind were made and confirmed experimentally."

As far as I can tell Feynman diagrams are not supposed to be an objective description of reality, only a calculation procedure.

"Second, this does not imply nonlocality: you cannot send signals exceeding the speed of light."

By this argument Bohm's theory is also local.

"Third, it does not imply MWI."

I did not say it implies it, I listed it as an alternative to the non-locality.

"And fourth, MWI is not inconsistent. MWI has its own faults which reject it as unfit do describe what is going on in QM, but inconsistency is not one of them. A wavefunction can be decomposed in many basis. Why is MWI preferring only one base which arise out of decoherence? Why is MWI not splitting the world along a different basis before decoherence takes place?"

It is inconsistent because one cannot make sense of the probabilities QM predicts when they are different from 0,1 and 0.5. All possibilities are real. How do you ascribe probabilities?

Andrei

5. "It is inconsistent because one cannot make sense of the probabilities" Indeed, and I found this to be a fault of MWI. However this is not an inconsistency, but an inability. Inconsistency is something like 0=1.

5. Dear Andrei,

You state: "Sure, but the debate was not about proposing new theories but to establish to what extent new theories are possible."

If ain't broken, don't fix it. There is absolutely no point to replace an extremely successful paradigm with something else. What is valuable however is to derive QM from physical principles. Then you do not argue with QM but with it's principles. QM is derivable from the invariance of the laws of nature under composition. From this there are only 3 solutions and experimental evidence singles out QM. Any non-quantum paradigm will necessarily violate this physical principle. QM may not be intuitive but it is correct. Invariance under composition is in agreement with all experiments so far. It's violation would mean that simply adding new degrees of freedom to a physical system, changes Planck constant.

http://arxiv.org/abs/1508.01140

which is something I have been rather skeptical about.

LC

http://arxiv.org/abs/1508.01140

which is something I have been rather skeptical about.

LC

8. I just gave it a quick glance, and the original retrocausality idea is wrong. I know Ken though and he does not produce bad papers, so I'll read it carefully. Maybe it is something along the transactional interpretation. I'll get back to you after I'll study the paper.

9. The paper is not completely correct. In Fig 3 page 7, Erutan (nature spelled backwards) chooses an input and Ecila (Alice spelled backwards) chooses a polarization. The authors proceed to show that by controlling the polarization tau, Ecila (and hence Alice) cannot send signals to Bob. This is true, BUT Ecila can control more than the polarization. In particular she can block Erutan's photon at pre-arranged intervals and thus send a signal to Bob. As such the mirror image of Alice-Bob IS NOT EQUIVALENT with Alice-Bob.

There is a kernel of truth in the paper though. See http://arxiv.org/pdf/quant-ph/0510032v1.pdf and take a look at the information flow which sometimes go backwards in time. The authors simply rediscovered a small piece of the pictorial formalism. But the way the authors describe it is wrong: "the explanation shows how entanglement might be a much simpler matter than the orthodox view assumes – not a puzzling feature of quantum reality itself, but an entirely unpuzzling feature of our knowledge of reality, once zigzags are in play." There are no zigzags (retrocausality) in nature, and the problems of retrocausality are not prevented the way the paper describes it. All you have is only a formal equivalence (similar with CPT=1) provable in the pictorial formalism.

In conclusion, what is correct is not new, and what is new is not correct.

1. My argument against this, where I am being bothered with this (it is actually annoying), is that information going back in time is really negative information going back in time. With information we might say that since time is the parameter that conserves energy, and E = ST, for temperature T = ħ/kt, here t a Euclideanized time, then for t ---> -t this is the same as having negative temperature. So we can think of this as a sort of negative information (anti-information). So an antiparticle transmits anti-information into the past, which is entirely the same as saying you have information propagating into the future.

LC

2. Florin, I think that you are making a criticism that goes beyond the scope of the article. They are not proposing a complete, time-symmetric view of all physics, they are only showing how a specific experiment, the EPR experiment with photons, can be understood in a time-symmetric way. In the EPR experiment, the only choice made by Alice is the measurement angle. You are violating the time-symmetry by allowing Ecila to do other things, such as blocking Erutan's photons.

If you want to allow such a possibility for Ecila, you need to figure out what the corresponding time-reversed operation for Alice would be. I think that's very complicated (for me, anyway), because absorption of a photon involves an irreversible process, and the time-reversed version of an irreversible process looks like a conspiracy (glass fragments coming together to make a perfect glass goblet, for instance). The whole point of Huw's choice of the polarization cubes is that understanding their behavior only relies on simple, time-reversible physics.

3. Daryl, I am not criticizing the paper on its result. That is correct and is known in the pictorial formalism (what is correct is not new). I am criticizing the interpretation of the result as it is presented in the abstract (what is new is not correct).

When you introduce a physical effect (the zigzag) to explain something you should test its implications in new settings. That is what I was doing when I talk about blocking the photons. This is standard physics practice. For example Einstein proposed general relativity and showed that it correctly predicts Mercury's orbit. Then it applied this new theory in a new setting and predicted the bending of light before it was observed.

If you insist on using the zigzag explanation only in the way it is used in the paper, then I can claim in similar fashion that electrons are not exhibiting wave behavior because I am looking only at say the photoelectric effect and do not consider the interference experiments on crystals. Sure, the electrons are behaving like particles in the photoelectric effect but I should not claim QM is not puzzling because of that. That would be a fake explanation for electrons, just like I do not agree with: "an entirely unpuzzling feature of our knowledge of reality, once zigzags are in play"

10. Florin, one comment about the claim that Born's rule is incompatible with an arbitrary initial probability density: I assume that that's for pure states. Mixed states (which combine classical and quantum probability) allow for more possibilities.

1. Daryl, the initial statistical distribution of the position of the particle can be anything you want if the initial conditions are completely free. However, to have agreement with QM, this needs to be |psi|^2. dB-B theory is about pure states.