Friday, October 30, 2015

Why is there something rather than nothing?


I recently discovered a very nice talk by Lawrence Krauss entitled "A Universe from Nothing". 




The actual talk starts at 12:48, and discusses a topic which used to be in the area of theologians and philosophers, but as the gaps of understanding shrinks science is now able to provide a plausible (but not yet definitive) explanation about how it all began. Even if you follow topics in astronomy, I think it is still awe inspiring to contemplate our place in the universe and realize that if you magically remove all visible matter this would hardly change anything in the evolution of our universe. Trillions of years in the future, the accelerated universe expansion would make all other galaxies disappear from our view and we would not have any experimental means to test the ideas of Big Bang and inflation. All of this would only be hearsay from distant past (our time now) if we somehow manage to survive that long, which is highly doubtful. 

So why there is something rather than nothing? Simply put, "nothing" is unstable because of  the combination between relativity and quantum mechanics. In quantum mechanics we have the Heisenberg uncertainty principle which shows that if we measure the position very precisely, the momentum must have a very large uncertainty. But this uncertainty would translate into very large speeds (even higher than the speed of light) and the only way out is to create particle pairs. Hence the vacuum is filled with virtual particles which pop in and out of existence short enough to obey the other Heisenberg uncertainty principle: the energy-time relations. But when you add gravity in the mix, this changes everything and the virtual particles become very real. And what would the total energy of such a system be? Precisely zero.  Then a natural question to ask is: what is the total energy of our universe? It turns out that this can actually be measured (we live in a special time when this is possible), and surprise, it is zero!

I won't spoil the video with additional information from it, please watch it, it is very nice, and instead I will focus on a part of it which I am disagreeing with: the role of the question "why?"

One common mistake people make is to think that correlation implies causation. This mistake is very easy to make because of our day to day experience. Krauss is not making this mistake, but another one which is also rooted in daily experience: why implies purpose. I will attempt to argue against this position and show that "why?" is a scientifically valid question which leads to genuine scientific answers.

Let me start with the concept of truth. In mathematics a statement is true if we can construct a proof starting from an axiomatic system. Now this concept of truth is not universal as it was shown by Godel. If your axiomatic system is rich enough to encompass arithmetic, Godel showed that any such system is incomplete and there are statements which can be neither proved nor disproved. If we augment the original axiomatic system with such a statement we create an enlarged axiomatic system. However, we can enlarge the axiomatic system with the negation of the statement and obtain another enlarged axiomatic system. Now the two enlarged axiomatic systems are incompatible with each other, and moreover, the process can be repeated. What this shows is that mathematics is infinitely rich and not axiomatizable. And so the concept of truth is parochial in mathematics.

But there is a second notion of truth which comes from nature: something is true if it is in agreement with reality. Physics uses this notion of truth because physics is an experimental science. Then a natural problem is to compare and contrast the two notions: the mathematical and the physical one.

When we answer "how?", we use the mathematical concept (and the "unreasonable effectiveness of mathematics"), but when we answer "why?" we use the physical approach. 

Let me illustrate with quantum mechanics. On the mathematical and how side, one starts with the usual axioms: a Hilbert space, observables are Hermitean operators, etc. However, on the physical and the why side, one can start from a physical principle: the invariance of the laws of nature under composition.

In the how, mathematical side, you build mathematical proofs, but on the why, physical principle side, you select distinguished mathematical structures from the infinite collection of the Platonic world of math which respect the physical principles. All mathematical structures are unique, but only a handful are "distinguished" and used by nature. "Why?" is a "distinguish-ability" question. Nature does not use hyperbolic composability for example because this violates another physical principle.

So why quantum mechanics, why special relativity, why general relativity? Because of the physical principles of invariance under composition, the physical principle of relativity, the physical principle of the equivalence between inertial and gravitational mass. 

And where are the physical principles coming from? Here I have a suggestion: they essentially encode the difference between the real world and the abstract world of math. Take hyperbolic quantum mechanics. It almost had a chance to describe something real, but it is a mathematical impossibility to construct a state space and hence to have an objective way to assign truth and make experimentally testable predictions. Positivity is a property of the physical world which is forbidden in the mathematical world by Godel's theorem.

So the question why is actually very meaningful and moreover it can create mathematical consequences which can be put to experimental tests. In physics why does not imply purpose but "distinguish-ability" of the mathematical structures which play a key role in the physical world. The reason Krauss disagrees with it has to do with the abuse of the question by theologians who answer why? with "because God". Unlike Europe, US is a very religious place where science is under constant attack by religious bigotry who enjoys significant political power (the video reference to Arkansas and Ohio was criticizing attempts to teach in public schools "intelligent design" as an alternative to evolution who is only "a theory": https://en.wikipedia.org/wiki/McLean_v._Arkansas https://en.wikipedia.org/wiki/Intelligent_design_in_politics).

I agree that theologians are "experts of nothing", but philosophy should not be lumped with the area which presumes the answer before you ask the question. However philosophy does not create new knowledge and only provides an interpretation of what science discovers. The physical principles behind special relativity and quantum mechanics were not uncovered by philosophical contemplation, but by solving concrete physical problems. It is also true that the change of the paradigms is not at all easy even if you have solved the concrete problem. Case in point, Lorentz discovered his transformations, but the proper paradigm was discovered by Einstein.

Friday, October 23, 2015

Hyperbolic Quantum Mechanics


In a recent post Lubos Motl stated:

     There only exist two types of theories in all of physics:
    
          1. Classical physics
          2. Quantum mechanics
          3. There is simply not third way.

     I added the third option in order to emphasize that it doesn't exist.

This is correct (gee, I am agreeing with Lubos, is everything all right?), but (as advertised last time) today I want to dig in deeper in option 3 and prove why this is so. In the process we will better understand quantum mechanics and explore a brand new (and unfortunately sterile) landscape of functional analysis.  

Quantum mechanics shares with classical physics a key property: the laws of nature do not change when we consider additional degrees of freedom. For example, the tensor product of two Hilbert spaces is still a Hilbert space and the composed system does not become classical, quantum-classical, or something else. This invariance of the laws of nature under composition is best expressed in the category formalism and in there we have three classes of solutions:

  1. Elliptic composition (quantum mechanics)
  2. Parabolic composition (classical physics)
  3. Hyperbolic composition (a hypothetical hyperbolic quantum mechanics)
For quantum mechanics von Neumann has unified Heisenberg's matrix formulation and Schrodinger's approach in the Hilbert space approach, and we will see that the functional analysis has a categorical origin and there is a mirror categorical formalism for hyperbolic quantum mechanics as well. We will see that this is unphysical for obvious reasons and option 3 while nice as a mathematical curiosity has no physical usefulness. But it is helpful to better understand the Hilbert space formulation of quantum mechanics.

Let me start with some elementary mathematical preliminaries. In quantum mechanics one uses complex numbers and they originate from the fundamental composition relationships of the two quantum mechanics products: the commutator and the Jordan product. In hyperbolic quantum mechanics one uses split-complex numbers



In split complex numbers the imaginary unit is \(j\) with \(j^2 = +1\) but \(j \ne 1\). In matrix representation, \(j\) is the 2x2 matrix  with zero on the diagonal and 1 on the off-diagonal. Split complex numbers have polar representations similar with the complex numbers, but the role of the sines and cosines is replaced by hyperbolic sines and hyperbolic cosines. Very important, the fundamental theorem of algebra does not hold in split complex numbers and this has very important physical consequences for hyperbolic quantum mechanics.

Formally, hyperbolic quantum mechanics is obtained by replacing \(\sqrt{-1}\) with \(j\) in the commutation relations. However, there are no Hilbert spaces in hyperbolic quantum mechanics!!!

So let us see the corresponding formulation of hyperbolic quantum mechanics.

Starting with de Broglie's ideas, in a hypothetical universe obeying hyperbolic quantum mechanics one would attach to a particle not a wave, but a scale transformation \(e^{jkr}\) and the scale transformation caries a given momenta in accordance with the usual de Broglie relation (in hyperbolic quantum mechanics one still have the same Planck constant): \(p = \hbar k\)

Continuing with matrix mechanics, Heisenberg approach carries forward identically, but here we hit the first roadblock: the matrices cannot be always diagonalized because the fundamental theorem of algebra does not hold in this case.

Continuing on Schrodinger equation, it's hyperbolic analog is:

\(+\frac{\hbar^2}{2m}\frac{d^2}{d x^2} \psi (x) + V(x) = E \psi (x) \)

To really understand what is going on in the hyperbolic case, we need to investigate the functional analysis of split-complex numbers. The best starting point are the metric spaces, and not the more abstract setting of topology. Key in a metric space is the triangle inequality. If you follow any standard functional analysis book you see that all metric spaces over complex numbers ultimately prove their triangle inequality starting from the triangle inequality of complex numbers. Moreover, this arises out of the trigonometric identity:

\({cos}^2 x + {sin}^2 x = 1\)

But in the hyperbolic case, one has this identity:

\({cosh}^2 x - {sinh}^2 x = 1\)

and in each of the four quadrants of the split-complex numbers, a reversed triangle inequality holds. If you cross the diagonals all bets are off, but it turns out that one can successfully introduce a vast functional analysis landscape for split complex numbers just as rich as the usual functional analysis. The reason for this is that ultimately the fundamental Hahn-Banach theorem holds in the split-complex case as well. To coin a name for the mirror analysis of split complex numbers, we will place in front the prefix para. As such in the hyperbolic case we will have a para-Hilbert space which is a very different beast than the usual Hilbert space.

There is a translation dictionary for all definitions and proofs in para-analysis:


Elliptic
  Hyperbolic
Triangle inequality
Reversed triangle inequality
Sup
Inf
Bounded
Unbounded
Complete
Incomplete

A sequence \(x_n\) in a metric space X = (X,d) is said to be para-Cauchy if for every \(\epsilon \gt 0\) there is an \(N = N(\epsilon)\) such that:  \(d(x_m,x_n) \gt \epsilon \) for every \(m,n\gt N\) The space X is said to be para-incomplete if every para-Cauchy sequence in X diverges.

A para-Hilbert space is an indefinite para-inner product space which is para-incomplete.

The para-Hilbert space is non-Hausdorff, but the key showstopper is that given a point x, and a convex set (which would correspond to a state space) there is not a unique perpendicular to the set. As such a hyperbolic GNS construction is not possible! This means that we lack positivity and hence we cannot define a physical state.

Invariance under composition demands 3 solutions: elliptic, parabolic, or hyperbolic. We can define a state only in the elliptic and parabolic cases (quantum and classical physics). 
By overwhelming experimental evidence, nature is quantum mechanical, no exceptions allowed.

Friday, October 16, 2015

Is quantum mechanics unique?


Quantum mechanics describes nature perfectly and no experiment has ever detected violations of quantum mechanics predictions. Last time I have shown that there is no realistic interpretation possible for quantum mechanics and this fact flies in the face of our classical intuition. But this intuition was developed as part of evolution of species on Earth (a lion chasing a gazelle needs not solve Schrodinger's equation) and is simply irrelevant to modern science.

If quantum mechanics is not a realist theory, then perhaps there are other realistic theories possible. Special relativity replaces Newtonian ideas of absolute space and time, electromagnetism is part of the larger electroweak theory, etc. When we look at answering uniqueness questions, there are two approaches possible.

First  you can prove no-go theorems. Bell famously said that no-go theorems only prove a lack of imagination for the author. When you hold dear to your heart a contrarian paradigm, no-go theorems will never convince you to change your point of view. I know this first hand from arguing with people who think they can beat Bell theorem in a locally realistic computer simulation although that is a mathematical impossibility.

Then there is a second approach possible: derive a physical theory from physical principles. Special theory of relativity has far fewer challengers today compared with quantum mechanics because it is much harder to argue with the principle of relativity. Can quantum mechanics be derived from physical principles? The answer is yes, the physical principle is the invariance of the laws of nature under composition:

If system A is described by quantum mechanics, and system B is described by quantum mechanics, then the composed system is described by quantum mechanics as well.

Physically this means that the Planck constant does not change when we add additional degrees of freedom. Mathematically quantum mechanics follows from using category theory arguments. But what other theories obey this invariance under composition principle?

It turns out that there are 3 such solutions possible:
  • elliptic composition
  • parabolic composition
  • hyperbolic composition
Elliptic composition is quantum mechanics (and this is why this blog is called elliptic composability), parabolic composition is classical mechanics, but what is this hyperbolic case?

Formally, hyperbolic composability is nothing but quantum mechanics with \(\sqrt{-1}\) replaced by \(\sqrt{+1}\) and the resulting theory is known as hyperbolic quantum mechanics.

The first thing one notices is that in hyperbolic quantum mechanics one continues to add amplitudes just like in ordinary quantum mechanics, and therefore this is not a realistic theory either. As such realism is dead for good. But can this theory describe anything in nature? Nope, but nevertheless I'll explore the mathematics of this theory in the next post because the most valuable aspect of this theory is to act as a comparison backdrop against quantum mechanics and illuminate various properties of it.

I will not start today to dig into the mathematical aspects, and I want to discuss instead the meaning of lost realism in physics. We have seen that quantum mechanics is a probabilistic, and not a deterministic theory. The wavefunction cannot have an ontic interpretation for two main reasons: it does not carry energy or momentum, and it has several distinct representations. But perhaps realism is saved by an epistemic interpretation: what if the experiment simply reveals pre-existing values? This hope was kept alive by various classical toy models, but they were put to rest by the PBR theorem. So realism is not an option anymore, but is there a real world independent of us? Do we have to resort to solipsism or even worse to some sort of quantum cargo-cult religious new age babble based on the discredited ideas of Stapp and the crackpot new age guru Deepak Chopra? Does the observer play an active role in quantum mechanics?

Let's see what the math shows. Quantum mechanics reconstruction uses category theory, and category theory is known as "objects with arrows". The nature of the objects is completely irrelevant, all that matters are the arrows which represent the relationships. Originally category theory was introduced to map the similarities between topological and algebraic objects, and the higher abstraction of categorical proofs allowed the extraction of common behaviors in very different mathematical domains. Because quantum mechanics reconstruction is categorical in nature, this derivation is blind to any hypothetical underlying quantum ontology. The "true meaning" of the wavefunction simply does not matter. The question: what does quantum mechanics describe? is not a testable, meaningful scientific question.

But what about the observer role? The observer does not cause the outcome. It that were true, you can use quantum mechanics to send signals faster than the speed of light. It is not the observer who is important, but the configuration of the measurement device. This is because non-commutative observables cannot be measured simultaneously. The observer only plays an indirect role in deciding what and how to measure. Here Andrei can argue along the lines of "free will is an illusion": the observer is part of nature, subject to quantum (or hypothetical sub quantum deterministic) laws as well. However, as an emergent phenomena, human consciousness is fundamentally different. Why? Because in quantum mechanics information is conserved, but people are born and later on die and there is no information conservation for the soul.  Free will is a manifestation of this lack of information conservation.

To date, the best correct interpretation of quantum mechanics available is QBism. However I am not completely satisfied with it. If qubism is Copenhagen done right, I am working on "qbism done right" :) but more on this in future posts.


Friday, October 9, 2015

Is there a realistic interpretation of Quantum Mechanics?

(a critical analysis of the Bohmian mechanics)

In the last two posts the merits of classical (realistic) description of Nature were discussed. The major disagreement was on the burden of proof. I contend it is the responsibility of any alternative explanation to prove it is better than quantum mechanics. Given the overwhelming and irrefutable evidence for the applicability of quantum mechanics in describing nature, this is simply impossible, but is there a realistic interpretation of quantum mechanics?

When discussing physics, there are three possible mathematical frameworks to choose from:
  • Lagrangian formalism in the tangent bundle,
  • Hamiltonian formalism in the cotangent bundle.
  • Formalism in the configuration space.
For the Lagrangian formalism, I cannot do any better than Zee in Chapter I.2 of his "Quantum Field Theory in a nutshell":

"Long ago, in a quantum mechanics class, the professor droned on and on about the double-slit experiment, giving the standard treatment.[...] The amplitude for detection is given by a fundamental postulate of quantum mechanics, the superposition principle, as the sum of the amplitudes for the particle to propagate from the source S through the hole A1 and then onward to the point O and the amplitude for the particle to propagate from the source S through the hole A2 and then onward to the point O.

Suddenly, a very bright student, let us call him Feynman, asked, "Professor, what if we drill a third hole in the screen?" The professor replied, "Clearly, the amplitude for the particle to be detected at the point O is now given by the sum of three amplitudes [...]."

The professor was just about ready to continue when Feynman interjected again, "What if I drill a fourth and a fifth hole in the screen?" Now the professor is visibly losing his patience: "All right wise guy, I think it is obvious to the whole class that we just sum over all the holes."

But Feynman persisted, ""What if I now add another screen with some holes drilled into it? The professor was really losing his patience: "Look, can't you see that you just take the amplitude to go from the source S to the hole Ai in the first screen, then to the hole Bj in the second screen, then to the detector at O, and then sum over all i and j?"

Feynman continue to pester, "What if I put a third screen, a fourth screen, eh? What if I put in a screen and drill an infinite number of holes in it so that the screen in no longer there?" [...]

What Feynman showed is even when there is just empty space between the source and the detector, the amplitude for the particle to propagate from the source to the detector is the sum of the amplitudes for the particle to go through each of one of the holes in each one of the (nonexistent) screens. In other words, we have to sum over the amplitude for the particle to propagate from the source to the detector following all possible paths between the source and the detector."

So if we have to consider all possible paths, the notion of the classical trajectory is simply doomed and there is no realistic quantum interpretation in the Lagrangian formalism. One down, two to go.

We will make a quick pass onto the phase space formalism for quantum mechanics. This is very easy, Wigner functions contain negative probability parts, and hence there is no realistic quantum interpretation in the Hamiltonian formalism either. Two down, one to go.

However the configuration formalism is the tricky one. Here one encounters the Hamilton-Jacobi and the Schrodinger equation.

Let us start from classical physics. Consider 1-d motion in a potential V. The Hamilton-Jacobi equation reads:

\(\frac{\partial S}{\partial t} + \frac{1}{2m}{(\frac{\partial S}{\partial x})}^2 + V(x) = 0\)
and
\(p = \frac{\partial S}{\partial x}\)

if V=0, we consider \(S = W-Et\) from which one trivially obtains:

\(\frac{\partial p}{\partial x}=\sqrt{2mE}\). If the particle is at the moment \(t_0\) at \(x_0\) then the particle motion is unsurprisingly:
\(x-x_0 = \sqrt{\frac{2E}{m}}(t-t_0)\)

The key point is that we need the initial particle position to solve the equation of motion.

So what would happen when we replace the Hamilton-Jacobi equation with its quantum counterpart, the Schrodinger equation:

\(-i\hbar\frac{\partial \psi}{\partial t} - \frac{\hbar^2}{2m}{(\frac{\partial \psi}{\partial x})}^2 +V\psi(x) = 0\) ?
If
\(\psi=\sqrt{\rho} exp(i\frac{S}{\hbar})\)
then we get the Hamilton-Jacobi equation for S but with an additional term called the quantum potential, and a continuity equation for \(\rho\). Welcome to the Bohmian formulation of quantum mechanics!

To solve the equation of motion we again need an initial condition, just like in the classical case. But this should raise big red flags!!! In school we were taught that quantum mechanics is probabilistic, not deterministic, and also that the wavefunction is all there is needed to know to make predictions. How is this possible? Where is going on here? 

To make quantum mechanics predictions one needs an additional ingredient: the Born rule. It turns out that in Bohmian quantum mechanics the Born rule constraints the allowed distribution of initial conditions and this is the beginning of the end of the realism claim of this interpretation.


Max Born

But where is the Born rule coming from? With the advantage of about 100 years of quantum mechanics now we have two nice answers. The wavefunction lives in a Hilbert space, and in there we have Gleason's theorem which basically mandates the Born rule as the only logical possibility to make sense of the lattice of projection operators. But this is an abstract mathematical take on the problem. There is also an excellent physical explanation given by (surprise...) Born himself: http://www.ymambrini.com/My_World/History_files/Born_1.pdf I will not explain Born's paper because it is very well written and very easy to be understood today, but the main point is that Born's rule is incompatible with an arbitrary initial probability density. The supporters of the Bohmian mechanics are well aware of this and they call the consistent initial probability density "quantum equilibrium". Moreover they point out that after some relaxation time an arbitrary probability density "reaches quantum equilibrium" and so any discrepancy of predictions between Bohmian quantum mechanics and standard quantum mechanics could only occur a few seconds after the Big Bang, so problem solved, right? Wrong! It is rather ironic that the undoing of Bohmian's mechanics comes from a most unexpected direction: Bell's theorem!!! Ironic because Bell was inspired to discover his theorem by viewing Bohmian mechanics as a counter-example to von Neumann's no go theorem on hidden variables.

In quantum mechanics it is very easy to see that the position operator at different times does not commute. Why? because the position operator at time \(t-t_0\) involves the momentum operator: \((t-t_0)P\) and P and Q do not commute. However this means that there is no probability space for positions at different times. But in Bohmian mechanics, the particle always have a "real" position and as such in there such a probability space does exists. Hence we can detect in principle statistical violations in the predictions of standard quantum mechanics and that of Bohmian mechanics.

The proponents of Bohmian mechanics are very well aware of this problem and they have a solution: there are no comparison of measurements at different times! I have to agree this is a very clever, but the trouble is only swept under the rug and it does not go away.

The prediction discrepancy goes away in Bohmian mechanics only if the theory is "contextual". Because of this the way velocity is measured in Bohmian mechanics is not what we would normally expect: \(v = (x_1-x_0)/(t_1-t_0)\) and Bohmian mechanics is known for "surreal trajectories".

Surreal or not, violations of the speed of light or not, non-locality or not, the main trouble is in the sudden change of the probability density after measurement. In the Copenhagen formulation, the wavefunction collapses upon measurement and this is naturally explained as updated information. After all, the wavefunction does not carry any energy or momenta and is just a tool to compute the statistical outcomes of any possible experiment. But one of the advertised virtues of Bohmian mechanics is its observer independence: the measurement simply reveals the pre-existing position of the particle. But is this really the case?

The trouble for Bohmian mechanics is that its predictions for two consecutive measurements differs from that of the standard quantum formalism. Why? Because the "quantum equilibrium" for the first measurement is not a "quantum equilibrium" for the second measurement because wavefunction collapses during the first measurement. (by the way, this is the root cause of why no correct quantum field theory can ever be created for Bohmian mechanics.)

So how does the Bohmian supporters deal with this? The theory is simply declared "contextual" and valid only between preparation (with a pre-supposed quantum-equilibrium distribution) and measurement.

Without contextuality Bohmian mechanics is an inconsistent theory as it predicts violations of the uncertainty principle. With contextuality Bohmian mechanics becomes a time-bound equivalent formulation of quantum mechanics. Think of it as an R^n flat local map of a curved manifold. A field theory in the Bohmian interpretation is impossible the same way a 2-d map of Earth (which topologically is a sphere) cannot avoid distortions. (I see a cohomology no go result for Bohmian quantum field theory in my future ;)).

Because of contextuality the Bohmian interpretation cannot be called realistic.

Now we have exhausted all 3 quantum mechanics formulations.  Quantum mechanics is simply not a realist theory anyway we look at it. It would have been really strange if in one equivalent formalism of quantum mechanics there were a realistic interpretation, and not in the other two.

Basically it is either initial particle positions, or Born rule. But the Born rule is an inescapable self-consistency condition on the Hilbert space due to Gleason's theorem, and we must rule out unrestricted initial statistical distributions.

Quantum mechanics is complete and any addition of "hidden variables" spoils its consistency.

Thursday, October 1, 2015


The Quantum-Classical Debate:

reply to Andrei



Today I will reply to the arguments Andrei made last week in his guest post. But let me first start by giving out the answer to the questions from two posts ago. First, on why there is no uncertainty relationship for spin, from any two non-commuting observables A and B one obtains:

\(\sigma_A \sigma_B \geq \frac{1}{2} |\langle [A,B] \rangle|\)

The key point of Heisenberg uncertainty relationship for position and momenta is to be pedantic and observe that the commutator is proportional with the identity operator \(I\):

\([x,p] = i \hbar I\)

which is not true in the case of spin. Because of this, in the spin case there exists states for which the right hand side value \(|\langle [A,B] \rangle|\) is zero. The position-momenta uncertainty remains bounded from below for any state.

For the other question on the apparent violation of the uncertainty principle. here is what Heisenberg stated:

"If the velocity of the electron is at first known, and the position then exactly measured, the position of the electron for times previous to the position measurement may be calculated. For these past times, δpδq is smaller than the usual bound." and "the uncertainty relation does not hold for the past." I think this is not a well known or appreciated fact by the majority of physics community.

Then Heisenberg pointed out that these values can never be used as initial conditions in a prediction about the future behavior of the electron.

Now back to answering Andrei's challenge to quantum mechanics, Andrei discussed 3 points:

Objection 1: Classical, local theories have been ruled out by Bell’s theorem.
Objection 2: Classical theories cannot explain single-particle interference (double slit experiment), quantum tunneling, the stability of atoms or energy quantification in atoms or molecules.
Objection 3: Even if one could elude the previous points, there is no reason to pursue classical theories because quantum mechanics perfectly predicts all observed phenomena.

Let's analyze them in turn.

On Objection 1, I agree that classical, local theories have been ruled out by Bell’s theorem with only one loophole left: super-deterministic theories pursued by 't Hooft. Any statistical theory obeying Kolmogorov's axioms respects Bell's inequalities. It is interesting to see how different realistic quantum mechanics interpretations escape Kolmogorov's axioms: Bohmian interpretation is contextual while quantum mechanics in phase space uses negative probabilities.

On superdeterminism, one needs to deny free will and this is a very tall order. While I (and anyone else) cannot give a rigorous definition of free will, I know that I have it. Andrei contends that classical theory is deterministic. While true, this is both an insufficient and an irrelevant argument. Superdeterminism is only a pre-requisite step: you need to obtain from it quantum correlations, and so far I am not aware of any successful model. Second, determinism does not imply superdeterminism because the existence of chaotic evolution equations. Predicting weather is a classical example. I do not think superdeterminism has any chance of success.  

On Objection 2, I again agree with its statement. Quantum mechanics arose out of the inability of classical mechanics to explain atomic phenomena. But instead of expanding on this let's reply to the concrete arguments Andrei raised. Let's start with:

" This is all nice, but classical physics is not the same thing as Newtonian physics of the rigid body. Let’s consider a better classical approximation of the electron, a charged bullet. The slits are made of some material that will necessarily contain a large number of charged “bullets”. As the test bullet travels, its trajectory will be determined by the field generated by the slitted barrier. The field will be a function of position/momenta of the “bullets” in the barrier. But the field produced by a barrier with two slits will be different than the field produced by a barrier with only one slit, so the effect with both holes open is NOT the sum of the effects with each hole open alone."

This argument is wrong on two counts. First, one can make an interference experiment with neutrons where the neutrons not passing through the slits will be simply absorbed. Using electrically neutral particles renders irrelevant Andrei's objection. Second, "the field produced by a barrier with two slits will be different than the field produced by a barrier with only one slit" is incorrect as shown by a simple order of magnitude analysis. The electric fields near the slit are relevant on an atomic distance scale, while the distance between slits is macroscopic. You are looking at about seven order of magnitude difference in the ratio of relevant distances which translates in terms of force into a ten to minus fourteen order of magnitude effect. But the interference pattern is macroscopic and the difference between two Gaussian distributions vs. interference pattern cannot be explained away by a force fourteen of orders of magnitude smaller than what it is needed.

The next objection is appealing to Yves Couder's results. Those are interesting experiments, but they are not a confirmation of quantum mechanics emergence from classical physics and I know no one who claims it so. As such the argument is irrelevant to the current debate.

On the free fall atomic model, I did not read those papers so do not know if the claims are correct or not. The author may simply have proposed a model and analyzed the consequences and never claimed that his model describes nature.  Based on the general information available it is immediately clear that that model has nothing to do with reality. Also peer review is no magic bullet for avoiding publishing incorrect results. In my area of expertise in the last 12 months I read 2 published papers which were pure unadulterated garbage: the errors were not subtle, but blatant and packaged in a dishonest way to bamboozle the reader. Moreover I have concrete proof that the authors were aware of their mistakes when they published it. Physics is not immune to charlatans, crooks, and incompetent reviewers.

On the ionic crystals argument, without quantum mechanics the collection of electrons and nuclei will behave like a plasma and not like a crystal. This is moderately easy to test: create a computer simulation of say 1000 atoms and use Maxwell's and Newtonian equations of motion only to model the interaction. Then try to find an initial configuration which will be stable. I think there is none. Prove me wrong with such a model and I'll concede this point.

On quantum tunneling, the argument is pure handwaiving. Let me make an analogy. I know how my microwave works. But an alternative explanation might be that little Oompa-Loompas inside it are heating the food and I cannot see them because they move really fast.  The point is that the argument needs to have more predictive power than a fuzzy non-committal: "A new theory could predict a much stronger force." Show me the money. Propose such a theory and then we can discuss its merits. I am not asking something impossible. Regarding tunneling, quantum mechanics provides testable predictions which were confirmed experimentally. I only hold any alternative theory to the same level of experimental confirmation.

On Objection 3, I somewhat disagree with it. It is worthwhile to pursue non-quantum toy theories to better understand quantum mechanics, but not to search for an alternative to quantum mechanics.

Let me answer the four sub-points raised by Andrei:

a.       If nature is not probabilistic after all, there is much to be discovered. Detailed mechanism behind quantum phenomena should be revealed, bringing out a deeper understanding of our universe, and maybe new physical effects.

There is no "sub-quantum" or "hidden variable" explanation to quantum effects. Quantum mechanics is at the core of Nature, and my work is about proving this rigorously and not as a result of personal beliefs.


b.      Quantum theories are not well equipped to describe the universe as a whole. There is no observer outside the universe, no measurement can be performed on it, not even in principle.

This is a sterile objection to quantum mechanics and this cannot be used to justified a realistic alternative theory. First, even in quantum mechanics there are realistic interpretations. Second, epistemic interpretation like Qbism avoids this because they only talk about Bayesian probabilities. This objection only applies to naively using quantum mechanics in cosmology. Loop quantum gravity is unaffected by this objection.


c.       Due to its inability to provide an objective description of reality, quantum mechanics may not be able to solve the cosmological constant problem. A theory that states clearly “what’s there” could provide a much better estimate of the vacuum energy. After all we are not interested in what energy someone could find by performing a measurement on the vacuum, but what the vacuum consists of, when no one is there to pump energy into it.

The statement underscores a deep misunderstanding of the vacuum. Vacuum is actually a very violent place filled with virtual particles due to interplay of relativity and quantum mechanics. See this poor quality but brilliant video of a Sidney Coleman lecture to understand why merging them is not a trivial thing as one may naively expect that adding symmetries to quantum mechanics always results in simplifications. See also the QCD "Lava Lamp" which was shown at the 2004 Nobel Physics lecture. The cosmological constant problem is not a problem of quantum mechanics. I am not an expert in string theory but I know it has at least a solution to this problem (I don't know if it got rid of Susskind's "Rube Goldberg" label).

d.      Quantum mechanics requires an infinitely large instrument to measure a variable with infinite precision. When gravity is taken into account, it follows that local, perfectly defined properties cannot exist, because, beyond a certain mass, the instrument would collapse into a black hole.

The same argument can be used in the case of classical deterministic physics.

As you can see I disagree with the points Andrei was making, but nevertheless I want to thank him for participating in this debate and I look forward to discuss his replies in the commenting section of this post. I think such debates are useful, and I feel that the professional community of physicists is not doing a good job in engaging the public or explaining what it is doing. Physicists are very busy people trying to get ahead in a very competitive field. However the outside world usually experiences an arrogant wall of silence.