Realist vs. surrealist

Today I want to talk about a famous paper: Surrealistic Bohm Trajectories by Berthold-Georg Englert, Marian O. Scully, Georg Süssmann, and Herbert Walther. This paper is available online here,

The basic setup of the paper is simple: the double slit experiment which when solved in Bohmian formalism produces the result below:

The first thing one notices is the symmetry of the picture which shows that if the particle enters the top slit it will end up in the top portion of the screen, and similarly for the bottom slit. So what would happen if we only have one slit? We cannot say anything interesting in this case, but Englert, Scully, Süssmann, and Walther observed something funny: if we have both slits open and we place a type of which-way detectors in front of the slits which do not perturb the center of the mass wavefunction, the symmetry of the picture in the Bohmian description is preserved.

This means that in this case in Bohmian interpretation still if the particle enters the top slit it will end up in the top portion of the screen, and similarly for the bottom slit. However, there is no interference present in this case, and because of this, there is a non-zero probability that the particle entering the top slit will be detected in the bottom part of the screen.

So they uncovered a serious interpretation problem for dBB theory: there are Bohmian trajectories without a correspondence in reality. The term coined by the authors of the paper was surrealistic trajectories:

"Does the retrodicted Böhm trajectory always agree with the observed track? Our answer is: No."

Citing from the paper, here are some key sentences:

"In Bohmian mechanics, a particle has a position and nothing else,..."

"...the predictive power of Bohmian mechanics does not exceed that of ordinary quantum theory, and so the alleged superiority of Bohmian mechanics over ordinary quantum theory is of a purely philosophical nature."

In summary, the only advantage of dBB theory is the realism of the particle position, but there are cases where position measurements for particles contradict the dBB trajectory. The name of the paper was marketing hit, but a long term disaster: it allowed to be summarily dismissed by dBB supporters.

The original reply to the surrealistic paper can be found here. Here are key statements from the reply:

"The authors distinguish between the Bohm trajectory for the atom and the detected path of the atom."

"In this regard it would be well to bear in mind that before one can speak coherently about the path of a particle, detected or otherwise, one must have in mind a theoretical framework in terms of which this notion has some meaning."

"BM provides one such framework, but it should be clear that within this framework the atom can be detected passing only through the slit through which its trajectory in fact passes. "

"Thus BM, together with the authors of the paper on which we are commenting, does us the service of making it dramatically clear how very dependent upon theory is any talk of measurement or observation."

"the "utter meaningless"ness of the question as to which "slit" the atom went through can, within the framework of orthodox quantum theory, in no way be avoided through the use of "one-bit detectors" - however they are called!"

Basically the reply is an appeal to the contextuality of the formalism and they make 3 points:

1. you need a theoretical framework to discuss the concept of measurement
2. BM and standard QM make the same predictions
3. there are no trajectories in standard QM and you have no grounds to criticize BM on the basis of standard QM

Now I cannot speak for the authors of the surrealistic paper, but it is clear that Englert, Scully, Süssmann, and Walther did provide an analysis IN the formalism of Bohmian mechanics. As such, it deserves a reply in the Bohmian framework as well. [If one looks at this page for example (which is the most authoritative exposition of Bohm theory) the topic is not even mentioned.]

A modern reply to the surrealistic paper in the framework of Bohmian mechanics can be found here. Basically the take from this paper is that in their setup the second photon says that Bohm trajectories are surreal—and, thanks to nonlocality, its report is not to be trusted. I will need a bit of time to study this paper carefully, and I will follow up on this after I'll reach a definitive conclusion. Right or wrong, this paper puts the spotlight back on a forgotten dBB topic IN the framework of dBB theory.

Was The Many Words Interpretation Proven?

My friend Cristi Soica brought to my attention a provocative preprint by Daniela Frauchiger and Renato Renner: Single-world interpretations of quantum theory cannot be self-consistent which he discussed at his blog.

Here is the abstract:

"According to quantum theory, a measurement may have multiple possible outcomes. Single-world interpretations assert that, nevertheless, only one of them "really" occurs. Here we propose a gedankenexperiment where quantum theory is applied to model an experimenter who herself uses quantum theory. We find that, in such a scenario, no single-world interpretation can be logically consistent. This conclusion extends to deterministic hidden-variable theories, such as Bohmian mechanics, for they impose a single-world interpretation."

Now it is not everyday that someone makes a bold claim like this and since I do not know any of the authors, I first checked their credibility. In the Acknowledgement section they wrote:

"We would like to thank Alexia Auffeves, Serguei Beloussov, Hans Briegel, Lıdia del Rio, David Deutsch, Artur Ekert, Nicolas Gisin, Philippe Grangier, Thomas Muller, Sandu Popescu, Rudiger Schack, and Vlatko Vedral for discussing ideas that led to this work." which made me decide to spend the time to try to read and understand their argument.

The paper starts dry with pedantic abstractions which takes some effort to read and understand. On page 11 there is the first clue something is fishy:

"We note that the purpose of the experiment is to prove Theorem 1. We therefore do not have to worry about its technological feasibility at this point. We only need to ensure that none of the steps of the experiment are forbidden by the basic laws of physics."

The basis of the argument is an extended Wigner's friend experiment:

and in the box on page 12 at n=30 it states:

@ n:30 A measures F1 with respect to a basis {$$|ok\rangle_{F1}$$, $$|fail\rangle_{F1}$$} and records the outcome x ∈ {ok, fail}.

where for example

$$|ok\rangle_{F1} = \sqrt{1/2}|head\rangle_{F1} − \sqrt{1/2}|tail\rangle_{F1}$$

which is a superposition of macroscopic states!!!  You do not need read the preprint past this point.

This is the same as measuring Schrodinger's cat in a basis dead + alive.  So here is the million dollar question: is a dead + alive basis not feasible from a technology point of view, or is it forbidden by some basic law of physics?

A naive quantum purist would think that since everything is quantum, why deny a superposition of macroscopic states basis? The answer: superselection rules.

Here are the rules in the quantum game with superselection rules:
• every physical quantity is represented by a self-adjoint operator, but not every self-adjoint operator represents a physical quantity,
• every pure state is represented by a one dimensional subspace, but not every one-dimensional subspace represents a pure state.
For example nobody can prepare a nucleon in a superposition of proton and neutron states due to the conservation of the electric charge. But what is the root cause of the impossibility of quantum superposition for macroscopic states? Who cares? You cannot perform any experiment in which you measure in a basis of a superposition of macroscopic states.

Physics is an experimental science and the experiment used in the argument cannot be performed. As such the conclusion of the preprint is vacuous. It would be a completely different matter if such a superposition would be allowed by nature. Then we would be forced to accept the result of this paper.

Bohr was right in demanding the treatment of the measurement apparatus as a classical object. Still, how can superselection arise out of a pure quantum formalism. This means that there are distinct relevant Hilbert spaces which when embedded into a single Hilbert space exhibit superselection. Hmmm... Where have I seen this before? Welcome to the measurement problem solution proposal using Grothendieck group construction. The physical basis of this is quantum indeterminacy. Outcome randomness in quantum mechanics is not some undesirable feature which needs to be explained away, but an essential part of nature.

I won a bet I did not want to win

There are times when events in the society at large overshadow everything. In US politics, Donald Trump just cleared his last hurdle to win the republican nomination. Last September I made a bet for a 6 pack of beer with a colleague that Trump will win the nomination. This was not a crazy bet on my part but it came from witnessing the dirty politics in Romania after the collapse of communism and drawing the parallel. I other words I have seen this movie played out before and I also recognized the complete lack of antibodies in US society to react to the kind of evil represented by Trump and his allies. The level of outrageous, disgusting, and blatant lies I see on TV from the so-called "Trump supporters" like Jeffrey Lord match or exceed the lies from former communists right after the collapse of communism in 1989.

Both republicans and democrats in US are now reaping what they sow.

On the republican side, you have an elite in bed with big business who polished the science of fear-mongering to whip up a base of losers left in the cold by globalization. It is this base who provided Trump with his victory. This mob does not practice reason and is only aroused by blind emotions. In the 20th century there were two evil utopias: fascism and communism. Fascism slogan is: you are the best, while communism proclaims: we are all the same. When you are hurting economically it is hard not to fall pray to those utopias. Trump is basically a proto-fascist dictator in the making who vows to "make America great again" but so far managed to "make America hate again". Trump cynically said that he could shot someone on 5th avenue in New York and still not lose a single vote. I think he was actually right. The rational voices like Mitt Romney or John McCain were soundly defeated, while the only serious challenge to Trump came from the half-asleep charlatan Ben Carson or "Lucifer in the flesh" Ted Cruz. Carson stated for example that the pyramids in Egypt were built to store the grains according to the biblical story of Joseph in Genesis (Ben Carson voters: people know how to read hieroglyphs for quite some time and we know the story of the ancient Egyptians and of the pyramids by reading the hieroglyphs).

The situation on the democratic side is not rosy either. Hillary Clinton is just as dishonest as her husband (remember "wiping the server with a cloth?") and she cynically rose to the current position starting with her disgusting defense of Bill in the days of the Lewinsky affair. She planned all her actions step by step over the years. For example in Virginia-a key state in the general election, she planted as governor a slimy ally who almost lost his election because of his untrustworthiness when democrats should have won in a landslide following the corruption scandal of the prior republican governor.

So what can we choose in a general election? A narcissistic bully endorsed by the KKK who will make a prostitute the first lady (just google Melania Trump's "supermodel" photos in GQ - I will not post her naked photos here), or a person rotten to the core? Democrats are right now drinking the cool aid of thinking that Hillary will win in a landslide. I predict that the general election will be very close and of low turnout. On the democratic side Hillary has the appeal of broccoli to a kid who hates vegetables, and on the republican side you got  people who would not vote for Trump.

Trump only speaks stupid things because he is playing to his base, but he is very sharp and shrewd and he should not be underestimated. Never Trump should be the motto of all sane people. To appreciate how bad the situation is, a former recent CIA director threatened Trump publicly with mutiny of the arm forces and given who said this, it basically amounted to a treat of a military coup. Can you picture Trump with his finger on the nuclear button? I can't.

Friday, April 29, 2016

I was surprise by the interest in the last post, and I think a continuation of the topic using new material is in order.

As side notes apparently I got upgraded on Lubos scale to "confused", so I hope I finally got past the misunderstanding that I am a closet admirer of classical physics which was really absurd since I am deriving quantum mechanics from physical principles. Currently I am reading Jean Bricmont's new book Making Sense of Quantum Mechanics which is of course written from the point of view of Bohmian interpretation - Bricmont is well known Bohmian supporter- and I will report on my read after I will be done.

Now back on local realism. In an interferometer one cannot locate the path a particle is traveling because the "which way information" destroys the interference. Moreover, by adjusting the lengths of the two paths, one can tune the interference in such a way that only one output will collect all the particles, and the other output will record a null result. So what would happen if we combine two such interferometers in such a way that they touch at point P as in the picture below?

This is the setup of Lucien Hardy's thought experiment and the experiment has one more twist: in one interferometer we inject electrons, while in the other we inject their anti-particle: the positrons.

If the two interferometers are not touching, they are tuned in such a way that all the particles arrive at the "c" detectors , and none at the "d" detectors -c for constructive interference, d for destructive interference-. Suppose now that the loops touch at point P. If the particle from the right loop goes through P it blocks the w+/u+ path of the particle in the left interferometer and the left interferometer superposition is prevented as well. As a result, now the left particle can be detected at the d+ detector.

So far this is nothing fancier than the interferometer discussion from "Where is Waldo?". But now comes the catch. If we use electrons and positrons, when they both arrive at P we will have an annihilation resulting in a photon represented as $$\gamma$$ in the picture above. The key question for local realists is: can we detect at the same time the positron at d+ and the electron at d- ?

When we operate the interferometers without touching this can only happen in the case of a blocked path which kills the interference phenomena. However, in the touching case, when both paths are blocked you get anihilation and a gamma photon. Any local realistic description of the experiment cannot predict simultaneous detection of the electron and positron at d-, and d+. Still this is allowed to occur by quantum mechanics.

This thought experiment is independent of initial conditions and hence is free of the superdeterminism loophole. The challenge for the local realist is to explain why sometimes we get simultaneous detection at detectors d+/d- when the interferometers paths are touching at P, and never when the interferometers do not touch.

GHZ for die-hard local realists

Last time I have discussed the impossibility to locate the particle along the path of an interferometer. However there is an even stronger argument against local realism which was due to Greenberger, Horne, and Zeilinger and was popularized by Sidney Coleman in his famous "Quantum Mechanics on your face" talk.

The setting is as follows: from a central station, three electrons are sent every minute to three very distant laboratories, say on the Moon, Mars, and Neptune, where three experimentalists decide to measure either the spin on the x axis or the spin on the y axis recording +1 or -1 based on the deflection of the electron in a Stern-Gerlach device. Their decision to measure on x or y axis is left at their own free will. The experiment is run for many years collecting a huge amount of data. Then the lab logs are brought back on Earth and compared. The following correlation emerges:

whenever the three experimentalists measure one spin on the x axis and two on the y axis, the product of the answers is +1.

Now is this consistent with quantum mechanics predictions? The initial GHZ state is:

$$|\psi\rangle = \frac{1}{\sqrt{2}}(|+++\rangle - |---\rangle)$$

and for example measuring x-y-y in laboratories 1-2-3 yields

$$\sigma_x^{(1)}\sigma_y^{(2)}\sigma_y^{(3)}|\psi\rangle = |\psi\rangle$$

because $$\sigma_x$$ flips a + into a - and $$\sigma_y$$ does the same thing and adds an $$i$$ factor.

and the same for the other 2 combinations: y-x-y, and y-y-x.

So far so good, but what would happen when all three experimentalists decided to measure all on the x axis? What would a die-hard local realist predict it would happen?

A local realist thinks the value of the measurements exist independent of measurement ("the Moon is there even when I am not looking at it"), and the experiment simply reveals the value. Since the three laboratories are far apart, the decision of what to measure in one laboratory cannot influence what it is measured at the other two laboratories. After all, the measurements are done every minute as the electrons arrive, and it takes more than 1 minute for the speed of light to propagate between any two laboratories.

So if the spin value exists independent of measurement, we have three equations:

$$SpinX_1 SpinY_2 SpinY_3 = +1$$
$$SpinY_1 SpinX_2 SpinY_3 = +1$$
$$SpinY_1 SpinY_2 SpinX_3 = +1$$

and by multiplication we get

$$SpinX_1 SpinX_2 SpinX_3 SpinY_1 SpinY_1 SpinY_2 SpinY_2 SpinY_3 SpinY_3= +1$$

Since the spins are either +1 or -1, the square of any spin is +1 and we simplify the equation to:

$$SpinX_1 SpinX_2 SpinX_3 = +1$$

What does quantum mechanics predict?

Recall that  $$\sigma_x$$ flips a + into a -, and so

$$\sigma_x^{(1)}\sigma_x^{(2)}\sigma_x^{(3)}|\psi\rangle = - |\psi\rangle$$

and the experimental results confirm that indeed

$$SpinX_1 SpinX_2 SpinX_3 = -1$$

in agreement with quantum mechanics and in blatant violation of local realism.

Where is Waldo?

Two posts ago I discussed the bouncing droplets topic, but since that was part of an April Fools joke that maybe left a wrong impression on the topic. Today I want to present how quantum superposition contradicts the classical idea of a particle trajectory. For fun we will call our particle Waldo, and try to locate it along the path.

What I will talk about today is based on the example given by Jean Bricmont's in his book Making Sense of Quantum Mechanics  In turn this example is inspired by David Albert's book: Quantum Mechanics and Experience. As a disclaimer, Bohmian quantum mechanics does provide a location for "Waldo" but I will not discuss how in this post. However, as we will see, classical ideas are incompatible with the quantum effect of superposition.

For our discussion we want to measure the spin of an electron, on the z and x axis. We know how to do that: simply pass the electron through a zone of an inhomogeneous magnetic field along that axis and observe the deflection.We can think of this as a box with one input slot and two output slots which sorts the incoming electrons. Given the "z" or "x" orientation of the magnet inside this imaginary box we have two kinds of sorting boxes, and experimentally we establish two rules:

• once an electron is sorted one way by a box, passing the same electron through another box of the same kind sorts the electron the same way. (repeated measurements yields the same result)
• if an electron is sorted one way by a box, passing the same electron through another box of a different kind results in 50%-50% outcomes from the new box.

Now let's have a source of electrons which where already sorted UP by a Z-Box and let's pass them through an X-Box (no pun intended). By the rules above we get 50% of them sorted x-UP and 50% sorted x-DOWN. Let us further pass each of the two beams through Z-Boxes, and according to the rules above we have a 4 way split as in the picture below

Now here comes the surprise: make the final two Z-boxes a single Z-box by bouncing the electron beams off mirrors and recombining them before the one and only final Z-box:

What percentages do we obtain? Remember that the boxes are imaginary boundaries for the inhomegenous magnetic field and instead of using mirrors we can use the first setting and bring the boxes closer and closer by overlapping the magnetic fields inside until the two boxes become one. So the first picture would make us predict that when we have only one box we would get 25+25=50% z UP and 25+25=50% z DOWN, while in fact we get 100% z UP and 0% z DOWN. This quantum surprise has a name: quantum superposition. It is superposition which distinguishes quantum from classical mechanics.

Now suppose we block one of the paths in the second picture (which illustrates a Mach-Zehnder  interferometer) then by applying the second rule from above we would get 25% z UP, 25% zDOWN, and 50% lost by the barrier blocking the path.

Why is superposition at odds with the notion of trajectory? Which path does Waldo take when going through the interferometer? A classical way of thinking would be as follows:
• Does Waldo takes the upper branch path? No, because to make sure this is the path, we block the bottom branch path and this results in 50-50 split at the final Z Box, and the final result is all z UP.
• Does Waldo takes the lower branch path? No, because to make sure this is the path, we block the upper branch path and this results in 50-50 split at the final Z Box, and the final result is all z UP.
• Does Waldo takes both paths? No, we always find Waldo in one or the other path is we try to see where he is.
• Does Waldo takes no path? No, if both paths are blocked nothing is detected at the end.
The trouble with this reasoning is counterfactual definiteness which is obeyed by classical mechanics. For example in the first two arguments above we assume that a the experiment would be the same if we would place a barrier in one of the paths and this would not change anything. In fact, quantum mechanics is contextual and the experimental setting plays a critical role into what it is measured. However, in classical physics we assume the existence of properties of objects even when they have not been measured and this is not the case in the quantum world.

How does RSA encryption work?

Since we have been discussing quantum computers, one of the main application of them is to break the RSA encryption. But what is this scheme and how does it work? Let's start with how encryption evolved during the history of mankind. In the ancient times one method was to peel in a spiral fashion the bark out of a tree branch and hold onto the stick. With the bark still attached you write the message vertically and after you peel it the message appears garbled up. To decode it you wrap the bark back on the branch and the letters neatly align if you have a branch of the right diameter. This is a rather poor method of hiding the message and better methods were required.

A big improvement in antiquity was the Caesar's cypher in which each letter is substituted by another letter. This method provides a large number of possibilities for encryption, but it has a simple weakness: in any language some words and some letter combinations are very common. For example in the English language the word "the" is the most common, and from it you can easily guess the substitution for the letters T, H, and E. Working in reversed word frequency order you can break this method of encryption relatively easily. The counter method for this was to daily change the substitution method by some sort of algorithm available to both the sender and receiver. One classical example of this was the Enigma machine used by Germany in the second world war.

Now if you do need to change the way the encryption is run after each encrypted message, is there a method which is unbreakable? Indeed it is, and it is based on XOR. In ASCII, each letter is represented by a number and each number has a binary representation of 0 and 1. If you have a one-time use of a cipher key: a sequence of random letters the same length as the message to be sent, you extract the binary representation of the cypher and you do a bit-wise XOR operation between the bits of the text and the bits of the cypher. The operation is reversible and as long as you do not reuse the key to allow the possibility of of the frequency attack explained above, the encryption is unbreakable. So why not use this method all the time and call it a day? Because the management of the keys is horrendous in practice.

The next improvement came from using asymmetric keys. The key idea is that of factorizing a number intro primes. It is trivial to multiply numbers, but there is no good known algorithm to do the factorization. The factorization time of a number made out of the product of two primes is proportional to the square root of the number. Say that we have a 400 digit number. The square root is a 200 digit number. The lifetime of the universe in seconds is an 18 digit number. A computer which factorizes a million tries per second, can only check $$10^{24}$$ possibilities. This means that the computer has to run for $$10^{176}$$ times the lifetime of the universe to factor a 400 digit number!

So what is the RSA algorithm?

Step 1: pick two large prime numbers p and q and multiply them.
Step 2: multiply (p-1) with (q-1) and find out a small number e relative prime with (p-1) * (q-1). pq and e form the public key you broadcast to the world.

Recall that each letter has a corresponding ASCII code and can be represented as a number.

Step3: the sender wants to encode a number M. Using the public key he computes the encryption of M as: $$M^e\ (mod~pq) = C$$ and he sends C to you.
Step 4: you first compute a number d such that $$ed=1 (mod~(p-1)(q-1))$$
Step 5: Finally you compute $$C^d (mod~pq) = M$$

The computations are in general time consuming and the method is used in practice only to encrypt the unbreakable (one-time used) XOR keys.

To break the RSA encryption one needs to be able to factorize the pq product. One way to do it is by Shor's algoritm. In this approach all steps are fast except the key part which is a quantum Fourier transform. In one of the prior posts I discussed a classical embedding of a quantum computer in what amounts to be an analog computer. The speedup in this setting is due to the ability to represent analog signals which are able to fast compute a Fourier transform.

The key question to ask is whether the computational speedup of the computation in a quantum computer is due to quantum mechanics, or is due to the nature of the analog structured being utilized in Shor's algorithm. Supporters of the Many Worlds Interpretation like David Deutsch contend it is quantum mechanics, but my take on it that this assertion is proven false by several concrete realization of analog computational devices.  Besides the emulation device created by Brian R La Cour and Granville E Ott, there are other prototype analog devices using optical waves capable of fast performing the key step in Shor's algorithm. The fathers of this line of research are David Ferry and Harris Akis from Arizona State University who published the first paper on this in 2001: Quantum wave processing in Superlattices and Microstructures - a journal unknown to the quantum foundation community.