Friday, June 24, 2016

Noncommutative Geometry


Before jumping into the topic for today, let me say a few words about Brexit: UK put a show of monstrous selfishness and hypocrisy: after colonizing half the globe, now they complain about immigrants?

Back to physics (and math). Last time I stated that geometry requires a generalization, so what does this all mean? There are many ways one can approach this, but let's do it in historical fashion and start with the duality:

Geometry - Algebra

It is informative to remember how ancient Greeks did geometry. For them everything (including the proofs) were a geometric construction with straightedge and compass and they had no concept of coordinates.


It was not until 1637 that geometry and algebra were married by Descartes with what we now call a Cartesian coordinate system. Subsequently mathematicians started realizing that geometry and algebra are nothing but distinct languages describing the very same thing. The first geometry theorem unknown to ancient Greeks was discovered in 1899 and the proof was done by purely algebraic arguments.

The power of algebra is higher than that of geometry because it is easier to formalize abstractions in algebra. In algebra one easily encounters noncommutativity and one example is operator non-commutativity in quantum mechanics. But if algebra is dual to geometry, what kind of geometric spaces would correspond to a non-commutative algebra? What does it mean that "the algebra of coordinates is non-commutative"?

The simplest example is that of a torus. Recall the old fashion arcade games where your character exits through the right side of the screen and re-enters though the left side? Similarly if you move past the top edge you re-emerge at the bottom. Topologically this is a torus. Now suppose you move in straight line in such a way that the ratio of your horizontal and vertical speeds is an irrational number. Slice the torus with trajectory lines respecting this ratio. What you get is a pathological foliation because all "measurable functions" are almost everywhere constant and there are no non-constant continuous functions. 

Other pathological examples are: the space of Penrose tilings, deformations of Poisson manifolds, quantum groups, moduli spaces, etc.

From the mathematical side, one can explain away all those pathological cases one by one, but this is missing the forest because of the trees. The duality from above now becomes:

Quotient spaces - Non Commutative Algebra

where we replace the commutative algebra of constant functions along the classes of an equivalence relation by the noncommutative convolution algebra of the equivalence relation.

Basically it all boils down to a generalization of measure theory. It is well known that the proper way to generalize measure theory is by von-Neumann algebras, and this is how quantum mechanics enters the picture (although historically non-commutative geometry arose from quantum mechanics and the work to classify von-Neumann algebras).

Next time we are going to dive deeper into non-commutative geometry and we will encounter the Dirac operator.

Friday, June 17, 2016

Norm and correlations


Continuing the discussion from last time, today I want to talk about the norm of a linear operator and it's implication for the maximum correlations which can be achieved in nature: Tsirelson's bound. The very same norm definition would later on play a key role in what would unexpectedly become in the end a "geometrization" of the Standard Model coupled with (unquantized) gravity.

In a Hilbert space, the definition of the norm of a bounded linear operator is:

\(||A|| = sup (||Au||/||u||)\) for \(u\ne 0\)

The most important properties of the norm for bounded operators are the triangle inequality:

\(||A+B|| \leq ||A|| + ||B||\)



and a multiplication identity which guarantees the continuity of multiplication:

 \(||AB|| \leq ||A||  ||B||\)

(can we call this a triangle inequality for multiplication?)

On the basis of the triangle inequality, one may be tempted to explore the association of the notion of physical distance with the notion of the norm of an operator in a Hilbert space, but this is a dead end. The triangle inequality for operators is essential for quantum mechanics because it ensures the usual notions of convergence in functional analysis (most of functional analysis follows from it). The name of this blog is elliptic composability, and the "elliptic" part follows from the triangle inequality above. If one imagines a quantum mechanics where the triangle inequality is reversed, then one arrives at the unphysical hyperbolic quantum mechanics based on split-complex numbers which violates positivity which in turns prevents the usual definition of probability as a positive quantity.

There turns out however to be a deep and completely counter-intuitive relationship between the "sup" in the norm definition and the notion of physical distance, but more on this in subsequent posts-don't want to spoil the surprise, I only want to whet the (mathematical) appetite a bit.

Now back to correlations. Suppose we have four operators \(\sigma_\alpha, \sigma_\beta, \sigma_\gamma, \sigma_\delta\) such that:

\({\sigma_\alpha}^2 = {\sigma_\beta}^2 = {\sigma_\gamma}^2 = {\sigma_\delta}^2 = 1\)
and
\([\sigma_\alpha, \sigma_\beta] = [\sigma_\beta, \sigma_\gamma] = [\sigma_\gamma, \sigma_\delta] = [\sigma_\delta, \sigma_\alpha] = 0\)

If we define an operator \(C\) as follows:

\(C= \sigma_\alpha \sigma_\beta + \sigma_\beta \sigma_\gamma + \sigma_\gamma \sigma_\delta - \sigma_\delta \sigma_\alpha\)

Then it is not hard to show that:

\(C^2 = 4 + [\sigma_\alpha, \sigma_\gamma][\sigma_\beta, \sigma_\delta]\)

From the triangle inequalities we have in general that:

\(||[A, B]|| = ||AB - BA|| = ||AB + (-B)A|| \leq ||AB|| + ||-BA|| \)
\(= ||AB|| + ||BA|| \leq ||A||||B|| + ||A||||B|| = 2||A||||B||\)

and so 

\(|| [\sigma_\alpha, \sigma_\gamma]|| \leq 2 ||\sigma_\alpha|| ||\sigma_\gamma|| = 2\)
\(|| [\sigma_\beta, \sigma_\delta]|| \leq 2 ||\sigma_\beta|| ||\sigma_\delta|| = 2\)

Therefore:

\(||C^2|| \leq 4+4\)
\(||C|| \leq 2 \sqrt{2}\)

And this is Tsirelson's bound because \(C\) appears in the left hand side of the CHSH inequality.

Now one can read about textbook derivation of Tsirelson bound in many places, but the key point is that quantum correlations have their origin in the notion of operator norms in a Hilbert space. Nowhere in the derivation we have used the notion of distance or causality. Quantum correlations are a mathematical consequence of the quantum formalism which are in turn is a consequence of considerations of composition and information. Quantum mechanics is nothing but composition and information, and correlations (both quantum and classical) are nothing but considerations of composition and information as well. 

The usual way we understand classical correlations as generated by a common cause is a parochial view due to our classical intuition. Yes, a common cause can generate correlations, but correlation does not imply causation

So what does all of this have to do with the notion of  distance and that of space-time? To uncover the link we would need first to generalize the notion of a space in geometry. Crazy talk? Not when it is based on von Neumann algebras research which led to a Fields medal. Please stay tuned...

Friday, June 10, 2016

Correlations vs. locality:
Can you hear the shape of a drum?


Is Nature local or non-local? Those are battle lines between the epistemic and ontic camps. But can we approach the problem from a different angle? I don't remember the quote exactly, but Bell once stated something along the following line: quantum correlations cry for an explanation. Or do they? I will attempt to make the case for the contrary.

If you think quantum correlations (which go above Bell limit) are in need of an explanation, then very likely you are in the ontic, beable, non-local camp. Personally I am not in this camp, and I am not in anyone's camp. What I am trying to do is reconstruct quantum mechanics from physical principles and then mathematically arrive at the correct quantum mechanics interpretation. 

So what do I know for now? Quantum mechanics is locality-independent. This means that considerations of locality have no role whatsoever in deriving quantum mechanics. Quantum correlations which go above Bell's limit are a mathematical consequence of the quantum formalism.  Do I find correlations above the Bell limit strange? Indeed I do, because as a living organism I am the result of a long natural selection process which favors classical intuition as a necessary tool for survival. However, as a physicist, it is not the correlations alone which are troublesome to me, but correlations over spatially separated experiments. And if quantum correlations are natural mathematical consequences of the quantum formalism, what is in deep need of an explanation is the very idea of distance.  This is a different paradigm from the one put forward by Bell.

We tend to take the idea of space or space-time, or locality, or neighborhood for granted because this how nature is and physics is an experimental science, but if everything is quantum mechanical at core, and if locality plays no role in deriving quantum mechanics, where does the metric tensor comes from? (I will attempt to show that this is the deep mystery, and not the correlations) The funny thing is that the answer is known and was arrived at by a completely unexpected route starting with a strange question: Can you hear the shape of a drum? Moreover, although there are exceptions, the answer is not well known to neither the quantum foundations community, nor to the string theory/high energy physics, and paradoxically, it is well known to mathematicians who uncovered an extremely rich mathematical domain: noncommutative geometry. I will start exploring this area in this post and continue the topic in subsequent ones.

So what is the shape of a drum question about? When you go to a symphonic concert, you can clearly identify pianos from trumpets, trumpets from drums, etc. Why? Because they sound different even when they play the very same note.


But now let's make the problem harder and pick the same type of instrument. The shape of the instrument determines the spectral characteristics of the sound. Can we solve the inverse problem? Do the eigenfrequencies uniquely determine the shape of the instrument? The answer is negative as counterexamples show. However we are onto something interesting here. Eigenvalues and eigenvectors naturally occur in quantum mechanics. Also although the answer is negative, we can tell an instrument from another, and therefore we must be missing just a little bit of information to solve the inverse problem. And if we are able to solve the inverse problem, we have succeeded into recovering the metric tensor information in a very different language and moreover this language is common to quantum mechanics as well.

Please stay tuned for the continuation next week. 

Friday, June 3, 2016

Interaction-free measurement


In earlier posts I have described my proposal for the solution of the measurement problem, in which a pair of quantum systems (system, observer) select a distinguished state by breaking an equivalence relationship. The physical reason of the equivalence relationship is the fundamental indeterminacy of quantum mechanics due to operator non-commutativity. The proposed mechanism preserves unitarity and does not require an interaction Hamiltonian.Of course, an interaction Hamiltonian can break the  the equivalence relationship as well, but today I want to focus on how interaction free measurements can occur in nature. 

There are several proposals on interaction-free measurements but the most dramatic and well known one is due to Avshalom Elitzur and Lev Vaidman, in the form of Elitzur-Vaidman bomb tester. On a personal note, Avshalom is very pleasant down to earth non-conformist person who speaks truth to power, is a genuine truth seeker, and a champion for the lowly persons toiling to push the boundary of knowledge. Last year I was having lunch with him at a conference when he surprised me by offering to swap our speaking time slots. This was like offering to swap a shiny new Mercedes Benz with a beat down Toyota. I was too stunt to accept it, but I was deeply touched and I owe him a debt of gratitude for the random act of kindness.

Now back on the bomb tester, the setup is as follows: a bomb factory produces extremely sensitive bombs which explode when they interact with a single photon. However the factory is not perfect, and sometimes they manufacture duds and the quality assurance department of the factory has to eliminate those defective bombs. The question is: how to do it? In a classical physics universe, this would be an impossible task: the very act of testing the bomb would explode the good ones, and leave the bad ones intact. Now here comes quantum mechanics to the rescue. Suppose we have an interferometer like the one in the picture below, tuned in such a way that every incoming photon triggers a detection at D1 while D2 does not detect anything.




I am too lazy to write the required LaTeX formulas and I picked a picture which shows the actual quantum states along the arms of the interferometer. Now if a functional bomb is inserted on the top arm, it blocks the top photon path. If the photon takes the upper path, it will explode the bomb. If however it will take the lower path, the interference at the second beam splitter is prevented and the photon would be detected at either D1 or D2!

Detection at D2 would signal a working bomb was inserted and moreover we found out the information without exploding it!

But wait a minute? What would happen if a dud is inserted into the interferometer arm? The dud does not interact with the photon, and the detection would always occur at D1.

Now the next question for the quality assurance department is to figure out if there is a way to increase the detection rate of the good bombs without exploding them. This is again possible using quantum mechanics and the idea comes from the quantum Zeno effect or in more popular terms: a watched pot never boils. In the quantum world, this is an actual real effect, and not a psychological phenomena. In fact, using the quantum Zeno effect the efficiency of the bomb detector can be increased arbitrarily close to 100%.

Now if you think this is all nonsense theoretical musings, nature confirmed the reality of the argument. For details see this experimental paper. So what do we learn from the existence of interaction-free measurements? First, nature is quantum mechanical. There is simply no way classical physics can achieve interaction free measurements. Second, quantum measurements is not a naive process which can be explained away by finding some interaction Hamiltonian. True, there is an interaction taking place here, but this is between the photon and the detector, not between the photon and the bomb. Third, the role of the observer is essential: no observer means no collapse.

So how does my proposed solution to the measurement problem using the Gothendieck group construction fare against interaction free measurement? It passes with flying colors. Gothendieck group construction demands the construction of a Cartesian pair, and physically the pair can be identified with the quantum system and the observer. Also the collapse is a change in representation, not a unitary evolution. Last, the unitary evolution is completely preserved. The only strange part is the existence of many Hilbert spaces each corresponding to a potential outcome. Still, this proposal is not the many worlds interpretation. Since the Hilbert spaces are only mathematical constructs devoid of physical reality ("Quantum phenomena do not occur in a Hilbert space. They occur in a laboratory"-Asher Peres), we can embed all those Hilbert spaces into a single one and the quantum interpretation which arises out of the Gothendieck group construction is that of Copenhagen. 

Friday, May 27, 2016

Realist vs. surrealist


Today I want to talk about a famous paper: Surrealistic Bohm Trajectories by Berthold-Georg Englert, Marian O. Scully, Georg Süssmann, and Herbert Walther. This paper is available online here,

The basic setup of the paper is simple: the double slit experiment which when solved in Bohmian formalism produces the result below:


The first thing one notices is the symmetry of the picture which shows that if the particle enters the top slit it will end up in the top portion of the screen, and similarly for the bottom slit. So what would happen if we only have one slit? We cannot say anything interesting in this case, but Englert, Scully, Süssmann, and Walther observed something funny: if we have both slits open and we place a type of which-way detectors in front of the slits which do not perturb the center of the mass wavefunction, the symmetry of the picture in the Bohmian description is preserved.

This means that in this case in Bohmian interpretation still if the particle enters the top slit it will end up in the top portion of the screen, and similarly for the bottom slit. However, there is no interference present in this case, and because of this, there is a non-zero probability that the particle entering the top slit will be detected in the bottom part of the screen. 

So they uncovered a serious interpretation problem for dBB theory: there are Bohmian trajectories without a correspondence in reality. The term coined by the authors of the paper was surrealistic trajectories:

"Does the retrodicted Böhm trajectory always agree with the observed track? Our answer is: No."

Citing from the paper, here are some key sentences:

"In Bohmian mechanics, a particle has a position and nothing else,..."

"...the predictive power of Bohmian mechanics does not exceed that of ordinary quantum theory, and so the alleged superiority of Bohmian mechanics over ordinary quantum theory is of a purely philosophical nature."

In summary, the only advantage of dBB theory is the realism of the particle position, but there are cases where position measurements for particles contradict the dBB trajectory. The name of the paper was marketing hit, but a long term disaster: it allowed to be summarily dismissed by dBB supporters.

The original reply to the surrealistic paper can be found here. Here are key statements from the reply:

"The authors distinguish between the Bohm trajectory for the atom and the detected path of the atom."

"In this regard it would be well to bear in mind that before one can speak coherently about the path of a particle, detected or otherwise, one must have in mind a theoretical framework in terms of which this notion has some meaning."

"BM provides one such framework, but it should be clear that within this framework the atom can be detected passing only through the slit through which its trajectory in fact passes. "

"Thus BM, together with the authors of the paper on which we are commenting, does us the service of making it dramatically clear how very dependent upon theory is any talk of measurement or observation."

"the "utter meaningless"ness of the question as to which "slit" the atom went through can, within the framework of orthodox quantum theory, in no way be avoided through the use of "one-bit detectors" - however they are called!"
  
Basically the reply is an appeal to the contextuality of the formalism and they make 3 points:

1. you need a theoretical framework to discuss the concept of measurement
2. BM and standard QM make the same predictions
3. there are no trajectories in standard QM and you have no grounds to criticize BM on the basis of standard QM 

Now I cannot speak for the authors of the surrealistic paper, but it is clear that Englert, Scully, Süssmann, and Walther did provide an analysis IN the formalism of Bohmian mechanics. As such, it deserves a reply in the Bohmian framework as well. [If one looks at this page for example (which is the most authoritative exposition of Bohm theory) the topic is not even mentioned.]

A modern reply to the surrealistic paper in the framework of Bohmian mechanics can be found here. Basically the take from this paper is that in their setup the second photon says that Bohm trajectories are surreal—and, thanks to nonlocality, its report is not to be trusted. I will need a bit of time to study this paper carefully, and I will follow up on this after I'll reach a definitive conclusion. Right or wrong, this paper puts the spotlight back on a forgotten dBB topic IN the framework of dBB theory. 

Friday, May 13, 2016

Was The Many Words Interpretation Proven?



My friend Cristi Soica brought to my attention a provocative preprint by Daniela Frauchiger and Renato Renner: Single-world interpretations of quantum theory cannot be self-consistent which he discussed at his blog.

Here is the abstract:

"According to quantum theory, a measurement may have multiple possible outcomes. Single-world interpretations assert that, nevertheless, only one of them "really" occurs. Here we propose a gedankenexperiment where quantum theory is applied to model an experimenter who herself uses quantum theory. We find that, in such a scenario, no single-world interpretation can be logically consistent. This conclusion extends to deterministic hidden-variable theories, such as Bohmian mechanics, for they impose a single-world interpretation."

Now it is not everyday that someone makes a bold claim like this and since I do not know any of the authors, I first checked their credibility. In the Acknowledgement section they wrote:

"We would like to thank Alexia Auffeves, Serguei Beloussov, Hans Briegel, Lıdia del Rio, David Deutsch, Artur Ekert, Nicolas Gisin, Philippe Grangier, Thomas Muller, Sandu Popescu, Rudiger Schack, and Vlatko Vedral for discussing ideas that led to this work." which made me decide to spend the time to try to read and understand their argument.

The paper starts dry with pedantic abstractions which takes some effort to read and understand. On page 11 there is the first clue something is fishy:

"We note that the purpose of the experiment is to prove Theorem 1. We therefore do not have to worry about its technological feasibility at this point. We only need to ensure that none of the steps of the experiment are forbidden by the basic laws of physics."

The basis of the argument is an extended Wigner's friend experiment:


and in the box on page 12 at n=30 it states:

@ n:30 A measures F1 with respect to a basis {\(|ok\rangle_{F1}\), \(|fail\rangle_{F1}\)} and records the outcome x ∈ {ok, fail}. 

where for example

\(|ok\rangle_{F1} = \sqrt{1/2}|head\rangle_{F1} − \sqrt{1/2}|tail\rangle_{F1}\)

which is a superposition of macroscopic states!!!  You do not need read the preprint past this point.

This is the same as measuring Schrodinger's cat in a basis dead + alive.  So here is the million dollar question: is a dead + alive basis not feasible from a technology point of view, or is it forbidden by some basic law of physics?

A naive quantum purist would think that since everything is quantum, why deny a superposition of macroscopic states basis? The answer: superselection rules.

Here are the rules in the quantum game with superselection rules:
  • every physical quantity is represented by a self-adjoint operator, but not every self-adjoint operator represents a physical quantity,
  • every pure state is represented by a one dimensional subspace, but not every one-dimensional subspace represents a pure state.
For example nobody can prepare a nucleon in a superposition of proton and neutron states due to the conservation of the electric charge. But what is the root cause of the impossibility of quantum superposition for macroscopic states? Who cares? You cannot perform any experiment in which you measure in a basis of a superposition of macroscopic states.

Physics is an experimental science and the experiment used in the argument cannot be performed. As such the conclusion of the preprint is vacuous. It would be a completely different matter if such a superposition would be allowed by nature. Then we would be forced to accept the result of this paper. 

Bohr was right in demanding the treatment of the measurement apparatus as a classical object. Still, how can superselection arise out of a pure quantum formalism. This means that there are distinct relevant Hilbert spaces which when embedded into a single Hilbert space exhibit superselection. Hmmm... Where have I seen this before? Welcome to the measurement problem solution proposal using Grothendieck group construction. The physical basis of this is quantum indeterminacy. Outcome randomness in quantum mechanics is not some undesirable feature which needs to be explained away, but an essential part of nature.

Friday, May 6, 2016

I won a bet I did not want to win


There are times when events in the society at large overshadow everything. In US politics, Donald Trump just cleared his last hurdle to win the republican nomination. Last September I made a bet for a 6 pack of beer with a colleague that Trump will win the nomination. This was not a crazy bet on my part but it came from witnessing the dirty politics in Romania after the collapse of communism and drawing the parallel. I other words I have seen this movie played out before and I also recognized the complete lack of antibodies in US society to react to the kind of evil represented by Trump and his allies. The level of outrageous, disgusting, and blatant lies I see on TV from the so-called "Trump supporters" like Jeffrey Lord match or exceed the lies from former communists right after the collapse of communism in 1989. 

Both republicans and democrats in US are now reaping what they sow. 

On the republican side, you have an elite in bed with big business who polished the science of fear-mongering to whip up a base of losers left in the cold by globalization. It is this base who provided Trump with his victory. This mob does not practice reason and is only aroused by blind emotions. In the 20th century there were two evil utopias: fascism and communism. Fascism slogan is: you are the best, while communism proclaims: we are all the same. When you are hurting economically it is hard not to fall pray to those utopias. Trump is basically a proto-fascist dictator in the making who vows to "make America great again" but so far managed to "make America hate again". Trump cynically said that he could shot someone on 5th avenue in New York and still not lose a single vote. I think he was actually right. The rational voices like Mitt Romney or John McCain were soundly defeated, while the only serious challenge to Trump came from the half-asleep charlatan Ben Carson or "Lucifer in the flesh" Ted Cruz. Carson stated for example that the pyramids in Egypt were built to store the grains according to the biblical story of Joseph in Genesis (Ben Carson voters: people know how to read hieroglyphs for quite some time and we know the story of the ancient Egyptians and of the pyramids by reading the hieroglyphs).


The situation on the democratic side is not rosy either. Hillary Clinton is just as dishonest as her husband (remember "wiping the server with a cloth?") and she cynically rose to the current position starting with her disgusting defense of Bill in the days of the Lewinsky affair. She planned all her actions step by step over the years. For example in Virginia-a key state in the general election, she planted as governor a slimy ally who almost lost his election because of his untrustworthiness when democrats should have won in a landslide following the corruption scandal of the prior republican governor.

So what can we choose in a general election? A narcissistic bully endorsed by the KKK who will make a prostitute the first lady (just google Melania Trump's "supermodel" photos in GQ - I will not post her naked photos here), or a person rotten to the core? Democrats are right now drinking the cool aid of thinking that Hillary will win in a landslide. I predict that the general election will be very close and of low turnout. On the democratic side Hillary has the appeal of broccoli to a kid who hates vegetables, and on the republican side you got  people who would not vote for Trump.

Trump only speaks stupid things because he is playing to his base, but he is very sharp and shrewd and he should not be underestimated. Never Trump should be the motto of all sane people. To appreciate how bad the situation is, a former recent CIA director threatened Trump publicly with mutiny of the arm forces and given who said this, it basically amounted to a treat of a military coup. Can you picture Trump with his finger on the nuclear button? I can't.