Sunday, March 26, 2017

Time as a continous functor


To recall from prior posts, a functor maps objects to objects and arrows to arrows between two categories. In other words, it is structure preserving. In the case of a monoidal category, suppose there is an arrow * from \(C\times C \rightarrow C\). Then a functor T makes the diagram below commute:


This is all fancy abstract math which has a simple physical interpretation when T corresponds to time evolution: the laws of physics do not change in time. Moreover it can be shown with a bit of effort and knowledge of C* algebras that Time as a functor = unitarity.

But what can we derive from the commutative diagram above? With the additional help of two more very simple and natural ingredients we will be able to reconstruct the complete formalism of quantum mechanics!!! Today I will introduce the first one: time is a continuous parameter. Just like in group theory adding continuity results in the theory of Lie groups we will consider continous functors and we will investigate what happens in the neighborhood of the identity element.

In the limit of time evolution going to zero T becomes the identity. For infinitesimal time evolution we can then write:

\(T = I + \epsilon D\)

We plug this back into the diagram commutativity condition \(T(A)*T(B) = T(A*B)\) and we obtain in first order the chain rule of differentiation:

\(D(A*B) = D(A)*B + A*D(B)\)

There is not a single kind of time evolution and \(D\) is not unique (think of various hamiltonians). There is a natural transformation between different time evolution functors  and we can express D as an operation like this: \(D_A = A\alpha\) where \((\cdot \alpha \cdot)\) is a product.

\(\alpha : C\times C \rightarrow C\)

Then we obtain the Leibniz identity:

\(A\alpha (B * C) = (A\alpha B) * C + B * (A \alpha C)\)

This is extremely powerful, as it is unitarity in disguise.  Next time we'll use the tensor product and the second ingredient to obtain many more mathematical consequences. Please stay tuned.

Sunday, March 19, 2017

Monoidal categories and the tensor product



Last time we discussed the category theory product which forms another category from two categories. Suppose now that we start with one category \(C\) and form the product with itself \(C\times C\). It is natural to see if there is a functor from \(C\times C\) to \(C\). If such a functor exists and moreover it respects associativity and unit elements, then the category \(C\) is called a monoidal category. By abuse of notation, the functor above is called the tensor product, but this is not the usual tensor product of vector space. The tensor product of vector space is only one concrete example of a monoidal product. To get to the ordinary tensor product we need to inject physics into the problem. 

The category \(C\) we are interested in is that of physical systems where the objects are physical systems, and arrows are compositions of physical systems. The key physical concepts needed are that of time and dynamical degree of freedom inside Hamiltonian formalism.

Time plays an distinguished role in quantum mechanics both in terms of formalism (remember that there is no time operator) and in how quantum mechanics can be reconstructed. 

The space in Hamiltonian formalism is a Poisson manifold which is not necessarily a vector space but because the Hilbert space \(L^2 (R^3\times R^3)\) is isomorphic to \(L^2 (R^3 ) \otimes L^2 (R^3 )\) let's discuss monoidal categories for vector spaces obeying an equivalence relationship. Hilbert spaces form a category of their own and there is a functor mapping physical systems into Hilbert spaces. This is usually presented as the first quantum mechanics postulate: each physical system is associated with a complex Hilbert space H.

For complete generality of the definition of the tensor product we consider two distinct vector space V and W for which we first consider the category theory product (in this case the Cartesian product) but for which we make the following identifications:
  • \((v_1, w)+(v_2, w) = (v_1 + v_2, w)\)
  • \((v, w_1)+(v, w_2) = (v, w_1 + w_2)\)
  • \(c(v,w) = (cv, w) = (v, cw)\)
For physical justification think of V and W as one dimensional vector spaces corresponding to distinct dynamical degrees of freedom. Linearity is a property of vector spaces and we expect this property to be preserved if vector spaces are to describe nature. Bilinearity in the equivalence relationship above arises because the degrees of freedom are independent.

Now a Cartesian product of vector spaces respecting the above relationships is a new mathematical object: a tensor product.

The tensor product is unique up to isomorphism and respects the following universal property:

There is a bilinear map \(\phi : V\times W \rightarrow V\otimes W\) such that given any other vector space Z and a bilinear map \(h: V\times W \rightarrow Z\) there is a unique linear map \(h^{'}: V\otimes W \rightarrow Z\) such that the diagram below commutes.


This universal property is very strong and several mathematical facts follows from it: the tensor product is unique up to isomorphism (instead of Z consider another tensor product \(V\otimes^{'}W\) ), the tensor product is associative, and there is a natural isomorphism between  \(V\otimes W\) and \(W\otimes V\) making the tensor product an example of a symmetric monoidal category, just like the category of physical systems under composition.

This may look like an insignificant trivial observation, but it is extremely powerful and it is the starting point of quantum mechanics reconstruction. On one hand we have composition of physical systems and theories of nature describing physical systems. On the other hand we have dynamical degrees of freedom and the rules of quantum mechanics. The two things are actually identical and each one can be derived from the other. To do this we need one additional ingredient: time viewed as a functor. Please stay tuned.

Monday, March 13, 2017

Category Theory Product


Before we discuss this week's topic, I want to make two remarks from the prior posts content. First, why we need natural transformations in algebraic topology? Associating groups to topological spaces (which incidentally describe the hole structure of the space) is done by the use of functors. Different (co)homology theories are basically different functors, and their equivalence is the same as proving the existence of a natural transformation. Second, the logic used in category theory is intuitionistic logic where truth is proved constructively. Since this is mapped into computer science by the Curry-Howard isomorphism, the fact that some statements have no constructive proof is equivalent with a computation running forever. In computation theory one encounters the halting problem. If the halting problem were decidable then category theory would have been mapped to ordinary logic instead of intuitionistic logic.

Now back to the topic of the day. We are still in pure math domain and we are looking at mathematical objects from 10,000 feet disregarding their nature and observing only their existence and their relationships (objects and arrows). The first question one asks is how to construct new categories from existing ones? One way is to simply reverse the direction of all arrows and the resulting category is unsurprisingly called the opposite category (or the dual). Another way is to combine two category into a new one. Enter the concept of a product of two categories: \(\mathbf{C}\times \mathbf{D}\). In set theory this would correspond with to the Cartesian product of two sets. However we need to give a definition which is independent of the nature of the elements. Moreover we want to give it in a way which guarantees uniqueness up to isomorphism. 

The basic idea is that of a projection from the elements of \(\mathbf{C}\times \mathbf{D}\) back to the elements of \(\mathbf{C}\) and \(\mathbf{D}\). So how do we know that those projections and the product is unique up to isomorphism? Suppose that there is another category \(\mathbf{Y}\) with maps \(f_C\) and \(f_D\). Then there is a unique map \(f\) such that the diagram below commutes





This diagram has to commute for all categories \(\mathbf{Y}\) and their maps \(f_C\) and \(f_D\). From this definition, can you prove uniqueness of the product up to isomorphism? It is a simple matter of "diagram reasoning". Just pretend that Y is now the "true incarnation" of the product. You need to find a morphisms f from Y to CxD and a morphism g from CxD to Y such that \(g\circ f =1_{C\times D}\), \(g\circ f = 1_Y\). See? Category theory is really easy and not harder than linear algebra.

Now what happens if we flip all arrows in the diagram above? We obtain a coproduct category \(\mathbf{C}\oplus \mathbf{D}\) and the projections maps become injection maps. 

OK, time for concrete examples:

  • sets: product = Cartesian product, coproduct = disjoint union
  • partial order sets: product = greatest lower bounds (meets), coproduct = least upper bounds (joins)
So where are we now? The concept of the product is very simple, but we need it as a stepping stone to the concept of tensor product and (symmetric) monoidal category. Why? Because physical systems form a symmetric monoidal category. Using categorical arguments we can derive the complete mathematical properties of any theory of nature describing a symmetric monoidal category. And the answer will turn out to be: quantum mechanics. Please stay tuned.

Saturday, March 4, 2017

The Curry–Howard isomorphism


Category theory may seem vary abstract and intimidating, but in fact it is extremely easy to understand. In category theory we look at concrete objects from far away without any regard for the internal structure. This is similar with Bohr's position on physics: physics is about what we can say about nature, and not decide what nature is. Surprisingly, a lot of information about the objects in category theory is derivable from the behavior of the objects and this is where I am ultimately heading with this series on category theory.

Last time I mentioned the origin of category theory as the formalism to clarify when two homology theories are equivalent. But category theory can be started from two other directions as well, and those alternative viewpoints help provide the intuition needed to navigate the abstractions of category theory. One thread of discussion starts with the idea of computability and the work of Alonzo Church and Alan Turing. Turing was Church's student and each started an essential line of research: lambda calculus and universal Turing machines.  Those later grew into one hand functional languages like Java script, and the other hand into object oriented languages like C++. What one can do with lambda calculus can be achieved with universal Turing machines, and the other way around. The essential idea of computer programming is to build complex structures out of simpler building blocks. Object oriented programming starts from the idea of packaging together actions and states. An object is a "black box" containing actions (functions performing computations) and information (the internal state of the object). Functional programming on the other hand lacks the concept of an internal state and you deal only with functions which take an input, crunch the numbers, and then produce an output. The typical example is FORTRAN: FORmula TRANslation (from higher level human understandable syntax into zeroes and ones understandable by a machine).

The second direction one can start category theory is intuitionistic logic and the foundation of set theory. The problem of naive set theory is that one can create paradoxes like Russel's paradox: the set of all sets which are not members of themselves. The solution Russel proposed was type theory. Types introduce structure to set theory preventing self-referential constructions. In computer programming, types are semantic rules which tell us how to understand various sequences of zeros and ones from computer memory as integers, boolean variables, etc.

In intuitionistic logic statements are not true by simply disproving their falsehood, but they are true by providing an explicit construction. Truth must be computed and the parallel with computer programming is obvious. There is a name for this relationship, the Curry-Howard isomorphism. The mathematical formalism needed to rigorously spell out this correspondence is category theory. At high level:
  • proofs are programs
  • propositions are types
More important is that we can attach logical and programming meaning to category theory constructions which helps dramatically reduce the difficulty of category theory to that of elementary linear algebra. 

There are two additional key points I want to make. First category theory ignores the structure of the objects: they can be sets, topological spaces, posets, even physical systems. As such uniqueness is relaxed in category theory and things are unique up to isomorphisms. Second, we are strengthening uniqueness by seeking universal propertiesThis gives category theory its abstract flavour: the generalization of standard mathematical concepts in category theory involve diagrams which must commute. The typical definition is something like: "if there is an "impostor" which claims to have the same properties as the concept being defined, then there exist a so and so isomorphism such that a certain diagram commutes which guarantees that the impostor is nothing but a restatement of the same concept up to isomorphism". Next time I will talk about the first key definition we need from category theory, that of a product, and by flipping the arrows that of a coproduct.