Skip to main content

Section 5.30 Session 26: Distributive categories and linear categories

This Session has been an interesting read. I feel like I was on the right track with looking at my graphs as matrices, but I haven’t quite been able to connect all the dots yet. This session also refers back to Exercise 20 of Article and I wasn’t quite satisfied with my previous work there.

Example 5.30.1. Exercise 1:.

Using the above definitions...
Solution.
We’re asked to prove the following:
\begin{equation*} \begin{bmatrix} f_{AX} & f_{AY}\\ f_{BX} & f_{BY} \end{bmatrix} \cdot \begin{bmatrix} g_{XU} & g_{XV}\\ g_{XV} & g_{YV} \end{bmatrix} = \begin{bmatrix} g_{XU} \circ f_{AX} + g_{YU} \circ f_{AY} & g_{XV} \circ f_{AX} + g_{YV} \circ f_{AY}\\ g_{XU} \circ f_{BX} + g_{YU} \circ f_{BY} & g_{XV} \circ f_{BX} + g_{YV} \circ f_{BY} \end{bmatrix} \end{equation*}
Since that’s kind of a lengthy expression, I’m going to call it \(f \cdot g = h: A+B \longrightarrow U \times V\) such that
\begin{equation*} h= \begin{bmatrix} h_{AU} & h_{AV}\\ h_{BV} & h_{YV} \end{bmatrix} \end{equation*}
\begin{equation*} = \begin{bmatrix} f_{AX} & f_{AY}\\ f_{BX} & f_{BY} \end{bmatrix} \cdot \begin{bmatrix} g_{XU} & g_{XV}\\ g_{XV} & g_{YV} \end{bmatrix} \end{equation*}
\begin{equation*} = \begin{bmatrix} g_{XU} \circ f_{AX} + g_{YU} \circ f_{AY} & g_{XV} \circ f_{AX} + g_{YV} \circ f_{AY}\\ g_{XU} \circ f_{BX} + g_{YU} \circ f_{BY} & g_{XV} \circ f_{BX} + g_{YV} \circ f_{BY} \end{bmatrix} \end{equation*}
The matrix product \(f \cdot g\) is defined relative to a preferred operator \(X+Y \xrightarrow{\alpha} X \times Y = \begin{bmatrix} 1_X & 0_{XY} \\ 0_{YX} & 1_Y \end{bmatrix} \) using the matrix form:
\begin{equation*} \begin{bmatrix} f_{AX} & f_{AY}\\ f_{BX} & f_{BY} \end{bmatrix} \cdot \begin{bmatrix} g_{XU} & g_{XV}\\ g_{XV} & g_{YV} \end{bmatrix} = \begin{bmatrix} g_{XU} & g_{XV}\\ g_{XV} & g_{YV} \end{bmatrix} \circ \begin{bmatrix} 1_X & 0_{XY} \\ 0_{YX} & 1_Y \end{bmatrix}^{-1} \circ \begin{bmatrix} f_{AX} & f_{AY}\\ f_{BX} & f_{BY} \end{bmatrix} \end{equation*}
We can write this simply \(f \cdot g = g \circ \alpha^{-1} \circ f = h\text{.}\) I should note that the domains and codomains of these maps are \(A+B \xrightarrow{f} X \times Y\text{,}\) \(X+Y \xrightarrow{g} U \times V\) and \(A+B \xrightarrow{h} U \times V\) . We can define the elements of those matrices using the injections and projections of the category. For example:
\begin{equation*} A \xrightarrow{f_{AX}} X = A \xrightarrow{j_1} A+B \xrightarrow{f} X \times Y \xrightarrow{p_1} X\text{.} \end{equation*}
Let’s start by taking a look at the map \(h_{AU} = g_{XU} \circ f_{AX} + g_{YU} \circ f_{AY}\text{.}\) By the definition of “matrix addition”, we’ve defined \(A \xrightarrow{h_{AU}} U\) to be the unique map such that:
\begin{equation*} \begin{bmatrix} 1_{AA} & g_{XU} \circ f_{AX}\\ 0_{UA} & 1_{UU} \end{bmatrix} \cdot \begin{bmatrix} 1_{AA} & g_{YU} \circ f_{AY}\\ 0_{UA} & 1_{UU} \end{bmatrix} = \begin{bmatrix} 1_{AA} & h_{AU}\\ 0_{UA} & 1_{UU} \end{bmatrix} \end{equation*}
Applying the definition of “matrix multiplication” to this gives us:
\begin{equation*} \begin{bmatrix} 1_{AA} & h_{AU}\\ 0_{UA} & 1_{UU} \end{bmatrix} = \begin{bmatrix} 1_{AA} & g_{YU} \circ f_{AY}\\ 0_{UA} & 1_{UU} \end{bmatrix} \circ \begin{bmatrix} 1_{A} & 0_{AU} \\ 0_{UA} & 1_{U} \end{bmatrix}^{-1} \circ \begin{bmatrix} 1_{AA} & g_{XU} \circ f_{AX}\\ 0_{UA} & 1_{UU} \end{bmatrix} \end{equation*}
Let’s take a closer look at the domains and codomains of that composition:
\begin{equation*} A+U \longrightarrow A \times U \longrightarrow A+U \longrightarrow A \times U \end{equation*}
By similar reasoning, we can also deduce the following:
\begin{equation*} \begin{bmatrix} 1_{AA} & h_{AV}\\ 0_{VA} & 1_{VV} \end{bmatrix} = \begin{bmatrix} 1_{AA} & g_{YV} \circ f_{AY}\\ 0_{VA} & 1_{VV} \end{bmatrix} \circ \begin{bmatrix} 1_{A} & 0_{AV} \\ 0_{VA} & 1_{V} \end{bmatrix}^{-1} \circ \begin{bmatrix} 1_{AA} & g_{XV} \circ f_{AX}\\ 0_{VA} & 1_{VV} \end{bmatrix} \end{equation*}
\begin{equation*} \begin{bmatrix} 1_{BB} & h_{BU}\\ 0_{UB} & 1_{UU} \end{bmatrix} = \begin{bmatrix} 1_{BB} & g_{YU} \circ f_{BY}\\ 0_{UB} & 1_{UU} \end{bmatrix} \circ \begin{bmatrix} 1_{B} & 0_{BU} \\ 0_{UB} & 1_{U} \end{bmatrix}^{-1} \circ \begin{bmatrix} 1_{BB} & g_{XU} \circ f_{BX}\\ 0_{UB} & 1_{UU} \end{bmatrix} \end{equation*}
\begin{equation*} \begin{bmatrix} 1_{BB} & h_{BV}\\ 0_{VA} & 1_{VV} \end{bmatrix} = \begin{bmatrix} 1_{BB} & g_{YV} \circ f_{BY}\\ 0_{VB} & 1_{VV} \end{bmatrix} \circ \begin{bmatrix} 1_{B} & 0_{BV} \\ 0_{VB} & 1_{V} \end{bmatrix}^{-1} \circ \begin{bmatrix} 1_{BB} & g_{XV} \circ f_{BX}\\ 0_{VB} & 1_{VV} \end{bmatrix} \end{equation*}
Maybe I can take these maps and turn them into a big “nested matrix”:
\begin{equation*} \begin{bmatrix} \begin{bmatrix} 1_{AA} & h_{AU}\\ 0_{UA} & 1_{UU} \end{bmatrix} & \begin{bmatrix} 1_{AA} & h_{AV}\\ 0_{VA} & 1_{VV} \end{bmatrix} \\ \begin{bmatrix} 1_{BB} & h_{BU}\\ 0_{UB} & 1_{UU} \end{bmatrix} & \begin{bmatrix} 1_{BB} & h_{BV}\\ 0_{VB} & 1_{VV} \end{bmatrix} \end{bmatrix} \end{equation*}
The components of that matrix should have the following domains and codomains:
\begin{equation*} \begin{bmatrix} A+U \longrightarrow A \times U & A+V \longrightarrow A \times V \\ B+U \longrightarrow B \times U & B+V \longrightarrow B \times V \end{bmatrix} \end{equation*}
Maybe I should be thinking of this identity matrix \(\alpha^{-1}\) in terms of it’s properties? The idea being that \(\alpha \circ \alpha^{-1} = 1_{X+Y}\) and \(\alpha^{-1} \circ \alpha = 1_{X \times Y}\) for any \(X\) and \(Y\) in the category. What if I were to choose \(X = A + B\) and \(Y = U + V\text{?}\)
Our linear category should already contain isomorphisms \(A+B \leftrightarrows A \times B\) and \(U+V \leftrightarrows U \times V\text{.}\) If we also have an isomorphism \((A+B)+(U+V) \leftrightarrows (A + B) \times (U + V) \text{,}\) then we should be able to use the injections and projections to define an isomorphism \(A+B \leftrightarrows U \times V\text{.}\)
Consider the following:
\begin{equation*} (A+B) \times (U + V) \xrightarrow{p_1} (A+B) \xrightarrow{\alpha^{-1}} (A \times B) \xrightarrow{j_1} (A \times B) + (U \times V) \end{equation*}
\begin{equation*} (A+B) \times (U + V) \xrightarrow{p_2} (U+V) \xrightarrow{\alpha^{-1}} (U \times B) \xrightarrow{j_2} (A \times B) + (U \times V) \end{equation*}
Our definition of product and sum mean we have unique maps \(j,p\) satisfying \(j \circ \alpha^{-1} \circ p = \begin{bmatrix} 1_{A+B} & 0_{(A+B)(U+V)} \\ 0_{(U+V)(A+B)} & 1_{U+V} \end{bmatrix}^{-1}\) defined from \((A+B) \times (U + V) \longrightarrow (A \times B) + (U \times V)\text{.}\) If we precompose by \(p_1\) and post compose by \(j_1\text{,}\) then we get a map \(A + B \longrightarrow U \times V\) which has the same domain and codomain as \(h\text{.}\)
If I’m interpreting this right, this gives us a special matrix \(\begin{bmatrix} j_1 \circ p_1 & j_1 \circ p_2 \\ j_2 \circ p_1 & j_2 \circ p_2 \end{bmatrix} \) that works for any map \(X+Y\rightarrow X \times Y\) by interchanging the roles of product and coproduct. Maybe that’s the key to showing this \(\begin{bmatrix} \begin{bmatrix} 1_{AA} & h_{AU}\\ 0_{UA} & 1_{UU} \end{bmatrix} & \begin{bmatrix} 1_{AA} & h_{AV}\\ 0_{VA} & 1_{VV} \end{bmatrix} \\ \begin{bmatrix} 1_{BB} & h_{BU}\\ 0_{UB} & 1_{UU} \end{bmatrix} & \begin{bmatrix} 1_{BB} & h_{BV}\\ 0_{VB} & 1_{VV} \end{bmatrix} \end{bmatrix} = h\text{?}\)
I think starting to see why the distributive property to fail in a “linear category”. I want to spend a little more time exploring so I’ll pick this up next week.