Continued from last week
Solution.
I’m starting to get the feeling that I’m overthinking this exercise and the real purpose was to connect the matrix product with composition. I’ve been basing my Python code around the properties of the adjacency matrix representation of a graph, but this idea really wasn’t explicitly made in the text anywhere.
If I’m just thinking of a traditional “matrix product”, I would evaluate \(f \cdot g\) as follows:
\begin{equation*}
\begin{bmatrix}
f_{AX} & f_{AY}\\
f_{BX} & f_{BY}
\end{bmatrix}
\cdot
\begin{bmatrix}
g_{XU} & g_{XV}\\
g_{YU} & g_{YV}
\end{bmatrix}
=
\end{equation*}
\begin{equation*}
\begin{bmatrix}
f_{AX} \cdot g_{XU} + f_{AY} \cdot g_{YU} &
f_{BX} \cdot g_{XU} + f_{BY} \cdot g_{YU} \\
f_{AX} \cdot g_{XV} + f_{AY} \cdot g_{YV} &
f_{BX} \cdot g_{XV} + f_{BY} \cdot g_{YV}
\end{bmatrix}
\end{equation*}
My work from last week has a number of errors. Particularly, I mistakely had a duplicate \(f_{XV}\) where my \(f_{YV}\) should have been, and the text defined \(\alpha\) opposite to how I did. Specifically, they use \(X \times Y \xrightarrow{\alpha} X+Y\) such that \(\alpha^{-1} \circ \alpha = 1_{X \times Y}\) and \(\alpha \circ \alpha^{-1} = 1_{X + Y}\) for \(X+Y \xrightarrow{\alpha^{-1}} X \times Y\text{.}\) Using that notation, can write our matrix product \(f \cdot g\) as \(g \circ \alpha \circ f\text{.}\) This gives us the following:
\begin{equation*}
\begin{bmatrix}
g_{XU} \circ \alpha \circ f_{AX} + g_{YU}\circ \alpha \circ f_{AY} &
g_{XU} \circ \alpha \circ f_{BX} + g_{YU} \circ \alpha \circ f_{BY} \\
g_{XV} \circ \alpha \circ f_{AX} + g_{YV} \circ \alpha \circ f_{AY} &
g_{XV} \circ \alpha \circ f_{BX} + g_{YV}\circ \alpha \circ f_{BY}
\end{bmatrix}
\end{equation*}
So I think the trink to this is that our category must have identity maps and an associative property. Given a product \(f \cdot g = g \circ \alpha \circ f\text{,}\) we could theoretically substitute \(f = \alpha^{-1}\) or \(g = \alpha^{-1}\text{.}\) That gives us \(\alpha^{-1} \cdot g = g \circ \alpha \circ \alpha^{1} = g\) and \(f \cdot \alpha^{-1} = \alpha^{-1} \circ \alpha \circ f
= f\text{.}\) This would give us the following two equivalences:
\begin{equation*}
\begin{bmatrix}
f_{AX} & f_{AY}\\
f_{BX} & f_{BY}
\end{bmatrix} =
\begin{bmatrix}
f_{AX} & f_{AY}\\
f_{BX} & f_{BY}
\end{bmatrix}
\cdot
\begin{bmatrix}
1_{X} & 0_{XY}\\
0_{YX} & 1_{Y}
\end{bmatrix}
\end{equation*}
\begin{equation*}
=
\begin{bmatrix}
1_{X} \circ \alpha \circ f_{AX} + 0_{YX}\circ \alpha \circ f_{AY} &
1_{X} \circ \alpha \circ f_{BX} + 0_{YX} \circ \alpha \circ f_{BY} \\
0_{XY} \circ \alpha \circ f_{AX} + 1_{Y} \circ \alpha \circ f_{AY} &
0_{XY} \circ \alpha \circ f_{BX} + 1_{Y}\circ \alpha \circ f_{BY}
\end{bmatrix}
\end{equation*}
and
\begin{equation*}
\begin{bmatrix}
g_{XU} & g_{XV}\\
g_{YU} & g_{YV}
\end{bmatrix} =
\begin{bmatrix}
1_{X} & 0_{XY}\\
0_{YX} &1_{Y}
\end{bmatrix} \cdot
\begin{bmatrix}
g_{XU} & g_{XV}\\
g_{YU} & g_{YV}
\end{bmatrix}
\end{equation*}
\begin{equation*}
=
\begin{bmatrix}
g_{XU} \circ \alpha \circ 1_{X} + g_{YU}\circ \alpha \circ 0_{XY} &
g_{XU} \circ \alpha \circ 0_{YX} + g_{YU} \circ \alpha \circ 1_{Y} \\
g_{XV} \circ \alpha \circ 1_{X} + g_{YV} \circ \alpha \circ 0_{XY} &
g_{XV} \circ \alpha \circ 0_{YX} + g_{YV}\circ \alpha \circ 1_{Y}
\end{bmatrix}
\end{equation*}
Using the properties of our identity maps and zero maps, we can simplify those to the following:
\begin{equation*}
\begin{bmatrix}
f_{AX} & f_{AY}\\
f_{BX} & f_{BY}
\end{bmatrix} =
\begin{bmatrix}
\alpha \circ f_{AX} + 0_{AX} &
\alpha \circ f_{BX} + 0_{BX} \\
0_{AY} + \alpha \circ f_{AY} &
0_{BY} + \alpha \circ f_{BY}
\end{bmatrix}
\end{equation*}
\begin{equation*}
\begin{bmatrix}
g_{XU} & g_{XV}\\
g_{YU} & g_{YV}
\end{bmatrix} =
\begin{bmatrix}
g_{XU} \circ \alpha + 0_{XU} &
0_{YU} + g_{YU} \circ \alpha \\
g_{XV} \circ \alpha + 0_{XV} &
0_{YV} + g_{YV}\circ \alpha
\end{bmatrix}
\end{equation*}
Considering how we’ve defined a “sum” of maps, for any \(A \xrightarrow{f} B\) and \(A \xrightarrow{g} B\) there should be uniquely defined maps satisfying
\begin{equation*}
\begin{bmatrix}
1_{AA} & f\\
0_{BA} & 1_{BB}
\end{bmatrix} \cdot
\begin{bmatrix}
1_{AA} & 0_{AB}\\
0_{BA} & 1_{BB}
\end{bmatrix}
=
\begin{bmatrix}
1_{AA} & f+0_{AB}\\
0_{BA} & 1_{BB}
\end{bmatrix} = f
\end{equation*}
and
\begin{equation*}
\begin{bmatrix}
1_{AA} & 0_{AB}\\
0_{BA} & 1_{BB}
\end{bmatrix}
\cdot
\begin{bmatrix}
1_{AA} & g\\
0_{BA} & 1_{BB}
\end{bmatrix}
=
\begin{bmatrix}
1_{AA} & 0_{AB}+g\\
0_{BA} & 1_{BB}
\end{bmatrix} = g
\end{equation*}
In other words, we can add a zero map to any map and get the same map back. If follows that \(\begin{bmatrix}
f_{AX} & f_{AY}\\
f_{BX} & f_{BY}
\end{bmatrix} = \begin{bmatrix}
\alpha \circ f_{AX} &
\alpha \circ f_{BX} \\
\alpha \circ f_{AY} &
\alpha \circ f_{BY}
\end{bmatrix} \) and \(\begin{bmatrix}
g_{XU} & g_{XV}\\
g_{YU} & g_{YV}
\end{bmatrix} =
\begin{bmatrix}
g_{XU} \circ \alpha &
g_{YU} \circ \alpha \\
g_{XV} \circ \alpha &
g_{YV}\circ \alpha
\end{bmatrix} \) are both “invariate” with respect to this map \(\alpha\text{.}\) Our expression for the product above can then be simplified one step more:
\begin{equation*}
\begin{bmatrix}
f_{AX} & f_{AY}\\
f_{BX} & f_{BY}
\end{bmatrix}
\cdot
\begin{bmatrix}
g_{XU} & g_{XV}\\
g_{YU} & g_{YV}
\end{bmatrix}
=
\end{equation*}
\begin{equation*}
\begin{bmatrix}
g_{XU} \circ \alpha \circ f_{AX} + g_{YU}\circ \alpha \circ f_{AY} &
g_{XU} \circ \alpha \circ f_{BX} + g_{YU} \circ \alpha \circ f_{BY} \\
g_{XV} \circ \alpha \circ f_{AX} + g_{YV} \circ \alpha \circ f_{AY} &
g_{XV} \circ \alpha \circ f_{BX} + g_{YV}\circ \alpha \circ f_{BY}
\end{bmatrix}
\end{equation*}
\begin{equation*}
=
\begin{bmatrix}
g_{XU} \circ f_{AX} + g_{YU}\circ f_{AY} &
g_{XU} \circ f_{BX} + g_{YU}\circ f_{BY} \\
g_{XV}\circ f_{AX} + g_{YV} \circ f_{AY} &
g_{XV} \circ f_{BX} + g_{YV}\circ f_{BY}
\end{bmatrix}
\end{equation*}
And that completes our proof.