Lecture 70 - Tensoring Enriched Profunctors

I could move on to more realistic examples of collaborative design, but it would take a while to develop these, and I think our time will be better spent reaping the rewards of the mathematical work we've already done. So, for more on co-design, please read Chapter 4 of the book - and also this:

Right now I want to talk more about enriched profunctors. We've seen how to do lots of fun stuff with feasibility relations. We can 'compose' them, which describes running two processes in series:

We can 'tensor' them, which describes running two processes in parallel:

We can also create feedback loops using cups:

and caps:

These features are the key building blocks of co-design diagrams - but they also show up in diagrams of electrical circuits, and many other diagrams used by scientists and engineers! So, we should understand them more deeply.

For starters, feasibility relations are just a special case of \(\mathcal{V}\)-enriched profunctors, namely the case where \(\mathcal{V} = \textbf{Bool}\). So, it's natural to ask if we can do all the same things with \(\mathcal{V}\)-enriched profunctors. And the answer is yes!

This is nice, because there's a variant of co-design diagrams where we use \(\mathcal{V} = \textbf{Cost}\) instead of \(\mathcal{V} = \textbf{Bool}\). In fact they were developed by the U.S. Navy starting in 1957 to help build nuclear submarines! They're called PERT charts, and they've been subsequently used for many other things. Of course, none of the practical people using PERT charts know they are dealing with enriched profunctors, and if you told them they probably wouldn't care. But still, this is a great example of how concepts that arise naturally in category theory can arise naturally in 'real life'.

So, let's take what we've done for feasibility relations, and do it for \(\mathcal{V}\)-enriched profunctors. It all works the same way.

We'll start by tensoring them.

In Lecture 63 and Lecture 64 we saw that if \(\mathcal{V}\) is any commutative quantale there's a category \(\mathbf{Prof}_\mathcal{V}\) with

But \(\mathbf{Prof}_\mathcal{V}\) is better than a mere category: you can also tensor objects in this category, and tensor morphisms!

We saw how to tensor \(\mathcal{V}\)-enriched categories way back in Lecture 62. I wrote this tensor product with a times sign, but now it's better to use the symbol \(\otimes\). Here's how it works: for any \(\mathcal{V}\)-enriched categories \(\mathcal{X}\) and \(\mathcal{Y}\), there is a \(\mathcal{V}\)-enriched category \(\mathcal{X} \otimes \mathcal{Y}\) for which:

an object is a pair \( (x,y) \) where \(x\) is an object of \(\mathcal{X}\) and \(y\) is an object of \(\mathcal{Y}\), and

This ability to tensor things makes \(\mathbf{Prof}_{\mathcal{V}}\) into a so-called 'monoidal category'. Instead of defining that - you can look up the definition if you care - I'll just get you to check some of the things a monoidal category should have.

First, we want tensoring to be associative. More precisely:

Puzzle 224. Suppose \(\mathcal{X}, \mathcal{Y}\) and \(\mathcal{Z}\) are \(\mathcal{V}\)-enriched categories. Show there is a \(\mathcal{V}\)-enriched profunctor

There is just one answer to this puzzle, and this choice of \( \alpha_{\mathcal{X}, \mathcal{Y},\mathcal{Z}}\) is called the associator.

Remember, there's a \(\mathcal{V}\)-enriched category called \(\mathbf{1}\). This has one object \(0\) and \(\mathbf{1}(0,0) = I \), where \(I\) is the multiplicative identity in \(\mathcal{V}\). Not surprisingly, this guy \(\mathbf{1}\) is the unit for tensoring! More precisely:

Puzzle 225. Suppose \(\mathcal{X}\) is a \(\mathcal{V}\)-enriched categories. Show there are \(\mathcal{V}\)-enriched profunctors

There's just one choice of each of these, unless you're a lot more sneaky than me. We call \(\lambda_{\mathcal{X}}\) the left unitor and \(\rho_{\mathcal{X}}\) the right unitor.

That's most of what we need for a monoidal category! Not everything: there are some subtleties. We want tensoring to define a functor \( \otimes \colon \mathbf{Prof}_{\mathcal{V}} \times \mathbf{Prof}_{\mathcal{V}} \to \mathbf{Prof}_{\mathcal{V}}\). And we need the associator and unitors to obey some equations.

But instead of scaring you with these details now, let me pound home the basic idea:

Monoidal categories let us study processes that we can run in series and in parallel.

You already know dozens of monoidal categories, and now you know infinitely many more: all the categories \(\mathbf{Prof}_{\mathcal{V}}\).

Puzzle 226. List a bunch of monoidal categories. Don't be afraid to guess: if a category has a good way of combining two objects to get a new object, and it seems kind of associative, chances are you've got a monoidal category.

$$ \alpha^{-1}_{\mathcal{X}, \mathcal{Y},\mathcal{Z}}(\langle x, \langle y,z\rangle\rangle,\langle \langle x', y'\rangle,z'\rangle) = \mathcal{X}(x,x') \otimes \mathcal{Y}(y,y') \otimes \mathcal{Z}(z,z') $$
Puzzle 226. Cartesian closed categories are an example. In this case \(\times\) plays the role of \(\otimes\) and the terminal element plays the role of \(1\). Some CCCs include Heyting algebras (and boolean algebras), and the category of posets, and any topos like Set.

The category of finite vector spaces over a field is another example. In this case \(\otimes\) is the tensor product of the two spaces, and the terminal element \(\mathbf{1}\) is a single point representing a zero-dimensional space.

I mentioned them elsewhere, but Action Algebras from computer science form a monoidal category, in fact they form a closed monoidal category that isn't symmetric.

I was reading a bit about John's category of finite probabilities. I believe this category is monoidal as well because it has a coproduct and an initial object. Here's my idea:

We need an identity, so I pick the initial probability distribution \(\mathbf{1}\) over the set containing one element \(\lbrace \ast \rbrace\). This probability distribution assigns \(\varnothing \mapsto 0\) and \(\lbrace \ast \rbrace \mapsto 1\).

Now I'm not 100% this is really a coproduct, since it diverges from the coproduct construction in Set. John said "Don't be afraid to guess" so I'm taking some liberty :D If it's not nonsense, it might be fun to wonder about what the information loss functor \(\mathbf{IL}\) from John Baez et al. (2011) does here.

Comment Source:> **Puzzle 224.** Suppose \\(\mathcal{X}, \mathcal{Y}\\) and \\(\mathcal{Z}\\) are \\(\mathcal{V}\\)-enriched categories. Show there is a \\(\mathcal{V}\\)-enriched profunctor
>
> \[ \alpha_{\mathcal{X}, \mathcal{Y},\mathcal{Z}} \colon (\mathcal{X} \otimes \mathcal{Y}) \otimes \mathcal{Z} \nrightarrow
\mathcal{X} \otimes (\mathcal{Y} \otimes \mathcal{Z}) \]
>
> which has an inverse.
Here's my go. It's hard to read functions of tuples, so I am going to use \\(\langle\ldots\rangle\\) for tuples rather than parentheses.
\[ \alpha\_{\mathcal{X}, \mathcal{Y},\mathcal{Z}}(\langle\langle x,y\rangle,z\rangle,\langle x',\langle y',z'\rangle\rangle) = \mathcal{X}(x,x') \otimes \mathcal{Y}(y,y') \otimes \mathcal{Z}(z,z') \]
Its inverse is similar:
\[ \alpha^{-1}\_{\mathcal{X}, \mathcal{Y},\mathcal{Z}}(\langle x, \langle y,z\rangle\rangle,\langle \langle x', y'\rangle,z'\rangle) = \mathcal{X}(x,x') \otimes \mathcal{Y}(y,y') \otimes \mathcal{Z}(z,z') \]
**Puzzle 226**. Cartesian closed categories are an example. In this case \\(\times\\) plays the role of \\(\otimes\\) and the terminal element plays the role of \\(1\\). Some CCCs include [Heyting algebras](https://en.wikipedia.org/wiki/Heyting_algebra) (and boolean algebras), and the category of posets, and any topos like Set.
The category of finite vector spaces over a field is another example. In this case \\(\otimes\\) is the [tensor product of the two spaces](https://en.wikipedia.org/wiki/Tensor_product#Tensor_product_of_vector_spaces), and the terminal element \\(\mathbf{1}\\) is a single point representing a zero-dimensional space.
I mentioned them elsewhere, but [Action Algebras](https://en.wikipedia.org/wiki/Action_algebra) from computer science form a monoidal category, in fact they form a closed monoidal category that isn't symmetric.
I was reading a bit about John's [category of finite probabilities](http://math.ucr.edu/home/baez/networks_oxford/networks_entropy.pdf). I believe this category is monoidal as well because it has a coproduct and an initial object. Here's my idea:
1. Let \\(\mu\\) be a finite probability measure over \\(X\\) and \\(\rho\\) be a finite probability measure over \\(Y\\). In John's \\(\mathbf{FinProp}\\), these objects are \\((X,\mu)\\) and \\((Y,\rho)\\) respectively. Define a new \\(\mu \sqcup \rho\\) over the disjoint union \\(X \sqcup Y\\) where
\[ (\mu \sqcup \rho) (E) = \mu(E \cap X) \rho(E \cap Y) \]
2. We want measure defined above to be a coproduct. So we must find two injective measure preserving maps \\(l: X \rightarrowtail X \sqcup Y\\) and \\(r: Y \rightarrowtail X \sqcup Y\\) such that
\[
\mu(E) = (\mu \sqcup \rho)(l(E)) \text{ and}\\\\
\rho(E') = (\mu \sqcup \rho)(r(E')) \\\\
\]
This works when \\(l\\) maps \\(E \mapsto E \sqcup Y\\) and \\(r\\) maps \\(E' \mapsto X \sqcup E' \\).
For example, if \\(X= \lbrace a , b, c\rbrace\\) and \\(Y = \lbrace 1,2,3\rbrace\\) then \\(l(\lbrace a\rbrace) = \lbrace a,1,2,3\rbrace\\) and \\(r(\lbrace 2,3\rbrace) = \lbrace a,b,c,2,3\rbrace\\)
3. We need an identity, so I pick the *initial* probability distribution \\(\mathbf{1}\\) over the set containing one element \\(\lbrace \ast \rbrace\\). This probability distribution assigns \\(\varnothing \mapsto 0\\) and \\(\lbrace \ast \rbrace \mapsto 1\\).
Now I'm not 100% this is really a coproduct, since it diverges from the coproduct construction in **Set**. John said "Don't be afraid to guess" so I'm taking some liberty :D If it's not nonsense, it might be fun to wonder about what the *information loss* functor \\(\mathbf{IL}\\) from [John Baez et al. (2011)](https://arxiv.org/pdf/1106.1791.pdf) does here.

The proof for the inverse is similar. Though it strikes me that we also need to prove that it is an inverse...

Yes. Here's a thought: you could try to define the associator and its inverse as companions of enriched functors that are clearly inverses of each other, and then prove

$$ \widehat{FG} = \widehat{F} \widehat{G} . $$
Do you see where I'm wanting to go with this?

This might require less calculation than a direct approach: for example, you wouldn't have to prove the associator and its inverse are profunctors. It would also develop some general tools that could be reused.

Comment Source:Anindya wrote:
>The proof for the inverse is similar. Though it strikes me that we also need to prove that it _is_ an inverse...
Yes. Here's a thought: you could try to define the associator and its inverse as companions of enriched functors that are clearly inverses of each other, and then prove
\[ \widehat{FG} = \widehat{F} \widehat{G} . \]
Do you see where I'm wanting to go with this?
This might require less calculation than a direct approach: for example, you wouldn't have to prove the associator and its inverse are profunctors. It would also develop some general tools that could be reused.

Guess 1: Rel (the category where objects are sets and morphisms are relations): my guess is that this can be viewed as the subcategory of Feas where the objects are discrete posets (only morphisms are the identities). But one could also probably show it's a monoidal category more directly: the cartesian product would be a tensor; for example, {1}xA is naturally isomorphic to A, and cartesian product is associative (up to isomorphism).

Guess 2: the category of finite dimensional vector spaces, equipped with the tensor product (from which I suppose guess 1 would follow). Reasons for guess: tensor product is associative, maps to a vector space where the basis is formed of a cartesian product, and has the word "tensor" in it.

Guess 3: the category of sets with conditional probability distributions as morphisms, i.e. (A->B) is the set of conditional distributions over B given A. The unital set is unit, and the tensor would be effectively cartesian product again.

Comment Source:Puzzle 226:
Guess 1: Rel (the category where objects are sets and morphisms are relations): my guess is that this can be viewed as the subcategory of Feas where the objects are discrete posets (only morphisms are the identities). But one could also probably show it's a monoidal category more directly: the cartesian product would be a tensor; for example, {1}xA is naturally isomorphic to A, and cartesian product is associative (up to isomorphism).
Guess 2: the category of finite dimensional vector spaces, equipped with the tensor product (from which I suppose guess 1 would follow). Reasons for guess: tensor product is associative, maps to a vector space where the basis is formed of a cartesian product, and has the word "tensor" in it.
Guess 3: the category of sets with conditional probability distributions as morphisms, i.e. (A->B) is the set of conditional distributions over B given A. The unital set is unit, and the tensor would be effectively cartesian product again.

Digression: interesting stack overflow post from subsequent googling: https://math.stackexchange.com/questions/175169/direct-products-in-the-category-rel : the concise argument in the first answer shows that while the monoidal product in Rel is the cartesian product, the categorical product is union. If I'm not mistaken, John has elsewhere remarked on the importance of this fact for more sophisticated categories than Rel. But I found the concision of the proof in the link illuminating.

Comment Source:*Digression*: interesting stack overflow post from subsequent googling: https://math.stackexchange.com/questions/175169/direct-products-in-the-category-rel : the concise argument in the first answer shows that while the monoidal product in Rel is the cartesian product, the categorical product is union. If I'm not mistaken, John has elsewhere remarked on the importance of this fact for more sophisticated categories than Rel. But I found the concision of the proof in the link illuminating.

My assumption is that stories form a monoidal category with monoidal product given by parallel plots.

In fact, they could form another monoidal category with monoidal product given by alternative plots, to give semantics to gamebooks/choose-your-own-adventure style book.

Comment Source:*Puzzle 226*
My assumption is that stories form a monoidal category with monoidal product given by parallel plots.
In fact, they could form another monoidal category with monoidal product given by alternative plots, to give semantics to gamebooks/choose-your-own-adventure style book.

Here's a thought: you could try to define the associator and its inverse as companions of enriched functors that are clearly inverses of each other, and then prove \(\widehat{FG} = \widehat{F}\circ\widehat{G}\)... This might require less calculation than a direct approach: for example, you wouldn't have to prove the associator and its inverse are profunctors. It would also develop some general tools that could be reused.

OK let's try this. We had a look at composing companions back in Lecture 66 so I'm adapting some stuff from there.

Comment Source:@John wrote:
> Here's a thought: you could try to define the associator and its inverse as companions of enriched functors that are clearly inverses of each other, and then prove \\(\widehat{FG} = \widehat{F}\circ\widehat{G}\\)... This might require less calculation than a direct approach: for example, you wouldn't have to prove the associator and its inverse are profunctors. It would also develop some general tools that could be reused.
OK let's try this. We had a look at composing companions back in [Lecture 66](https://forum.azimuthproject.org/discussion/comment/20385/#Comment_20385) so I'm adapting some stuff from there.
\[\]
First, let's prove that \\(\widehat{F} \circ \widehat{G} = \widehat{FG}\\), where \\(F : \mathcal{Y} \to \mathcal{Z}, G : \mathcal{X} \to \mathcal{Y}\\).
\[(\widehat{F} \circ \widehat{G})(x, z) = \bigvee\big(\widehat{G}(x, y) \otimes \widehat{F}(y, z)\big) = \bigvee\big(\mathcal{Y}(Gx, y) \otimes \mathcal{Z}(Fy, z)\big)\]
It turns out that this join \\(= \mathcal{Z}(FGx, z) = \widehat{FG}(x, z)\\).
To prove this, first set \\(y = Gx\\) to see that our join \\(\geq \mathcal{Y}(Gx, Gx) \otimes \mathcal{Z}(FGx, z) \geq I \otimes \mathcal{Z}(FGx, z) = \mathcal{Z}(FGx, z)\\)
But given any \\(y\\) we have \\(\mathcal{Z}(FGx, z) \geq \mathcal{Z}(FGx, Fy) \otimes \mathcal{Z}(Fy, z) \geq \mathcal{Y}(Gx, y) \otimes \mathcal{Z}(Fy, z)\\)
So we also have \\(\mathcal{Z}(FGx, z) \geq\\) our join. QED
\[\]
Second, note that the companion of the identity \\(\mathcal{V}\\)-functor \\(\mathcal{X}\to\mathcal{X}\\) is the identity profunctor \\(\mathcal{X}\nrightarrow\mathcal{X}\\).
It follows that if \\(F\\) and \\(G\\) are inverses, then the profunctors \\(\widehat{F}\\) and \\(\widehat{G}\\) are inverses, ie isomorphisms in the category \\(\textbf{Prof}\\).
\[\]
Third, let's define \\(A : (\mathcal{X}\otimes\mathcal{Y})\otimes\mathcal{Z} \to \mathcal{X}\otimes(\mathcal{Y}\otimes\mathcal{Z})\\) and \\(B : \mathcal{X}\otimes(\mathcal{Y}\otimes\mathcal{Z}) \to (\mathcal{X}\otimes\mathcal{Y})\otimes\mathcal{Z}\\) by
\[A\langle\langle x, y\rangle, z\rangle = \langle x, \langle y, z\rangle\rangle \qquad B\langle x, \langle y, z\rangle\rangle = \langle\langle x, y\rangle, z\rangle\]
It's easy to check that \\(A\\) and \\(B\\) are indeed \\(\mathcal{V}\\)-functors, and they are obviously mutually inverse.
\[\]
Finally, what is \\(\widehat{A}\\)? By the definition of a companion we have
\[\widehat{A}(\langle\langle x, y\rangle, z\rangle, \langle x', \langle y', z'\rangle\rangle) = (\mathcal{X}\otimes(\mathcal{Y}\otimes\mathcal{Z}))(A\langle\langle x, y\rangle, z\rangle, \langle x', \langle y', z'\rangle\rangle)\]
\[= (\mathcal{X}\otimes(\mathcal{Y}\otimes\mathcal{Z}))(\langle x, \langle y, z\rangle\rangle, \langle x', \langle y', z'\rangle\rangle) = \mathcal{X}(x, x') \otimes \mathcal{Y}(y, y') \otimes \mathcal{Z}(z, z')\]
which is our associator one way, and similarly \\(\widehat{B}\\) is the associator the other way. Combining this with the previous results, we have proved that the associators are profunctors and that they are inverse to each other.

Anindya - I like this argument. This could be the slickest possible solution to Puzzle 224.

Now that you mention it, I seem to recall you and Matthew had trouble proving the composite of conjoints is the conjoint of the composite:

$$ \check{FG} \stackrel{?}{=} \check{F} \check{G} .$$
I don't even know if this is true! It would seem strangely asymmetrical for this to be false.

Comment Source:Anindya - I like this argument. This could be the slickest possible solution to Puzzle 224.
Now that you mention it, I seem to recall you and Matthew had trouble proving the composite of conjoints is the conjoint of the composite:
\[ \check{FG} \stackrel{?}{=} \check{F} \check{G} .\]
I don't even know if this is true! It would seem strangely asymmetrical for this to be false.

(I'm wondering if we could get this result about conjoints directly from the previous one about companions by composing with cup and cap to turn a profunctor \(\mathcal{X}\nrightarrow\mathcal{Y}\) into a profunctor \(\mathcal{Y}^\text{op}\nrightarrow\mathcal{X}^\text{op}\).)

[EDIT to add: this turned out to be a little more complex than I'd initially thought, I think it's right now but the last step needs proving]

Comment Source:Addendum to the above:
Given a \\(\mathcal{V}\\)-functor \\(F : \mathcal{Z} \to \mathcal{Y}\\), define \\(F^\text{op} : \mathcal{Z}^\text{op} \to \mathcal{Y}^\text{op}\\) by \\(F^\text{op}(z) = F(z)\\). It's easy to prove \\(F^\text{op}\\) is indeed a \\(\mathcal{V}\\)-functor.
Note that \\(\check{F}(y, z) = \mathcal{Y}(y, Fz) = \mathcal{Y}^\text{op}(F^\text{op} z, y) = \widehat{F^\text{op}}(z, y)\\).
So we can use this construction to turn conjoints into companions and vice versa. In particular we have
\[\check{GF}(x, z) = \widehat{(GF)^\text{op}}(z, x) = \widehat{G^\text{op} F^\text{op}}(z, x) = (\widehat{G^\text{op}}\circ \widehat{F^\text{op}})(z, x) = (\check{F}\circ\check{G})(x, z)\]
... which gives us the rule about composing conjoints directly from the one about composing companions.
[EDIT to add: this turned out to be a little more complex than I'd initially thought, I think it's right now but the last step needs proving]

The category of groups, group homomorphisms, with the direct sum (coproduct) as the monoidal product and the trivial group as the unit

Monoidal posets: a monoidal category where every homset is either a singleton or empty.

The category of set with the disjoint union as the monoidal product and singletons as the unit.

But I had this conjecture while thinking of examples. Every example was basically using a coproduct as the monoidal product.

Conjecture: Any category can be turned into a monoidal category if it has all coproducts and a unit object \(I\) such that \(\hom(I,I) = \lbrace1_I\rbrace\) (it only has one morphism from itself to itself, the identity map), where the monoidal product is given by the coproduct and the unit object is the monoidal unit.

I'll try and prove this later.

Comment Source:**Puzzle 226** Here's some answers
+ The category of groups, group homomorphisms, with the direct sum (coproduct) as the monoidal product and the trivial group as the unit
+ Monoidal posets: a monoidal category where every homset is either a singleton or empty.
+ The category of set with the disjoint union as the monoidal product and singletons as the unit.
But I had this conjecture while thinking of examples. Every example was basically using a coproduct as the monoidal product.
**Conjecture**: Any category can be turned into a monoidal category if it has all coproducts and a unit object \\(I\\) such that \\(\hom(I,I) = \lbrace1_I\rbrace\\) (it only has one morphism from itself to itself, the identity map), where the monoidal product is given by the coproduct and the unit object is the monoidal unit.
I'll try and prove this later.

I think when you're saying 'has all coproducts' you mean what most category theorists when they say 'has binary coproducts': each pair of objects \(x\) and \(y\) has a coproduct, written \(x + y\). We can try to take the coproduct of any number of objects, even an infinite number, but binary coproducts are probably the most important case.

People often study categories with binary coproducts and an initial object, often written \( 0\): that's an object that has only one morphism to any other object in the category.

An initial object can be seen as a 'nullary coproduct', and we can build up ternary, etc. coproducts from binary ones, so a category with finite coproducts is the same as a category with binary coproducts and an initial object.

Categories with finite coproducts are very nice: perhaps the most important is the category \(\textbf{FinSet}\) with finite sets as objects, and functions as morphisms.

Comment Source:Scott - I believe your conjecture!
I think when you're saying 'has all coproducts' you mean what most category theorists when they say 'has binary coproducts': each pair of objects \\(x\\) and \\(y\\) has a coproduct, written \\(x + y\\). We can try to take the coproduct of any number of objects, even an infinite number, but binary coproducts are probably the most important case.
People often study categories with binary coproducts and an **initial object**, often written \\( 0\\): that's an object that has only one morphism to _any_ other object in the category.
An initial object can be seen as a 'nullary coproduct', and we can build up ternary, etc. coproducts from binary ones, so a **category with finite coproducts** is the same as a category with binary coproducts and an initial object.
Categories with finite coproducts are very nice: perhaps the most important is the category \\(\textbf{FinSet}\\) with finite sets as objects, and functions as morphisms.

Speaking of companions and conjoints, I was wondering whether the cap and cup profunctors can be written as companions or conjoints of a functor.
I think that \(\text{hom} : \mathcal{X}^{\text{op}} \otimes \mathcal{X} \to \mathcal{V}\) is the companion of the identity \(\mathcal{V}\)-functor \(\mathbf{1} : \mathcal{X} \to \mathcal{X}\).
So I was kind of expecting to get something similar for cups and caps, but I didn't manage to.

Comment Source:Speaking of companions and conjoints, I was wondering whether the cap and cup profunctors can be written as companions or conjoints of a functor.
I think that \\(\text{hom} : \mathcal{X}^{\text{op}} \otimes \mathcal{X} \to \mathcal{V}\\) is the companion of the identity \\(\mathcal{V}\\)-functor \\(\mathbf{1} : \mathcal{X} \to \mathcal{X}\\).
So I was kind of expecting to get something similar for cups and caps, but I didn't manage to.

Theorem: A category \(C\) with all binary coproducts (there exists a coproduct for every pair of objects) and an initial object \(\mathbf{0}\) is a monoidal category where the coproduct is the tensor product and the initial object is the unit. If for any pair of objects \(x,y\), there exists a pair of morphisms \(f : x \to y\) and \(f' : y \to x\), then \(C\) is also a symmetric monoidal category.

Proof: Here's what we need to show for \(C\) in order to prove it is a symmetric monoidal category.

There's a bifunctor \(\otimes : C \times C \to C\) which is a functor from the product category \(C \times C\) into \(C\). We must be able to tensor both objects and morphisms.

an identity object \(I\): this will be isomorphic to the initial object \(\mathbf{0}\)

the tensor product must be associative up to isomorphism, meaning there must be a natural isomorphism \(\alpha_{xyz} : (x \otimes y) \otimes z \cong x \otimes (y \otimes z)\)

tensoring any object \(x\) with the identity object must be isomorphic to \(x\): so we must have left and right unitor natural isomophisms \(\lambda_x : I \otimes x \cong x \) and \(\rho_x : x \otimes I \cong x\)

braiding and the associator obey the hexagon identities and we require \(B_{yx}B_{xy} = 1_{x\otimes y}\)

We are given that \(C\) has all binary coproducts (as John points out, if you have all binary coproducts, then you also can repeat the binary coproduct to get objects isomorphic to coproducts over any number of objects) and that \(C\) has an initial object \(\mathbf{0}\). Both of these objects are defined as colimits and have an universal property which we will use to show that we can form a symmetric monoidal category. I'll write the coproduct of \(A\) and \(B\) as \(A+ B\)

The universal property of the coproduct of two objects \(A,B\) is for any pair of maps \(f:A \to X\) and \(g:B\to X\), there is an unique map \([f,g]:A+B\to X\) such that \(f\) and \(g\) factor into \(f = [f,g]i_A\) and \(g = [f,g]i_B\) where \(i_A : A \to A+B\) and \(i_B : B \to A +B\) are inclusion maps.

The universal property of the initial object \(\mathbf{0}\) is for any object \(X\), there exists an unique map \(h: \mathbf{0} \to X\). Note that initial objects are always isomorphic to one another (proof is left to the reader), so even if there's more than one, we can treat them as one object.

For the coproduct \(+\) to be the tensor product \(\otimes\), we need to define what happens to objects and morphisms (or rather pairs of objects and morphisms). The coproduct already says what to do: let \(A\otimes B = A+B\). Since \(C\) has all binary coproducts, we can do this for all objects. But what do we do with morphisms?

Using this argument, we also get the braiding natural isomorphism. For any pair of objects \(x,y\), if there exists any map \(f: x\to y\) and \(f': y\to x\), then
$$x + y \cong y + x $$
since there is a unique map \([f,f'] : x + y \to y + x \).

Let's show that for any \(x\), \(\mathbf{0}+x \cong x \cong x+\mathbf{0}\) proving that the initial object serves as an identity object and that the left and right unitor natural isomorphisms exist.

By the universal property of the initial object, there's a unique morphism \(!_x : \mathbf{0} \to x\) for any \(x\). We always have the identity morphism \(1_x : x \to x\) and so by the universal property of the coproduct, we have the unique morphism \([!,1_x] : \mathbf{0} + x \to x \) and \([1_x, !]:x + \mathbf{0} \to x\). But for any object \(a\) we have \(\hom(\mathbf{0} + x,a)\cong \hom(x,a) \cong \hom(x+\mathbf{0},a)\) since given \(f:x\to a\) there's an unique choice of map for each homset: \([!_a ,f]\), \(f\), and \([f,!_a]\)

All that's left is to show that the coherence conditions hold.

Comment Source:**Theorem:** A category \\(C\\) with all binary coproducts (there exists a coproduct for every pair of objects) and an initial object \\(\mathbf{0}\\) is a monoidal category where the coproduct is the tensor product and the initial object is the unit. If for any pair of objects \\(x,y\\), there exists a pair of morphisms \\(f : x \to y\\) and \\(f' : y \to x\\), then \\(C\\) is also a symmetric monoidal category.
Proof: Here's what we need to show for \\(C\\) in order to prove it is a symmetric monoidal category.
+ There's a bifunctor \\(\otimes : C \times C \to C\\) which is a functor from the product category \\(C \times C\\) into \\(C\\). We must be able to tensor both objects and morphisms.
+ an identity object \\(I\\): this will be isomorphic to the initial object \\(\mathbf{0}\\)
+ the tensor product must be associative up to isomorphism, meaning there must be a natural isomorphism \\(\alpha_{xyz} : (x \otimes y) \otimes z \cong x \otimes (y \otimes z)\\)
+ tensoring any object \\(x\\) with the identity object must be isomorphic to \\(x\\): so we must have left and right unitor natural isomophisms \\(\lambda_x : I \otimes x \cong x \\) and \\(\rho_x : x \otimes I \cong x\\)
+ both the associator and unitors must satisfy coherence conditions: [triangle and pentagram diagrams](https://ncatlab.org/nlab/show/symmetric+monoidal+category)
+ a braiding natural isomorphism: \\(B_{xy} : x \otimes y \to y\otimes x\\)
+ braiding and the associator obey the [hexagon identities](https://ncatlab.org/nlab/show/symmetric+monoidal+category) and we require \\(B_{yx}B_{xy} = 1_{x\otimes y}\\)
We are given that \\(C\\) has all binary coproducts (as John points out, if you have all binary coproducts, then you also can repeat the binary coproduct to get objects isomorphic to coproducts over any number of objects) and that \\(C\\) has an initial object \\(\mathbf{0}\\). Both of these objects are defined as colimits and have an universal property which we will use to show that we can form a symmetric monoidal category. I'll write the coproduct of \\(A\\) and \\(B\\) as \\(A+ B\\)
+ The universal property of the coproduct of two objects \\(A,B\\) is for any pair of maps \\(f:A \to X\\) and \\(g:B\to X\\), there is an unique map \\([f,g]:A+B\to X\\) such that \\(f\\) and \\(g\\) factor into \\(f = [f,g]i_A\\) and \\(g = [f,g]i_B\\) where \\(i_A : A \to A+B\\) and \\(i_B : B \to A +B\\) are inclusion maps.
+ The universal property of the initial object \\(\mathbf{0}\\) is for any object \\(X\\), there exists an unique map \\(h: \mathbf{0} \to X\\). Note that initial objects are always isomorphic to one another (proof is left to the reader), so even if there's more than one, we can treat them as one object.
For the coproduct \\(+\\) to be the tensor product \\(\otimes\\), we need to define what happens to objects *and* morphisms (or rather pairs of objects and morphisms). The coproduct already says what to do: let \\(A\otimes B = A+B\\). Since \\(C\\) has all binary coproducts, we can do this for all objects. But what do we do with morphisms?
Consider \\(f:x \to y\\) and \\(f' : x' \to y'\\). Tensoring these objects and morphisms should give \\(f \otimes f' : x \otimes x' \to y\otimes y'\\). So we must send the pair \\((f,f')\\) to an unique map \\(x +x' : y +y'\\). But remember the universal property of coproducts, since there's a map \\(i_{y}f : x \to y + y' \\) and \\(i_{y'} f' : x' \to y +y'\\), there must be a unique map \\([f,f']: x + x' \to y + y'\\) such that the following [diagram commutes](https://tikzcd.yichuanshen.de/#eyJub2RlcyI6W3sicG9zaXRpb24iOlswLDBdLCJ2YWx1ZSI6IngifSx7InBvc2l0aW9uIjpbMiwwXSwidmFsdWUiOiJ5In0seyJwb3NpdGlvbiI6WzAsMl0sInZhbHVlIjoieCcifSx7InBvc2l0aW9uIjpbMiwyXSwidmFsdWUiOiJ5JyJ9LHsicG9zaXRpb24iOlswLDFdLCJ2YWx1ZSI6IngreCcifSx7InBvc2l0aW9uIjpbMiwxXSwidmFsdWUiOiJ5K3knIn1dLCJlZGdlcyI6W3siZnJvbSI6MiwidG8iOjQsInZhbHVlIjoiaV97eCd9In0seyJmcm9tIjowLCJ0byI6NCwibGFiZWxQb3NpdGlvbiI6InJpZ2h0IiwidmFsdWUiOiJpX3gifSx7ImZyb20iOjEsInRvIjo1LCJsYWJlbFBvc2l0aW9uIjoibGVmdCIsInZhbHVlIjoiaV95In0seyJmcm9tIjozLCJ0byI6NSwibGFiZWxQb3NpdGlvbiI6InJpZ2h0IiwidmFsdWUiOiJpX3t5J30ifSx7ImZyb20iOjAsInRvIjoxLCJ2YWx1ZSI6ImYifSx7ImZyb20iOjIsInRvIjozLCJ2YWx1ZSI6ImYnIn0seyJmcm9tIjo0LCJ0byI6NSwibGluZSI6ImRhc2hlZCIsInZhbHVlIjoiXFxleGlzdHMhIFxcIFtmLGYnXSJ9XX0=):
\[\begin{matrix} x & \overset{f}{\rightarrow} & y\\\ i_x \downarrow \quad & & \quad \downarrow i_y\\\ x+ x' & \overset{[f,f']}{\rightarrow} & y + y' \\\ i_{x'}\uparrow \quad & & \quad \uparrow i_{y'} \\\ x' & \overset{f'}{\rightarrow} & y' \end{matrix} \]
Using this argument, we get the associator isomorphism. Repeating the binary coproduct \\(n\\) times is isomorphic to a coproduct over \\(n\\) objects. We only need show this for three objects \\(x,y,z\\) such that
\[\alpha_{xyz}: (x+ y)+ z \to x + (y+z) \]
We prove that \\((x +y) + z \to x + (y+z)\\) by showing \\(\hom((x +y) + z,a) \cong \hom(x + (y+z),a)\\) (I think this statement is the Yoneda lemma). By the universal property of the coproduct, for \\(f:x\to a\\), \\(g:y \to a\\), and \\(h: z \to a\\), there's a unique map \\([f,g]: x +y \to a\\) and therefore a unique map \\([[f,g],h] \in \hom((x +y) + z,a) \\), but by the same properties there's a corresponding unique map \\([f,[g,h]] \in \hom(x + (y + z),a) \\), so \\( (x+ y)+ z \cong x + (y+z)\\) and the associator exists!
Using this argument, we also get the braiding natural isomorphism. For any pair of objects \\(x,y\\), if there exists any map \\(f: x\to y\\) and \\(f': y\to x\\), then
\[x + y \cong y + x \]
since there is a unique map \\([f,f'] : x + y \to y + x \\).
Let's show that for any \\(x\\), \\(\mathbf{0}+x \cong x \cong x+\mathbf{0}\\) proving that the initial object serves as an identity object and that the left and right unitor natural isomorphisms exist.
By the universal property of the initial object, there's a unique morphism \\(!_x : \mathbf{0} \to x\\) for any \\(x\\). We always have the identity morphism \\(1_x : x \to x\\) and so by the universal property of the coproduct, we have the unique morphism \\([!,1_x] : \mathbf{0} + x \to x \\) and \\([1_x, !]:x + \mathbf{0} \to x\\). But for any object \\(a\\) we have \\(\hom(\mathbf{0} + x,a)\cong \hom(x,a) \cong \hom(x+\mathbf{0},a)\\) since given \\(f:x\to a\\) there's an unique choice of map for each homset: \\([!_a ,f]\\), \\(f\\), and \\([f,!_a]\\)
All that's left is to show that the coherence conditions hold.

So I also should probably show that we actually have naturality for the natural isomorphisms (not just that these morphisms exist, but that they form natural transformations). But the proof is straight forward, here's the unitors. Naturality here means that given \(f : x \to y\), this diagram commutes:

I seem to recall Matthew had trouble proving the composite of conjoints is the conjoint of the composite:

$$ \check{FG} \stackrel{?}{=} \check{F} \check{G} .$$
I don't even know if this is true! It would seem strangely asymmetrical for this to be false.

Matthew wrote:

It is asymmetrical! I don't think this is true in general.

Instead, we have

$$ \check{G F} = \check{F} \check{G} $$

Oh, duh! Right!

This is not asymmetrical in any 'bad' sense; it's just the natural thing. Let \(\mathbf{Cat}_{\mathcal{V}}\) be the category with

\(\mathcal{V}\)-enriched categories as objects

\(\mathcal{V}\)-enriched functors as morphisms

and let \(\mathbf{Prof}_{\mathcal{V}}\) be the category with

\(\mathcal{V}\)-enriched categories as objects

\(\mathcal{V}\)-enriched profunctors as morphisms.

Then taking companions gives an inclusion of \(\mathbf{Cat}_{\mathcal{V}}\) in \(\mathbf{Prof}_{\mathcal{V}}\) while taking conjoints gives an inclusion of \(\mathbf{Cat}_{\mathcal{V}}^{\text{op}}\) in \(\mathbf{Prof}_{\mathcal{V}}\).

In other words, less technically speaking, enriched profunctors are a generalization of enriched functors where an enriched functor going either way between enriched categories \(\mathcal{X}\) and \(\mathcal{Y}\) gives a profunctor going from \(\mathcal{X}\) to \(\mathcal{Y}\). The backwards way has got to be contravariant.

Comment Source:I wrote:
> I seem to recall Matthew had trouble proving the composite of conjoints is the conjoint of the composite:
>
> \[ \check{FG} \stackrel{?}{=} \check{F} \check{G} .\]
>
> I don't even know if this is true! It would seem strangely asymmetrical for this to be false.
Matthew wrote:
> It is asymmetrical! I don't think this is true in general.
> Instead, we have
> \[ \check{G F} = \check{F} \check{G} \]
Oh, duh! Right!
<img src = "http://math.ucr.edu/home/baez/emoticons/doh20.gif">
This is not asymmetrical in any 'bad' sense; it's just the natural thing. Let \\(\mathbf{Cat}_{\mathcal{V}}\\) be the category with
* \\(\mathcal{V}\\)-enriched categories as objects
* \\(\mathcal{V}\\)-enriched functors as morphisms
and let \\(\mathbf{Prof}_{\mathcal{V}}\\) be the category with
* \\(\mathcal{V}\\)-enriched categories as objects
* \\(\mathcal{V}\\)-enriched profunctors as morphisms.
Then taking companions gives an inclusion of \\(\mathbf{Cat}\_{\mathcal{V}}\\) in \\(\mathbf{Prof}\_{\mathcal{V}}\\) while taking conjoints gives an inclusion of \\(\mathbf{Cat}\_{\mathcal{V}}^{\text{op}}\\) in \\(\mathbf{Prof}_{\mathcal{V}}\\).
In other words, less technically speaking, enriched profunctors are a generalization of enriched functors where an enriched functor going _either way_ between enriched categories \\(\mathcal{X}\\) and \\(\mathcal{Y}\\) gives a profunctor going from \\(\mathcal{X}\\) to \\(\mathcal{Y}\\). The backwards way has got to be contravariant.

Comment Source:Anindya wrote:
> \[ (\check{F})^\text{op} = \widehat{F^\text{op}} \]
Wow, that's cool! As you clearly realize, this reduces the study of conjoints to the study of companions together with the study of 'op' for both functors and profunctors! <img src = "http://math.ucr.edu/home/baez/emoticons/thumbsup.gif">
I guess we also have
\[ (\widehat{F})^{\text{op}} = \check{F^\text{op}} .\]
This should follow from \\( \text{op} \circ \text{op} = 1 \\).

Theorem: A category \(C\) with all binary coproducts (there exists a coproduct for every pair of objects) and an initial object \(\mathbf{0}\) is a monoidal category where the coproduct is the tensor product and the initial object is the unit. If for any pair of objects \(x,y\), there exists a pair of morphisms \(f : x \to y\) and \(f' : y \to x\), then \(C\) is also a symmetric monoidal category.

Nice!

One little nuance: we're assuming we have a coproduct for every pair of objects, and an initial object. But we need to pick a specific choice of these to make \(C\) into a symmetric monoidal category, because the definition of symmetric monoidal category talks about the tensor product of objects, and the unit for the tensor product.

Luckily, one can prove that different choices will make \(C\) into symmetric monoidal categories that are 'equivalent', using the precise definition of equivalence for symmetric monoidal categories (which is Definition 13 here.)

So, in the end it doesn't matter much. It does, however, illustrate the subtle difference between specifying things via a universal property (and thus 'up to canonical isomorphism') and specifying them 'on the nose' (that is, up to equality).

Comment Source:Scott wrote:
> **Theorem:** A category \\(C\\) with all binary coproducts (there exists a coproduct for every pair of objects) and an initial object \\(\mathbf{0}\\) is a monoidal category where the coproduct is the tensor product and the initial object is the unit. If for any pair of objects \\(x,y\\), there exists a pair of morphisms \\(f : x \to y\\) and \\(f' : y \to x\\), then \\(C\\) is also a symmetric monoidal category.
Nice!
One little nuance: we're assuming we have _a_ coproduct for every pair of objects, and _an_ initial object. But we need to pick a specific choice of these to make \\(C\\) into a symmetric monoidal category, because the definition of symmetric monoidal category talks about _the_ tensor product of objects, and _the_ unit for the tensor product.
Luckily, one can prove that different choices will make \\(C\\) into symmetric monoidal categories that are 'equivalent', using the precise definition of equivalence for symmetric monoidal categories (which is [Definition 13 here](http://math.ucr.edu/home/baez/qg-winter2001/definitions.pdf).)
So, in the end it doesn't matter much. It does, however, illustrate the subtle difference between specifying things via a universal property (and thus 'up to canonical isomorphism') and specifying them 'on the nose' (that is, up to equality).

One little nuance: we're assuming we have a coproduct for every pair of objects, and an initial object. But we need to pick a specific choice of these to make \(C\) into a symmetric monoidal category, because the definition of symmetric monoidal category talks about the tensor product of objects, and the unit for the tensor product.

Luckily, one can prove that different choices will make \(C\) into symmetric monoidal categories that are 'equivalent', using the precise definition of equivalence for symmetric monoidal categories (which is Definition 13 here.)

That's a good point, I was treating the coproducts and initial objects as unique since even though they need not be unique, they must be isomorphic to one another. But for that reason, the different symmetric categories must be equivalent.

Comment Source:> One little nuance: we're assuming we have _a_ coproduct for every pair of objects, and _an_ initial object. But we need to pick a specific choice of these to make \\(C\\) into a symmetric monoidal category, because the definition of symmetric monoidal category talks about _the_ tensor product of objects, and _the_ unit for the tensor product.
> Luckily, one can prove that different choices will make \\(C\\) into symmetric monoidal categories that are 'equivalent', using the precise definition of equivalence for symmetric monoidal categories (which is [Definition 13 here](http://math.ucr.edu/home/baez/qg-winter2001/definitions.pdf).)
That's a good point, I was treating the coproducts and initial objects as unique since even though they need not be unique, they must be isomorphic to one another. But for that reason, the different symmetric categories must be equivalent.