From many sources I am hearing that symmetric polynomials not only an algebra, but also Hopf algebra.
Would someone be so kind to explain where the coproduct (antipode) comes from ?
And what it is useful for ?

Let me pay some attention to details of the question.
Is it correct that there is some "natural" Hopf structure OR different authors writes about different and they are interesting each one in each specific task ?

Is it important to go to the limit n->infinity (which I heard many times, but not really like/understand) or we can stay in the more custom setup of $C[x_1,...,x_n]^{S_n}$ ?

What is puzzling for me that $C^n$ can be seen as an additive group and so $C[x_1...x_n]$ is Hopf algebra, but this does not seem to survive on $C[x_1...x_n]^{S_n}$.
E.g. $\Delta(xy)=xy\otimes 1+x\otimes y +y\otimes x + 1\otimes xy$ which is not in
$C[x,y]^{S_2} \otimes C[x,y]^{S_2}$.

7 Answers
7

Yes, there exists a natural Hopf algebra structure on the ring of symmetric functions (i. e., symmetric "polynomials" in infinitely many indeterminates). It is not related to the additive group of $k^n$ (for a good reason: as you said, the obvious coalgebra structure on $k\left[x_1,x_2,...,x_n\right]$ coming from addition on the affine space does not survive restriction to the symmetric polynomials), and it does require infinitely many indeterminates.

1. The algebra $\mathbf{Symm}$

Let $k$ be a commutative ring. (It will serve as a base ring. You can safely assume $k$ to be $\mathbb C$ if you want, but if you know some representation theory, you can actually get surplus value from working with $k=\mathbb Z$ even if all you care about are representations of finite groups over $\mathbb C$.)

So what is $\mathbf{Symm}$ (or $Sym$, as it is also known)? First of all, it is a $k$-Hopf algebra which, as a $k$-algebra, is the algebra of "symmetric functions" in countably many commuting indeterminates $X_1$, $X_2$, $X_3$, .... Here, "function" does not mean an actual function, but instead it means a power series $p$ in the indeterminates $X_1$, $X_2$, $X_3$, ... such that for some $d\in \mathbb N$, every monomial of total degree $\geq d$ occurs in $p$ with coefficient $0$. (Another word for this meaning of "function" is "degree-bounded power series".) "Symmetric" means that any two monomials which only differ in the order of the exponents (but the multisets of their exponents are the same) must have the same coefficient. (I guess I should say that for me, a monomial doesn't include a coefficient.) This definition of $\mathbf{Symm}$ is easily seen to be equivalent to the definition as a direct limit $\lim\limits_{n\to\infty} k\left[X_1,X_2,...,X_n\right]^{S_n}$, where the mapping $k\left[X_1,X_2,...,X_n\right]^{S_n} \to k\left[X_1,X_2,...,X_m\right]^{S_m}$ (for $n\leq m$) takes a symmetric polynomial in $n$ variables and clones its coefficients to get an $m$-variable symmetric polynomial (I hope this is understandable; anyway, this is not important for us).

2. Applying symmetric functions to multisets

Since we call the elements of $\mathbf{Symm}$ "functions", let us explain what they can be evaluated at:

For any $p\in\mathbf{Symm}$, any $u\in\mathbb N$ and any $u$-tuple $\left(x_1,x_2,...,x_u\right)$ of elements of a commutative $k$-algebra, we can define the "value" of $p$ at $\left(x_1,x_2,...,x_u\right)$ (denoted by $p\left(x_1,x_2,...,x_u\right)$) to be the result of

1) removing all monomials which contain at least one of the variables $X_{u+1}$, $X_{u+2}$, $X_{u+3}$, ... from the power series $p$;

2) then applying the resulting power series $q$ to $X_1=x_1$, $X_2=x_2$, ..., $X_u=x_u$ (this makes sense because the power series $q$ is a polynomial, due to our definition of "function").

Due to the symmetry of $p$, this result actually doesn't depend on the order of $\left(x_1,x_2,...,x_u\right)$; thus, we can think of $p\left(x_1,x_2,...,x_u\right)$ as the value of $p$ at the multiset (rather than $u$-tuple) $\left(x_1,x_2,...,x_u\right)$.

3. The bialgebra $\mathbf{Symm}$

So we have a $k$-algebra $\mathbf{Symm}$. How do we make it a Hopf algebra? First, let me show the intuition behind the comultiplication. It should correspond to the union of multisets. So, if we have some $p\in\mathbf{Symm}$, and write $\Delta\left(p\right)=\sum\limits_{i \in I} q_i \otimes r_i$, then we should have

(1) $p\left(x_1,x_2,...,x_u,y_1,y_2,...,y_v\right) = \sum\limits_{i \in I} q_i\left(x_1,x_2,...,x_u\right) r_i\left(y_1,y_2,...,y_v\right)$ for any two multisets $\left(x_1,x_2,...,x_u\right)$ and $\left(y_1,y_2,...,y_v\right)$ of elements of any commutative $k$-algebra.`

The counity should mean applying to the empty multiset:

(2) $p\left(\ \right) = \varepsilon\left(p\right)$.

(Sorry, dear Java friends, $p\left(\ \right)$ doesn't mean $p$ here.)

How do we actually get such $\Delta$ and $\varepsilon$ ? The easy answer is: By the fundamental theorem on symmetric polynomials, $\mathbf{Symm}$ is generated as a $k$-algebra by the elementary symmetric polynomials

(This does not immediately follow from the fundamental theorem on symmetric polynomials, since the fundamental theorem is usually not formulated for infinitely many variables, but you can either apply the same argument (lexicographic induction) in the infinite-variables case, or use the direct-limit construction of $\mathbf{Symm}$ to conclude the infinite-variables case from the finite-variables one.)

Hence, in order to define a $k$-algebra homomorphism from $\mathbf{Symm}$ to another commutative $k$-algebra (be it $\mathbf{Symm}\otimes \mathbf{Symm}$ or $k$ or anything else), it is enough to specify its values at the $e_j$ for $j=1,2,3,...$, and we have total freedom in doing so so. Since we want $\mathbf{Symm}$ to be a $k$-bialgebra, we must define $\Delta$ and $\varepsilon$ as $k$-algebra homomorphisms; so let us define $\Delta$ by requiring that

(Actually, $\varepsilon$ just maps every $p\in\mathbf{Symm}$ to the constant term of $p$. But $\Delta$ isn't that easily described.)

To check that this matches our intuition above at least on the $e_j$ (i. e., that it satisfies (1) and (2) for $p=e_j$), we must show that every $j\geq 1$ satisfies

$e_j\left(x_1,x_2,...,x_u,y_1,y_2,...,y_v\right) = \sum\limits_{m=0}^j e_m\left(x_1,x_2,...,x_u\right) e_{j-m}\left(y_1,y_2,...,y_v\right)$ for any two multisets $\left(x_1,x_2,...,x_u\right)$ and $\left(y_1,y_2,...,y_v\right)$ of elements of any commutative $k$-algebra,`

and $e_j\left(\ \right) = 0$. These are very easy. It is more complicated to check (1) and (2) for general $p$.

4. The graded Hopf algebra $\mathbf{Symm}$

From here on, everything goes very fast: We have a $k$-bialgebra $\mathbf{Symm}$. It is graded (the $n$-th degree consists of homogeneous power series of degree $n$ in $\mathbf{Symm}$) and connected (the $0$-th degree is the ground ring $k$, embedded in $\mathbf{Symm}$ by constant power series), so it is a Hopf algebra. This is because every connected graded bialgebra is a Hopf algebra (the antipode is constructed by induction on the degree); for a more detailed proof of this, see, e. g., Corollary II.3.2 in Dominique Manchon's http://arxiv.org/abs/math/0408405 , which even proves this for connected filtered bialgebras).

For an overview of the properties of $\mathbf{Symm}$, see, e. g. Michiel Hazewinkel's http://arxiv.org/abs/0804.3888 Section 10 (errata: http://mit.edu/~darij/www/algebra/typos1short.pdf ). The antipode, for example, switches elementary symmetric with complete symmetric functions (up to sign), leaving the power sum functions invariant (again, up to sign). See also his Section 18 about the relation of $\mathbf{Symm}$ to the representation theory of symmetric group.

Thank You very much ! Great answer ! But are you sure that it is impossible to avoid this limit n->Inf ? (I do not know it and so do not like it:) It seems you introduce coproduct ad hoc just . What is motivation and use of it ?
–
Alexander ChervovJan 18 '12 at 13:36

As I said, I do avoid this limit, so it's not impossible - it is just an alternative construction of $\mathbf{Symm}$. As for motivation, the underlying motivation in arxiv.org/abs/0804.3888 was to explain Witt vectors in a more algebraic (and less number-theoretic) context. But if you read Section 18 of arxiv.org/abs/0804.3888 you will learn of a second application probably more interesting to you: Equalities in $\mathbf{Symm}$ "encode" isomorphisms of representations of $S_n$. (See also Dan's reply for this.)
–
darij grinbergJan 18 '12 at 13:41

What means "correspond to the union of multisets" ?
–
Alexander ChervovJan 18 '12 at 13:48

It's just some blabber that was supposed to describe the equality (1). By the union of two multisets $\left(p_1,p_2,...,p_u\right)$ and $\left(q_1,q_2,...,q_v\right)$, I mean the multiset $\left(p_1,p_2,...,p_u,q_1,q_2,...,q_v\right)$. The intuition behind (1) is that to evaluate $p$ at the union of two multisets, we write the tensor $\Delta\left(p\right)$ in the form $\sum\limits_{i\in I} q_i\otimes r_i$, and sum (over all $i$) the product $q_i\left(\text{first multiset}\right)r_i\left(\text{second multiset}\right)$.
–
darij grinbergJan 18 '12 at 13:54

Here are two different definitions of the Hopf algebra structure. One needs to work in infinitely many variables as you indicate.

From the point of view of the representation theory of the symmetric group, the product in $\Lambda$ can be defined as
$$ V \cdot W = \mathrm{Ind}_{S_n \times S_k}^{S_{n+k}}V \otimes W$$
for $V$ a representation of $S_n$ and $W$ a representation of $S_k$; this product is then extended bilinearly. The coproduct then has a natural dual definition:
$$ \Delta(V) = \sum_{i+j = n} \mathrm{Res}^{S_n}_{S_i \times S_j} V, $$
where a representation of $S_i \times S_j$ defines an element in $\Lambda \otimes \Lambda$ in the natural way.

The connection between symmetric functions and representations of $S_n$ is as follows. The graded piece $\Lambda^n$ is isomorphic to the ring of virtual representations of $S_n$ via the so called characteristic map. A virtual representation $V$ is mapped to the symmetric function
$$ \mathrm{ch}(V) = \frac 1 {n!} \sum_{\sigma \in S_n} \mathrm{Tr}\left(\sigma \mid V\right) \psi(\sigma) $$
where
$$ \psi(\sigma) = \prod_{(i_1\cdots i_k) \text{ a cycle in } \sigma} p_k. $$
This is in fact an isometry relative to the usual inner product on symmetric functions, and the natural inner product on representations for which irreducible representations form an orthonormal basis. The representation associated to the Young diagram $\lambda$ corresponds to the Schur function $s_\lambda$, so equivalently $$\langle s_\lambda, s_\mu \rangle = \delta_{\lambda \mu}.$$

A more direct definition of the coproduct is in terms of power sums. Define a coproduct via
$$ \Delta(p_{i_1}\cdots p_{i_n}) = \sum_{k=0}^n p_{i_1}\cdots p_{i_k} \otimes p_{i_{k+1}}\cdots p_{i_{n}}. $$
In particular the power sums $p_n$ are primitive elements for this coproduct and they span the module of primitive elements. The elementary and homogeneous symmetric functions are divided powers for this Hopf algebra structure.

The antipode is uniquely determined by the coproduct. You prove this by induction over degree: when you expand $\Delta(x)$, you find terms of lower degree (where the antipode is known) and two terms $x \otimes 1 + 1 \otimes x$, on which you can deduce how the antipode acts. This holds in any graded connected Hopf algebra.

Thank You very much ! But would you be so kind to explain how reps of S_n, S_k are related to symmetric functions ? The second definition is pretty explicit (so I like it), but un-motivated :( And I added - question: "where is it useful" ?
–
Alexander ChervovJan 18 '12 at 13:38

1

The reason I prefer not to define the operations of $\mathbf{Symm}$ in terms of power sums is that power sums don't generate $\mathbf{Symm}$ if $k=\mathbb Z$ (but only if $k$ is a $\mathbb Q$-algebra). This is probably not particularly relevant for Alexander, though.
–
darij grinbergJan 18 '12 at 13:39

Thanks for the correction Darij. I added the definition of how you go from representations of the symmetric group to symmetric functions.
–
Dan PetersenJan 18 '12 at 13:51

What means "graded piece $\Lambda^n$" ? Is $p_k$ in formula 4 (definition of $\Psi$) is also a power sum ? If yes how one can come to such a definition of characteristic map - I mean why is it natural? What is it used for ?
–
Alexander ChervovJan 18 '12 at 14:09

I'm surprised nobody has mentioned the connection to the Littlewood-Richardson coefficients so far in response to "What is it useful for?". The coproduct
$$
\Delta(h_k) = \sum_{i+j = k} h_i \otimes h_j
$$
gives rise to the following formula
$$
\Delta( s_\lambda ) = \sum_{\mu,\nu} c_{\mu,\nu}^{\lambda} s_\mu \otimes s_\nu
$$
Here $c_{\mu,\nu}^{\lambda}$ is the Littlewood-Richardson coefficient and $s_\lambda$ is the Schur function of shape $\lambda$. Hopf algebra techniques have been used to derive "skew" Pieri rules recently in work of Lam, Lauve and Sottile.

True, but you don't really need comultiplication to define the Littlewood-Richardson coefficients. By the duality I mention in my answer the coefficients also satisfy $s_\mu\times s_\nu=\sum_\lambda c_{\mu,\nu}^{\lambda}s_\lambda$, and that is how they are usually introduced.
–
Marc van LeeuwenJan 18 '12 at 22:57

The second formula is usually written for multiplication - when we multiply Schur functions product can be decomposed as linear combination of Schur's with Littlewood-Rich. coefs (Schur = characters of GL(n)). So if I understand correctly your formula says - Schur's are in a sense "self-dual" i.e. their multiplication and comultiplication is given by the same formula. What does it mean ? It should be some kind of reasons for this...
–
Alexander ChervovJan 19 '12 at 4:58

Alexander: if you use the definition of the product and coproduct in terms of induction and restriction that I gave, then this self-duality is Frobenius reciprocity.
–
Dan PetersenJan 19 '12 at 6:33

What $h_k$ means - is it "elementary" or "complete" symmetric function ? And is it easy to derive the second from the first one ? By "derive" I mean take as definition of Littlewood-Richardson standard definition. Dan Peterson suggests (thanks!) it follows from Frobenius reciprocity - but is there direct way ?
–
Alexander ChervovJan 19 '12 at 6:45

$h_k$ is always complete. It is not really easy to derive the second fact from the first, I think; the Hopf algebra starts simplifying things only when you move on to skew Schur functions. But maybe Zelevinsky's approach gives a simpler argument.
–
darij grinbergJan 19 '12 at 15:24

In the great answers given so far, I didn't find a simple direct description of the natural comultiplication operation on symmetric functions. So I'll just copy and paste my comment from this question which tries to explain it in as few characters as are allowed in a comment. It shows why infinitely many variables are needed: we need the Hilbert hotel (twice the number of variables must equal the number of variables). OK, since I see copy-and-paste from comments doesn't work without manual repair anyway, I'll add a few words of clarification as well.

Symmetric functions are (degree-bounded power series) in infinitely many variables, and order doesn't matter (due to the "symmetric" part). Now rename the variables $x_0,y_0,x_1,y_1,x_2,\ldots$ and decompose a symmetric function $s=\sum_i u_iv_i$ as a sum of products of a symmetric function $u_i$ in the $x$'s and one $v_i$ in the $y$'s; then $\Delta(s)=\sum_i u_i\otimes v_i$. For instance for elementary symmetric function one has $\Delta(e_k)=\sum_{i+j=k}e_i\otimes e_j$ since the monomials can be arbitrarily spread across the $x$'s and $y$'s (and similarly for complete homogeneous symmetric functions), while for power sums $\Delta(p_k)=p_k\otimes 1+1\otimes p_k$, since the monomials involve a single $x_i$ or a single $y_i$, but the two cannot mix in a power sum.

I may add that multiplication and comultiplication are dual in operations in this case; Zelevinky calls it a (positive) self-adjoint Hopf-algebra.

That seems to be great point ! But it seems very general - not much using particular structure of "symmetric functions" (or I am missing something?), are their some other natural examples where one can introduce co-product using the same idea ?
–
Alexander ChervovJan 19 '12 at 4:54

Well there is the part "order doesn't matter" that depends on having symmetric power series (as I have now emphasised in the answer). In the Hilbert hotel one needs to be able to freely permute guests among rooms in order to operate efficiently. Due to this I do not see how to apply this simple idea in fundamentally different situations.
–
Marc van LeeuwenJan 19 '12 at 6:47

Do you mean that setup of "Hilbert hotel" (I am not clearly understanding it) is equivalent to symmetric function setup ? Is it easy to see this (if it is true) or it is non-trivial ?
–
Alexander ChervovJan 19 '12 at 6:52

In fact there are several Hopf algebra structures on this algebra, mainly because it's one of the many occurence of the free commutative graded algebra with exactly one generator in each degree $A=\mathbb C[h_1,\dots,h_n,\dots]$, where $h_i$ is of degree $i$. Here the $h_i$'s can be either the elementary or the complete symmetric polynomials. Of course that's the grading which makes things interesting, otherwise it would just be the usual polynomial algebra on infinitely many variables.

Indeed, let $G$ be the group of formal power series with constant terme equal to 1. Then the algebra $O(G)$ of polynomial function on $G$ is generated by the linear maps $\lambda_k$ defined by
$$\langle \lambda_k,1+\sum a_nX^n \rangle=a_k/k!$$

Letting $\lambda_k$ having degree $k$, then the map $h_k \rightarrow \lambda_k$ is an isomorphism of graded algebra. But being an algebra of function on a group, $O(G)$ has a natural Hopf algebra structure given by

$$\Delta(f)(a \otimes b)=f(ab)$$
and
$$S(f)(a)=f(a^{-1})$$

If you think as the $h_i$'s as being the elementary symmetric polynomials, then this coproduct is the same as the coproduct of Dan's answer (if I'm not wrong). This is not just an abstract isomorphism, however, but if I remenber well it is reminiscent of the fact that coefficients of a polynomial are elementary symmetric functions of its roots.

But if you take instead the group of formal power series of the form
$$X+\sum_{n\geq 1} a_nX^{n+1}$$
, whose multiplication is given by the composition of formal power series, then you get again the same graded algebra but the above formula leads to a non-commutative coproduct (leading to the so-called Faa di Bruno Hopf algebra).

Edit: Let me add a few words about the motivations. Once you already know the fundamental theorem of symmetric functions, the above isomorphism may seems tautological and not very interesting. In fact, the existence of an (actually several) very explicit isomorphism(s) from the algebra of symmetric functions to $A$ is nothing but a reformulation of this theorem. On the other hand, the above definition is arguably one of the most natural definition of $A$, and you get the Hopf structure for free.

Somehow, the fundamental theorem tells you that the algebra structure of symmetric functions is not that interesting. But it turns out that many interesting combinatorial identities can be deduced from the Hopf structure, and especially from the fact that it's self dual. Hence the pull pack of the coproduct and antipode to the algebra of symmetric functions itself has many interesting combinatorial applications. The same is true for the other Hopf algebra structure, since it identifies combinatorial identities between symmetric functions, and the computation of the composition inverse of a formal power serie.

@Adrien ! Thank You very much it is very interesting both things what you point out - that co-product is somehow related to multiplication of power series (and the second about Faa di Bruno). Concerning the first one - is there any deeper relation than just the graded isomorphism between symmetric functions and coefficients of power series ? It is unmotivated - just two things have the same size - so we consider SOME (ad hoc) isomorphism and so move co-product from one thing to another. And also not clear what USE of this point of view ? (Except trivial - that guarante of co-associativity etc
–
Alexander ChervovJan 19 '12 at 8:13

@Alexander: I added a few words about that.
–
AdrienJan 20 '12 at 13:41

@Adrien Thank you very much ! But still I do not see how looking on symmetric polynoms I should see multiplicative group of power series ?
–
Alexander ChervovJan 20 '12 at 14:43

Bruce, thank you for your answer. But I am afraid you underestimated degree of my ignorance :) I do not know what is motivation to introduce a co-product, where is it useful, etc... So antipode is of second priority...
–
Alexander ChervovJan 18 '12 at 13:26

There is also a motivation coming from the (co)homology of Grassmannians.

(Note I'm making this answer community wiki - I would like someone to expand on this, since I don't have the time right now to look up sources to do so and don't want to risk getting something wrong by going off the top of my head.)

Thank for Yours answer. I also seen many papers talking about this. But COhomology is an algebra - there is NO co-product (unless we have a (semi)-group structure). What people saying here that co-product also appears in the limit n->infity. Grassmanians is certain limits are BU, which is at least H-space as far as I remember - so there will be co-product... But if it is the way to see it - I would not say it is easy...
–
Alexander ChervovJan 19 '12 at 16:44