The title of my question has a field $\mathbb F$ in it, but to make sure I'm not losing anything, I would like to introduce my question in full generality. But still, I will be happy with an answer in fields, which is why I chose this title. Please feel free to consider $R$ a field and all modules to be vector spaces.

Notation

I'm assuming $\mathbb N=\{0,1,\ldots\}$ in this question.

Let $R$ be any unital ring. $R[x]$ denotes the ring of polynomials over $R$. $R[[x]]$ denotes the ring of formal power series over $R$. (The linked article requires that $R$ be commutative. I don't.) $M_n(R)$ will denote the ring of all square matrices over $R$ indexed by the set $\{0,1,\ldots,n-1\}\times\{0,1,\ldots,n-1\}.$

$M_{\infty}(R)$ denotes the set of all $\mathbb N\times\mathbb N$-matrices over $R$ whose columns have finitely many non-zero coefficients. This restriction allows the usual multiplication of such matrices, and so they form a ring.

I will also be using the ring of endomorphisms $E(R)$ of the free $R$-module $V(R)=\bigoplus_{i=0}^{\infty}R,$ with a fixed basis $e_0,e_1,\ldots$

Background

The rings $M_\infty (R)$ and $E(R)$ are isomorphic. Just like for finite matrices over fields, the $i$-th column of a matrix $A$ represents the element $y$ of $V(R)$ to which $e_i$ is mapped by a corresponding endomorphisms $\alpha.$ To be precise, the column contains the coefficients of $y$ in the basis $e_0,e_1,\ldots$ This explains the requirement that there be only finitely many non-zero elements in the columns, because every $y\in V(R)$ can be written uniquely as a linear combination of the basis, and linear combinations are finite sums.

$\phi$ is a ring isomorpisms onto its image $M_\triangledown(R).$ I found it very interesting that formal power series (and therefore also polynomials) can be embedded in matrices. It turns out to be useful. I'm reading a 1974 paper by Jan Krempa which uses it. The title is On the Jacobson Radical of Polynomial rings.

Very similarly, we can define the map $\psi:R[x]/\langle x^n\rangle\longrightarrow M_n(R)$ such that

Let $f=\sum_{i=0}^\infty a_ix^i\in R[[x]].$ Now $\phi(f)$ is a matrix over $R$ and as such can be identified with an endomorphism $\alpha_f$ of $V(R).$ As an endomorphism, $\alpha_f$ can take arguments from $V(R)$ and send them to some other elements of $V(R).$ However, the original power series $f$ is not usually thought to have such a capability. We can define for any $f\in R[[x]]$ and $y\in V(R)$ $$fy:=\alpha_f(y).$$

I would like to know if thinking of formal power series acting in this way on $V(R)$ is natural. I was trying to see some natural way in which a power series can be thought of as an endomorphism of a free module or vector space but I failed.

It's easy to write what $fy$ actually is. Let $y\neq 0$ for simplicity. Let $\{y_i\}_{i=1}^\infty$ be the coefficients of $y$ in the basis $e_0,e_1,\ldots$ Let $n$ be the greatest index such that $y_n=0$ but $y_{n-1}\neq 0$. Now calculating $fy$ is essentially the same as calculating

I've looked at these formulas for quite some time now and they still tell me nothing. Could please tell me if you recognize them?

An analogous question can be asked about the finite-dimensional case of $R[x]/\langle x^n\rangle$ and the map $\psi$. I hereby ask it then, as this post is already so long and latex-laden that I have to wait one or two seconds for a reaction to my typing on the screen.

3 Answers
3

So I think it is more natural to think of these matrices as acting on the right on row vectors rather than acting on the left on column vectors. This action is actually the natural action by right multiplication of $R[[x]]$ on itself (resp. $R[x]/x^n$ on itself) with respect to the "basis" $\{ 1, x, x^2, ... \}$ (resp. the basis $\{ 1, x, ... x^{n-1} \}$).

The action you've chosen to look at is instead the dual action on $R$-linear functionals $R[[x]] \to R$ (resp. $R[x]/x^n \to R$), but for me to elaborate on this would require making some distracting distinctions between left and right modules because you didn't specify that $R$ was commutative...

Here's another way to think about the situation. The basic concepts here are, I think, obscured by working in full generality so I'll assume that $R = k$ is a field. Okay, so you know that if $V$ is a $k$-vector space and $T : V \to V$ is a $k$-linear operator, then we have a natural homomorphism $k[x] \to \text{End}(V)$ given by sending $x$ to $T$. A natural question to ask is when this homomorphism can be extended to a homomorphism from $k[x]/x^n$ or from $k[[x]]$.

Well, to get a homomorphism $k[x]/x^n \to \text{End}(V)$ we need $T$ to satisfy $T^n = 0$. If we want the homomorphism to be injective, then we also want $T^{n-1} \neq 0$. This means we can find some $e_n \in V$ such that $T^{n-1} e_n \neq 0$. But now $T$ preserves $\text{span}(e_n, T e_n, ... T^{n-1} e_n)$, so we might as well restrict our attention to this subspace, and naming $e_i = T^{n-i} e_n$ we might as well take $V = \text{span}(e_1, ... e_n)$. The action of $T$ on this basis is then given by an $n \times n$ upper-triangular matrix with $1$s above the diagonal and $0$s elsewhere.

To get a homomorphism $k[[x]] \to \text{End}(V)$ is more subtle. A sufficient but not necessary condition is that $T$ is locally nilpotent, which means that for every $v \in V$ there exists $n$ such that $T^n v = 0$. This means that $f(T) v$ is well-defined for every formal power series $f$.

Written in the basis $\{ e_1, e_2, e_3, ... \}$ this is the infinite upper-triangular matrix with $1$s below the diagonal and $0$s elsewhere. In characteristic zero, another way to think about this example is as the action of differentiation $\frac{d}{dx}$ on $k[x]$ with respect to the basis $\{ 1, x, \frac{x^2}{2!}, \frac{x^3}{3!}, ... \}$.

In general, keep in mind "Cayley's theorem for rings": every ring $R$ embeds into the endomorphism ring $\text{End}(R)$ of its underlying abelian group by left multiplication.
–
Qiaochu YuanMay 2 '12 at 20:59

Thank you very much! I'd never have guessed I should be multiplying this the other way around to see something I'd recognize. I will think about this Cayley theorem thing and come back if I have questions.
–
user23211May 2 '12 at 21:10

First take the case of a field, $\mathbb F$, of characteristic zero, then let $p_n(y) = \frac{y^n}{n!}$.

Then the $p_n$ form a basis for $\mathbb F[y]$ as a vector space over $\mathbb F$.

What is the operation of $F[[x]]$ on $\mathbb F[y]$? It turns out, it is the operation that sends $p_n(y)$ to $p_{n-1}(y)$. So if an element of $\mathbb F[y]$ is written as $\sum {a_n p_n(y)}$, the result of having $x$ act on $\mathbb F[y]$ is the result of differentiating the polynomial. In other words, $x$ acts as $D_y$. So we can think of:

$$\mathbb F[[x]] \cong \mathbb F[[D_y]]$$

Essentially, $(\sum a_i x^i)\cdot p(y) = \sum a_i p^{(i)}(y)$

So this is action is differentiation in the case of a field of characteristic zero.

You can extend this idea to a ring $R$ if $\mathbb Q\subset R$.

Now if $R$ is commutative of arbitrary characteristic, then you can define a ring on $\oplus_{i\in\mathbb N} R$ by defining $(a_i)\cdot(b_i) = (c_i)$ where:

There's no need to restrict to the case that $\mathbb{Q} \in R$; the abstract construction you've written down in terms of formal entities $p_n$ works regardless. The resulting algebra (defined by $p_n p_m = {n+m \choose m} p_{n+m}$) just can't be embedded in the algebra of polynomials; instead it's the algebra of "divided powers."
–
Qiaochu YuanMay 2 '12 at 21:30

Yes, I was aware of that, I just wanted to make it clear that it was very much "like" a differential. In particular, as I've added to the answer, the natural ring associated has multiplication which, when acted on by $x$, gives the very familiar multiplicative rule of differentiation. @QiaochuYuan
–
Thomas AndrewsMay 2 '12 at 21:33