Are roots of the entire family of Legendre Polynomials dense in the interval $[0,1]$ (i.e., it's not possible to find a subinterval, no matter how small, that doesn't contain at least one root of one polynomial)?

If anyone knows of an article/text that proves any of the above, please let me know. The definition of these polynomials can be found on Wikipedia.

According to a Corollary (page 114) in the book linked below, the Legendre polynomial of degree $k$ has $k$ distinct roots in the interval $(-1,1)$. tinyurl.com/29c89tu
–
Timothy WagnerNov 28 '10 at 3:32

4

1. Yes. It is a deep theorem of the theory of orthogonal polynomials that all their roots within their support interval are simple. 2. There are no explicit closed forms for the general roots of a Legendre polynomial, but there are asymptotic expansions for the roots.
–
Guess who it is.Nov 28 '10 at 3:39

3

3. Note that the roots of successive Legendre polynomials are interlacing (they form a Sturm sequence).
–
Guess who it is.Nov 28 '10 at 3:40

1

For 2) I mean are these polynomials solvable by radicals (can the roots be written in a finite amount of space using ration numbers and radicals).
–
user3180Nov 28 '10 at 3:41

1

@user3971: what a specific definition of "calculated exactly"... what makes a radical so easy to calculate compared to other functions? Even to solve a cubic equation by the cubic formula "by radicals" requires one to compute cosines in the general case and I see no reason this is essentially easier than computing any other function whose Taylor series decays rapidly.
–
Qiaochu YuanNov 28 '10 at 3:44

5 Answers
5

To resolve the second question, note first that the Legendre polynomials are odd functions for odd order (0 then is one root of the polynomial), and even functions for even order. Thus, with regards to solubility in terms of radicals, you should be able to derive (possibly complicated!) radical expressions at least up until $P_9(x)$. To use that as an example, note that

$$\frac{P_9(\sqrt{x})}{\sqrt{x}}$$

is a quartic; thus, one can use the quartic formula to derive explicit expressions for its roots, and then you can easily derive the roots of $P_9(x)$ .

$P_{10}(x)$ is where your trouble starts. If we take a look at the polynomial

$$P_{10}(\sqrt{x})$$

we have a quintic to contend with. I'll skip the relatively tedious details, but you can verify that its Galois group is not a solvable group, and thus the solution cannot be expressed in terms of radicals (you can use theta or hypergeometric functions, though).

So, not much hope in the symbolic front. In the numeric front, things are much easier. The slickest way of getting accurate values of the roots of the Legendre polynomial is to use the Jacobi matrix in my previous answer. Since there exist stable and efficient algorithms (e.g. QR algorithm or divide-and-conquer) for the symmetric eigenproblem (in LAPACK, for instance), and things can be set such that only eigenvalues are returned, you have a good way of generating good approximate values of Legendre polynomial roots. (In the context of Gaussian quadrature, where the roots of orthogonal polynomials play a pivotal role, the scheme is referred to as the Golub-Welsch algorithm.)

Alternatively, as I mentioned in the comments, there exist asymptotic approximations for the roots, which can then be subsequently polished with a few applications of Newton-Raphson. One such asymptotic approximation is due to Francesco Tricomi. Letting $\xi_{n,k}$ be the $k$-th root of $P_n(x)$, ordered in decreasing order, we have

I'll answer question 1 only for now, but I might edit this to address the others later.

One should note that corresponding to any set of orthogonal polynomials, there exists a symmetric tridiagonal matrix, called a Jacobi matrix, whose characteristic polynomial is the monic (leading coefficient is 1) version of the set of orthogonal polynomials considered. To use the Legendre polynomials as an explicit example, we first note that the monic Legendre polynomials satisfy the following two-term recurrence relation:

(the general pattern is that you have $\frac{n}{\sqrt{4 n^2-1}}$ in the $(n,n+1)$ and $(n+1,n)$ positions, and 0 elsewhere.)

We now note that $\frac{n}{\sqrt{4 n^2-1}}$ can never be 0, and then use the fact that if a symmetric tridiagonal matrix has no zeroes in its sub- or superdiagonal, then all its eigenvalues have multiplicity 1. (A proof of this fact can be found in Beresford Parlett's The Symmetric Eigenvalue Problem.) Thus, all the roots of the Legendre polynomial are simple roots.

A more conventional proof of this fact is in page 27 of Theodore Chihara's An Introduction to Orthogonal Polynomials. Briefly, the argument is that $P_n(x)$ changes sign at least once within $[-1,1]$ (and thus has at least one zero of odd multiplicity within the support interval) since

$$\int_{-1}^1 P_n(u)\mathrm du=0$$

Now, the polynomial

$$P_n(x)\prod_{j=1}^k(x-\xi_j)$$

where the $\xi_j$ are the distinct zeroes of odd multiplicity within $[-1,1]$, should be greater than or equal to zero within $[-1,1]$, and thus its integral over $[-1,1]$ should be greater than zero. However, since

$$\int_{-1}^1 P_n(u) u^k\mathrm du=0\qquad\text{if}\qquad k < n$$

we have a contradiction, and thus all the roots of the Legendre polynomial are simple (and within the support interval $[-1,1]$).