Let $x_1 < x_2 < \ldots < x_n$ and $y_1 < y_2 < \ldots < y_n$ be two sequences
of $n$ real numbers. It is well known that there are polynomials that "interpolate"
in that $f(x_i)=y_i$ for all $i$, and the Lagrange interpolating polynomial
even warrants a solution of degree $ < n$. Now, what happens if we want the
polynomial $f$ to be nondecreasing on the interval $[x_0,x_n]$ ? Is there always
a solution, and is there a bound on the degree also ?

4 Answers
4

This problem has appeared before in literature and is now well understood, I guess. The general version is when you have no restriction on the $y_i$'s and you ask for an interpolating polynomial that is monotone on each sub-interval $[x_ix_{i+1}]$. The first paper proving the existence of such a polynomial is:

but it is a non-constructive proof, as it uses the Weierstrass approximation theorem much like the answer given by Harald Hanche-Olsen above. Another proof for the case $0=y_0\le \cdots \le y_n=1$ is given in "Polynomial Approximations to Finitely Oscillating Functions" by W.J. Kammerer (Theorem 4.1) and the non-constructive aspect of his proof is the use of uniform convergence of appropriate Bernstein polynomials. In "Piecewise monotone polynomial interpolation", S.W. Young proves the same theorem and makes the final remark that the existence of such monotone interpolating polynomial is in fact equivalent to the Weierstrass theorem. On the other hand Rubinstein has some papers devoted to proving the existence of interpolating polynomials which are increasing in all of $\mathbb R$.

The first paper which gives bounds on the degrees is, I think,

E. Passow, L. Raymon, "The degree of piecewise monotone interpolation", which is here

and an improvement is made in "Exact estimates for monotone interpolation" by G.L. Iliev.
Note that the bounds are in terms of
$$A=\max \Delta y_i=\max (y_{i}-y_{i-1}) \\qquad B=\min \Delta y_i \qquad C=\min \Delta x _i$$

To add to Gjergji Zaimi's informative answer: It is easy to see that the degree cannot be bounded in terms of $n$ alone, even when $n=3$.

Suppose that we want $f$ of degree $m$ such that $f(0)=0$, $f(1)=\epsilon$, and $f(2)=1$, and $f$ is increasing on $[0,1]$, where $\epsilon>0$ is small. Then $|f(k/m)| \le \epsilon$ for $k=0,\ldots,m$, so the Lagrange interpolation formula shows that for fixed $m$, the coefficients of $f$ are $O(\epsilon)$, so $f(2)$ is $O(\epsilon)$ and cannot be $1$ if $\epsilon$ is small enough. In other words, the degree of any solution $f$ must grow as $\epsilon$ shrinks.

I don't know if this has been studied, but at least if you forget about a bound on the degree, a sledgehammer approach gives you a positive answer. For simplicity, assume $x_i\in[0,1]$ and proceed by induction on $n$, with the induction hypothesis being the existence of an increasing interpolating polynomial $p_n$. To get from $n-1$ to $n$, let $$p_n(x)=p_{n-1}(x)+(x-x_1)\cdots(x-x_{n-1})q_n(x)$$ where $q_n$ is a polynynomial to be determined. Given $p_{n-1}'(x)>\epsilon>0$ we have a little wiggle room: We merely need $|q_n(x)|$ and $|q_n'(x)|$ to be very small for $0<x<x_{n-1}$ (exactly how small left as an exercise) and $q_n'(x)>0$ for $x_{n-1}<x<1$. To achieve this, write $$q_n(x)=\int_0^x r_n(x)$$ and use the Weierstrass approximation theorem to let $r_n$ approximate a suitable continuous function. Adjust with a positive multiplicative constant to hit $p_n(x_n)=y_n$ exactly.

The astute reader will notice a problem with this: If $p_{n-1}(x_n)\ge y_n$ this prescription loses. So we have to make sure that $r_{n-1}$, after shooting up to a nice big value around $x_{n-1}$, comes quickly back down to a small value in order to have this not happen. This complicates the proof quite a bit though, and I am not about to work through the details. I'd be interested to hear about pointers to the literature.