In trying to solve the problem $\sqrt D f(x)=g(x)$ I tried to expand the derivative as a Taylor series, and have encountered a lot of problems. Is there some reason that this shouldn't be possible? The Taylor series for the $1/2th$ derivative seems to be: $\sum\limits_{n=0} {1/2 \choose n} (D-1)^n $ with $D$ being $\frac{d}{dx}$. The one is there because if you center the Taylor series around zero it will acquire a lot of sums of infinities. Because $D$ is really a matrix, and therefore $\sqrt D$ is also a matrix, so I think that 1 should really be the identity matrix. Any help would be appreciated.

2 Answers
2

We want $D_x^{1/2}$ to be an operator (we shall forget uniqueness a bit) such that $D_x^{1/2}\left(D_x^{1/2}f(x)\right)=f'(x).$ We also want it to have some reasonably nice properties, such as linearization and the product rule. Let's try to find out $D_x^{1/2}(a_nx^n)$ first, so we can take advantage of the linear properties and study a more general case, $D_{x}^{1/2}\left[\sum_{k\geqslant 0}a_kx^k\right]$. First, show by induction that $$D^{r}_x(a_nx^n)=\frac{n!}{(n-r)!}a_nx^{n-r}\tag{$r\in\mathbb{N}$}$$ and assume it can be extended for fractional $r$, using the $\Gamma$-function. Using linear properties, $$D^{r}_x\left(\sum_{n\geqslant 0}a_nx^n\right)=\frac{r!}{x^r}\sum_{n\geqslant 1}{n\choose r}a_nx^{n}$$ Since you are looking for a Taylor expansion, write it: $$D_{x}^{r}\left[f(x+a)\right]=\sum_{n\geqslant 0}\frac{x^{n-r}}{(n-r)!}f^{(n)}(a).$$ Let $r=1/2$ and we're there. However, this answer is sketchy; there are another issues, such as convergence and existence of these operators. Consider also negative cases (antiderivatives).

1) You can integrate both sides of the equation $f(x)=D^{\frac{1}{2}}g(x)$. But for performing this option, you need that the resultant function is of the form $D^{\frac{1}{2}}g(x)=x^{\lambda}\eta(x)$ where $\eta$ is an analytic function and $\lambda\leq\frac{3}{2}$.

It would be to long to give you all the details that justify this argument. But you can find them in "An introduction to the fractional Calculus and Fractional Differential Equations" by Miller and Ross, Theorem 3 page 105. Basically is a more complicated version of the Fundamental Theorem of Calculus that you know.

2) If you lack the hypothesis of 1) you could try to apply the Laplace transform to both sides of the equation, solve it algebraically (will be easy in general for this equation unless $g$ is very ugly), and reverse back with the inverse transform. But you are going to get an special function as an answer in most cases, no matter if $g$ is simple.

(All this assuming you use the Riemann Lioville definition of fractional calculus. I would forget about Taylor.)