Under what conditions on a and b is there a distribution $f_{a,b}$ such that the product $XY$ of two independent realizations $X$ and $Y$ from $f_{a,b}$ has a Beta(a,b) distribution?

A standard result on deriving the distibution of the product of two variables indicates that the p.d.f. $f_{a,b}$ needs to satisfy:
$\int_0^1 f_{a,b}(x) f_{a,b}(\frac{v}{x}) \frac{1}{x} dx = \frac{v^{a-1} (1-v)^{b-1}}{B(a,b)}$, but I have no idea how to find such an $f_{a,b}$. (B denotes the beta function).

This article ( http://www.jstor.org/stable/2045709 ) might be relevant for the case a=b=1, but my background in the relevant maths is not strong enough to understand it at all.

The problem might simplify if you look for the logs of $X$ and $Y$, which should sum to a log beta of course. Summing RV's means convolving distributions and convolution becomes product after taking a Fourier transform.
–
Jeff SchenkerJul 21 '10 at 15:18

3 Answers
3

A partial answer is that $f_{a,b}$ exists for every positive integer $b$.

To see this, first recall that for every positive $s$ and $a$ the distribution Gamma$(s,a)$ has density proportional to $z^{s-1}e^{-az}$ on $z\ge0$ and that the sum of independent Gamma$(s,a)$ and Gamma$(t,a)$ is Gamma$(s+t,a)$.

If $b=1$, choose $X=e^{-Z}$ and $Y=e^{-T}$ where $Z$ and $T$ are i.i.d. and Gamma$(1/2,a)$. Then $X$ and $Y$ are i.i.d. and $XY=e^{-(Z+T)}$ where $Z+T$ is Gamma$(1,a)$, that is, exponential of parameter $a$, hence $XY$ is Beta$(a,1)$ and you are done if $b=1$.

From there, recall that the product of two independent Beta$(a,c)$ and Beta$(a+c,b-c)$ is Beta$(a,b)$ for every $c$ in $(0,b)$. Assume that $b$ is a positive integer and choose $X=e^{-Z_1-Z_2-\cdots-Z_b}$ and $Y=e^{-T_1-T_2-\cdots-T_b}$ where all the $Z_k$ and $T_i$ are independent and, for each $k$, $Z_k$ and $T_k$ are both Gamma$(1/2,a+k-1)$. Then, by the $b=1$ case, each $e^{-(Z_k+T_k)}$ is Beta$(a+k-1,1)$ hence $XY$ is the product of some independent Beta$(a,1)$, Beta$(a+1,1)$, ..., Beta$(a+b-1,1)$, hence it is Beta$(a,b)$ and you are done for every positive integer $b$.

If you're only interested in the existence of a solution Jeff's suggestion of looking at the logarithms seems to be the correct approach. Let $Z$ have Beta$(a,b)$ distribution. Suppose that there is a solution to the "square root" problem in this case, and let $X$ be a random variable with this distribution.
Then, if $\phi(u) = E[e^{i u \log(Z)}] = E[Z^{i u}]$ and $\psi(u) = E[X^{i u}]$ are the characteristic functions (or Fourier transforms) of $\log(Z)$ it must be the case that $\psi(u)^2 = \phi(u)$ for all $u\in\mathbb{R}$.

Because we know a lot about Beta distributions we can actually solve explicitly that $\phi(u) = \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)} \frac{\Gamma(\alpha+i u)}{\Gamma(\alpha+\beta+i u)}$. Then we can prove the existence of a solution by defining $\psi(u) = \sqrt{\phi(u)}$ and then letting $X$ be a random variable with distribution $\psi(u)$.

There are a couple of issues that need to be smoothed over. First of all, $\phi(u)$ is a complex-valued function so we need to be careful with the square root. However, since $\phi(u)\neq 0$ for all $u\in\mathbb{R}$ we can take the square root in a continuous manner. The bigger issue is that we need to be able to prove that $\psi(u)$ is a characteristic function of some probability distribution. For this we need some Fourier inversion theorems.

Theorem 14 in A Modern Approach to Probability Theory by Fristedt and Gray gives sufficient conditions for a function to be the characteristic function of a real-valued random variable.
There are several conditions to check. First, that $\psi(0)=1$ and $\psi$ is continuous. These are obviously satisfied. Next, that $\psi(u)$ is "positive definite". That is, $\sum_{k=1}^n \sum_{j=1}^n \psi(u_k-u_j)z_j\bar{z}_k \geq 0$ for any complex $n$-tuple $(z_1,\ldots, z_n)$ and real $n$-tuple $(u_1,\ldots, u_n)$. I don't know how to check to see if this is true. The final condition is that $\int| \psi(u)| du < \infty$. I'm not an expert in the $\Gamma$ function, so I plotted $\psi(u)$ with Mathematica and it looks like $|\psi(u)|$ decays roughly like $|u|^{-b}$ as $|u|\rightarrow \infty$ so that $\int |\psi(u)| du < \infty$ if $b>1$.

I still can't think of how to check the "positive definite" condition, but if one can check that the problem would be solved for $b>1$.

Didier informed me of a small mistake in my above explanation. I said that $\psi(u)$ decays like $|u|^-b$. However, what is true is that $\phi(u)$ decays roughly at that rate, and so for $\psi$ to be in $L^1$ we would need $b>2$. Due to Didier's above proof in the cases where $b$ is a positive integer I believe that the "square root" distribution probably exists for any $a,b>0$. I still don't know what to do about proving the "positive definite" condition, but the integrability assumption on $\psi$ can probably be removed by someone who knows inverse Fourier transforms better than me.
–
Jon PetersonJul 28 '10 at 17:47

Thank you for the feedback. I am also interested in existence, but I would love to be able to find the square root distribution if it exists, given a and b. Since you indicate that it is likely they exist, and Didier's work indicates how to obtain it for integral b, the best way forward for me might be to attempt to extend that characterization to non-integral b.
–
Steve KroonAug 16 '10 at 13:55

1

Yes. Call $B$ the set of parameters $b$ such that $f_{a,b}$ exists for every positive $a$. We know that $B$ is closed by addition and that it contains $1$, hence that $B$ contains every positive integer. And $B$ might also be a closed subset of the real line. But it is not clear (to me) that $B$ should contain every positive $b$. In particular the positive definiteness condition which Jon alluded to, might fail for small positive values of $b$. Who knows... :-)
–
DidAug 17 '10 at 16:04

It turns out that I'm now looking for the square root of the Beta(0.5, 0.5) distribution, so non-integral b has suddenly become more important to me.
–
Steve KroonSep 7 '10 at 7:38

Hmm, $\beta(1/2,1/2)$ is also the distribution of the fraction of time spent on the positive quadrant by Brownian Motion started at the origin. Have you pursued ideas of this sort taking two independent copies of this random variable ?
–
Ivan DornicSep 27 '10 at 20:19

Denoting, $Z = X \ Y$, take Mellin-Transform expectations of both-sides.
Since $E[X^{s-1}] = B(a+s,b)$, this condition you demand amounts to find the solutions of
$B(a+s,b)/B(a,b)=1$, which can also be rewritten as:

$\Gamma(a) \Gamma(a+s) = \Gamma(a+b) \Gamma(a+b+s).$

EDIT: I apologize for not having read in details Jon Petersen's 1st answer, which is far
more superior to my feeble (though unwanted) rephrasing...

I'm not clear on what you mean by "Mellin-transform expectation". However, there seems to be something wrong with your solution, since Γ(a)Γ(a+s)=Γ(a+b)Γ(a+b+s) can not hold for b=1 (RHS will be strictly greater than the LHS), while the discussion in the previous comments have shown that b=1 and any a satisfy my requirements.
–
Steve KroonSep 29 '10 at 10:39