I have two independent random variables each with a logistic distribution (not necessarily same mean and variance).

How is the difference (or sum) of the two variables distributed?

Dragan

04-23-2010, 01:43 PM

I have two independent random variables each with a logistic distribution (not necessarily same mean and variance).

How is the difference (or sum) of the two variables distributed?

See if Theorem 4 on page 7 helps you.

http://www.math.chalmers.se/~wastlund/dmtcsPath.pdf

koochak

04-25-2010, 05:47 PM

Thanks for the reply. If I'm reading that right, the result is limited to two random variables from the same logistic distribution with location=0 and scale=1. I was looking for a more general result. Even then the expression for h(x) is a cdf which I do not recognise, that is to say I cannot work out the corresponding pdf.

Dragan

04-25-2010, 07:18 PM

Thanks for the reply....that is to say I cannot work out the corresponding pdf.

It is a straight-forward exercise to work out the pdf. It is right here:

Now just scale it to meet your needs. For example, suppose I wanted to standardize the pdf so that the variance is equal to 1. To accomplish this task, I would multiply not only the entire pdf by the square root of the variance but also multiply each value of x in the pdf by the square root of the variance as well. Get the idea...

koochak

04-26-2010, 04:08 AM

The problem (as posed originally) is for the case where the two r.v come from distributions with different mean and variance.
However, assuming for now that they are from the same distribution with mu=0, then your expression for f(x) does not look right. This is because I would expect the sum of the two r.v to have a mean of 0 (and be well-behaved near the mean). However f(x) is undefined for x=0.

Dragan

04-26-2010, 08:35 AM

... However f(x) is undefined for x=0.

f(x) is not undefined at x = 0 - the height of the pdf is f(x)=1/6. Note, you have to take the Limit of f(x) as x->0.

Anyway, if the two independent r.v.s have different means, then all you do is sum them. Next, subtract this sum (the mean of the two r.v.s) from x in f(x) above before multiplying by the constant that will change the variance (as described above).

Obviously you obtain the "new" variance by taking the sum of the variances associated with the two independent r.v.s. (Variance Sum Law for independent r.v.s)

Seems to work fine for me....Just as you posed in the original question:)

Edit:

Here is an example for clarity.

Let X and Y be independent logistic random variables with means of 1/4 and 3/4, respectively. Further, let X and Y have variances of 1 (Scale of Sqrt[3]/Pi) and 4 (Scale of 2*Sqrt[3]/Pi), respectively.

Thus, X and Y have different means and variances as was asked above.

Now, let Z = X + Y.

As such, Z will have a mean of 1 and variance of 5.

To obtain the pdf associated with Z you would take the standard pdf above and scale it by the constant