Suppose I have a r.v. $Z = X + \alpha Y$ and that $F_Z$ is the probability distribution function of $Z$. If we think of the probability $p = F_Z(q) = \mathbb{P}(X+\alpha Y < q)$ as a function $p = p(q, \alpha)$, how can we write the derivative $\partial_{\alpha} p(q, \alpha)$ supposing as much regularilty on the distribution of $X$ and $Y$ as we want?

I'm loosely thinking of a situation where you know the marginals distributions $F_X$ and $F_Y$ and maybe locally (around $(q, \alpha)$) know some information about the dependence of $X$ and $Y$ (maybe correlation is enough in some cases to build an approximation?).

I guess I should say I'm looking for something where you don't use the full joint distribution function.
–
mathtickJul 22 '12 at 1:24

1

Then you're looking for something that doesn't exist.
–
Robert IsraelJul 22 '12 at 7:46

1

To see that correlation isn't enough, remember that adding outliers can affect the correlation coefficients with a negligible effect on probabilities like $P(X+\alpha Y \lt q)$.
–
Douglas ZareJul 22 '12 at 9:44

... and in case the internet changes by the time you read the previous comment the reference is: Sensitivity analysis of Values at Risk, C. Gourieroux, J.P. Laurent, O. Scaille
–
mathtickAug 7 '12 at 19:43