By Lemma 3.2 of Cortissoz,
the infimum above is $d_P(\lambda,\mu)$:
the Lévy-Prohorov distance between the two laws.

The infimum is achieved if we allowed to choose both random variables.
That is, there exist $X_1$ and $Y_1$ on $(\Omega, {\cal B}, \lambda)$
with ${\cal L}(X_1) = \lambda$, ${\cal L}(Y_1) = \mu$, and
$\alpha(X_1,Y_1) = d_P(\lambda,\mu)$.
But in my problem, I want to fix the random variable $X$.

Why the result may be true: the
space $L^0(\Omega, {\cal B}, \lambda)$ is huge. There
are lots of random variables with law $\mu$. I can't think of any
obstruction to finding such a random variable.

Why the result may be false: the
space $L^0(\Omega, {\cal B}, \lambda)$ is huge. A compactness
argument seems hopeless to me. I can't think of any
construction for finding such a random variable.

How is the fact that you want to fix the rv $X$ relevant? Isn't it true that for any two rvs $X_1$ and $X_2$ with the same law, there is an isomorphism of $(0,1)$ taking $X_1$ to $X_2$?
–
Ori Gurel-GurevichOct 26 '10 at 20:20

@Ori: I'm not quite sure what kind of isomorphism you have in mind. The random variables $X_1(\omega)=2\omega-\lfloor 2\omega\rfloor$ and $X_2(\omega)=3\omega-\lfloor 3\omega\rfloor$ are both uniform (0,1) random variables, but neither can be written as a function of the other. It's certainly possible that an easy transformation or observation solves the problem. I'd be glad to hear about it!
–
Byron SchmulandOct 27 '10 at 1:26

You're right, I didn't understand the question at first. The way I understand it now I would almost say it is not a question in probability as it depends on the representation of the random variable in question.
–
Ori Gurel-GurevichOct 27 '10 at 2:47

In point 2) are you appealing to Strassen theorem? Don't you need the laws of both X and Y to be tight?
–
Ngoc Mai TranFeb 25 '11 at 8:13

2 Answers
2

Because what follows doesn't fit in a comment, I write it here as an answer; but they are merely comments. After computing the minimizer for several simple distributions, my impression is that the answer to this question is yes, and there will be many minimizers.

Intuitively, it seems possible to build an optimizer as follows: we are given the law $\mu$ and we would like to find a function $f$ such that 1) the distribution of $f$ is $\mu$ and 2) $\alpha(f,X)$ is minimum. Let $\epsilon > 0 $ be this minimum. Let $F(x) = \mu( (-\infty, x])$, i.e., $F$ is the distribution function associated with $\mu$. Let $G$ be the inverse function of $F$: $G(x) \doteq \inf\{y: F(y) \ge x \}$. By its definition $G$'s distribution is $\mu$. Draw the graphs of the functions $l(x) = x + \epsilon$ and $u(x) = x -\epsilon$ around the graph of the function $X(x) = x$. To get the minimizer, one cuts the graph of $G$ into $n$ small pieces with lines parallel to the $x$ axis and shifts around the pieces along these lines so that they lie between the graphs of $l$ and $u$ as much as possible. As the number of pieces increase and their size decreases you would expect this to converge to a function that is the desired minimizer. The result will depend on the particulars of this process.

As to non-uniqueness: suppose $f$ is a minimizer. Denote with $E$ the subset of $[0,1]$ over which $f$ differs from $X$ by at least $\epsilon$. The values that $f$ takes over $E$ can be freely permuted without affecting the distribution and the distance between $f$ and $X$. So there will be infinitely many minimizers, when there is one.

Thanks for looking at my problem; I thought it was dead. I agree that the minimizer likely exists, and will try to pursue your strategy. The part where we take the limit has me a bit worried, though.
–
Byron SchmulandSep 12 '10 at 16:45

This probably helps not at all, but I saw you were interested in Ky-Fan metric, and friends of mine have looked at these in a noncommutative setting in which there are some "extreme value" properties. Maybe there's something useful in there for you: http://arxiv.org/PS_cache/arxiv/pdf/0707/0707.4239v3.pdf