Let $\zeta \in \mathbb{R}^n$ be a fixed vector, $\zeta_i > 0$ for all $i$. I want a tight lower bound on $P(X < \zeta)$, that is,

$$ P(X_1 \leq \zeta_1, \ldots, X_n \leq \zeta_n) \geq ? $$

In particular, I'm interested in the case where the $\zeta_i$'s are i.i.d $exp(1)$.

One rather loose bound when the $\zeta_i$'s are far apart is
$$ P(X_1 \leq \zeta_1, \ldots, X_n \leq \zeta_n) \geq P(\max_iX_i \leq \min_j\zeta_j) $$
and from there one can lower bound this further with Sudakov.

Due to the negative correlation we can't apply Slepian's lemma directly (and the high negative correlation would probably give us a bad bound).

I've looked into the multivariate Mills ratio literature, but they seem to be concerned with bounding
$$ P(X_i \geq \zeta_i) $$
for $\zeta_i > 0$

I've also tried writing down the integral and the inverse covariance matrix explicitly. (Since the covariance matrix $\Sigma$ in this case is a triangular circulant matrix, the inverse can be found in closed form, but it's not sparse and rather complicated).

Any ideas would be highly appreciated. This has applications in ranking and statistics.

3 Answers
3

This is not a full answer but, first, to point out that things are probably simpler for random than for deterministic $\zeta_i$s. If the $\zeta_i$ are i.i.d. exponential with parameter $1$, for every real numbers $x_i$,
$$
P[\zeta_1\ge x_1,\ldots,\zeta_n\ge x_n]=\exp(-[x_1^++\cdots+x_n^+]),
$$
where $x^+=x$ if $x\ge0$ and $0$ otherwise. Hence,
$$
(*)=P[\zeta_1\ge X_1,\ldots,\zeta_n\ge X_n]=E[\exp(-[X_1^++\cdots+X_n^+])].
$$
A second step (certainly not optimal) is that, the exponential being convex, by Jensen's inequality,
$$
(*)\ge\exp(-E[X_1^++\cdots+X_n^+])=\exp(-nE[X_1^+])=\exp(-n/\sqrt{2\pi}).
$$
Note that the random variables $X_i$ are independent when their indices are at distance at least $2$, hence
$$
(*)\le E[\exp(-X_1^+)]^{\lfloor n/2\rfloor}.
$$
Another (nonconclusive) remark is that a Gaussian random vector $(X_i)_{1\le i\le n}$ with correlation structure as in the post can be realized with the help of an i.i.d. centered reduced Gaussian random vector $(Y_i)_{1\le i\le n+1}$ as
$$
X_i=(Y_i-Y_{i+1})/\sqrt2.
$$
Hence, one has also
$$
(*)=E[\exp(-[(Y_1-Y_2)^++\cdots+(Y_n-Y_{n+1})^+]/\sqrt2)].
$$