As written, I'm not sure this question makes sense. I take it that by Jacobian you mean the determinant of the jacobian matrix. This matrix is the identity at the identity, whence your Jacobian is 1, but as soon as the exponential map fails to be bijective, it will be zero. So I would say that greatest lower bound is 0, but I doubt this is what you are asking.
–
José Figueroa-O'FarrillJan 27 '10 at 15:31

1

Sorry for not being precise enough. With 'near the origin', I meant a region where the exponential is bijective. For example, the exponential is injective on the ball of radius ln(2) around the origin. A lower bound for an even smaller region that contains the origin would be fine.
–
SkippyJan 27 '10 at 16:11

@some guy on the street: I'm not sure what else is not specific enough. Could you be more specific? :-)
–
SkippyJan 27 '10 at 16:13

@Skippy, "on the ball of radius ln(2) around the origin" is good enough. I get the impression that it's just a tad unusual to say "Jacobian" when considering maps between spaces of different dimension (construing the orthogonals as embedded in all matrices), but that sort of generalization actually appeals to me; though I suppose you really want to compare the Haar measures on the two spaces, as related via the exponential map?
–
some guy on the streetJan 27 '10 at 22:38

1 Answer
1

For simplicity, I work with $2n \times 2n$ matrices, the odd by odd case is similar.

Summary: If $B$ is a skew symmetric matrix with eigenvalues $\pm i \theta_1$, $\pm i \theta_2$, ..., $\pm i \theta_n$ then the Jacobian matrix of the exponential near $B$ has eigenvalues
$$\frac{1-e^{i(\mp \theta_j \mp \theta_k)}}{i(\pm \theta_j \pm \theta_k)}, \quad 1 \leq j < \leq n$$
as well has having the eigenvalue $1$ with multiplicity $n$. (This is $4 \binom{n}{2}+n=\binom{2n}{2}$ eigenvalues in total.)

Take whatever sort of bound you have on $B$ and turn it into a lower bound on the above quantity. Notice that, if $\theta_j + \theta_k$ gets as large as $2 \pi$, the above quantity is zero, so there is no nontrivial lower bound in that case.

A practical note on minimizing the above quantity: the log of the above is the sum of many terms of the form $f(\phi) := \log \sin(\phi) - \log \phi$, where $\phi$ is a linear function. By the double derivative test, $f$ is concave. So the sum of many terms of the form $f(\mbox{linear function})$ will be concave. This means that, on any convex region, the minimum will occur somewhere on the boundary.

Notation: We write $\mathfrak{so}$ for the vector space of Skew symmetric matrices. We fix $B$ and $\theta_j$ as above.

Explanation: Let me first point out why the question makes sense. The orthogonal matrices are a manifold, not a vector space, so one might be tempted to wonder whether it even makes sense to speak of a Jacobian; let alone to speak of the eigenvalues of the Jacobian matrix.

There are two ways to fix this, a naive way, and a sophisticated way, and they both give the same answer. The naive way is to point out that the orthogonal matrices are contained in the $n \times n$ matrices. So we certainly have a $\binom{2n}{2} \times (2n)^2$ matrix, giving the Jacobian matrix of the exponential map as a map from skew-symmetric matrices to all matrices. This matrix is not square; its image is the tangent plane at $e^B$ to the space of orthogonal matrices. Explicitly, that tangent plane is $e^B \mathfrak{so}$. We can rotate that tangent plane by the orthogonal matrix $e^{-B}$, giving us a map from $\mathfrak{so}$ to itself; it is now sensible to discuss the eigenvalues of that map.

The sophisticated way is say that the Jacobian matrix is a map from $\mathfrak{so}$ to the tangent space of $SO$ at $e^B$. But that tangent space is canonically identified with the tangent space of $SO$ at the identity, and the latter tangent space is $\mathfrak{so}$.

Either way, we are being asked to consider the following map from $\mathfrak{so}$ to itself:
$$E \mapsto e^{-B} \lim_{t \to 0} (e^{B+tE} - e^B)/t. \quad (*)$$

Let $A$ be the map $X \mapsto [B,X]$ from $\mathfrak{so}$ to itself. By the Baker-Campbell-Hausdorff formula, $(*)$ is
$$\frac{1-e^{-A}}{A} E$$
where $(1-e^{-A})/A$ must be understood as the power series $1-A/2+A^2/6-\cdots$. If written out as matrices, $A$ would be a $\binom{2n}{2} \times \binom{2n}{2}$ matrix and $E$ would be a vector of length $\binom{2n}{2}$.

Now, if the eigenvalues of $B$ are $\pm i \theta_j$, as above, then the eigenvalues of $A$ are $0$ with multiplicity $n$ and $i(\pm \theta_j \pm \theta_k)$. (Because the root system of type $D_n$ is $\{ \pm e_j \pm e_k \}$, or because it is an easy computation.) If $\alpha_1$, $\alpha_2$, ..., $\alpha_N$ are the eigenvalues of $A$, then the eigenvalues of $(1-e^{-A})/A$ are $(1-e^{-\alpha_j})/\alpha_j$, where $(1-e^{-0})/0$ is interpreted as $1$.