All iterative solvers I've been able to find are for a system Ax = b where b is a vector. Does anyone know of general iterative solvers for AX = B where X, B are matrices, or more specifically finding the inverse A^-1? (Assuming A is large and sparse.)

If $B$ has $r$ columns, you can solve $AX=B$ by calling your $Ax=b$ solver $r$ times. As Federico points out, it's almost invariably a mistake to compute $A^{-1}$. finally this question is not appropriate for this site.
–
Chris GodsilMay 30 '12 at 22:00

1 Answer
1

Unless your matrices have a special structure, both $A^{-1}$ and $A^{-1}B$ are dense (full), general matrices without any special properties. Even storing them in memory will be prohibitive.

The short answer is: don't do it. :)

The long answer is: look for structure in your matrices, and exploit it. If there is no structure and you really need to perform this computation, then your best bet is using an exact solver, not an iterative one. It will be extremely slow ($O(N^3)$ time and $O(N^2)$ storage), but there are no shortcuts.

In my case the entries of A^-1 have a probabilistic interpretation, so I need to at least approximate the inverse. A is sparse because I'm making some approximations (it would otherwise be dense). I was hoping the approximations would lead to a block diagonal structure where the largest block was tractable for exact solvers, but this was not the case. However, I believe the structure may be almost block diagonal, leading to a sparse inverse.
–
Nicholas AndrewsMay 30 '12 at 22:04

Have you actually tried any canned software to see if your conjecture is correct?
–
Igor RivinMay 30 '12 at 23:54

What is the probabilistic interpretation of the entries of $A^{-1}$? Are you looking for the entries of a covariance matrix of the form $(X^{T}X)^{-1}$ for example? "Almost Block Diagonal" really isn't gonig to be enough to help here- as soon as you've got one entry linking two blocks, you can get fill in of the linking block in the inverse matrix.
–
Brian BorchersMay 31 '12 at 3:26