Sorry if this has already been asked before. I tried googling for it, but perhaps I could not find the right words to search for. My question is: Which is the fastest way to compute A*inv(B) [edit] where A and B are matrices?

Do you know anything about your matrices? Are they "big", are they "sparse", are you working in a particular ring? The more information you provide the more informed a response people can give.
–
Ryan BudneyAug 1 '10 at 21:02

I do not think that I can answer your question anyways, but what do you mean by “fastest?” Are you interested in the asymptotic time (as in the Strassen algorithm), or in the real-world computation time? If it is the latter, I guess that stating the size of the matrices you are targeting might help.
–
Tsuyoshi ItoAug 1 '10 at 21:07

The general strategy would be to do this: Let $\mathbf{J} = \mathbf{A} \mathbf{B}^{-1}$; therefore $\mathbf{J}\mathbf{B} = \mathbf{A}$. You must then rewrite this into $\mathbf{P}\mathbf{x} = \mathbf{Q}$ form (this depends on the dimensions of your matrices).

For instance, for $\mathbf{A} \in \mathbb{R}^{m \times n}$, $\mathbf{B} \in \mathbb{R}^{n \times n}$ and $\mathbf{J} \in \mathbb{R}^{m \times n}$, you can write a system of equations:

$J_{i,*} \cdot B_{*,j} = A_{i,j}$

(Notation: for a matrix $\mathbf{X}$, we define $X_{i,*}$ as the $i$-th row vector and $X_{*,j}$ as the $j$-th column vector, and $X_{i,j}$ as the element in the $i$-th row and $j$-th column).

From here, you can use a fast linear solver to solve the resulting linear equation system -- you will get a solution for the elements of $\mathbf{J}$.

By solving a linear system of equations and not taking the inverse directly, you're not only cutting down on the no. of operations required, you can exploit also properties like sparsity, inertia, etc. and have capabilities like pre-conditioning at your disposal.

More general than exploiting sparsity in your matrices, you might have to exploit the structure inherent in A, B, or both if you want to do better than $O(n^3)$ flops. For instance, if B is of the form

$C+u.v^T$

where $C$ is another (much simpler!) matrix and $u$ and $v$ are vectors or rectangular matrices of suitable dimension, you might benefit from using the Sherman-Morrison-Woodbury formula.