I have a $n \times n$ matrix, for which i need to calculate the characteristic polynomial. The matrix is over $GF(2)$, and $n \approx 10^4$. However the matrix is very sparse, with around $ n $ non zero entries.

Is there a computer algebra system with sparse matrix structures capable of handling the calculation of the characteristic polynomial? What would be the most promising system to try?

5 Answers
5

I have been experimenting with Fermat to see how this goes. For a 5000x5000 test case I made up, it computed the minimal polynomial (Wiedemann algorithm) in about 30 seconds, then using that, the characteristic poly in about 22 minutes. It used about 45 meg of RAM. This is on a three year old IMac.

I am the author of Fermat. You can find my email online, and the Fermat web site.

@rhlewis As shown, I used a randomly generated matrix with density about 1/N, so there were, in fact, some zero rows, as well as some rows with 1 or 2 ones, and one row with 3 ones (based on a handful of random samples I looked at). If I change the density to 2/N, the runtime goes up to about 22 seconds. I just now tried a random permutation matrix as well, and got a runtime of about 5 minutes. That seems to make sense, as there are no zero rows to cut down the work. I used a 64-bit Linux machine with 8 Gb of memory.
–
JamesOct 25 '11 at 5:02

for testing. The fact that it is sparse is important. Runtimes for dense examples were much, much longer.

EDIT: It seems that if you use a sparse matrix data structure, and smaller (one byte) integers, this can be done much more efficiently. If you have access to Maple 14 or 15, try this (and note that, here, $N = 10^5$):

On my machine, I get the answer in less then 0.5 seconds. (It took longer - about 0.75 seconds - to construct the matrix!) Memory used was about 45.3 Mb, but going back to size $10^4\times 10^4$ reduces the memory to 2.7 Mb. I did examples with $N = 10^6$ in about 5 seconds and 220 Mb of memory.

Anyway, it's clear from the various answers that there are computer algebra systems out there that can handle the computations you are interested in (at least Maple and Fermat, and likely others), so you should be in a position to choose whichever system is most convenient for you.

I have routinely used sage (google "sage math") to compute the characteristic polynomial of adjacency matrices of graphs on around 800 vertices. I believe sage calls linbox for
the actual computation. Note that you can read of the characteristic polynomial from
the rational normal form and this might be cheaper to compute.

The versions of Fermat available for download were not able to use the best determinant algorithms for matrices with > 5000 rows. I've changed that now. Google "Lewis Fermat"

I experimented some more with examples like this. I created a random 10000 row square matrix with density 1.86. Using the best algorithm over GF(2), which turns out to be just Gaussian elimination for this example, the determinant (characteristic poly) is computed in 18 seconds using 18 megabytes, on a three-year old iMac. For larger densities, at some point the algorithm I mentioned above will be better.

Correction: Apologies for a mistake in the above: Gaussian elimination is effective only for densities < 1.01. However, the other method works well as far as I tried it, up to density 2.0, taking about 100 minutes and only about 68 meg of RAM.