Dear all I am using scalapack to perform a singular value decomposition on a big matrix

I compare the results from ScaLAPACK with the LAPACK code that I have and I see that when I use the PZGESVD routine for singular value decomposition the routine gives the same singular values as the LAPACK routine ZGESVD but I obtain different left and right singular vectors U and VT.

Does any body experience a similar problem?Here is the piece of code that I wrote

1) for two distinct singular values, singular vector does not have to be the same up to multiplication with a scalar number of the kind e^(i.theta). While this is a simple +/- in the real numbers, in the complex numbers it is hard to see that to vectors are multiple by e^(i.theta). So please check this.

2) if this is singular values are similar (or very close), then thesubspace is invariant, it's even harder to compare the answer of LAPACK and ScaLAPACK.

3) All in all, you can check that a) || A - U * S * V^H || / || A || is small b) || I - U * U^H || is small c) || I - V * V^H || is smallfor both LAPACK and ScLAPACK. They both have A correct answer. There is no such thing has THE correct answer for this problem.

From the algorithm used, I think the results should be "identical" for two different pairs of block sizes ... Can you try the check please? || A - U * S * V^H || / || A || and orthogonality of U and V? --JL

Thanks for the help once again, i wrote the part for the orthogonality once again and this are the results I get on 8 processorsThe result does not change when I change the number of processors and the block size.

I can assume now that my Singular Value Decomposition should be right. I compared anyway the results with my serial program written using LAPACK and the singular right and left vectors are still different, maybe they are normalized in a different way ?

In my next step for the algorithm I need to compute eigenvalues and eigenvectors of a non symmetric matrix.I had a look at the subroutine but the parallel version of the subroutine zgeev seems not to be available anymore.

I can assume now that my Singular Value Decomposition should be right.

Yes it is. Congratulations. (For the successful use of the subroutines but also for writing the check correctly.)

I compared anyway the results with my serial program written using LAPACK and the singular right and left vectors are still different, maybe they are normalized in a different way ?

Please reread our initial answer. There is not unicity of the singular vectors.

In exact arithmetic, if the singular value is unique, then singular vectors may differ by a e^(i.theta) (so a complex scalar of modulus 1). In real arithmetic, this is a sign, so that's easy to spot. In complex arithmetic, this is really harder to check with your eyes that two vectors differ by e^(i.theta). What is unique is the invariant subspace associated with this singular value: so any vector of norm 1 in that subspace of dimension 1 is fine. Although the singular vectors are different, there is no wrong answer. LAPACK is correct, ScaLAPACK is correct.

In exact arithmetic, if the singular value is multiple say of multiplicity 4 (e.g.): what is unique here is once more the invariant subspace of dimension 4 associated with this singular value: so any orthogonal basis of this subspace of dimension 4 is fine. This gives much more freedom to the singular value solvers ... Once more, although the singular vectors are different, there is no wrong answer. LAPACK is correct, ScaLAPACK is correct.

In 32-bit (or 64-bit) arithmetic, then if the singular value are close together ... well you end up with the same problem than in exact arithmetic with a multiple singular value. I think this makes sense but it is hard to describe in this post what's going on precisely.

In my next step for the algorithm I need to compute eigenvalues and eigenvectors of a non symmetric matrix.I had a look at the subroutine but the parallel version of the subroutine zgeev seems not to be available anymore.

Oups ... This is coming, this is coming ... maybe register yourself at lapack-announce@cs.utk.edu. We hope to have the feature available within the next 6 months. I do not recommend you to try anything in ScaLAPACK at this point. Sorry for the bad news.

Thanks once again for the reply.I found on the internet some subroutines that should do the work for the eigenvalue eigenvectors problem but I do not know whether they were deeply tested.On the other side since the SVD reduces the size of the matrix I have to work on i simply use the lapack subroutine ZGEEV on a single processors and then i send the results to the others.

It would be nice anyway to have also the eigenvalue part in a parallel version so i will be looking forward for the update.