On Sun, Nov 1, 2009 at 11:03 PM, Sturla Molden <sturla@molden.no> wrote:
>josef.pktd@gmail.com skrev:
>> In econometrics (including statsmodels) we have a lot of quadratic
>> forms that are usually calculate with a matrix inverse.
> That is a sign of numerical incompetence.
I agree, but that's the training. Last time I did principal components
with SVD, it took me a long time to figure out how to get it to work,
I still don't understand it.The only matrix decomposition that I'm
familiar with is eigenvalue decomposition.
But we had this part of the discussion before, in applied
econometrics, if we have enough multicollinearity that numerical
precision matters, then we are screwed anyway and have to rethink the
data analysis or the model, or do a pca.
>
> You see this often in statistics as well, people who think matrix
> inverse is the way to calculate mahalanobis distances, when you should
> really use a Cholesky.
>> As for LU, I'd rather use an SVD as it is numerically more stabile.
> Using LU, you are betting on singular values not being tiny. With SVD
> you can solve an ill-conditioned system by zeroing tiny singular values.
> With LU you just get astronomic rounding errors.
How can you calculate the quadratic form or the product inv(A)*B with SVD?
Solving the equations is ok, since pinv and lstsq are based on SVD internally.
In matlab there is also a version for QR, but I haven't figured out
how to do this in scipy without an inverse.
Josef
>> Sturla
> _______________________________________________
> SciPy-User mailing list
>SciPy-User@scipy.org>http://mail.scipy.org/mailman/listinfo/scipy-user>