mathjs: `eigs` throws on a basic case (apparently any defective matrix)

Describe the bug

Trying to find the eigenvalues and eigenvectors of the matrix [[2, 1], [0, 2]] throws an error. Expected result: eigenvector of [1, 0] with eigenvalue 2.

To Reproduce Steps to reproduce the behavior.

import { eigs } from "mathjs";

const A = [ [5, 2.3], [2.3, 1] ];
const B = [ [2.0, 1.0], [0.0, 2.0] ];

console.log(eigs(A));
// console.log(eigs(B));

Uncommenting the only commented line should throw the Error. (Matrix A is there to make sure that it at least runs ok on some input)

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 31 (20 by maintainers)

Commits related to this issue

Most upvoted comments

First thing that would need to be done (to make stuff a little less confusing) is normalise the vectors - basically divide each vector by it’s last element so it is 1 (seems to be done universally, and makes sense since we getting something like ax = y, instead of ax = by)

I beg to differ, that is not the first thing that should be done in response to this bug; it is a thing we might want to do at some point, there’s a very reasonable argument for it, but it is a breaking change to working code. Definitely the PR that addresses this bug should not change any working test cases for eigs, only add new test cases that failed before but now pass. Then we can open a new issue about eigs normalization. For that, my advice would be a two-step process, first allowing normalization by a flag so it is non-breaking, and later removing the unnormalized version in a breaking change, but Jos is generally in favor of a more direct breaking way of moving forward, so we would probably do it that way. But that’s a separate discussion to have at a separate time.

As for inverseIterate throwing error - it is happening when we find one vector via usolve and it is passed further (because it looks like [[a], [b]] instead [a, b]). It is very easily fixable, but it doesn’t resolve any issue sadly

As for defective cases: apparently if we calculate (A - eI) and get rank of it we can determine if we get linearly independent eigenvectors or not.

Agreed on both counts. And in fact we are already doing part of the rank computation (for the submatrices of the almost-triangular form of A that correspond to each eigenvalue in turn). We certainly find out when those submatrices are not full rank (well, by crashing at the moment, but still). So we know when any eigenvector has lower geometric multiplicity than algebraic. So that’s why I say, once we can finally settle on a desired eigs interface, it will not be too much work to fix the current implementation.

Writing everything from scratch made me finally understand what is going on and what we are “sitting on”.

So we do support all non defective cases. We support some defective cases to some degree.

First thing that would need to be done (to make stuff a little less confusing) is normalise the vectors - basically divide each vector by it’s last element so it is 1 (seems to be done universally, and makes sense since we getting something like ax = y, instead of ax = by)

As for inverseIterate throwing error - it is happening when we find one vector via usolve and it is passed further (because it looks like [[a], [b]] instead [a, b]). It is very easily fixable, but it doesn’t resolve any issue sadly

As for defective cases: apparently if we calculate (A - eI) and get rank of it we can determine if we get linearly independent eigenvectors or not. But as for calculation of these, still don’t know universal algorithm

Side note: it seems that our eigs algorithm does Schur decomposition, so my suggestion is to move most of the code to schur.js Although I would need to test it a lot since with so many different cases it might have been a fluke…

I have almost finished fixing the current implementation:

  • fixed Schur decomposition
  • get eigenvalues correctly for ALL cases

I am still struggling with eigenvectors thought. There are so many cases…

  • technically the Q matrix of Schur decomposition (A=QUQ^T) should have eigenvectors as columns, but it is not always the case it seems…
  • then there is inverse power method / power method - they only give one pair of eigenvalue / vector
  • inverse power method with shifting - fails when determinant is 0 (which is almost always the case…)

If anyone has some iterative methods / resources to get eigenvectors when you already have eigenvalues I would be grateful

Also @josdejong it appeared that I misunderstood what SVD does. So it only helps in classifying matrices, but does not really help in finding eigenvalues/vectors (you get eigenvalues of AA* and A*A which is completely something else)

I am not experienced in the numerical aspects of eigenvalue/eigenvector computation. The current algorithm appears to be operable, so I am reluctant to change it given my lack of expertise. Therefore, my plan is to add detection of defective matrices to the current algorithm, and improve the interface so that it is clear how to obtain the eigenvectors and the which eigenvalue each eigenvector is associated with. Anyone who is experienced in such numerical matters is welcome to submit a PR providing another method of producing eigenvectors and eigenvalues, along with tests that demonstrate that it behaves better/faster/more accurately/etc. than the current algorithm. But that’s beyond the scope of what I am able to do.

If you have comments on the matter here, as to what exactly the eigs function should return in the case of a defective matrix, please share so that I can continue to proceed with resolving this issue. Thanks.

looks like math for symetric real case is broken. The values are just wrong. a revised version of #3016 should probably fix most of issues, since it is using QR algorithm which loves triangular matrices, and works for symetric as well