For any unit vector x there is a projection Ex such that Exx = x and Exy = 0 if x and y are orthogonal. This suggests that projections are useful in inner product spaces. If λi are the eigenvalues of M and Ei are the respective projections for the corresponding eigenvector them M = Σ λiEi. If xi are the components of an eigenvector then its projection has elements xixj. Much of this, I fear, is limited to hermitian matrices.
One use of projections is as follows: Use one of the methods that finds an eigenvector of a matrix M with a large eigenvalue. Subtract the corresponding projection from M. The new M has the same eigenvector but its eigenvalue is now 0 and you can iterate finding the eigenvector with the largest eigenvalue of the new matrix.
I hope soon to have some eigenvectors of general matrices and test imperially how much of this logic works for non hermitian matrices.
In an inner product space (denoted by juxtaposition of vector expressions) for every vector v≠0 there is a projection E so that for every vector x, Ex = ((vx)/(vv))v.
If {bi} forms a basis for our space and bibj = δij we call this an orthogonal coördinate system. If {vi} are scalars and v = Σvibi then the matrix Eij = (vivj)/(vv) is the E that goes with the v above.
To avoid exponent overflow, or underflow, I did a scalar normalization towards unity. This failed for many real matrices because they had complex roots and the high-power matrix was not nearly a projection. The projection I sought was a matrix with complex coefficients and my program dealt only with real matrices; I needed symmetry breaking to get off of the real line.