That a 3rd degree real polynomial has a real root shows that for any real 3 by 3 matrix A there is a real λ such that the determinant |A − λI| = 0.
Using geometric intuition we argue here that there is a vector x such that (A − λI)x = 0 and Ax = λx. In the 3D real vector space S2 is the unit sphere consisting of the set of vectors x such that |x| = 1. The image of S2 for any linear transformation A is an ellipsoid whose center is the origin. The volume within S2 is 4/3π. The volume of the image of the interior of S2 under A is 4/3π |A|. If A is a rotation (as a matrix A is orthogonal and |A|=1) then the image of S2 is S2. When we choose λ so that |A − λI| = 0, then the volume of the image of S2 under A − λI is 0. This image is thus a deflated ellipsoid, namely an elliptical disk is some 2D subspace. This image of S2 includes the origin and thus some vector of S2 maps to the origin and is thus is the eigenvector we seek.
I think a more rigorous proof requires some Brouwer fixed point theorem.
Computationally we might choose a λ near zero and invert |A − λI| for a sequence of λ’s approaching 0. The inverses will comprise matrices whose columns, (or rows) consist of longer and longer vectors whose normalized form approaches our eigenvector. When there are more than one eigenvector for an eigenvalue the matrices will show them all.
This construction works in any odd number of dimensions. It also works for arbitrary real matrices A; they need not be orthogonal.