Matrices: Eigenvectors Eigenvalues.
In this post, I would like to give the first elements that a young person must know if he wants to understand the notions of eigenvectors and the eigenvalues of a matrix. In my opinion, before going any further, we must master this subject for a matrix of order 2.
A matrix is the memory of a linear transformation from one linear vector space (the departure space) into another linear space (the arrival space). Its columns are records of the transforms of the canonical base of the departure space. As you can see in the figure below, the first column of matrix A is the transformed vector of the first vector e_1 of the canonical base and the second column is the transformed vector of the second vector e_2 of the canonical base
The transformation of the other vectors of the departure space are linear combination of the columns of the matrix A.
An eigenvector of a matrix is a vector that is transformed proportionally by the matrix to itself. The coefficient of proportionality is called the eigenvalue of the matrix associated with the eigenvector.
If the vectors of the canonical basis are eigenvectors the matrix is diagonal, i.e. it is in the following form:
In the diagonal form, the matrices are very nice and easy to handle. Therefore, the question is for a given matrix: does there exist a basis of eigenvectors such that in this basis the matrix becomes diagonal? This is called the problem of diagonalization. In general, the answer is no!
Let’s start by showing an example of a matrix that doesn’t admit any eigenvalue. Thus, it doesn’t transform any vector proportionally to itself. Such matrix must rotate the vectors. For example the 90° rotation
The matrix A send the first vector of the canonical basis e_1 at e_2 and the second vector e_2 at -e_1. The matrix A does not have an eigenvector.
Let’s us give a family of matrices that have one or two eigenvectors:
Let us deal with the general case. It should be remembered that an eigenvector is transformed proportionally to itself (including the rotation of the angle 180°). Keeping this in mind, we propose below a new way to find eigenvectors and eigenvalues. The classical method consists in finding the eigenvalues as the roots of the characteristic polynomial. In this post, I will show another path based on the fact of proportionality and othogonality. Unlike the classical method, we will give an equation to first find the eigenvectors instead of eigenvalues.
It is well known that symmetric matrices are diagonalizable and their eigenvectors are orthogonal. Subsequently, we deduce this result and show something more general: a matrix which has its anti-diagonal coefficients with the same signs (or are zeros) is diagonalizable:
In books and in classrooms, the search for eigenvalues and eigenvectors is based on the characteristic polynomial of matrix A. The eigenvalues lambda are the roots (solutions) of the following algebraic equation:
where tr(A) is the trace of A and det(A) is the determinant of A. There is a relationship between the equation of the eigenvalues and the eigenvectors:
I think we can use the proportionality and orthogonality method for higher dimension matrices. For example in dimension 3 we can work with vector v=(1, r_1,r_2) with its orthogonal vectors (r_1,-1,0) and (r_2,0,-1) may be I will present this in a next post.