All \(n\) by \(n\) matrices have \(n\) eigenvalues.

Assume:

  • \(\lambda\) is our eigenvalue (it is also a scalar)
  • \(\vec{x}\) is our eigenvector
  • \(A\) is an \(n\) by \(n\) matrix.

\[\begin{align*} A\vec{x} &= \lambda\vec{x}\\ A\vec{x} - \lambda\vec{x} &= 0\\ (A - \lambda I)\vec{x} &= 0 \end{align*}\]

Eigenvectors are defined as non-zero, so we are not interested in the case when \(\vec{x} = 0\).

\(\vec{x}\) is some non-zero vector in the nullspace of \((A - \lambda{}I)\). If \((A - \lambda{}I)\) has vectors other than \(0\) in the nullspace, it must be singular.

We can find the singular matrices with \[det(A-\lambda I) = 0\]

Example of Finding the Eigenvectors

Start by finding the eigenvalues for \(A=\begin{bmatrix}3 & 1 \\1 & 3\end{bmatrix}\)

\(A\) is setup so that the eigenvalues will be real numbers.

\[ \begin{align*} 0 &= det(A-\lambda{}I)\\ 0 &= \begin{vmatrix} 3-\lambda & 1 \\ 1 & 3 - \lambda \end{vmatrix}\\ 0 &=(3-\lambda{})^2-1\\ 0 &= 9 - 6\lambda{} - \lambda{}^2 - 1\\ 0 &= \lambda{}^2 - 6\lambda -8\\ 0 &= (\lambda - 4)(\lambda - 2) \end{align*} \]

So \(\lambda{}_1=4\) and \(\lambda{}_2=2\) Now we can plug both \(\lambda\) in to \((A-\lambda{}I)\)

\[\begin{align*} \begin{bmatrix} 3 & 1 \\ 1 & 3 \end{bmatrix} - \begin{bmatrix} 2 & 0 \\ 0 & 2 \end{bmatrix} &= \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix} \\ \begin{bmatrix} 3 & 1 \\ 1 & 3 \end{bmatrix} - \begin{bmatrix} 4 & 0 \\ 0 & 4 \end{bmatrix} &= \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix} \\ \end{align*}\]

And solve for \((A-\lambda{}I)\vec{x}=0\)

\[\begin{align*} \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}\vec{x}_1=0 &\implies \vec{x}_1= \begin{bmatrix}1\\-1\end{bmatrix}\\ \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix}\vec{x}_2=0 &\implies \vec{x}_2= \begin{bmatrix}1\\1\end{bmatrix} \end{align*}\]

Example of a Degenerate Matrix

Notice that the eigenvectors in the first example are independent. Not all matrices have independent eigenvectors.

\[ A = \begin{bmatrix}3 & 1 \\ 0 & 3\end{bmatrix} \]

We can read the eigenvalues directly off a triangular matrix. To see how, try finding \(det(A-\lambda{}I)\vec{x}=0\):

\[\begin{equation*} \begin{vmatrix} 3-\lambda & 1 \\ 0 & 3-\lambda \end{vmatrix}=0 \end{equation*}\]

Remember, the determinant of a triangular matrices is the product down the diagonal.

\((3-\lambda )(3-\lambda )=0 \implies \lambda{}_1 = 3, \lambda{}_2 = 3\)

Repeated eigenvalues are not a problem. The problem comes when we try to solve \((A-\lambda{}I)\vec{x}=0\)

\[\begin{equation*} \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \vec{x} = 0 \end{equation*}\]

There is only one indepedent solution: \(\vec{x}=\begin{bmatrix}1\\0\end{bmatrix}\). We cannot diagonalize \(A\).

Diagonalization \(S^{-1}AS= \Lambda{}\)

Assume we take our \(n\) linearly independent eigenvectors of \(A\)

\[\begin{equation*} S = \begin{bmatrix} & & \\ \vec{x}_1 & \cdots & \vec{x}_n \\ & & \end{bmatrix} \end{equation*}\]

What happens when we take \(AS\)?

\[\begin{equation*} AS = A\begin{bmatrix} & & \\ \vec{x}_1 & \cdots & \vec{x}_n \\ & & \end{bmatrix} = \begin{bmatrix} & & \\ \lambda{}_1\vec{x}_1 & \cdots & \lambda{}_n\vec{x}_n \\ & & \end{bmatrix} \end{equation*}\]

Remember that \(AS\) is a linear combination of the colums of \(A\). Because \(\vec{x}_1\) is an eigenvector, the first column of \(AS\) is going to be \(\lambda{}_1\vec{x}_1\), and the subsequent columns follow the same pattern.

Now we want to factor out \(\lambda{}\) from \(AS\).

\[\begin{equation*} AS = \begin{bmatrix} & & \\ \lambda{}_1\vec{x}_1 & \cdots & \lambda{}_n\vec{x}_n \\ & & \end{bmatrix} = \begin{bmatrix} & & \\ \vec{x}_1 && \cdots && \vec{x}_n \\ & & \end{bmatrix} \begin{bmatrix} \lambda{}_1 & & \\ & \ddots & \\ & & \lambda{}_n \end{bmatrix} \end{equation*}\]

We will call this last diagonal matrix \(\Lambda\), and we can now say \(AS=S\Lambda\), which gives us the following two equations:

\[S^{-1}AS = \Lambda\] \[S\Lambda{}S^{-1}=A\]

Remember: We can only invert \(S\) is we have \(n\) independent eigenvectors

Which gives us the most sought after equation:

\[ A^2= S\Lambda{}S^{-1}S\Lambda{}S^{-1}=S\Lambda{}^2S^{-1} \implies A^k=S\Lambda{}^kS^{-1} \]

Theorem: \(A^k \rightarrow 0\) as \(k \rightarrow \infty\) if all \(|\lambda{}_i| < 1\)

Eigenvalues, Eigenvectors of \(A^2\)

If we have \(A\) with eigenvalues \(\lambda{}_1 \ldots \lambda{}_n\):

  • The eigenvalues of \(A^2\) are \((\lambda{}_1)^2 \ldots (\lambda{}_n)^2\)
  • The eigenvectors of \(A^2\) the same as te eigenvectors of \(A\)

Said another way:

If \(A\vec{x} = \lambda \vec{x}\) then \(A^2\vec{x} = \lambda A\vec{x} = \lambda{}^2\vec{x}\)

Facts

  • The sum of the \(n\) eigenvalues is equal to the trace of the matrix (the sum down the diagonal).
  • The product of the eigenvalues is equal to the determinate if the matrix has \(n\) distinct eigenvalues.
  • A triangular matrix , \(U\) has eigenvalues along the diagonal - they are the pivots.
  • \(A\) has one or more eigenvalues \(\lambda =0\) exactly when \(A\) is singular
  • We can multiply eigenvectors by any non-zero constant, and \(A\vec{x} = \lambda \vec{x}\) will remain true
  • Symmetric matrices always have real eigenvalues
  • Elimination changes eigenvalues

Facts About Diagonalizing

  • A matrix with fewer than \(n\) eigenvectors cannot be diagonalized
  • Any matrix that has no repeated eigenvalues can be diagonalized
  • \(A\) is sure to have \(n\) independent eigenvectors (and be diagonalizable) if all the \(\lambda{}\)’s are different (no repeated \(\lambda{}\)s).
  • If \(A\) does not have \(n\) independent \(\lambda\)s, it might be diagonizable.