An eigenvector \(x\) lies along the same line as \(Ax\): \[Ax = \lambda x\] The scalar \(\lambda\) is the eigenvalue.
Linear Transformation Properties: If \(Ax = \lambda x\), then:
Singularity: If \(Ax = \lambda x\), then \((A - \lambda I)x = 0\). This implies the matrix \((A - \lambda I)\) is singular, meaning: \[\det(A - \lambda I) = 0\]
The Two “Quick Checks”:
Special Matrices:
Almost all vectors change direction when multiplied by a matrix \(A\). However, certain exceptional vectors \(x\) stay in the same direction. These are the eigenvectors.
\[Ax = \lambda x\]
The eigenvalue \(\lambda\) tells us whether the special vector \(x\) is stretched, shrunk, reversed, or left unchanged.
Consider the matrix: \[A = \begin{pmatrix} 0.8 & 0.3 \\ 0.2 & 0.7 \end{pmatrix}\]
To find the eigenvalues, we solve the characteristic equation \(\det(A - \lambda I) = 0\): \[\det \begin{pmatrix} 0.8 - \lambda & 0.3 \\ 0.2 & 0.7 - \lambda \end{pmatrix} = \lambda^2 - 1.5\lambda + 0.5 = 0\]
Factoring the quadratic: \((\lambda - 1)(\lambda - 0.5) = 0\). The eigenvalues are \(\lambda_1 = 1\) and \(\lambda_2 = 0.5\).
We can use the built-in eigen() function in R to verify
these results.
# Define the matrix A
A <- matrix(c(0.8, 0.2, 0.3, 0.7), nrow = 2, ncol = 2)
# Calculate eigenvalues and eigenvectors
results <- eigen(A)
# Output results
cat("Eigenvalues:\n")## Eigenvalues:
## [1] 1.0 0.5
##
## Eigenvectors (as columns):
## [,1] [,2]
## [1,] 0.8320503 -0.7071068
## [2,] 0.5547002 0.7071068
A projection matrix projects any vector onto a subspace (e.g., a line or a plane). * If \(x\) is in the column space, it doesn’t move: \(Px = x\) (\(\lambda = 1\)). * If \(x\) is in the nullspace, it is crushed to zero: \(Px = 0\) (\(\lambda = 0\)).
A reflection matrix flips vectors across a line. * Eigenvectors sitting on the reflection line have \(\lambda = 1\). * Eigenvectors perpendicular to the line are reversed: \(\lambda = -1\).
These properties serve as vital checks for manual calculations:
To find eigenvalues for any \(n \times n\) matrix: 1. Subtract \(\lambda\) from the diagonal: Form the matrix \((A - \lambda I)\). 2. Compute the Determinant: Find the polynomial \(\det(A - \lambda I)\). 3. Find Roots: Solve the polynomial for zero.
For \(A = \begin{pmatrix} 1 & 2 \\ 2 & 4 \end{pmatrix}\), \(\det(A - \lambda I) = \lambda^2 - 5\lambda = 0\). Thus \(\lambda_1 = 0\) and \(\lambda_2 = 5\).
Eigenvalues are not always real. Consider a \(90^\circ\) rotation matrix \(Q\): \[Q = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}\]
The characteristic equation is \(\lambda^2 + 1 = 0\), giving eigenvalues \(\lambda = i\) and \(\lambda = -i\). Since no real vector stays in the same direction after a 90-degree rotation, the eigenvectors must be complex.
A Markov Matrix has columns that sum to 1 (like our Example 1). * Steady State: There is always an eigenvalue \(\lambda = 1\). This represents the “long-term” behavior where the system stabilizes. * Decaying Mode: Other eigenvalues satisfy \(|\lambda| < 1\). * Application: Google’s PageRank algorithm uses the eigenvector of a massive Markov matrix to determine website importance.
```
[ .8 .3 ] to
\begin{pmatrix} 0.8 & 0.3 \\ 0.2 & 0.7 \end{pmatrix}
for standard mathematical notation.𝑥 and 𝜆 with standard LaTeX $x$
and $\lambda$.