J.P.
5/13/2022
We will use the Power Method to approximate the dominant eigenvalue and eigenvector for the matrix: \[ A = \begin{bmatrix} 2& -12 \\ 1& -5 \\ \end{bmatrix} \] We start with an initial guess for the dominant eigenvector: \[ \mathbf{v}_0= \begin{bmatrix} 1\\ 1\\ \end{bmatrix} \] Using the method described, we obtain the following: \[ \mathbf{v}_1 = A\mathbf{v}_0 = \begin{bmatrix} 2& -12\\ 1& -5\\ \end{bmatrix} \begin{bmatrix} 1\\ 1\\ \end{bmatrix} = \begin{bmatrix} -10\\ -4\\ \end{bmatrix} \Rightarrow -4 \begin{bmatrix} 2.50 \\ 1.00\\ \end{bmatrix} \] And after repeating 6 times we get: \[ \mathbf{v}_6 = A\mathbf{v}_5 = \begin{bmatrix} 2& -12\\ 1& -5\\ \end{bmatrix} \begin{bmatrix} -280\\ -94\\ \end{bmatrix} = \begin{bmatrix} 598\\ 190\\ \end{bmatrix} \Rightarrow 190 \begin{bmatrix} 2.99 \\ 1.00\\ \end{bmatrix} \]
A = [0.49, 0.02, 0.22; 0.02, 0.28, 0.20; 0.22, 0.20, 0.40]
# Defining matrix A
v = [1;1;1] #initial guess
v = A*v;
[M, I] = max(v); # Finding location of
v_scaled = v/v(I) # largest value in first iteration
itr = 1;
while itr < 11 # loop starts
v = A*v; # calculating new vector
v_scaled = v/v(I) #scaling vector
itr = itr+1;
end
#Rayleigh quotient
s1 = A*v_scaled;
s2 = s1*v_scaled';
s3 = v_scaled'*v_scaled;
eig = s2/s3;
fprintf('The dominant eigenvalue is %d and the dominant eigenvector is \n', eig)
v_scaled
A =
0.490000 0.020000 0.220000
0.020000 0.280000 0.200000
0.220000 0.200000 0.400000
v =
1
1
1
The dominant eigenvalue is 0.72 and the dominant eigenvector is
v_scaled =
0.9999
0.5001
1.0000
After 10 iterations the Power Method gave an estimated dominant eigenvector of \[ \begin{bmatrix} .9999 \\ 0.5001\\1.0000\\ \end{bmatrix} \]
and an estimated dominant eigenvalue of \( 0.72 \) for the matrix
\[ A = \begin{bmatrix} 0.49 0.02 0.22\\ 0.02 0.28 0.20 \\ 0.22 0.20 0.40 \\ \end{bmatrix}. \]
Using analytical methods from linear algebra the dominant eigenvector is \[ \begin{bmatrix} 1.0 \\ 0.5\\1.0\\ \end{bmatrix} \]
with a corresponding eigenvalue of 0.72.
This results in a 0% error for the dominant eigenvalue and a 0.0022% error for the dominant eigenvector.
The Power Method is a quick and efficient numerical method of finding dominant eigenvectors and eigenvalues for a square matrix. With 10 iterations we had 0.0022% error for a \( 3 \times 3 \) matrix, and with more iterations the percent error will decrease. The downfall is that if you can only find a dominant eigenvector and eigenvalue.