The rank of a matrix \(A\) can be defined as the dimension of the vector space spanned by the columns of A (column rank) or the dimension of the vector spaced spanned by the rows of A (row rank). If the matrix \(A\) is nonsingular, the rank is the dimension of the matrix.
We will show
\[ det(A) = det \left[ \begin{array}{rrrr} 1 & 2 & 3 & 4 \\ -1 & 0 & 1 & 3 \\ 0 & 1 & -2 & 1 \\ 5 & 4 & -2 & -3 \\ \end{array} \right] \neq 0 \] This would imply the rank of \(A\) is 4. The calculation below shows \(det(A)=-9\) which proves that the rank of \(A\) is 4.
## [,1] [,2] [,3] [,4]
## [1,] 1 2 3 4
## [2,] -1 0 1 3
## [3,] 0 1 -2 1
## [4,] 5 4 -2 -3
## [1] -9
If a matrix \(A\) has dimensions \(m \times n\) where \(m > n\), the maximum rank of the matrix is \(n\).
The reason is that the maximum dimension of the column space spanned by the \(n\) vectors is \(n\). Since the row rank and the column rank of the matrix \(A\) are equal, both must satisfy the same upper bound. This proves the
The minimum rank of the matrix is 1 assuming the matrix is non-zero.
Because the matrix \(A\) is non-zero, at least one entry \(a_{ij}\) is non-zero. This implies \(j\)-th column vector \(A[,j]\) is non-zero. Therefore, the column space spanned by \(A[,j]\) has dimension 1. Therefore, the dimension of the space spanned by all column vectors must be at least 1. This proves the result.
We will show the rank of \(B\) is 1. The proof is that \(B\) is row echelon equivalent to a matrix with one non-zero row vector.
\[ B = \left[ \begin{array}{rrr} 1 & 2 & 1 \\ 3 & 6 & 3 \\ 2 & 4 & 2 \\ \end{array} \right] \rightarrow_{-3R_1 + R_2} \left[ \begin{array}{rrr} 1 & 2 & 1 \\ 0 & 0 & 0 \\ 2 & 4 & 2 \\ \end{array} \right] \rightarrow_{-2R_2+R_3} \left[ \begin{array}{rrr} 1 & 2 & 1 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ \end{array} \right] \] The rank of \(B\) is preserved under elementary row operations. The above elementary row operations show that \(B\) has a row space spanned by just one non-zero vector \((1, 2, 1)\). This means the rank of \(B\) is 1.
Compute the eigenvector and eigenvalues of \(A\) where \[ A = \left[ \begin{array}{rrr} 1 & 2 & 3 \\ 0 & 4 & 5 \\ 0 & 0 & 6 \\ \end{array} \right] \]
We first write out the characteristic polynomial to solve for the eigenvalues. \[ det(A - u I) = det \left[ \begin{array}{rrr} 1 - u & 2 & 3 \\ 0 & 4 - u & 5 \\ 0 & 0 & 6 - u \\ \end{array} \right] = (1-u) det \left[ \begin{array}{rr} 4 - u & 5 \\ 0 & 6 - u \\ \end{array} \right] = (1-u)[(4-u)(6-u) + 5 \cdot 0 ] = (1-u)(4-u)(6-u) = 0 \]
Thus, we see the characteristic polynomial has roots \(u = 1, 4, 6\) in ascending order. These roots are the eigenvalues.
Next, we solve for the eigenvectors of each eigenvalue by backward substitution into the lower triangular matrix.
When \(u = 1\), we get the matrix equation:
\[ \left[ \begin{array}{rrr} 1 - 1 & 2 & 3 \\ 0 & 4- 1 & 5 \\ 0 & 0 & 6 - 1 \\ \end{array} \right] \left( \begin{array}{r} x \\ y \\ z \\ \end{array} \right) = \left[ \begin{array}{rrr} 0 & 2 & 3 \\ 0 & 3 & 5 \\ 0 & 0 & 5 \\ \end{array} \right] \left( \begin{array}{r} x \\ y \\ z \\ \end{array} \right) = \left( \begin{array}{r} 0 \\ 0 \\ 0 \\ \end{array} \right) \] These give the linear equations by back substitution:
\[
\begin{align}
z &= 0 & \text{ by row 3} \\
3y + 5z &= 0 & \text{ by row 2} \\
y &= 0 & \text{ by solving above for } y \\
\end{align}
\] Since \(x\) is unconstrained by the equations, we get an eigenvector \(v_1\) equal to
\[
v_1 =
\left(
\begin{array}{r}
1 \\
0 \\
0 \\
\end{array}
\right) \text{ for eigenvalue u = 1}
\] Next, solving the same problem for eigenvalue \(u = 4\) we get the system of equations:
\[ \left[ \begin{array}{rrr} 1-4 & 2 & 3 \\ 0 & 4- 4 & 5 \\ 0 & 0 & 6 - 4 \\ \end{array} \right] \left( \begin{array}{r} x \\ y \\ z \\ \end{array} \right) = \left[ \begin{array}{rrr} -3 & 2 & 3 \\ 0 & 0 & 5 \\ 0 & 0 & 2 \\ \end{array} \right] \left( \begin{array}{r} x \\ y \\ z \\ \end{array} \right) = \left( \begin{array}{r} 0 \\ 0 \\ 0 \\ \end{array} \right) \]
Back substitution reveals: \[ \begin{align} 2z & = 0 & \text{ by row 3} \\ 5z & = 0 & \text{ by row 2} \\ -3x + 2y + 3z & = 0 & \text{ by row 1} \\ \end{align} \]
This implies \(z = 0\) and \(-3x + 2y = 0\) which implies \(y = -3/2 x\).
Thus, the eigenvector associated with eigenvalue \(u=4\) is \[ \left( \begin{array}{r} 1 \\ 3/2 \\ 0 \\ \end{array} \right) \text{ for eigenvalue u = 4} \] Following the same logic for \(u = 6\) , we get the matrix equation:
\[ \left[ \begin{array}{rrr} 1-6 & 2 & 3 \\ 0 & 4- 6 & 5 \\ 0 & 0 & 6 - 6 \\ \end{array} \right] \left( \begin{array}{r} x \\ y \\ z \\ \end{array} \right) = \left[ \begin{array}{rrr} -5 & 2 & 3 \\ 0 & -2 & 5 \\ 0 & 0 & 0 \\ \end{array} \right] \left( \begin{array}{r} x \\ y \\ z \\ \end{array} \right) = \left( \begin{array}{r} 0 \\ 0 \\ 0 \\ \end{array} \right) \]
This implies:
\[ \begin{align} -2y + 5z & = 0 & \text{ by row 2} \\ -5x + 2y + 3z & = 0 & \text{ by row 1} \\ z & = \frac{2}{5} y & \text{ by equation 1 above} \\ -5x + 2y + 3 ( \frac{2}{5} y) & =0 & \text{ plugging into above equation 2 above} \\ -5x + \frac{16}{5}y & =0 & \\ y & = \frac{25}{16}x & \text{ by solving above for y} \end{align} \]
We conclude the eigenvector associated with \(u = 6\) is:
\[ \left( \begin{array}{r} x \\ \frac{25}{16}x \\ \frac{5}{8}x \end{array} \right) = x \left( \begin{array}{r} 1 \\ 25/16 \\ 5/8 \\ \end{array} \right) \text{ for eigenvalue u = 6 } \] This completes the solution.
We can use the eigen() function in R to calculate the same result:
## [,1] [,2] [,3]
## [1,] 1 2 3
## [2,] 0 4 5
## [3,] 0 0 6
## eigen() decomposition
## $values
## [1] 6 4 1
##
## $vectors
## [,1] [,2] [,3]
## [1,] 0.5108407 0.5547002 1
## [2,] 0.7981886 0.8320503 0
## [3,] 0.3192754 0.0000000 0
## [1] 0.5547002 0.8320503 0.0000000
# Use a scaling constant (entry 1) to show
# the below vector is equivalent to solution for eigenvalue 4
(vec2/vec2[1])## [1] 1.0 1.5 0.0
vec1 = soln$vectors[,1]
#
# Use a scaling constant 16 to show
# the below vector is equivalent to solution for eigenvalue 6
(vec1/vec1[1]*16)## [1] 16 25 10
Clearly, the above eignvector and eigenvalues are the same as my own solution.