In this problem, we’ll verify using R that SVD and Eigenvalues are related as worked out in the weekly module. Given a 3 × 2 matrix A.
Write code in R to compute X = AAT and Y = ATA. Then, compute the eigenvalues and eigenvectors of X and Y using the built-in commans in R. Then, compute the left-singular, singular values, and right-singular vectors of A using the svd command. Examine the two sets of singular vectors and show that they are indeed eigenvectors of X and Y. In addition, the two non-zero eigenvalues (the 3rd value will be very close to zero, if not zero) of both X and Y are the same and are squares of the non-zero singular values of A. Your code should compute all these vectors and scalars and store them in variables. Please add enough comments in your code to show me how to interpret your steps.
\[ A = \begin{bmatrix} 1 & 2 & 3 \\ -1 & 0 & 4 \end{bmatrix} \]
1.1 Write code in R to compute X = AAT and Y = ATA.
X <- A %*% t(A)
X
## [,1] [,2]
## [1,] 14 11
## [2,] 11 17
Y <- t(A) %*% A
Y
## [,1] [,2] [,3]
## [1,] 2 2 -1
## [2,] 2 4 6
## [3,] -1 6 25
1.2 Compute the eigenvalues and eigenvectors of X and Y using the built-in commans in R
X_value <- eigen(X)$values
X_value
## [1] 26.601802 4.398198
X_vector<- eigen(X)$vectors
X_vector
## [,1] [,2]
## [1,] 0.6576043 -0.7533635
## [2,] 0.7533635 0.6576043
Y_value <- eigen(Y)$values
Y_value
## [1] 2.660180e+01 4.398198e+00 1.058982e-16
Y_vector<- eigen(Y)$vectors
Y_vector
## [,1] [,2] [,3]
## [1,] -0.01856629 -0.6727903 0.7396003
## [2,] 0.25499937 -0.7184510 -0.6471502
## [3,] 0.96676296 0.1765824 0.1849001
1.3 Compute the left-singular, singular values, and right-singular vectors of A using the svd command.
left_sv <- svd(A)$u
left_sv
## [,1] [,2]
## [1,] -0.6576043 -0.7533635
## [2,] -0.7533635 0.6576043
sv <- svd(A)$d
sv
## [1] 5.157693 2.097188
right_sv <- svd(A)$v
right_sv
## [,1] [,2]
## [1,] 0.01856629 -0.6727903
## [2,] -0.25499937 -0.7184510
## [3,] -0.96676296 0.1765824
The examination of the two sets of singular vectors shows that they are are indeed eigenvectors of X and Y. The two non-zero eigenvalues of both X and Y are the same.
Using the procedure outlined in section 1 of the weekly handout, write a function to compute the inverse of a well-conditioned full-rank square matrix using co-factors.
A−1 = CT/det(A)
myinverse <- function(mtrx){
# assigning initial matrix of co-factors
C <- matrix(0,nrow=nrow(mtrx), ncol(mtrx))
for (i in 1:ncol(mtrx)){
for (j in 1:nrow(mtrx)){
# calculating the co-factors and populating matrix with them
C[i, j] <- det(A[-i,-j])*(-1)^(i+j)
}
}
# calculating the inverse of a matrix
t(C)/det(mtrx)
}
A <- matrix(c(5, 8, -1, 3, 2, 8, -5, 0, 10), nrow=3, ncol=3)
A
## [,1] [,2] [,3]
## [1,] 5 3 -5
## [2,] 8 2 0
## [3,] -1 8 10
B<- myinverse(A)
B
## [,1] [,2] [,3]
## [1,] -0.04255319 0.14893617 -0.02127660
## [2,] 0.17021277 -0.09574468 0.08510638
## [3,] -0.14042553 0.09148936 0.02978723
I = A%*%B
I
## [,1] [,2] [,3]
## [1,] 1.000000e+00 -2.220446e-16 -1.110223e-16
## [2,] -5.551115e-17 1.000000e+00 -1.110223e-16
## [3,] 2.220446e-16 -2.220446e-16 1.000000e+00
As we see the off-diagonal elements of I are close to zero, whereas the diagonal elements are close to 1.