Setup

## 
## Attaching package: 'dplyr'
## The following objects are masked from 'package:stats':
## 
##     filter, lag
## The following objects are masked from 'package:base':
## 
##     intersect, setdiff, setequal, union
## 
## Attaching package: 'kableExtra'
## The following object is masked from 'package:dplyr':
## 
##     group_rows

Part 1 - (non)Commutivity of Matrix Transposes

(1) Show that \({A^{T}A \neq AA^{T}}\) in General. (Proof and Demonstration)

##      [,1]      [,2]      [,3]      [,4]     
## [1,] "A_{1,1}" "A_{1,2}" "A_{1,j}" "A_{1,n}"
## [2,] "A_{2,1}" "A_{2,2}" "A_{2,j}" "A_{2,n}"
## [3,] "A_{i,1}" "A_{i,2}" "A_{i,j}" "A_{i,n}"
## [4,] "A_{n,1}" "A_{n,2}" "A_{n,j}" "A_{n,n}"
##      [,1]      [,2]      [,3]      [,4]     
## [1,] "A_{1,1}" "A_{2,1}" "A_{i,1}" "A_{n,1}"
## [2,] "A_{1,2}" "A_{2,2}" "A_{i,2}" "A_{n,2}"
## [3,] "A_{1,j}" "A_{2,j}" "A_{i,j}" "A_{n,j}"
## [4,] "A_{1,n}" "A_{2,n}" "A_{i,n}" "A_{n,n}"

\[ A = \begin{bmatrix}A_{1,1}&A_{1,2}&A_{1,j}&A_{1,n} \\A_{2,1}&A_{2,2}&A_{2,j}&A_{2,n} \\A_{i,1}&A_{i,2}&A_{i,j}&A_{i,n} \\A_{n,1}&A_{n,2}&A_{n,j}&A_{n,n} \\\end{bmatrix} \] \[ A^T = \begin{bmatrix}A_{1,1}&A_{2,1}&A_{i,1}&A_{n,1} \\A_{1,2}&A_{2,2}&A_{i,2}&A_{n,2} \\A_{1,j}&A_{2,j}&A_{i,j}&A_{n,j} \\A_{1,n}&A_{2,n}&A_{i,n}&A_{n,n} \\\end{bmatrix} \] \[{ AA^{T}= \begin{bmatrix}A_{1,1}&A_{1,2}&A_{1,j}&A_{1,n} \\A_{2,1}&A_{2,2}&A_{2,j}&A_{2,n} \\A_{i,1}&A_{i,2}&A_{i,j}&A_{i,n} \\A_{n,1}&A_{n,2}&A_{n,j}&A_{n,n} \\\end{bmatrix} \begin{bmatrix}A_{1,1}&A_{2,1}&A_{i,1}&A_{n,1} \\A_{1,2}&A_{2,2}&A_{i,2}&A_{n,2} \\A_{1,j}&A_{2,j}&A_{i,j}&A_{n,j} \\A_{1,n}&A_{2,n}&A_{i,n}&A_{n,n} \\\end{bmatrix} } = \left[ \begin{matrix} \sum\limits_{ j=1 }^{ n }{ { A }_{ 1,j }^{ 2 } } & \sum\limits_{ j=1 }^{ n }{ { A }_{ 1,j }{ A }_{ 2,j } } & ... & \sum\limits_{ j=1 }^{ n }{ { A }_{ 1,j }{ A }_{ n,j } } \\ \sum\limits_{ j=1 }^{ n }{ { A }_{ 1,j }{ A }_{ 2,j } } & \sum\limits_{ j=1 }^{ n }{ { A }_{ 2,j }^{ 2 } } & ... & \sum\limits_{ j=1 }^{ n }{ { A }_{ 2,j }{ A }_{ n,j } } \\ ... & ... & ... & ... \\ \sum\limits_{ j=1 }^{ n }{ { A }_{ 1,j }{ A }_{ n,j } } & \sum\limits_{ j=1 }^{ n }{ { A }_{ 2,j }{ A }_{ n,j } } & ... & \sum\limits_{ j=1 }^{ n }{ { A }_{ n,j }^{ 2 } } \end{matrix} \right] \]

\[{ A^TA = \begin{bmatrix}A_{1,1}&A_{2,1}&A_{i,1}&A_{n,1} \\A_{1,2}&A_{2,2}&A_{i,2}&A_{n,2} \\A_{1,j}&A_{2,j}&A_{i,j}&A_{n,j} \\A_{1,n}&A_{2,n}&A_{i,n}&A_{n,n} \\\end{bmatrix} \begin{bmatrix}A_{1,1}&A_{1,2}&A_{1,j}&A_{1,n} \\A_{2,1}&A_{2,2}&A_{2,j}&A_{2,n} \\A_{i,1}&A_{i,2}&A_{i,j}&A_{i,n} \\A_{n,1}&A_{n,2}&A_{n,j}&A_{n,n} \\\end{bmatrix} } = \left[ \begin{matrix} \sum\limits_{ i=1 }^{ n }{ { A }_{ i,1 }^{ 2 } } & \sum\limits_{ i=1 }^{ n }{ { A }_{ i,1 }{ A }_{ i,2 } } & ... & \sum\limits_{ i=1 }^{ n }{ { A }_{ i,1 }{ A }_{ i,n } } \\ \sum\limits_{ i=1 }^{ n }{ { A }_{ i,1 }{ A }_{ i,2 } } & \sum\limits_{ i=1 }^{ n }{ { A }_{ i,2 }^{ 2 } } & ... & \sum\limits_{ i=1 }^{ n }{ { A }_{ i,2 }{ A }_{ i,n } } \\ ... & ... & ... & ... \\ \sum\limits_{ i=1 }^{ n }{ { A }_{ i,1 }{ A }_{ i,n } } & \sum\limits_{ i=1 }^{ n }{ { A }_{ i,2 }{ A }_{ i,n } } & ... & \sum\limits_{ i=1 }^{ n }{ { A }_{ i,n }^{ 2 } } \end{matrix} \right] \]

For the above to be true, we would need to have the above summations to be elementwise equal between the top and bottom matrices.
This is usually not the case.

COUNTEREXAMPLE:

##      [,1] [,2] [,3]
## [1,]    1    2    3
## [2,]    6    4    5
## [3,]    7    8    9
##      [,1] [,2] [,3]
## [1,]    1    6    7
## [2,]    2    4    8
## [3,]    3    5    9
##      [,1] [,2] [,3]
## [1,]   14   29   50
## [2,]   29   77  119
## [3,]   50  119  194
##      [,1] [,2] [,3]
## [1,]   86   82   96
## [2,]   82   84   98
## [3,]   96   98  115
##       [,1]  [,2]  [,3]
## [1,] FALSE FALSE FALSE
## [2,] FALSE FALSE FALSE
## [3,] FALSE FALSE FALSE

As a simple counterexample, consider:

\[ Q = \begin{bmatrix}1&2&3 \\6&4&5 \\7&8&9 \\\end{bmatrix} ; Q^{T} = \begin{bmatrix}1&6&7 \\2&4&8 \\3&5&9 \\\end{bmatrix}\] \[ QQ^{T} = \begin{bmatrix}1&2&3 \\6&4&5 \\7&8&9 \\\end{bmatrix} \begin{bmatrix}1&6&7 \\2&4&8 \\3&5&9 \\\end{bmatrix} = \begin{bmatrix}14&29&50 \\29&77&119 \\50&119&194 \\\end{bmatrix}\] \[ Q^{T}Q = \begin{bmatrix}1&6&7 \\2&4&8 \\3&5&9 \\\end{bmatrix} \begin{bmatrix}1&2&3 \\6&4&5 \\7&8&9 \\\end{bmatrix} = \begin{bmatrix}86&82&96 \\82&84&98 \\96&98&115 \\\end{bmatrix}\] Clearly, \({QQ^T \neq Q^TQ}\) .


(2) For a special type of square matrix, \({ A^{ T }A = AA^{ T } }\)

Under what conditions could this be true? (Hint: The Identity matrix I is an example of such a matrix).

This is true for Permutation matrices because

\[(PP^T)_{ij} = \sum_{k=1}^n P_{ik} P^T_{kj} = \sum_{k=1}^n P_{ik} P_{jk}\] and

\[\sum_{k=1}^n P_{ik} P_{jk} = \begin{cases} 1 & \text{if } i = j \\ 0 & \text{otherwise} \end{cases}\] so \[PP^T = I\] .

PERMUTATION EXAMPLE:

##      [,1] [,2] [,3] [,4] [,5]
## [1,]    0    1    0    0    0
## [2,]    0    0    0    1    0
## [3,]    1    0    0    0    0
## [4,]    0    0    1    0    0
## [5,]    0    0    0    0    1
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    0    0    1    0    0
## [2,]    1    0    0    0    0
## [3,]    0    0    0    1    0
## [4,]    0    1    0    0    0
## [5,]    0    0    0    0    1
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]    0    1    0    0    0
## [3,]    0    0    1    0    0
## [4,]    0    0    0    1    0
## [5,]    0    0    0    0    1
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]    0    1    0    0    0
## [3,]    0    0    1    0    0
## [4,]    0    0    0    1    0
## [5,]    0    0    0    0    1
##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE

For example, consider:

\[ P = \begin{bmatrix}0&1&0&0&0 \\0&0&0&1&0 \\1&0&0&0&0 \\0&0&1&0&0 \\0&0&0&0&1 \\\end{bmatrix} ; P^{T} = \begin{bmatrix}0&0&1&0&0 \\1&0&0&0&0 \\0&0&0&1&0 \\0&1&0&0&0 \\0&0&0&0&1 \\\end{bmatrix}\] \[ PP^{T} = \begin{bmatrix}0&1&0&0&0 \\0&0&0&1&0 \\1&0&0&0&0 \\0&0&1&0&0 \\0&0&0&0&1 \\\end{bmatrix} \begin{bmatrix}0&0&1&0&0 \\1&0&0&0&0 \\0&0&0&1&0 \\0&1&0&0&0 \\0&0&0&0&1 \\\end{bmatrix} = \begin{bmatrix}1&0&0&0&0 \\0&1&0&0&0 \\0&0&1&0&0 \\0&0&0&1&0 \\0&0&0&0&1 \\\end{bmatrix} = I_5\] \[ P^{T}P = \begin{bmatrix}0&0&1&0&0 \\1&0&0&0&0 \\0&0&0&1&0 \\0&1&0&0&0 \\0&0&0&0&1 \\\end{bmatrix} \begin{bmatrix}0&1&0&0&0 \\0&0&0&1&0 \\1&0&0&0&0 \\0&0&1&0&0 \\0&0&0&0&1 \\\end{bmatrix} = \begin{bmatrix}1&0&0&0&0 \\0&1&0&0&0 \\0&0&1&0&0 \\0&0&0&1&0 \\0&0&0&0&1 \\\end{bmatrix} =I_5\]


Obviously commutativity is true for symmetric matrices because, by definition, \({A = A^{T}}\), so \({AA^T}=AA=A^TA\) .

SYMMETRIC EXAMPLE:

##      [,1] [,2] [,3]
## [1,]    1    2    4
## [2,]    2    3    5
## [3,]    4    5    6
##      [,1] [,2] [,3]
## [1,]    1    2    4
## [2,]    2    3    5
## [3,]    4    5    6
##      [,1] [,2] [,3]
## [1,]   21   28   38
## [2,]   28   38   53
## [3,]   38   53   77
##      [,1] [,2] [,3]
## [1,]   21   28   38
## [2,]   28   38   53
## [3,]   38   53   77
##      [,1] [,2] [,3]
## [1,] TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE

For example, consider:

\[ S = \begin{bmatrix}1&2&4 \\2&3&5 \\4&5&6 \\\end{bmatrix} ; S^{T} = \begin{bmatrix}1&2&4 \\2&3&5 \\4&5&6 \\\end{bmatrix}\] \[ SS^{T} = \begin{bmatrix}1&2&4 \\2&3&5 \\4&5&6 \\\end{bmatrix} \begin{bmatrix}1&2&4 \\2&3&5 \\4&5&6 \\\end{bmatrix} = \begin{bmatrix}21&28&38 \\28&38&53 \\38&53&77 \\\end{bmatrix}\] \[ S^{T}S = \begin{bmatrix}1&2&4 \\2&3&5 \\4&5&6 \\\end{bmatrix} \begin{bmatrix}1&2&4 \\2&3&5 \\4&5&6 \\\end{bmatrix} = \begin{bmatrix}21&28&38 \\28&38&53 \\38&53&77 \\\end{bmatrix}\]


Also this is true for orthogonal matrices because, by definition, \({A^{T} = A^{-1}det(A)}\) ,
so \[{ AA^{T} = AA^{-1}*det(A)=I*{det(A)}=A^{-1}A*det(A)=A^{T}A}\] .

ORTHOGONAL EXAMPLE:

##      [,1] [,2]
## [1,]    1   -1
## [2,]    1    1
##      [,1] [,2]
## [1,]    1    1
## [2,]   -1    1
##      [,1] [,2]
## [1,]  0.5  0.5
## [2,] -0.5  0.5
## [1] 2
##      [,1] [,2]
## [1,]    2    0
## [2,]    0    2
##      [,1] [,2]
## [1,]    2    0
## [2,]    0    2
##      [,1] [,2]
## [1,] TRUE TRUE
## [2,] TRUE TRUE

For example, consider orthogonal matrix \({X = \begin{bmatrix}1&-1 \\1&1 \\\end{bmatrix}}\) which has transpose \({X^{T} = \begin{bmatrix}1&1 \\-1&1 \\\end{bmatrix}}\) . Here,
\({XX^{T}= \begin{bmatrix}1&-1 \\1&1 \\\end{bmatrix} \begin{bmatrix}1&1 \\-1&1 \\\end{bmatrix} = \begin{bmatrix}2&0 \\0&2 \\\end{bmatrix}}\)
\({X^{T}X= \begin{bmatrix}1&1 \\-1&1 \\\end{bmatrix} \begin{bmatrix}1&-1 \\1&1 \\\end{bmatrix} = \begin{bmatrix}2&0 \\0&2 \\\end{bmatrix}}\)

Note that we can make the above matrix orthonormal by dividing it by the square root of its determinant, which here is 2.

ORTHONORMAL EXAMPLE:

##          [,1]      [,2]
## [1,] 0.707107 -0.707107
## [2,] 0.707107  0.707107
##           [,1]     [,2]
## [1,]  0.707107 0.707107
## [2,] -0.707107 0.707107
##           [,1]     [,2]
## [1,]  0.707107 0.707107
## [2,] -0.707107 0.707107
##                          [,1]                     [,2]
## [1,] -0.000000000000000111022 -0.000000000000000111022
## [2,]  0.000000000000000111022 -0.000000000000000111022
##      [,1] [,2]
## [1,] TRUE TRUE
## [2,] TRUE TRUE
##      [,1] [,2]
## [1,]    1    0
## [2,]    0    1
##      [,1] [,2]
## [1,]    1    0
## [2,]    0    1
##      [,1] [,2]
## [1,] TRUE TRUE
## [2,] TRUE TRUE

For example, consider orthonormal matrix \({O = \begin{bmatrix} 0.707107&-0.707107 \\0.707107&0.707107 \\\end{bmatrix}}\) which has transpose \({O^{T} = \begin{bmatrix}0.707107&0.707107 \\-0.707107& 0.707107 \\\end{bmatrix}}\) .
Here,
\({OO^{T}= \begin{bmatrix} 0.707107&-0.707107 \\0.707107&0.707107 \\\end{bmatrix} \begin{bmatrix}0.707107&0.707107 \\-0.707107& 0.707107 \\\end{bmatrix} = \begin{bmatrix}1&0 \\0&1 \\\end{bmatrix}} = I_2\)

\({O^{T}O= \begin{bmatrix}0.707107&0.707107 \\-0.707107& 0.707107 \\\end{bmatrix} \begin{bmatrix} 0.707107&-0.707107 \\0.707107&0.707107 \\\end{bmatrix} = \begin{bmatrix}1&0 \\0&1 \\\end{bmatrix}} = I_2\)


However, the above items do not include all matrices for which the transpose commutes under multiplication.

By the Spectral Theorem, the full set of matrices for which the transpose commutes are called Normal Matrices. These are discussed in the text book in sections NM and OD (pages 574-580.)

By Theorem OD (Orthonormal Diagonalization) on page 575, we have:

Suppose that \(A\) is a square matrix. Then there is a unitary matrix \(U\) and a diagonal matrix \(D\), with diagonal entries equal to the eigenvalues of \(A\), such that \(U^TAU = D\) if and only if \(A\) is a normal matrix.

(Note: for matrices with all real entries, unitary is synonymous with orthonormal , and above I have used \(U^T\) to indicate transpose, which for real matrices is the same as the adjoint. The distinction only comes into play in the case of matrices with complex entries.)

NORMAL example (non-symmetric, etc.):

##      [,1] [,2] [,3]
## [1,]    1    1    0
## [2,]    0    1    1
## [3,]    1    0    1
##      [,1] [,2] [,3]
## [1,]    1    0    1
## [2,]    1    1    0
## [3,]    0    1    1
##      [,1] [,2] [,3]
## [1,]    2    1    1
## [2,]    1    2    1
## [3,]    1    1    2
##      [,1] [,2] [,3]
## [1,]    2    1    1
## [2,]    1    2    1
## [3,]    1    1    2
##      [,1] [,2] [,3]
## [1,] TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE

For example, consider a normal matrix which is neither symmetric nor orthogonal:

\[ Z = \begin{bmatrix}1&1&0 \\0&1&1 \\1&0&1 \\\end{bmatrix} ; Z^{T} = \begin{bmatrix}1&0&1 \\1&1&0 \\0&1&1 \\\end{bmatrix}\] \[ ZZ^{T} = \begin{bmatrix}1&1&0 \\0&1&1 \\1&0&1 \\\end{bmatrix} \begin{bmatrix}1&0&1 \\1&1&0 \\0&1&1 \\\end{bmatrix} = \begin{bmatrix}2&1&1 \\1&2&1 \\1&1&2 \\\end{bmatrix}\] \[ Z^{T}Z = \begin{bmatrix}1&0&1 \\1&1&0 \\0&1&1 \\\end{bmatrix} \begin{bmatrix}1&1&0 \\0&1&1 \\1&0&1 \\\end{bmatrix} = \begin{bmatrix}2&1&1 \\1&2&1 \\1&1&2 \\\end{bmatrix}\]


Part 2 - Matrix Decomposition - LU

Matrix factorization is a very important problem.

There are supercomputers built just to do matrix factorizations. Every second you are on an airplane, matrices are being factorized. Radars that track flights use a technique called Kalman filtering. At the heart of Kalman Filtering is a Matrix Factorization operation. Kalman Filters are solving linear systems of equations when they track your flight using radars.

Write an R function to factorize a square matrix A into LU or LDU, whichever you prefer.

Please submit your response in an R Markdown document using our class naming convention, e.g. LFulton_Assignment2_PS2.Rmd You don’t have to worry about permuting rows of A and you can assume that A is less than 5x5, if you need to hard-code any variables in your code. If you doing the entire assignment in R, then please submit only one markdown document for both the problems.

Test a constructed 3x3 matrix

##      [,1] [,2] [,3]
## [1,]    1    1    1
## [2,]    0    1    1
## [3,]    0    0    1
## [1] 1
##      [,1] [,2] [,3]
## [1,]    1   -1    0
## [2,]    0    1   -1
## [3,]    0    0    1
##      [,1] [,2] [,3]
## [1,]    1    0    0
## [2,]    1    1    0
## [3,]    1    1    1
## [1] 1
##      [,1] [,2] [,3]
## [1,]    1    0    0
## [2,]   -1    1    0
## [3,]    0   -1    1
##      [,1] [,2] [,3]
## [1,]    1    1    1
## [2,]    1    2    2
## [3,]    1    2    3

Testing matrix \({\begin{bmatrix}1&1&1 \\1&2&2 \\1&2&3 \\\end{bmatrix}}\) :

Get the result using my LU algorithm

## $L
##      [,1] [,2] [,3]
## [1,]    1    0    0
## [2,]    1    1    0
## [3,]    1    1    1
## 
## $U
##      [,1] [,2] [,3]
## [1,]    1    1    1
## [2,]    0    1    1
## [3,]    0    0    1

My result is \[L = \begin{bmatrix}1&0&0 \\1&1&0 \\1&1&1 \\\end{bmatrix} ; U = \begin{bmatrix}1&1&1 \\0&1&1 \\0&0&1 \\\end{bmatrix} ; LU = \begin{bmatrix}1&1&1 \\1&2&2 \\1&2&3 \\\end{bmatrix} \]

##      [,1] [,2] [,3]
## [1,] TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE
##      [,1] [,2] [,3]
## [1,] TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE

Check using lu decomposition function from pracma

## $L
##      [,1] [,2] [,3]
## [1,]    1    0    0
## [2,]    1    1    0
## [3,]    1    1    1
## 
## $U
##      [,1] [,2] [,3]
## [1,]    1    1    1
## [2,]    0    1    1
## [3,]    0    0    1
## $L
##      [,1] [,2] [,3]
## [1,]    1    0    0
## [2,]    1    1    0
## [3,]    1    1    1
## 
## $U
##      [,1] [,2] [,3]
## [1,]    1    1    1
## [2,]    0    1    1
## [3,]    0    0    1

Pracma result is \[L = \begin{bmatrix}1&0&0 \\1&1&0 \\1&1&1 \\\end{bmatrix} ; U = \begin{bmatrix}1&1&1 \\0&1&1 \\0&0&1 \\\end{bmatrix}; LU = \begin{bmatrix}1&1&1 \\1&2&2 \\1&2&3 \\\end{bmatrix} \]

##      [,1] [,2] [,3]
## [1,] TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE
##      [,1] [,2] [,3]
## [1,] TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE

Test a constructed 5x5 matrix

##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    2    3    4    5
## [2,]    0    2    3    4    5
## [3,]    0    0    3    4    5
## [4,]    0    0    0    4    5
## [5,]    0    0    0    0    5
## [1] 120
##      [,1] [,2]      [,3]      [,4]  [,5]
## [1,]    1 -1.0  0.000000  0.000000  0.00
## [2,]    0  0.5 -0.500000  0.000000  0.00
## [3,]    0  0.0  0.333333 -0.333333  0.00
## [4,]    0  0.0  0.000000  0.250000 -0.25
## [5,]    0  0.0  0.000000  0.000000  0.20
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]    1    1    0    0    0
## [3,]    1    1    1    0    0
## [4,]    1    1    1    1    0
## [5,]    1    1    1    1    1
## [1] 1
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]   -1    1    0    0    0
## [3,]    0   -1    1    0    0
## [4,]    0    0   -1    1    0
## [5,]    0    0    0   -1    1
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    2    3    4    5
## [2,]    1    4    6    8   10
## [3,]    1    4    9   12   15
## [4,]    1    4    9   16   20
## [5,]    1    4    9   16   25

Testing matrix \({\begin{bmatrix}1&2&3&4&5 \\1&4&6&8&10 \\1&4&9&12&15 \\1&4&9&16&20 \\1&4&9&16&25 \\\end{bmatrix}}\) :

Get the result using my LU algorithm

## $L
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]    1    1    0    0    0
## [3,]    1    1    1    0    0
## [4,]    1    1    1    1    0
## [5,]    1    1    1    1    1
## 
## $U
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    2    3    4    5
## [2,]    0    2    3    4    5
## [3,]    0    0    3    4    5
## [4,]    0    0    0    4    5
## [5,]    0    0    0    0    5

My result is \[L = \begin{bmatrix}1&0&0&0&0 \\1&1&0&0&0 \\1&1&1&0&0 \\1&1&1&1&0 \\1&1&1&1&1 \\\end{bmatrix} ; U = \begin{bmatrix}1&2&3&4&5 \\0&2&3&4&5 \\0&0&3&4&5 \\0&0&0&4&5 \\0&0&0&0&5 \\\end{bmatrix}; LU = \begin{bmatrix}1&2&3&4&5 \\1&4&6&8&10 \\1&4&9&12&15 \\1&4&9&16&20 \\1&4&9&16&25 \\\end{bmatrix} \]

##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE
##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE
## $L
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]    1    1    0    0    0
## [3,]    1    1    1    0    0
## [4,]    1    1    1    1    0
## [5,]    1    1    1    1    1
## 
## $U
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    2    3    4    5
## [2,]    0    2    3    4    5
## [3,]    0    0    3    4    5
## [4,]    0    0    0    4    5
## [5,]    0    0    0    0    5

Pracma result is \[L = \begin{bmatrix}1&0&0&0&0 \\1&1&0&0&0 \\1&1&1&0&0 \\1&1&1&1&0 \\1&1&1&1&1 \\\end{bmatrix} ; U = \begin{bmatrix}1&2&3&4&5 \\0&2&3&4&5 \\0&0&3&4&5 \\0&0&0&4&5 \\0&0&0&0&5 \\\end{bmatrix}; LU = \begin{bmatrix}1&2&3&4&5 \\1&4&6&8&10 \\1&4&9&12&15 \\1&4&9&16&20 \\1&4&9&16&25 \\\end{bmatrix} \]

##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE
##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE

Test a different 5x5 matrix

##      [,1] [,2] [,3] [,4] [,5]
## [1,]    9    3    2    4    1
## [2,]    0    6    9   -3    5
## [3,]    0    0    4    2   -1
## [4,]    0    0    0    7    5
## [5,]    0    0    0    0  -12
## [1] -18144
##          [,1]       [,2]       [,3]       [,4]       [,5]
## [1,] 0.111111 -0.0555556  0.0694444 -0.1071429 -0.0643188
## [2,] 0.000000  0.1666667 -0.3750000  0.1785714  0.1750992
## [3,] 0.000000  0.0000000  0.2500000 -0.0714286 -0.0505952
## [4,] 0.000000  0.0000000  0.0000000  0.1428571  0.0595238
## [5,] 0.000000  0.0000000  0.0000000  0.0000000 -0.0833333
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]   -1    1    0    0    0
## [3,]   99    5    1    0    0
## [4,]  101   22  -66    1    0
## [5,]   77  -55   33  -22    1
## [1] 1
##         [,1]                      [,2]                       [,3]                      [,4] [,5]
## [1,]       1    -0.0000000000000025678    0.000000000000000267952  0.0000000000000000102878    0
## [2,]       1     0.9999999999999962252   -0.000000000000003167987 -0.0000000000000000614500    0
## [3,]    -104    -4.9999999999999831246    0.999999999999995559108 -0.0000000000000000955764    0
## [4,]   -6987  -351.9999999999985220711   65.999999999999744204615  0.9999999999999940047957    0
## [5,] -150304 -7523.9999999999681676854 1418.999999999994315658114 21.9999999999998685495939    1
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    9    3    2    4    1
## [2,]   -9    3    7   -7    4
## [3,]  891  327  247  383  123
## [4,]  909  435  136  213  282
## [5,]  693  -99 -209  385 -353

Testing matrix \({\begin{bmatrix}9&3&2&4&1 \\-9&3&7&-7&4 \\891&327&247&383&123 \\909&435&136&213&282 \\693&-99&-209&385&-353 \\\end{bmatrix}}\) :

Get the result using my LU algorithm

## $L
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]   -1    1    0    0    0
## [3,]   99    5    1    0    0
## [4,]  101   22  -66    1    0
## [5,]   77  -55   33  -22    1
## 
## $U
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    9    3    2    4    1
## [2,]    0    6    9   -3    5
## [3,]    0    0    4    2   -1
## [4,]    0    0    0    7    5
## [5,]    0    0    0    0  -12

My result is \[L = \begin{bmatrix}1&0&0&0&0 \\-1&1&0&0&0 \\99&5&1&0&0 \\101&22&-66&1&0 \\77&-55&33&-22&1 \\\end{bmatrix} ; U = \begin{bmatrix}9&3&2&4&1 \\0&6&9&-3&5 \\0&0&4&2&-1 \\0&0&0&7&5 \\0&0&0&0&-12 \\\end{bmatrix} ; LU = \begin{bmatrix}9&3&2&4&1 \\-9&3&7&-7&4 \\891&327&247&383&123 \\909&435&136&213&282 \\693&-99&-209&385&-353 \\\end{bmatrix} \]

##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE
##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE
## $L
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    1    0    0    0    0
## [2,]   -1    1    0    0    0
## [3,]   99    5    1    0    0
## [4,]  101   22  -66    1    0
## [5,]   77  -55   33  -22    1
## 
## $U
##      [,1] [,2] [,3] [,4] [,5]
## [1,]    9    3    2    4    1
## [2,]    0    6    9   -3    5
## [3,]    0    0    4    2   -1
## [4,]    0    0    0    7    5
## [5,]    0    0    0    0  -12

Pracma result is \[L = \begin{bmatrix}1&0&0&0&0 \\-1&1&0&0&0 \\99&5&1&0&0 \\101&22&-66&1&0 \\77&-55&33&-22&1 \\\end{bmatrix} ; U = \begin{bmatrix}9&3&2&4&1 \\0&6&9&-3&5 \\0&0&4&2&-1 \\0&0&0&7&5 \\0&0&0&0&-12 \\\end{bmatrix} ; LU = \begin{bmatrix}9&3&2&4&1 \\-9&3&7&-7&4 \\891&327&247&383&123 \\909&435&136&213&282 \\693&-99&-209&385&-353 \\\end{bmatrix} \]

##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE
##      [,1] [,2] [,3] [,4] [,5]
## [1,] TRUE TRUE TRUE TRUE TRUE
## [2,] TRUE TRUE TRUE TRUE TRUE
## [3,] TRUE TRUE TRUE TRUE TRUE
## [4,] TRUE TRUE TRUE TRUE TRUE
## [5,] TRUE TRUE TRUE TRUE TRUE

In each of the examples tested, MYLU gives the same results as the LU function from the pracma package.

Such results successfully decompose the tested matrix back into the L and U constituents from which it was constructed.