Ronald WESONGA (PhD)
FALL 2024
Given \[n=10,~~ S=\begin{bmatrix}25 & 6 & 3\\ 6 & 16 & 10 \\3 & 10 & 9 \end{bmatrix}\]
Find
\[\bar{x}_i=\frac{x_{1i}1+\cdots+x_{ni}1}{n}=\frac{y_i^{'}1}{n}\] \[\bar{x}=\begin{bmatrix}\bar{x}_1\\ \vdots \\ \bar{x}_p\end{bmatrix}=\begin{bmatrix}\frac{y_1^{'}1}{n}\\ \vdots \\ \frac{y_p^{'}1}{n}\end{bmatrix}\]
\[\bar{x}=\frac{1}{n}\begin{bmatrix} x_{11} & \cdots & x_{1n} \\ \vdots & \vdots & \vdots \\ x_{p1} & \cdots & x_{pn} \end{bmatrix} \begin{bmatrix} 1 \\ \vdots \\ 1 \end{bmatrix}\]
\[\bar{x} = \frac{1}{n} x^{'} 1\]
Matrix of Means
\[1\bar{X}^{'}=\frac{1}{n}11^{'}X=\begin{bmatrix} \bar{X}_1 & \cdots & \bar{X}_p \\ \vdots & \ddots & \vdots\\ \bar{X}_1 & \cdots & \bar{X}_p \end{bmatrix}\]
Matrix of Deviations
\[1\bar{X}^{'}=\frac{1}{n}11^{'}X=\begin{bmatrix} X_{11}-\bar{X}_1 & \cdots & X_{1p}-\bar{X}_p \\ \vdots & \ddots & \vdots\\ X_{n1}-\bar{X}_1 & \cdots & X_{np}-\bar{X}_p \end{bmatrix}\]
Matrix of SS and CP
\[(n-1)S = (X-\frac{1}{n}11^{'}X)^{'}(X-\frac{1}{n}11^{'}X)\] \[S = \frac{1}{n-1}X^{'}(I-\frac{1}{n}11^{'})X)\] \[S=D^{\frac{1}{2}}~~R~~ D^{\frac{1}{2}}\] \[R=D^{-\frac{1}{2}}~~S~~ D^{-\frac{1}{2}}\]
Let \[X=\begin{bmatrix}1 & 2 & 5 \\ 4 & 1 & 6 \\ 4 & 0 & 4 \end{bmatrix}\]
\[b^{'}X=2X_1+2X_2-X_3\] \[c^{'}X=X_1-X_2-3X_3\]
Given,
Calculate
An array \(x\) of \(n\) real number \(\{x_1, x_2, \cdots, x_n\}\) is called a vector \[x=\begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix}~~ or~~ x^{'}=\begin{bmatrix} x_1 & x_2 & \cdots & x_n \end{bmatrix}\]
The following operations are possible \[cx, ~~x+y,~~ L_x, ~~ L_{cx}=|c|L_x\] \[cos(\theta)=\frac{x_1y_1+x_2y_2}{L_xL_y}=\frac{x^{'}y}{L_xL_y}=\frac{x^{'}y}{\sqrt{x^{'}x}\sqrt{y^{'}y}}\]
Example 3.1 (Calculating lengths of vectors and the angle between them) Given the vectors \(x^{'} = [1,3,2]\) and \(y^{'} = [-2,1, -1]\), find \(3x\) and \(x + y\). Next, determine the length of \(x\), the length of \(y\), and the angle between \(x\) and \(y\). Also, check that the length of \(3x\) is three times the length of \(x\)
A pair of vectors \(x\) and \(y\) of the same dimension is said to be linearly dependent if there exist constants \(c_1\) and \(c_2\), both not zero, such that \(c_1x+c_2y = 0\). A set of vectors \(x_1, x_2, \cdots , x_k\) is said to be linearly dependent if there exist constants \(c_1, c_2, \cdots, c_k\), not all zero, such that \(c_1x+c_2x_2+\cdots+c_kx_k = 0\).
Linear dependence implies that at least one vector in the set can be written as linear combination of the other vectors. Vector of the same dimension that are not linearly dependent are said to be linearly independent.
Projection (or shadow) of a vector \(x\) on a vector \(y\) is \[\frac{(x^{'}y)}{y^{'}y}y=\frac{(x^{'}y)}{L_y}\frac{1}{L_y}y\]
Length of projection \(\frac{|x^{'}y|}{L_y}=L_x\left|\frac{x^{'}y}{L_xL_y}\right|=L_x|cos(\theta)|\)
Example 3.2 (Identifying linearly independent vectors) Are the following vectors linearly dependent? \[x_1=\begin{bmatrix} 1\\ 2\\ 1 \end{bmatrix} x_2=\begin{bmatrix} 1\\ 0\\ -1 \end{bmatrix} x_3 = \begin{bmatrix} 1\\ -2\\ 1 \end{bmatrix}\]
\[ X_{np} = \begin{bmatrix} x_{11} & \cdots & x_{1p} \\ \vdots & \ddots & \vdots \\ x_{n1} & \cdots & x_{np} \end{bmatrix}\]
Example 3.3 (Transpose of a matrix) For \[A_{2x3}=\begin{bmatrix} 3& -1 & 2 \\ 1 & 5 & 4\end{bmatrix}\longrightarrow A^{'}_{3x2}=\begin{bmatrix} 3 & 1 \\-1 & 5 \\ 2 & 4\end{bmatrix}\]
The following operations are possible \[cA_{np},~~ A+B,~~ A_{nxk}*B_{kxp},~~ d^{'}Ad\]
Square matrices will be of special importance in our development of statistical methods. A square matrix is said to be symmetric if \(A = A^{'}\) or \(a_{ij} = a_{ji}\) for all \(i\) and \(j\).
Identity matrix \(I\) act like \(1\) in ordinary multiplication \((I A = A I = A_{kxk})\). Extensively, if there exists a matrix \(B\) such that \(BA=BA=I\), then \(B\) is the inverse of \(A\), denoted \(A^{-1}\)
Orthogonal matrices \(QQ^{'}=Q^{'}Q=I\) or \(Q^{'}=Q^{-1}\)
Eigenvalue and eigenvector \((\lambda, x\ne 0)\) exist, if \(Ax = \lambda x\). Normalize \(x\) so that it has a unity length, that is, \(x^{'}x=1\)
Let \(A_{k×k}\) be a square symmetric matrix. Then \(A\) has \(k\) pairs of eigenvalues and eigenvectors, namely \((\lambda_1, e_1), \cdots,(\lambda_k, e_k)\). The eigenvectors can be chosen to satisfy \(1 = e_1^{'}e_1 = \cdots = e_k^{'}e_k\) and be mutually perpendicular. The eigenvectors are unique unless two or more eigenvalues are equal.
Example 3.8 (Verifying eigenvalues and eigenvectors). Let \(A=\begin{bmatrix}1&-5\\-5&1\end{bmatrix}\). show that \(\lambda_1=6\) and \(\lambda_2=-4\) are its eigenvalues and the corresponding eigenvectors are \(e_1=[\frac{1}{\sqrt{2}}, \frac{-1}{\sqrt{2}}]^{'}\) and \(e_2=[\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}]^{'}\)
The study of variation and interrelationships in multivariate data is often based upon distances and the assumption that the data are multivariate normally distributed. Squared distance and the multivariate normal density can be expressed in terms of matrix products called quadratic forms. Consequently, it should not be surprising that quadratic forms play central role in multivariate analysis. Quadratic forms that are always nonnegative and the associated positive definite matrices.
Spectral decomposition for symmetric matrices \[A_{(kxk)}=\lambda_1e_1e_1^{'}+\cdots+\lambda_ke_ke_k^{'}~~\forall~~ e_i^{'}e_i=1, ~~e_i^{'}e_j=0~~i=1,\cdots k\]
Because \(x^{'}Ax\) has only square terms \(x_i^{2}\) and product terms \(x_ix_k\), it is called a quadratic form. When a \(k x k\) symmetric matrix \(A\) is such that \[0\le x^{'}Ax\] implying nonnegative definite. If the equality holds in the equation above only for the vector \(x^{'} = [0, \cdots, 0]\), then \(A\) or the quadratic form is said to be positive definite, \[0 < x^{'}Ax\]
Using the spectral decomposition, we can easily show that a \(k × k\) matrix A is a positive definite matrix if and only if every eigenvalue of A is positive. A is a nonnegative definite matrix if and only if all of its eigenvalues are greater than or equal to zero.
Example 3.9 and example 3.10
Assume \[A=\sum_{i=1}^k\lambda_ie_ie_i^{'}=P\Lambda P^{'};~~ PP^{'}=P^{'}P=I, \Lambda=diag(\lambda_i)\]
Possible operations \[A^{-1},~~ A^{\frac{1}{2}}\]
Given a random vector \(X^{'}=[X_1,X_2,X_3,X_4,X_5]\) with mean vector \(\mu^{'}=[2,4,-1, 3,0]\) and variance-covariance matrix \[\Sigma_X= \begin{bmatrix}4& -1& \frac{1}{2}& -\frac{1}{2} & 0\\-1&3&1&-1&0 \\ \frac{1}{2} &1&6&1&-1 \\ -\frac{1}{2}&-1&1&4&0 \\ 0&0&-\frac{1}{2}&0&2\end{bmatrix}\] Partition \[X=\begin{bmatrix} X_1 \\ X_2\\... \\ X_3 \\ X_4 \\ X_5\end{bmatrix}=\begin{bmatrix}X^{(1)}\\...\\X^{(2)} \end{bmatrix}\]
Let \[A=\begin{bmatrix}1 & -1\\1&1 \end{bmatrix}~~~B=\begin{bmatrix}1 & 1 & 1\\1&1&-2 \end{bmatrix}\] Suppose linear combinations \(AX^{(1)}, BX^{(2)}\)
Find the following