10. Conclusion

10.1 Matrix Factorizations

  1. A = LU = (Lower triangular L with 1's on the diagonal)(Upper triangular U with pivots on the diagonal)

    requirements : No row exchanges as Gaussian elimination reduces square A to U.

  2. A=LDU=(Lower triangular L with 1's on the diagonal)(pivot matrix D is diagonal)(Upper triangular U with 1's on the diagonal)

    requirements: No row exchanges.The pivots in D are divided out to leave 1's on the diagonal of U. If A is symmetric the U is \(L^T\) and \(A=LDL^T\).

  3. PA=LU

    requirements: permutation matrix P to avoid zeros in the pivot positions and to do all of the row exchanges on A in advances. A is invertible. Then P,L,U are invertible.

  4. EA=R (m by m invertible E)(any m by n matrix A) = rref(A)

    requirements : None! The reduced row echelon form R has r pivot rows and pivot columns, containing the identity matrix. The last m-r rows of E are a basis for the left nullspace of A; they multiply A to give m-r zero rows in R. The first r columns of \(E^{-1}\) are a basis for the column space of A.

  5. S=\(C^TC\)=(Lower triangular)(Upper triangular) with \(\sqrt{D}\) on both diagonals

    requirements: S is symmetric and positive definite (all n pivots in D are positive). This Cholesky factorization C=chol(S) has \(C^T=L\sqrt{D}\) , so \(S=C^TC=LDL^T\).

  6. \(A=QR\) = (orthonormal columns in Q) (upper triangular R)

    requirements: A has independent columns. Those are orthogonalized in Q by the Gram-Schmidt or Householder process.If A is square the \(Q^{-1}=Q^{T}\).

  7. \(A=X\Lambda X^{-1}\) = (eigenvectors in X) (eigenvalues in \(\Lambda\))(left eigenvectors in \(X^{-1}\))

    requirements: A must have n linearly independent eigenvectors.

  8. S = \(Q\Lambda Q^{-1}\)=\(Q\Lambda Q^T\) = (orthogonal matrix Q)(real eigenvalue matrix \(\Lambda\))(\(Q^T \ is \ Q^{-1}\))

    requirements: S is real and symmetric: \(S^T=S\). This is the Spectral Theorem.

  9. A = \(B J B^{-1}\) = (generalized eigenvectors in B)(Jordan blocks in J)(\(B^{-1}\))

    requirements: A is any square matrix. This Jordan form J has a block for each independent eigenvector of A . Every block has only one eigenvalue.

  10. A = \(U\Sigma V^T\) = (orthogonal U is \(m \times m\))(\(m \times n\) singular value matrix \(\sigma_1, \sigma_2, ..., \sigma_r\) on its diagonal)(orthogonal V is \(n \times n\))

    requirements: None. This Singular Value Decomposition(SVD) has the eigenvectors of \(AA^T\) in U and eigenvectors of \(A^TA\) in V; \(\sigma_i=\sqrt{\lambda_i(A^TA)}=\sqrt{\lambda_i(AA^T)}\); Those singular values are \(\sigma_1 \geq \sigma_2 \cdots \geq \sigma_r >0\). By column-row multiplication:

    \(A=U_{r}\Sigma V_{r}^T=\sigma_1 u_1 v_1^{T} + \cdots + \sigma_r u_r v_r^{T}\). If A is symmetric positive definite the \(U=V=Q\) and \(\Sigma = \Lambda\) and S=$Q\Lambda Q^T $

  11. \(A^{+}=V\Sigma^{+} U^T\) = (orthogonal V is \(n \times n\))(\(n \times m\) pseudoinverse of \(\Sigma\) with \(1/\sigma_1,\cdots,1/\sigma_r\) on diagonal)(orthogonal \(m \times m\))

    requirements: None. The pseudoinverse \(A^{+}\) has \(A^{+}A\)= projection onto row space of A and \(AA^{+}\)=projection onto column space. \(A^{+}=A^{-1}\) if A is invertible. The shortest least-squares solution to \(Ax=b\) is \(x^{+}=A^{+}b\). This solves \(A^{T}Ax^{+}=A^{T}b\).

  12. A = \(QS\) = (orthogonal matrix Q)(symmetric positive definite matrix S)

    requirements: A is invertible. This polar decomposition has \(S^2=A^TA\). The factor S is semidefinite if A is singular. The reverse polar decomposition A=KQ has \(K^2=AA^T\). Both have \(Q=UV^T\) from SVD.

  13. A = \(U\Lambda U^{-1}\) = (unitary U)(eigenvalue matrix \(\Lambda\))(\(U^{-1}\) which is \(U^{H}=\overline{U}^T\))

    requirements: A is normal. \(AA^H=A^HA\). Its orthonormal (and possibly complex) eigenvectors are the columns of U. Complex \(\lambda's\) unless \(S=S^H\): Hermitian case.

  14. A = \(QTQ^{-1}\) = (unitary Q)(triangular T with \(\lambda's\) on diagonal)(\(Q^{-1}=Q^H\))

    requirements: Schur trianularization of any square A.There is a matrix Q with orthonormal columns that makes \(Q^{-1}AQ\) triangular.

  15. \(F_n = \left [ \begin{matrix} I&D \\ I&-D \end{matrix}\right] \left [ \begin{matrix} F_{n/2}& \\ &F_{n/2} \end{matrix}\right] \left [ \begin{matrix} even-odd \\ permutation \end{matrix}\right]\)= one step of the recursive FFT.

    requirements: \(F_n\) = Fourier matrix with entries \(w^{jk}\) where \(w^n=1\) : \(F_n\overline{F_n}=nI\). D has \(1, w, ..., w^{n/2 - 1}\) on its diagonal. For \(n=2^l\) the Fast Fourier Transform will compute \(F_nx\) with only \(1/2 nl=1/2 nlog_2n\) multiplications form \(l\) stages of D's.

10.2 Six Great Theorems of Linear Algebra

Dimension Theorem : All bases for a vector space have the same number of vectors.

Counting Theorem: Dimension for column space + dimension of nullspace = number of columns.

Rank Theorem: Dimension of column space = dimension of row space = rank.

Fundamental Theorem:The row space and nullspace of A are orthogonal complements in \(R^n\); The column space and left nullspace of A are orthogonal complements in \(R^m\)

SVD: There are orthonormal bases (\(v's\) and \(u's\) for the row and column spaces) so that \(Av_i=\sigma_iu_i\).

Spectral Theorem:If \(A^T=A\) there are orthonormal \(q's\) so that \(Aq_i=\lambda_iq_i\) and \(A=Q\Lambda Q^T\).

10.3 Nonsingular VS Singular

Nonsingular --- Singular

A is invertible --- A is not invertible

The columns are independent --- The columns are dependent

The rows are independent --- The rows are dependent

The determinant is not zero --- The determinant is zero

Ax = 0 has one solution x=0 --- Ax=0 has infinitely many solutions

Ax=b has one solution \(x=A^{-1}b\) --- Ax=b has no solution or infinitely many

A has n pivots (nonzero) --- A has r< n pivots

A has full rank r=n --- A has rank r < n

The reduced row echelon form is R=I --- R has at least one zero row

The column space is all of \(R^m\) --- The column space has dimension r<m

The row space is all of \(R^n\) --- The row space has dimension r<n

All eigenvalues are nonzero --- Zero is an eigenvalues of A

\(A^TA\) is symmetric positive definite --- \(A^TA\) is only semidefinite

A has n (positive) singular values --- A has r < n singular values

posted @ 2022-04-05 19:25  溪奇的数据  阅读(118)  评论(0编辑  收藏  举报