xinyu04

导航

Deep Learning 深度学习 Notes Chapter 2 Linear Algebra

1. Multiplying Matrices and Vector

\(C=AB\), \(\text{The operation is defined by:}\)

\[\begin{equation} C_{i,j} = \sum_k A_{i,k}B_{k,j} \end{equation} \]

\(\text{Dot product}:\)

\[\begin{equation} x^Ty \end{equation} \]

2. Linear Dependence and Span

\(\text{It is not possible to have } {\bf more\ than\ one\ but\ less\ than\ infinitely\ many \ solution} \text{ for a particular } b\). \(\text{ If both }x,y \text{ are solutions, then }z = ax+(1-a)y \text{ is also a solution for any real }a.\)

\(\text{We can rewrite:}\)

\[\begin{equation} Ax = \sum_{i}x_iA_{:,i} \end{equation} \]

\(\bf Span\): \(\text{ A Span of a set of vectors is the set of all points obtainable by linear combination of the original vectors.}\)

\(\bf Linear\ Combination:\)

\[\begin{equation} \sum_i c_iv_i \end{equation} \]

\(\large \text{Determining if } Ax=b\text{ has a solution thus amounts to testing if }b\text{ is in the span of the columns of }A.\text{ Where }A\in \mathbb{R^{m\times n}, b\in\mathbb{R^m}}.\text{ Therefore, }A \text{ must have at least }m\text{ columns, that is }n\geq m.\)

3. Norms

\[\begin{equation} ||x||_p = (\sum_i|x_i|^p)^{1/p} \end{equation} \]

\(\text{Frobenius Norm}\):

\[\begin{equation} ||A||_F = \sqrt{\sum_{i,j}A_{i,j}^2} \end{equation} \]

4. Eigendecomposition

\[\begin{equation} Av=\lambda v \end{equation} \]

\(\text{Where }v \text{ is the } {\bf eigenvector},\lambda \text{ is the } {\bf eigenvalue.}\)
\(\bf Eigen-Decomposition:\)

\[\begin{equation} \bf A = V\text{diag}(\bf \lambda)V^{-1} \end{equation} \]

\(\text{Where }V = [v^{(1)},v^{(2)},...,v^{(n)}]\)
\(\text{For every symmetric matrix can be composed into an expression using only real-valued eigenvectors and eigenvalues:}\)

\[\begin{equation} \bf A = Q\Lambda Q^T \end{equation} \]

5. SVD

\[\begin{equation} \bf A = UDV^T \end{equation} \]

\(\text{where }A\in\mathbb{R^{m\times n}},U\in\mathbb{R^{m\times m}},D\in \mathbb{R^{m\times n}},V\in\mathbb{R^{n\times n}}.\)

\[\begin{align} \bf A^TA &=\bf VD^TDV^T\\ &=\bf V\Lambda V^T \end{align} \]

\(\text{Which means that the right singular vectors of } \bf A\text{ are the eigen-vectors of } \bf A^TA\).

6. Moore-Penrose Pseudoinverse

\(\text{To solve a linear equation:}\)

\[\begin{equation} \bf Ax=y \end{equation} \]

\(\text{Psedoinverse of }\bf A\text{ is defined as:}\)

\[\begin{equation} \bf A^+ = \lim_{\alpha\rightarrow 0}(\bf A^TA+\alpha I)^{-1}A^T \end{equation} \]

7. Trace Operation

\[\begin{align} \text{Tr}(\bf A) &= \sum_i \bf A_{i,i}\\ ||{\bf A}||_F &= \sqrt{\text{Tr}({\bf AA^T})} \end{align} \]

\(\text{Other properties:}\)

\[\begin{align} \text{Tr}({\bf ABC}) = \text{Tr}({\bf CAB}) = \text{Tr}({\bf BCA}) \end{align} \]

posted on 2022-04-23 01:31  Blackzxy  阅读(37)  评论(0编辑  收藏  举报