特征向量-Eigenvalues_and_eigenvectors#Graphs 线性变换

总结:

1、线性变换运算封闭,加法和乘法

2、特征向量经过线性变换后方向不变

 

https://en.wikipedia.org/wiki/Linear_map

Examples of linear transformation matrices

In two-dimensional space R2 linear maps are described by 2 × 2 real matrices. These are some examples:

 

In mathematics, a linear map (also called a linear mappinglinear transformation or, in some contexts, linear function) is a mapping V → W between two modules (including vector spaces) that preserves (in the sense defined below) the operations of addition and scalar multiplication.

An important special case is when V = W, in which case the map is called a linear operator,[1] or an endomorphism of V. Sometimes the term linear function has the same meaning as linear map, while in analytic geometry it does not.

A linear map always maps linear subspaces onto linear subspaces (possibly of a lower dimension);[2] for instance it maps a plane through the origin to a planestraight line or point. Linear maps can often be represented as matrices, and simple examples include rotation and reflection linear transformations.

In the language of abstract algebra, a linear map is a module homomorphism. In the language of category theory it is a morphism in the category of modules over a given ring.

 

 

Definition and first consequences

Let {\textstyle V} and  {\textstyle W} be vector spaces over the same field {\textstyle \mathbf {K} .} A function {\textstyle f:V\to W} is said to be a linear map if for any two vectors {\textstyle \mathbf {u} ,\mathbf {v} \in V} and any scalar {\textstyle c\in \mathbf {K} } the following two conditions are satisfied:

{\displaystyle f(\mathbf {u} +\mathbf {v} )=f(\mathbf {u} )+f(\mathbf {v} )} additivity / operation of addition
{\displaystyle f(c\mathbf {u} )=cf(\mathbf {u} )} homogeneity of degree 1 / operation of scalar multiplication

Thus, a linear map is said to be operation preserving. In other words, it does not matter whether you apply the linear map before or after the operations of addition and scalar multiplication.

This is equivalent to requiring the same for any linear combination of vectors, i.e. that for any vectors {\textstyle \mathbf {u} _{1},\ldots ,\mathbf {u} _{n}\in V} and scalars  {\textstyle c_{1},\ldots ,c_{n}\in \mathbf {K} ,} the following equality holds:[3][4]

{\displaystyle f\left(c_{1}\mathbf {u} _{1}+\cdots +c_{n}\mathbf {u} _{n}\right)=c_{1}f\left(\mathbf {u} _{1}\right)+\cdots +c_{n}f\left(\mathbf {u} _{n}\right).}

Denoting the zero elements of the vector spaces {\textstyle V} and  {\textstyle W} by  {\textstyle \mathbf {0} _{V}} and  {\textstyle \mathbf {0} _{W}} respectively, it follows that {\textstyle f\left(\mathbf {0} _{V}\right)=\mathbf {0} _{W}.} Let {\textstyle c=0} and {\textstyle \mathbf {v} \in V} in the equation for homogeneity of degree 1:

 {\displaystyle f\left(\mathbf {0} _{V}\right)=f\left(0\mathbf {v} \right)=0f(\mathbf {v} )=\mathbf {0} _{W}.}

Occasionally,  {\textstyle V} and  {\textstyle W} can be considered to be vector spaces over different fields. It is then necessary to specify which of these ground fields is being used in the definition of "linear". If {\textstyle V} and {\textstyle W} are considered as spaces over the field {\textstyle \mathbf {K} } as above, we talk about {\textstyle \mathbf {K} }-linear maps. For example, the conjugation of complex numbers is an {\textstyle \mathbf {R} }-linear map  {\textstyle \mathbf {C} \to \mathbf {C} }, but it is not  {\textstyle \mathbf {C} }-linear.

A linear map {\textstyle V\to \mathbf {K} } with {\textstyle \mathbf {K} } viewed as a vector space over itself is called a linear functional.[5]

These statements generalize to any left-module {\textstyle {}_{R}M} over a ring {\textstyle R} without modification, and to any right-module upon reversing of the scalar multiplication.

 

 

 

 

 

https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Graphs

In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction when that linear transformation is applied to it. More formally, if T is a linear transformation from a vector space V over a field F into itself and v is a vector in V that is not the zero vector, then v is an eigenvector of T if T(v) is a scalar multiple of v. This condition can be written as the equation

where λ is a scalar in the field F, known as the eigenvalue, characteristic value, or characteristic root associated with the eigenvector v.

If the vector space V is finite-dimensional, then the linear transformation T can be represented as a square matrix A, and the vector v by a column vector, rendering the above mapping as a matrix multiplication on the left hand side and a scaling of the column vector on the right hand side in the equation

There is a correspondence between n by n square matrices and linear transformations from an n-dimensional vector space to itself. For this reason, it is equivalent to define eigenvalues and eigenvectors using either the language of matrices or the language of linear transformations.[1][2]

Geometrically an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction that is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed.[3]

 

 

 

 

 

 

 

 

 

 

 math.mit.edu/~gs/linearalgebra/ila0601.pdf

 A100 was found by using the eigenvalues of A, not by multiplying 100 matrices.

 

 

 

posted @ 2016-10-04 21:25  papering  阅读(493)  评论(0编辑  收藏  举报