LaTeX编辑数学公式基本语法

LaTeX编辑数学公式基本语法

引用自:
[1]https://blog.csdn.net/qq_33532713/article/details/108602463
[2]https://www.cnblogs.com/Sinte-Beuve/p/6160905.html
[3]https://blog.csdn.net/happyday_d/article/details/83715440

LaTeX中数学模式有两种形式:inline和display。前者是指正文插入行间数学公式,后者独立排列,可以有或没有编号。

行间公式(inline):用$...$将公式括起来
块间公式(display):用$$...$$将公式括起来,默认显示在行中间

各类希腊字母表

希腊字母 英文 希腊字母 英文 希腊字母 英文 希腊字母 英文
\(\alpha\) \alpha \(\theta\) \theta \(o\) o \(\tau\) \tau
\(\beta\) \beta \(\vartheta\) \vartheta \(\pi\) \pi $ \upsilon$ \upsilon
\(\gamma\) \gamma \(\iota\) \iota \(\varpi\) \varpi $ \phi$ \phi
\(\delta\) \delta \(\kappa\) \kappa \(\rho\) \rho $ \varphi$ \varphi
\(\epsilon\) \epsilon \(\lambda\) \lambda \(\varrho\) \varrho $ \chi$ \chi
\(\varepsilon\) \varepsilon \(\mu\) \mu \(\sigma\) \sigma $ \psi$ \psi
$ \zeta$ \zeta \(\nu\) \nu \(\varsigma\) \varsigma \(\omega\) \omega
\(\eta\) \eta \(\xi\) \xi
\(\Gamma\) \Gamma \(\Lambda\) \Lambda \(\Sigma\) \Sigma $ \Psi$ \Psi
\(\Delta\) \Delta \(\Xi\) \Xi \(\Upsilon\) \Upsilon $ \Omega$ \Omega
\(\Theta\) \Theta \(\Pi\) \Pi $ \Phi$ \Phi

上下标、根号、省略号

  • 下标:$x_i$==>\(x_i\)

  • 上标:$x^2$ ==>\(x^2\)

注意:上下标如果多于一个字母或者符号,需要用一对{}括起来:

  • $x _ {i1}$==>\(x _ {i1}\)

  • $x^{\alpha t}$==>\(x^{\alpha t}\)

  • 根号:\sqrt , eg: $\sqrt[n]{5}$==>\(\sqrt[n]{5}\)

  • 省略号:\dots \cdots 分别表示 \(\dots\)\(\cdots\)

运算符

  • 基本运算符:+ - * /等可以直接输入,其他特殊的有:
\pm \times \div \cdot \cap \cup \geq \leq \neq \approx \equiv
\(\pm\) \(\times\) \(\div\) \(\cdot\) \(\cap\) \(\cup\) \(\geq\) \(\leq\) \(\neq\) \(\approx\) \(\equiv\)
  • 求和:$\sum_1^n$ ==>\(\sum_1^n\)

  • 累乘:$\prod_{n=1}^{99}x_n$ ==>\(\prod_{n=1}^{99}x_n\)

  • 积分:$\int_1^n$ ==> \(\int_1^n\)

  • 极限:\lim\limits _ {x \to \infty}==> \(\lim\limits _ {x \to \infty}\)

分数

分数的表示:\frac{}{},如$\frac{3}{8}$ ==> \(\frac{3}{8}\)

矩阵

使用&分隔同行元素,\\表示换行:

示例:

$$
\begin{matrix}
1&x&x^2\\
1&y&y^2\\
1&z&z^2\\
\end{matrix}
$$

结果:

\[\begin{matrix} 1&x&x^2\\ 1&y&y^2\\ 1&z&z^2\\ \end{matrix} \]

行列式

示例:

$$
X=\left|
	\begin{matrix}
	x_{11} & x_{12} & \cdots & x_{1d}\\
	x_{21} & x_{22} & \cdots & x_{2d}\\
	\vdots & \vdots & \ddots & \vdots\\
	x_{m1} & x_{m2} & \cdots & x_{md} \\
    \end{matrix}
    \right|
$$

结果:

\[X=\left| \begin{matrix} x_{11} & x_{12} & \cdots & x_{1d}\\ x_{21} & x_{22} & \cdots & x_{2d}\\ \vdots & \vdots & \ddots & \vdots\\ x_{m1} & x_{m2} & \cdots & x_{md} \\ \end{matrix} \right| \]

箭头

符号 表达式 符号 表达式
\(\leftarrow\) \lefrarrow $ \longleftarrow$ \longleftarrow
$ \rightarrow$ \rightarrow $ \longrightarrow$ \longrightarrow
$ \leftrightarrow$ \leftrightarrow $ \longleftrightarrow$ \longleftrightarrow
$ \Leftarrow$ \Leftarrow $ \Longleftarrow$ \Longleftarrow
$ \Rightarrow$ \Rightarrow $ \Longrightarrow$ \Longrightarrow
$ \Leftrightarrow$ \Leftrightarrow $ \Longleftrightarrow$ \Longleftrightarrow

方程式

$$
\begin{equation}
E=mc^2
\end{equation}
$$

\[\begin{equation} E=mc^2 \end{equation} \]

分隔符

各种括号用 () [] { } \langle\rangle 等命令表示,注意花括号通常用来输入命令和环境的参数,所以在数学公式中它们前面要加 \。可以在上述分隔符前面加 \big \Big \bigg \Bigg 等命令来调整大小。

$$
\max \limits_{a<x<b} \Bigg\{f(x)\Bigg\}
$$

\[\max \limits_{a<x<b} \Bigg\{f(x)\Bigg\} \]

分段函数

$$
f(n) = 
\begin{cases}
n/2, & \text {if $n$ is even}\\
3n+1, & \text {if $n$ is odd}
\end{cases}
$$

\[f(n) = \begin{cases} n/2, & \text {if $n$ is even}\\ 3n+1, & \text {if $n$ is odd} \end{cases} \]

方程组

$$
\left\{
\begin{array}{c}
	a_1x+b_1y+c_1z=d_1\\
	a_2x+b_2y+c_2z=d_2\\
	a_3x+b_3y+c_3z=d_3
\end{array}
\right. # 注意right后面有个小数点
$$

\[\left\{ \begin{array}{c} a_1x+b_1y+c_1z=d_1\\ a_2x+b_2y+c_2z=d_2\\ a_3x+b_3y+c_3z=d_3 \end{array} \right. \]

案例

线性模型

$$h(\theta)=\sum_{j=0}^n \theta_j x_j$$

\[h(\theta)=\sum_{j=0}^n \theta_j x_j \]

均方误差

$$J(\theta) = \frac{1}{2m} \sum_{i=0}^m (y^i-h_\theta (x^i))^2$$

\[J(\theta) = \frac{1}{2m} \sum_{i=0}^m (y^i-h_\theta (x^i))^2 \]

批量梯度下降

$$
\frac{\partial J(\theta)}{\partial\theta_j} = -\frac{1}{m}\sum_{i=0}^m (y^i - h_\theta (x^i))x^i_j
$$

\[\frac{\partial J(\theta)}{\partial\theta_j} = -\frac{1}{m}\sum_{i=0}^m (y^i - h_\theta (x^i))x^i_j \]

推导过程

$$
\begin{align}
\frac{\partial J(\theta)}{\partial \theta_j}
& = - \frac{1}{m} \sum_{i=0}^m (y^i-h_\theta(x^i))\frac{\partial}{\partial\theta_j}(y^i-h_\theta(x^i))\\
& = -\frac{1}{m}\sum_{i=0}^m(y^i-h_\theta(x^i))\frac{\partial}{\partial\theta_j}(\sum_{j=0}^n\theta_j x^i_j - y^i)\\
& = - \frac{1}{m}\sum_{i=0}^m(y^i-h_\theta (x^i))x^i_j
\end{align}
$$

\[\begin{align} \frac{\partial J(\theta)}{\partial \theta_j} & = - \frac{1}{m} \sum_{i=0}^m (y^i-h_\theta(x^i))\frac{\partial}{\partial\theta_j}(y^i-h_\theta(x^i))\\ & = -\frac{1}{m}\sum_{i=0}^m(y^i-h_\theta(x^i))\frac{\partial}{\partial\theta_j}(\sum_{j=0}^n\theta_j x^i_j - y^i)\\ & = - \frac{1}{m}\sum_{i=0}^m(y^i-h_\theta (x^i))x^i_j \end{align} \]

posted @ 2022-03-17 16:12  GeoFXR  阅读(626)  评论(0编辑  收藏  举报