次梯度方向是函数值上升方向

💡 Definition of subgradient

We say a vector \(g\in \mathbb{R}^n\) is a subgradient of \(f:\mathbb{R}^n\to \mathbb{R}\) at \(x\in \operatorname{\textbf{dom}} f\) if for all \(y\in \operatorname{\textbf{dom}} f\),

\[ f(y)\ge f(x) + g^T(y-x). \]

The set \(\partial f(x) = \{g|~f(y)\ge f(x) + g^T(y-x)\}\) is a subdifferential of \(f\) at \(x\).


📌 Proof

Consider \(g\) is a subgradient of \(f\) and \(\varepsilon>0\), we have

\[ f(x+\varepsilon g) \ge f(x) + g^T(\varepsilon g) = f(x) + \varepsilon \|g\|_2^2 \Rightarrow f(x+\varepsilon g)>f(x). \]

This implies that \(g\) is ascent direction of \(f\).


posted @ 2024-11-15 10:08  MathClown  阅读(1)  评论(0编辑  收藏  举报