Sup, inf convolution for convex functions

Let $\Omega$ be a bounded convex domain in $\mathbb{R}^n$. $f:\Omega\rightarrow\mathbb{R}^n$. If $f$ is a convex function in $\Omega$, then
$u$ is locally bounded and locally Lipschitz continuous. If $\partial_{x_i}f(x_0)$ exists at $x_0$, then $u$ is differentiable at $x_0$. By standard analysis, there exists a hyperplande $L_{x_0}(x)$ at any $x_0\in\Omega$. Now we any get a clearly picture to see that $u$ is differentiable at $x_0\in\Omega$.

Suppose $u$ is convex function in $\Omega$ and $u\in C(\overline{\Omega})$, show that
\begin{align}
u^\epsilon(x)=\max_{y\in\bar{\Omega}}(u(y)-\frac{1}{\epsilon}|x-y|^2)
\end{align}
is also convex in $\Omega^\epsilon$.

Since we can not find a direct relevant reference for the proof, we give one here.

Assume that

\begin{align}
u^\epsilon(x_0)=u(y_0)-\frac{1}{\epsilon}|x_0-y_0|^2.
\end{align}
Let $L(y)=u(y_0)+p(y-y_0)$ be the support plane at $y_0$, then we have
\begin{align}
u^\epsilon(x)&\geq u(y)-\frac{1}{\epsilon}|x-y|^2\\
&\geq u(y_0)+p_{y_0}(y-y_0)-\frac{1}{\epsilon}|x-y|^2\\
&= L_{y_0}(y)-\frac{1}{\epsilon}|x-y|^2
\end{align}

Therefore,
\begin{align}
u^\epsilon(x_0)&=L_{y_0}(y_0)-\frac{1}{\epsilon}|x_0-y_0|^2\\
u^\epsilon(x)&\geq L_{y_0}(y)-\frac{1}{\epsilon}|x-y|^2.
\end{align}
The last inequality implies that
\begin{align}
u^\epsilon(x)\geq L_{y_0}(x-x_0+y_0)-\frac{1}{\epsilon}|x_0-y_0|^2.
\end{align}

Let
\begin{align}
l_{x_0}(x)&=L_{y_0}(x-x_0+y_0)-\frac{1}{\epsilon}|x_0-y_0|^2\\
&=u(y_0)-\frac{1}{\epsilon}|x_0-y_0|^2+p_0(x-x_0),
\end{align}
then
\begin{align}
u^\epsilon(x_0)=l_{x_0}(x_0),\\
u^\epsilon(x)\geq l_{x_0}(x).
\end{align}

Hence, $u^\epsilon(x)$ is convex in $\Omega_\epsilon$.

Similarly, we can prove that $u_\epsilon$ is also convex. But the proof is different, I don't know why?


Suppose $u$ is convex function, show that
\begin{align}
u^\epsilon(x)=\min_{y\in\bar{\Omega}}(u(y)+\frac{1}{\epsilon}|x-y|^2)
\end{align}
is also convex in $\Omega^\epsilon$.


For any $x_1,x_2\in\Omega^\epsilon$, we have
\begin{align}
u^\epsilon(x_1)=u(y_1)+\frac{1}{\epsilon}|x_1-y_1|^2,\\
u^\epsilon(x_2)=u(y_2)+\frac{1}{\epsilon}|x_2-y_2|^2,
\end{align}
where $y_1,y_2\in\Omega$.

By convexity, for any $\lambda\in(0,1)$, we have
\begin{align*}
\lambda u^\epsilon(x_1)+(1-\lambda)u^\epsilon(x_2)&=\lambda u(y_1)+(1-\lambda)u(y_2)\\
&~~~~+\lambda\frac{1}{\epsilon}|x_1-y_1|^2
+(1-\lambda)\frac{1}{\epsilon}|x_2-y_2|^2\\
&\geq u(\lambda y_1+(1-\lambda)y_2)+\frac{1}{\epsilon}|\lambda x_1+(1-\lambda)x_2-(\lambda y_1+(1-\lambda)y_2)|^2\\
&\geq \min_{y\in\bar{\Omega}}(u(y)+\frac{1}{\epsilon}|\lambda x_1+(1-\lambda)x_2-y|^2)\\
=&u^\epsilon(\lambda x_1+(1-\lambda)x_2).
\end{align*}
Hence, $u^\epsilon(x)$ is convex.

posted @ 2021-11-10 22:31  Minimal_Cone  阅读(108)  评论(0编辑  收藏  举报