why constrained regression and Regularized regression equivalent
problem 1:
$\min_{\beta} ~f_\alpha(\beta):=\frac{1}{2}\Vert y-X\beta\Vert^2 +\alpha\Vert \beta\Vert$
problem 2:
$\min_{\beta} ~\frac{1}{2}\Vert y-X\beta\Vert^2 \\ s.t.~\Vert \beta\Vert-c\leq 0$
problem 2 Lagrangian:
$\mathcal{L}(\beta,\lambda)=\frac{1}{2}\Vert y-X\beta\Vert^2+\lambda (\Vert \beta\Vert-c)$
kkt shows:
dual-inner optimal:$\beta^*=min_{\beta}~\mathcal{L}(\beta,\lambda):=\frac{1}{2}\Vert y-X\beta\Vert^2+\lambda (\Vert \beta\Vert-c)$
primal-inner optimal:$\lambda^*(\Vert \beta\Vert-c)=0$
for problem 1:
$\beta^*=\min_{\beta} ~f_\alpha(\beta):=\frac{1}{2}\Vert y-X\beta\Vert^2 +\alpha\Vert \beta\Vert$
set $\lambda = \alpha$ and $c=\Vert \beta\Vert$
can see both kkt conditions meet