Gradient Descending 梯度下降

Gradient Descent

1. USAGE

When we want to know min[f(X)].


2. ESSENCE

Following the path of gradient, we go down (f(x) decrease) fastest.

We define the step size. Then every time after we step forward, we adjust the gradient, and compare the height (f(x)). When the height rarely changes, then we reach the bottom(min[f(x)]).


3. SAMPLE


#define
f(x) = x^2

rarely_changed = 0.00001

initial position:
x = 1, y= x^2 = 1

gradient_func = df/dx = 2x

step_weight = 0.001


#optimization

gradient = 2x = 2

step = step_weight * gradient = 0.002

new position:
x = x - step = 0.999
y = x^2 = 0.998001

height_changed = 1 - 0.998001 = 0.001999

height_changed > rarely_changed then repeat #optimization

posted @ 2017-12-08 11:32  付小同  阅读(158)  评论(0编辑  收藏  举报