MIT《计算机科学与编程导论》第六讲
Lecture 6
Regression test
回归测试,测试所有的情况。
Speed of convergence
收敛速度
Newton's method 牛顿法
The basic idea is, you take a guess and you find the tangent of that guess
简单的说,先设定一个初始猜测值guess,然后求得该值对应函数的切点斜率。
f(guess) = guess² - x
So let's say I guessed 3, I look for the tangent of the curve at 3.
And then my next guess is going to be where the tangent crosses the x axis.
So instead of dividing it in half, I'm using a different method to find the next guess.
The utility of thie relies upon the observation that, most of the time, tangent line is
a good approximation to the curve for values near the solution.
And therefore, the x intercept of tangent will be closer to the right answer.
举个例子,先假定猜测值是3,找到3点处的切线。
然后将下一次猜测值guess设定在该切线与X轴相交处。
这里不再是一分为二的方法,而是使用了另一种方法来找到下一个猜测值guess。
这种方法使用依据于一个发现:绝大多数情况下,对于问题的解附近的值来说,
切线是曲线很好的近似。因此切线在X轴上的截距会比先前更接近正确的解。
So how do we find the intercept of the tangent, the x intercept?
Well, this is where derivatives come in.
What we know is that the slope of the tangent is given by the first derivative of
the function f at the point of the guess. So the slope of the guess is the first derivative.
Which is dy over dx, change in y divided by change in x.
我们如何得到切线在X轴上的截距。这里导数就派上用场了。
该点切线的斜率将会等于函数一阶导数在该点的值。斜率就是一阶导数。
即dy/dx,y的该变量除以x的该变量。
f'(guessi) = 2*guessi
guessi+1 = guessi - f(guessi)/2guessi
以求开根号16为例,guess从3开始:
f(3) = 3*3 - 16 = -7
guessi+1 = 3 - (-7 / 2*3) = 4.1666
错过了正解4,但会逐渐接近它。
牛顿法在解决复杂问题时更加出色,比如:
求squareRoot(2, 0.01)时,牛顿法循环了3次,二分法8次。
求squareRoot(2, 0.0001)时,牛顿法循环了4次,二分法14次。
求squareRoot(2, 0.000001)时,牛顿法循环了5次,二分法22次。
随着问题复杂度增加,好方法同差方法之间的差距就会越来越大。
Non-scalar type 非标量
Immutable: Tuples, Strings
Mutable - List
Techs = [ 'MIT', 'Cal Tech' ]
Ivys = [ 'Harvard', 'Yale', 'Brown' ]
append操作,产生list的list
Univs = []
Univs.append(Techs)
Univs.append(Ivys)
[ [ 'MIT', 'Cal Tech' ], [ 'Harvard', 'Yale', 'Brown' ] ]
连接操作
Univs = Techs + Ivys
[ 'MIT', 'Cal Tech', 'Harvard', 'Yale', 'Brown' ]
把各种类型元素保存到list
L = [ 'L', 'MIT', 3.3, ['a'] ]
可变性,删除元素,跟split切片很不同
L.remove('MIT')