人工智能中的局部搜索算法
在局部搜索算法中,我们不再关心从初始节点到目标节点之间的路径,而是考虑从当前节点出发,移动到它的邻近状态,直到到达合理的目标状态。相比于前面所说的无信息搜索算法和有信息搜索算法,局部搜索算法往往能以常数的空间复杂度(不用保存路径)在很大甚至无限的状态空间中找到合理解。
爬山法
爬山法不断向值增加的方向移动,直到到达顶峰。
function HillClimbing(problem) returns a local maximum state
current_state = initial_state
loop do
next_state = the highest neighbor
if (next_state is higher than current_state)
current_state = next_state
else
return current_state
爬山法的问题在于它只能保证到达局部最大值,却不能保证到达全局最大值。
比如我们从C点出发,那么我们会停在局部最大值A点,因此没办法到达全局最大值B点。
模拟退火算法
模拟退火算法与爬山法类似,只是我们不再一味地往值增加的方向移动,而是以一定的几率容许往值减小的方向移动,从而使得我们有可能从局部最大值A点走出来,并到达全局最大值B点。
只所以叫做模拟退火,是因为一开始这个几率相对较高,而随着时间的增加,这个几率则像温度一样慢慢减小。
function SimulatedAnnealing () returns a solution state
current_state = initial_state
for t = 1 to infinite do
T = schedule(t)
if T = 0 then
return current_state
next_state = a randomly selected neighbor
E = next_state.height - current_state.height
if E > 0 then
current_state = next_state
else
current_state = next_state with probability e^(E/T)
遗传算法
遗传算法模拟生物中的遗传过程,从初始种群开始,迭代进行一系列杂交和变异直到获得合适的种群,并从中挑选出最佳个体。
function GeneticAlgorithm(population, fitin) returns a solution state
inputs: population, a set of individuals
fitness, a function that measures fitness of an individual
repeat
new_population = empty_set
for i = 1 to sizeof(population) do
x = RandomSelect(population, fitness)
y = RandomSelect(population, fitness)
new_individual = Reproduce(x, y)
if (a probability) then
new_individual = Mutate(new_individual)
add new_individual to new_population
until some individuals are fit enough or time has elapsed
return the best individual in the population
----------------------------------------------------------------
function Reproduce(x, y) returns a new individual
inputs: x, y, the parents of the new individual
length = Length(x)
mutation_point = RandomSelectIn(1, length)
new_individual = Sub(x, 1, mutation_point)
+ Sub(y, mutation_point, length)
return new_individual