Andrew Ng 的 Machine Learning 课程学习 (week5) Neural Network Learning


这学期一直在跟进 Coursera上的 Machina Learning 公开课, 老师Andrew Ng是coursera的创始人之一,Machine Learning方面的大牛。这门课程对想要了解和初步掌握机器学习的人来说是不二的选择。这门课程涵盖了机器学习的一些基本概念和方法,同时这门课程的编程作业对于掌握这些概念和方法起到了巨大的作用。

课程地址 https://www.coursera.org/learn/machine-learning

笔记主要是简要记录下课程内容,以及MATLAB编程作业....

Neural Networks

Week4 编程作业核心代码

nnCostFunction.m

 1 %首先把原先label表示的y变成向量模式的output
 2 y_vect = zeros(m,num_labels);  %5000x10
 3 for i = 1:m,  
 4     y_vect(i,y(i)) = 1;  
 5 end;  
 6 
 7 a1 = [ones(m, 1) X];
 8 z2 = a1 * Theta1';
 9 a2 = sigmoid(z2);    % 5000 x 25
10 a2 = [ones(m,1) a2]; % 5000 x 26
11 z3 = a2 * Theta2';   % 5000 x 10
12 a3 = sigmoid(z3);    % 5000 x 10
13 
14 for i = 1:m
15     J = J + sum(-1*y_vect(i,:).*log(a3(i,:))-(1-y_vect(i,:)).*log(1-a3(i,:)));  
16 end
17 J = J/m;
18 J = J + lambda*(sum(sum(Theta1(:,2:end).^2))+sum(sum(Theta2(:,2:end).^2)))/2/m;  
19 
20 %backward propagation  
21 Delta1 = zeros(size(Theta1));  %25x401
22 Delta2 = zeros(size(Theta2));  %10x26
23 for i=1:m
24     delta3 = a3(i,:)' - y_vect(i,:)';  %10x1
25     tempTheta2 = Theta2' * delta3; % 26x10x10x1 = 26x1 
26     delta2 = tempTheta2(2:end) .* sigmoidGradient(z2(i,:)');  %25x1   
27     Delta2 = Delta2 + delta3 * a2(i,:);   % 10x1x1x26
28     Delta1 = Delta1 + delta2 * a1(i,:);   %25x1x1x401
29 end;  
30 
31 Theta2_grad = Delta2/m;  
32 Theta1_grad = Delta1/m;  
33   
34 %regularization gradient  
35   
36 Theta2_grad(:,2:end) = Theta2_grad(:,2:end) + lambda * Theta2(:,2:end) / m;  
37 Theta1_grad(:,2:end) = Theta1_grad(:,2:end) + lambda * Theta1(:,2:end) / m;  
38   

sigmoidGradient.m

1 g = sigmoid(z) .* ( 1 - sigmoid(z));

 

posted @ 2015-11-24 13:13  奔跑的小子  阅读(1000)  评论(0编辑  收藏  举报