摘要: function J = computeCostMulti(X, y, theta) m = length(y); % number of training examples J = 0; for i = 1:m J = J + (X(i,:) * theta - y(i,1)) ^ 2 end; J = J / (2 * m); end 阅读全文
posted @ 2017-03-08 22:18 KennyRom 阅读(224) 评论(0) 推荐(0) 编辑
摘要: function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters) m = length(y); % number of training examples J_history = zeros(num_iters, 1); for iter = 1:num_iters theta = the... 阅读全文
posted @ 2017-03-08 22:17 KennyRom 阅读(226) 评论(0) 推荐(0) 编辑