上一页 1 2 3 4 5 6 ··· 19 下一页
摘要: ndarray.shape Tuple of array dimensions. 阅读全文
posted @ 2017-03-23 22:20 KennyRom 阅读(798) 评论(0) 推荐(0) 编辑
摘要: 计算中所有元素的值 Example: 阅读全文
posted @ 2017-03-23 22:07 KennyRom 阅读(329) 评论(0) 推荐(0) 编辑
摘要: import numpy as np np.matrix('1, 2; 3, 4') #1, 2 #3, 4 np.matrix([[1,2],[3,4]]) #1, 2 #3, 4 阅读全文
posted @ 2017-03-23 21:56 KennyRom 阅读(209) 评论(0) 推荐(0) 编辑
摘要: Map applies a function to all the items in an input_list Blueprint Most of the times we want to pass all the list elements to a function one-by-one an 阅读全文
posted @ 2017-03-23 20:57 KennyRom 阅读(231) 评论(0) 推荐(0) 编辑
摘要: Description The method strip() returns a copy of the string in which all chars have been stripped from the beginning and the end of the string (defaul 阅读全文
posted @ 2017-03-23 20:48 KennyRom 阅读(260) 评论(0) 推荐(0) 编辑
摘要: fminunc( FCN, X0); fminunc( FCN, C0, Options); [X, FVEC, INFO, OUTPUT, GRAD, HESS] = fminunc (FCN, ...); %Solve an unconstrained optimization problem defined by the function FCN. %X0 determines a... 阅读全文
posted @ 2017-03-09 22:19 KennyRom 阅读(525) 评论(0) 推荐(0) 编辑
摘要: Create options struct for optimization functions. GradObj When set to "on", the function to be minimized must return a second argument which is the gr 阅读全文
posted @ 2017-03-09 22:06 KennyRom 阅读(381) 评论(0) 推荐(0) 编辑
摘要: function J = computeCostMulti(X, y, theta) m = length(y); % number of training examples J = 0; for i = 1:m J = J + (X(i,:) * theta - y(i,1)) ^ 2 end; J = J / (2 * m); end 阅读全文
posted @ 2017-03-08 22:18 KennyRom 阅读(224) 评论(0) 推荐(0) 编辑
摘要: function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters) m = length(y); % number of training examples J_history = zeros(num_iters, 1); for iter = 1:num_iters theta = the... 阅读全文
posted @ 2017-03-08 22:17 KennyRom 阅读(226) 评论(0) 推荐(0) 编辑
摘要: Financial Management Time Limit: 1000MS Memory Limit: 10000K Total Submissions: 182193 Accepted: 68783 Description Larry graduated this year and final 阅读全文
posted @ 2017-03-06 19:50 KennyRom 阅读(275) 评论(0) 推荐(0) 编辑
上一页 1 2 3 4 5 6 ··· 19 下一页