于2020年10月20日2020年10月20日由Sukuna发布
第一题:WarmUp 让我们写一个函数返回一个单位阵 这真没啥好说的
function A = warmUpExercise()
%WARMUPEXERCISE Example function in octave
% A = WARMUPEXERCISE() is an example function that returns the 5x5 identity matrix
A = [];
% ============= YOUR CODE HERE ==============
% Instructions: Return the 5x5 identity matrix
% In octave, we return values by defining which variables
% represent the return values (at the top of the file)
% and then set them accordingly.
A = eye(5);
% ===========================================
end
第二题&第五题:求代价函数
function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
% =================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
f=X*theta;
f=f-y;
J=1/m/2*(f'*f);
% ============================================================
end
回忆公式
代价函数的表达式
由于
那么我们可以使用这个定义求出应该有的值,再和y相减即可 注意这个
在这里是齐次的,我们暂时不讨论非齐次的情况 怎么求和?利用向量的内积性质求出每一个测试集相差的平方和即可 如果是非齐次的怎么办?那就减去一个向量即可
第三题&第六题:梯度下降
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% ================ YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCostMulti) and gradient here.
%
hyp = X*theta;
dJ = X'*(hyp-y);利用矩阵乘法的性质与每一个元素进行了运算
theta = theta - (alpha/m)*dJ;
% ============================================================
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
end
end
吴恩达给的代码里面出现了迭代,这个是用来保存每一次J的值的,我们在实际运用的时候一定要记录J的变换,判断梯度下降的停止的时刻 首先求出每一次
,保存到yf里面,扩大f到和
一样维度的方阵里
第四题 归一化
mu=mean(X);
sigma=std(X);
X_norm=(X-ones(size(X,1),1)*mu)./(ones(size(X,1),1)*sigma);
mean:求出X每一个参数值的平均值,mean(X,维度) std:求极差,std(X,维度)
第七题 正规方程
function [theta] = normalEqn(X, y)
%NORMALEQN Computes the closed-form solution to linear regression
% NORMALEQN(X,y) computes the closed-form solution to linear
% regression using the normal equations.
theta = zeros(size(X, 2), 1);
% ==================== YOUR CODE HERE ======================
% Instructions: Complete the code to compute the closed form solution
% to linear regression and put the result in theta.
%
theta = pinv(X' * X) * X' * y;
% ============================================================
end
这好像老师上课提到过