Created
May 9, 2013 05:13
-
-
Save taterbase/5545688 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) | |
%GRADIENTDESCENT Performs gradient descent to learn theta | |
% theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by | |
% taking num_iters gradient steps with learning rate alpha | |
% Initialize some useful values | |
m = length(y); % number of training examples | |
J_history = zeros(num_iters, 1); | |
for iter = 1:num_iters | |
% ====================== YOUR CODE HERE ====================== | |
% Instructions: Perform a single gradient step on the parameter vector | |
% theta. | |
% | |
% Hint: While debugging, it can be useful to print out the values | |
% of the cost function (computeCost) and gradient here. | |
% | |
predictions = X * theta; | |
difference = predictions - y; | |
partialDerivative = X' * difference; | |
step = alpha * ((1/m) * partialDerivative); | |
theta = theta - step; | |
% ============================================================ | |
% Save the cost J in every iteration | |
J_history(iter) = computeCost(X, y, theta); | |
end | |
end |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment