Skip to content

Instantly share code, notes, and snippets.

@alexgmcm
Created August 1, 2012 15:34
Show Gist options
  • Save alexgmcm/3227914 to your computer and use it in GitHub Desktop.
Save alexgmcm/3227914 to your computer and use it in GitHub Desktop.
Stanford Machine Learning Exercise 2 code
function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
hypo=(sigmoid(X*theta));
%could replace y.* by y'* and remove the need for sum???
J=(1/m)*sum( -y.*log(hypo) - (1-y).*(log(1-hypo)) ) + ((lambda/(2*m))*sum(theta(2:length(theta)).^2));
%could do same with ()'*X and remove the need for sum??
grad = (1/m)*(hypo - y)'*X;
grad(2:length(grad))= grad(2:length(grad))' + (lambda/m)*(theta(2:length(theta)));
% =============================================================
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment