Skip to content

Instantly share code, notes, and snippets.

@dmichael
Created November 1, 2011 03:14
Show Gist options
  • Select an option

  • Save dmichael/1329765 to your computer and use it in GitHub Desktop.

Select an option

Save dmichael/1329765 to your computer and use it in GitHub Desktop.
logistic-regression-cost.m
function [J, grad] = costFunction(theta, X, y)
% Logistic Regression cost function and partial derivative (gradient)
m = length(y); % number of training examples
cost = (y' * log(h)) + ((1-y')*log(1-h));
J = -1/m * cost;
grad = 1/m * (X' * (h - y));
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment