Skip to content

Instantly share code, notes, and snippets.

@marcelcaraciolo
Created October 28, 2011 03:39
Show Gist options
  • Save marcelcaraciolo/1321569 to your computer and use it in GitHub Desktop.
Save marcelcaraciolo/1321569 to your computer and use it in GitHub Desktop.
linear regression
#Evaluate the linear regression
def compute_cost(X, y, theta):
'''
Comput cost for linear regression
'''
#Number of training samples
m = y.size
predictions = X.dot(theta).flatten()
sqErrors = (predictions - y) ** 2
J = (1.0 / (2 * m)) * sqErrors.sum()
return J
def gradient_descent(X, y, theta, alpha, num_iters):
'''
Performs gradient descent to learn theta
by taking num_items gradient steps with learning
rate alpha
'''
m = y.size
J_history = zeros(shape=(num_iters, 1))
for i in range(num_iters):
predictions = X.dot(theta).flatten()
errors_x1 = (predictions - y) * X[:, 0]
errors_x2 = (predictions - y) * X[:, 1]
theta[0][0] = theta[0][0] - alpha * (1.0 / m) * errors_x1.sum()
theta[1][0] = theta[1][0] - alpha * (1.0 / m) * errors_x2.sum()
J_history[i, 0] = compute_cost(X, y, theta)
return theta, J_history
@nam157
Copy link

nam157 commented Oct 3, 2020

let me ask what 'theta' is ??

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment