Skip to content

Instantly share code, notes, and snippets.

@FisherKK
Last active August 15, 2018 23:58
Show Gist options
  • Select an option

  • Save FisherKK/9fea2d5b848f279a3e818c193514fca1 to your computer and use it in GitHub Desktop.

Select an option

Save FisherKK/9fea2d5b848f279a3e818c193514fca1 to your computer and use it in GitHub Desktop.
def train(X, y, model_parameters, learning_rate=0.1, iterations=100):
# Make prediction for every data sample
predictions = [predict(x, model_parameters) for x in X]
# Calculate initial cost for model - MSE
lowest_error = mse(predictions, y)
for i in range(iterations):
# Sum up partial gradients for every data sample, for every parameter in model
accumulated_grad_w0 = 0
accumulated_grad_b = 0
for x, y_target in zip(X, y):
accumulated_grad_w0 += (predict(x, model_parameters) - y_target)*x[0]
accumulated_grad_b += (predict(x, model_parameters) - y_target)
# Calculate mean of gradient
w_grad = (1.0/len(X)) * accumulated_grad_w0
b_grad = (1.0/len(X)) * accumulated_grad_b
# Update parameters by small part of averaged gradient
model_parameters["w"][0] = model_parameters["w"] - learning_rate * w_grad
model_parameters["b"] = model_parameters["b"] - learning_rate * b_grad
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment