Skip to content

Instantly share code, notes, and snippets.

@mh-github
Created December 4, 2015 19:47
Show Gist options
  • Select an option

  • Save mh-github/a9852d1c20facc79fcce to your computer and use it in GitHub Desktop.

Select an option

Save mh-github/a9852d1c20facc79fcce to your computer and use it in GitHub Desktop.
require 'csv'
# y = mx + b
# m is slope, b is y-intercept
def compute_error_for_line_given_points(b, m, points)
totalError = 0
0.upto points.length-1 do |i|
x = points[i][0]
y = points[i][1]
totalError += (y - (m * x + b)) ** 2
end
return totalError / points.length
end
def step_gradient(b_current, m_current, points, learningRate)
b_gradient = 0
m_gradient = 0
n = points.length + 0.0
0.upto points.length-1 do |i|
x = points[i][0]
y = points[i][1]
m_gradient += -(2/n) * x * (y - ((m_current * x) + b_current))
b_gradient += -(2/n) * (y - ((m_current * x) + b_current))
end
new_m = m_current - (learningRate * m_gradient)
new_b = b_current - (learningRate * b_gradient)
return [new_b, new_m]
end
def gradient_descent_runner(points, starting_b, starting_m, learning_rate, num_iterations)
b = starting_b
m = starting_m
0.upto num_iterations-1 do |i|
b, m = step_gradient(b, m, points, learning_rate)
end
return [b, m]
end
def run()
points = CSV.read('data.csv', converters: :numeric)
learning_rate = 0.0001
initial_b = 0 # initial y-intercept guess
initial_m = 0 # initial slope guess
num_iterations = 1000
puts "Starting gradient descent at b = #{initial_b}, m = #{initial_m}, error = #{compute_error_for_line_given_points(initial_b, initial_m, points)}"
puts "Running..."
(b, m) = gradient_descent_runner(points, initial_b, initial_m, learning_rate, num_iterations)
puts "After #{num_iterations} iterations b = #{b}, m = #{m}, error = #{compute_error_for_line_given_points(b, m, points)}"
end
run()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment