Skip to content

Instantly share code, notes, and snippets.

@jeovazero
Created June 6, 2021 03:09
Show Gist options
  • Save jeovazero/f733fbbbc7b10f65371810e4d339d8af to your computer and use it in GitHub Desktop.
Save jeovazero/f733fbbbc7b10f65371810e4d339d8af to your computer and use it in GitHub Desktop.
# linear_regression
# 2021-06-06
# author: jeovazero
Σ = sum
h(t, x) = Σ(t .* x)
j(θ, x, y) = 1 / 2 * Σ([(h(θ, x[i]) - y[i])^2 for i in 1:length(y)])
grad_j(θ, x, y) =
[Σ([(y[i] - h(θ, x[i])) * x[i][j] for i in 1:length(y)]) for j in 1:length(θ)]
function linear_regression(initial_theta, x, y, learning_rate, threshold)
local θ = initial_theta
local grad_θ = grad_j(θ, x, y)
while j(θ, x, y) > threshold
θ = [θ[i] + learning_rate * grad_θ[i] for i in 1:length(θ)]
grad_θ = grad_j(θ, x, y)
end
return θ
end
t = [1, 1, 1]
x = [[1, 2, 3], [2, 3, 1], [5, 6, 2]]
y = [14, 11, 23]
e = 0.00001
a = 0.0001
# optimal_answer = [1,2,3]
println("theta: ", linear_regression(t, x, y, a, e))
# output:
# theta: [1.009970699169594, 1.9910891910904216, 3.0027040755191465]
#
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment