Last active
October 1, 2019 16:24
-
-
Save Mattamorphic/e76dcd88574fcc453774a165915d87b0 to your computer and use it in GitHub Desktop.
Gradient Descent Example Python
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Given f(x) = x**4 - 3**3 + 2 = f1(x) = 4x**3 - 9**2 | |
# lets start at x = 6 | |
curr_x = 6 | |
gamma = 0.001 | |
precision = 0.0000001 | |
step_size = 1 | |
max_iterations = 1000 | |
i = 0 | |
df = lambda x: (4 * x**3) - (9**2) | |
while step_size > precision and i < max_iterations: | |
prev_x = curr_x | |
curr_x -= gamma * df(prev_x) | |
step_size = abs(curr_x - prev_x) | |
i+=1 | |
print("The local minimum occurs at:", curr_x) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
current // Initial guess at where the optimum lies | |
gamma // A step size determined by the user | |
precision // How precise the answer must be | |
step_size = 1 // The difference between the current and previous steps | |
max_iterations // How long are we willing to let the algorithm run | |
i = 0 // iteration counter | |
def df // Function that returns our derivative / partial derivatives | |
while step_size > precision and i < max_iterations: | |
previous = current // retain the old position for comparison | |
current = gamma * - df(previous) // the next step should be in the negative direction | |
step_size = abs(current - previous) // determine how far we have moved | |
i++ |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment