Created
December 23, 2019 04:55
-
-
Save SuvroBaner/28ca648989de994ae3bff618b554d9e0 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def update_parameters_with_gd(parameters, grads, learning_rate): | |
""" | |
Update parameters using one step of gradient descent | |
Arguments: | |
parameters -- python dictionary containing your parameters to be updated: | |
parameters = {"W1": W1, "b1": b1, "W2": W2, "b2": b2} | |
parameters['W' + str(l)] = Wl | |
parameters['b' + str(l)] = bl | |
grads -- python dictionary containing your gradients to update each parameters: | |
grads = {"dW1": dW1, "db1": db1, "dW2": dW2, "db2": db2} | |
grads['dW' + str(l)] = dWl | |
grads['db' + str(l)] = dbl | |
learning_rate -- the learning rate, scalar. | |
Returns: | |
parameters -- python dictionary containing your updated parameters | |
""" | |
L = len(parameters) // 2 # number of layers in the neural networks | |
# Update rule for each parameter | |
for l in range(L): | |
parameters["W" + str(l+1)] = parameters["W" + str(l+1)] - learning_rate*grads['dW' + str(l+1)] | |
parameters["b" + str(l+1)] = parameters["b" + str(l+1)] - learning_rate*grads['db' + str(l+1)] | |
return parameters |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment