Skip to content

Instantly share code, notes, and snippets.

@SuvroBaner
Created December 31, 2019 06:52
Show Gist options
  • Save SuvroBaner/85f402ee6e3e216801368ab8824762ef to your computer and use it in GitHub Desktop.
Save SuvroBaner/85f402ee6e3e216801368ab8824762ef to your computer and use it in GitHub Desktop.
import numpy as np
import tensorflow as tf
print(tf.__version__) # 1.15.0
w = tf.Variable(0, dtype = tf.float32) # defining the parameter "w"
#cost = tf.add(tf.add(w**2, tf.multiply(-10., w)), 25) # defining the coost function "J"
cost = w**2 -10*w + 25
train = tf.train.GradientDescentOptimizer(0.01).minimize(cost) # learning rate is 0.01 to minimize the cost
init = tf.global_variables_initializer()
session = tf.Session()
session.run(init)
# Now let's run 1000 iteration of Gradient Descent to minimize this cost and see the optimal value of "w"
for i in range(1000):
session.run(train)
print(session.run(w))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment