Skip to content

Instantly share code, notes, and snippets.

@suriyadeepan
Created January 19, 2017 15:55
Show Gist options
  • Save suriyadeepan/9945026a7491ab47977f4d68d19cb567 to your computer and use it in GitHub Desktop.
Save suriyadeepan/9945026a7491ab47977f4d68d19cb567 to your computer and use it in GitHub Desktop.
piece of model for sentiment classification
# params
seqlen = metadata['max_words']
state_size = 128
vocab_size = len(metadata['idx2w'])
batch_size = 128
num_classes = 2
tf.reset_default_graph()
x_ = tf.placeholder(tf.int32, [None, seqlen], name = 'x')
y_ = tf.placeholder(tf.int32, [None, seqlen], name = 'y')
# embeddings
embs = tf.get_variable('emb', [vocab_size, state_size])
rnn_inputs = tf.nn.embedding_lookup(embs, x_)
# rnn cell
cell = tf.nn.rnn_cell.LSTMCell(state_size, state_is_tuple=True)
# uncomment line below for increasing depth
#cell = tf.nn.rnn_cell.MultiRNNCell([cell] * num_layers, state_is_tuple=True)
init_state = cell.zero_state(batch_size, tf.float32)
rnn_outputs, final_state = tf.nn.dynamic_rnn(cell=cell, inputs=rnn_inputs, initial_state=init_state)
# parameters for softmax layer
W = tf.get_variable('W', [state_size, num_classes])
b = tf.get_variable('b', [num_classes],
initializer=tf.constant_initializer(0.0))
# output for each time step
logits = tf.matmul(rnn_outputs[-1], W) + b
predictions = tf.nn.softmax(logits)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment