Skip to content

Instantly share code, notes, and snippets.

@rohit-gupta
Created September 15, 2017 06:41
Show Gist options
  • Save rohit-gupta/74da302ef43de24ae21b852974229367 to your computer and use it in GitHub Desktop.
Save rohit-gupta/74da302ef43de24ae21b852974229367 to your computer and use it in GitHub Desktop.
Implementing Seq2Seq in Keras with State Transfer and Teacher Forcing
from keras.models import Model
from keras.layers import Input, LSTM, Dense, TimeDistributed
inputs = Input(batch_shape=(8,16,1024))
layer = LSTM(256, return_state=True, return_sequences=True)
outputs = layer(inputs)
# TODO Add Teacher Forcing
output, state = outputs[0], outputs[1:]
output = LSTM(256)(output, initial_state=state)
model = Model(inputs, output)
model.summary()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment