Skip to content

Instantly share code, notes, and snippets.

@WillKoehrsen
Created November 5, 2018 00:26
Show Gist options
  • Save WillKoehrsen/06443acaea86e333fbaf3757367cecc2 to your computer and use it in GitHub Desktop.
Save WillKoehrsen/06443acaea86e333fbaf3757367cecc2 to your computer and use it in GitHub Desktop.
from keras.models import Sequential
from keras.layers import LSTM, Dense, Dropout, Masking, Embedding
model = Sequential()
# Embedding layer
model.add(
Embedding(input_dim=num_words,
input_length = training_length,
output_dim=100,
weights=[embedding_matrix],
trainable=False,
mask_zero=True))
# Masking layer for pre-trained embeddings
model.add(Masking(mask_value=0.0))
# Recurrent layer
model.add(LSTM(64, return_sequences=False,
dropout=0.1, recurrent_dropout=0.1))
# Fully connected layer
model.add(Dense(64, activation='relu'))
# Dropout for regularization
model.add(Dropout(0.5))
# Output layer
model.add(Dense(num_words, activation='softmax'))
# Compile the model
model.compile(
optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment