Skip to content

Instantly share code, notes, and snippets.

@MLWhiz
Created January 7, 2019 07:03
Show Gist options
  • Save MLWhiz/bb02ac976869546146dcbdead258ed8c to your computer and use it in GitHub Desktop.
Save MLWhiz/bb02ac976869546146dcbdead258ed8c to your computer and use it in GitHub Desktop.
def get_model(features,clipvalue=1.,num_filters=40,dropout=0.1,embed_size=501):
features_input = Input(shape=(features.shape[1],))
inp = Input(shape=(maxlen, ))
# Layer 1: Word2Vec Embeddings.
x = Embedding(max_features, embed_size, weights=[embedding_matrix], trainable=False)(inp)
# Layer 2: SpatialDropout1D(0.1)
x = SpatialDropout1D(dropout)(x)
# Layer 3: Bidirectional CuDNNLSTM
x = Bidirectional(LSTM(num_filters, return_sequences=True))(x)
# Layer 4: Bidirectional CuDNNGRU
x, x_h, x_c = Bidirectional(GRU(num_filters, return_sequences=True, return_state = True))(x)
# Layer 5: some pooling operations
avg_pool = GlobalAveragePooling1D()(x)
max_pool = GlobalMaxPooling1D()(x)
# Layer 6: A concatenation of the last state, maximum pool, average pool and
# additional features
x = concatenate([avg_pool, x_h, max_pool,features_input])
# Layer 7: A dense layer
x = Dense(16, activation="relu")(x)
# Layer 8: A dropout layer
x = Dropout(0.1)(x)
# Layer 9: Output dense layer with one output for our Binary Classification problem.
outp = Dense(1, activation="sigmoid")(x)
# Some keras model creation and compiling
model = Model(inputs=[inp,features_input], outputs=outp)
adam = optimizers.adam(clipvalue=clipvalue)
model.compile(loss='binary_crossentropy',
optimizer=adam,
metrics=['accuracy'])
return model
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment