Skip to content

Instantly share code, notes, and snippets.

@shubham0204
Created June 11, 2021 11:36
Show Gist options
  • Save shubham0204/a67a061a770b94e32a2166ad892ed83b to your computer and use it in GitHub Desktop.
Save shubham0204/a67a061a770b94e32a2166ad892ed83b to your computer and use it in GitHub Desktop.
# Multilayer Perceptron with GeLU ( Gaussian Linear Units ) activation
def mlp( x , hidden_dims ):
y = tf.keras.layers.Dense( hidden_dims )( x )
y = tf.nn.gelu( y )
y = tf.keras.layers.Dense( x.shape[ -1 ] )( y )
y = tf.keras.layers.Dropout( 0.4 )( y )
return y
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment