Skip to content

Instantly share code, notes, and snippets.

@pythonlessons
Created August 16, 2023 12:45
Show Gist options
  • Save pythonlessons/78011843cb0a0a8ea6cd2aa66a717929 to your computer and use it in GitHub Desktop.
Save pythonlessons/78011843cb0a0a8ea6cd2aa66a717929 to your computer and use it in GitHub Desktop.
transformer_attention
encoder_vocab_size = 1000
d_model = 512
encoder_embedding_layer = PositionalEmbedding(vocab_size, d_model)
random_encoder_input = np.random.randint(0, encoder_vocab_size, size=(1, 100))
encoder_embeddings = encoder_embedding_layer(random_encoder_input)
print("encoder_embeddings shape", encoder_embeddings.shape)
feed_forward_layer = FeedForward(d_model, dff=2048)
feed_forward_output = feed_forward_layer(encoder_embeddings)
print("feed_forward_output shape", feed_forward_output.shape)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment