Skip to content

Instantly share code, notes, and snippets.

@pythonlessons
Created August 16, 2023 12:45
Show Gist options
  • Save pythonlessons/2627bef864a41428101928ee0284405c to your computer and use it in GitHub Desktop.
Save pythonlessons/2627bef864a41428101928ee0284405c to your computer and use it in GitHub Desktop.
transformer_attention
encoder_vocab_size = 1000
d_model = 512
encoder_embedding_layer = PositionalEmbedding(vocab_size, d_model)
random_encoder_input = np.random.randint(0, encoder_vocab_size, size=(1, 100))
encoder_embeddings = encoder_embedding_layer(random_encoder_input)
print("encoder_embeddings shape", encoder_embeddings.shape)
cross_attention_layer = GlobalSelfAttention(num_heads=2, key_dim=512)
cross_attention_output = cross_attention_layer(encoder_embeddings)
print("global_self_attention_output shape", cross_attention_output.shape)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment