Skip to content

Instantly share code, notes, and snippets.

@Cheneng
Created March 13, 2018 02:29
Show Gist options
  • Save Cheneng/6487f900ed733328d41a47d7defaea29 to your computer and use it in GitHub Desktop.
Save Cheneng/6487f900ed733328d41a47d7defaea29 to your computer and use it in GitHub Desktop.
The tutorial of the pack_padded_sequence and pad_packed_sequence in PyTorch.
import torch
import torch.nn as nn
import torch.autograd as autograd
from torch.nn.utils.rnn import pad_packed_sequence, pack_padded_sequence
# Create some fake sorted data with 0 pad
data = autograd.Variable(torch.LongTensor([[3, 4, 5, 1, 9],
[3, 1, 3, 4, 0],
[5, 3, 6, 0, 0]]))
# Record the sequence length of the data
seq_len = [5, 4, 3]
# Embedding
embed = nn.Embedding(10, 5, padding_idx=0)
rnn = nn.RNN(5, 5, batch_first=True)
embed_data = embed(data)
# pack the padded sequence, and as the input of the rnn
packed_sequence = pack_padded_sequence(embed_data, seq_len, batch_first=True)
out, hidden = rnn(packed_sequence)
# unpacked the output
output, seq_len = pad_packed_sequence(out, batch_first=True)
print(output)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment