Skip to content

Instantly share code, notes, and snippets.

@alonstern
Last active April 13, 2020 12:35
Show Gist options
  • Save alonstern/48d70c988908d1a0fff68d5d28a9d3e4 to your computer and use it in GitHub Desktop.
Save alonstern/48d70c988908d1a0fff68d5d28a9d3e4 to your computer and use it in GitHub Desktop.
train the model
def train_model(model, train_dataset):
loss_function = nn.NLLLoss()
optimizer = optim.Adam(model.parameters(), lr=0.001)
train_loader = data.DataLoader(train_dataset, shuffle=True)
model.train()
for sample, tags in tqdm.tqdm(train_loader):
# The loader return the data the in form of "[our sameple]" instead of "our sample".
# This is done for batching in case I want to train on multiple samepls at once.
# I don't use batching here so I do this.
sample = sample[0]
tags = tags[0]
model.zero_grad()
tag_scores = model(sample)
loss = loss_function(tag_scores, tags)
loss.backward()
optimizer.step()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment