Skip to content

Instantly share code, notes, and snippets.

@MLWhiz
Created September 7, 2020 15:11
Show Gist options
  • Save MLWhiz/89a7135644d248a5fbaeca8293358aa2 to your computer and use it in GitHub Desktop.
Save MLWhiz/89a7135644d248a5fbaeca8293358aa2 to your computer and use it in GitHub Desktop.
num_epochs = 5
for epoch in range(num_epochs):
# Set model to train mode
model.train()
for x_batch,y_batch in train_dataloader:
# Clear gradients
optimizer.zero_grad()
# Forward pass - Predicted outputs
pred = model(x_batch)
# Find Loss and backpropagation of gradients
loss = loss_criterion(pred, y_batch)
loss.backward()
# Update the parameters
optimizer.step()
model.eval()
for x_batch,y_batch in valid_dataloader:
pred = model(x_batch)
val_loss = loss_criterion(pred, y_batch)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment