Skip to content

Instantly share code, notes, and snippets.

@omarsar
Created August 19, 2018 03:14
Show Gist options
  • Save omarsar/78eab1b3ac8c878c55cc4ca6fbfb3505 to your computer and use it in GitHub Desktop.
Save omarsar/78eab1b3ac8c878c55cc4ca6fbfb3505 to your computer and use it in GitHub Desktop.
for epoch in range(N_EPHOCS): # loop over the dataset multiple times
train_running_loss = 0.0
train_acc = 0.0
model.train()
# TRAINING ROUND
for i, data in enumerate(trainloader):
# zero the parameter gradients
optimizer.zero_grad()
# reset hidden states
model.hidden = model.init_hidden()
# get the inputs
inputs, labels = data
inputs = inputs.view(-1, 28,28)
# forward + backward + optimize
outputs = model(inputs)
loss = criterion(outputs, labels)
loss.backward()
optimizer.step()
train_running_loss += loss.detach().item()
train_acc += get_accuracy(outputs, labels, BATCH_SIZE)
model.eval()
print('Epoch: %d | Loss: %.4f | Train Accuracy: %.2f'
%(epoch, train_running_loss / i, train_acc/i))
### output
'''
Epoch: 0 | Loss: 0.7149 | Train Accuracy: 75.81
Epoch: 1 | Loss: 0.2770 | Train Accuracy: 91.46
Epoch: 2 | Loss: 0.2099 | Train Accuracy: 93.58
Epoch: 3 | Loss: 0.1766 | Train Accuracy: 94.50
Epoch: 4 | Loss: 0.1638 | Train Accuracy: 94.94
Epoch: 5 | Loss: 0.1457 | Train Accuracy: 95.44
Epoch: 6 | Loss: 0.1347 | Train Accuracy: 95.81
Epoch: 7 | Loss: 0.1299 | Train Accuracy: 95.87
Epoch: 8 | Loss: 0.1228 | Train Accuracy: 96.06
Epoch: 9 | Loss: 0.1140 | Train Accuracy: 96.45
'''
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment