Skip to content

Instantly share code, notes, and snippets.

@thunderInfy
Created August 6, 2020 22:15
Show Gist options
  • Select an option

  • Save thunderInfy/292a128aca73caf5f9e8c5e2187a6347 to your computer and use it in GitHub Desktop.

Select an option

Save thunderInfy/292a128aca73caf5f9e8c5e2187a6347 to your computer and use it in GitHub Desktop.
# get loss value
loss = loss_function(q, k, queue)
# put that loss value in the epoch losses list
epoch_losses_train.append(loss.cpu().data.item())
# perform backprop on loss value to get gradient values
loss.backward()
# run the optimizer
optimizer.step()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment