Skip to content

Instantly share code, notes, and snippets.

@sadimanna
Created June 30, 2021 10:54
Show Gist options
  • Save sadimanna/880a84b6456a418a9d155fa691b28c9b to your computer and use it in GitHub Desktop.
Save sadimanna/880a84b6456a418a9d155fa691b28c9b to your computer and use it in GitHub Desktop.
#OPTMIZER
optimizer = LARS(
[params for params in model.parameters() if params.requires_grad],
lr=0.2,
weight_decay=1e-6,
exclude_from_weight_decay=["batch_normalization", "bias"],
)
# "decay the learning rate with the cosine decay schedule without restarts"
#SCHEDULER OR LINEAR EWARMUP
warmupscheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, lambda epoch : (epoch+1)/10.0, verbose = True)
#SCHEDULER FOR COSINE DECAY
mainscheduler = torch.optim.lr_scheduler.CosineAnnealingWarmRestarts(optimizer, 500, eta_min=0.05, last_epoch=-1, verbose = True)
#LOSS FUNCTION
criterion = SimCLR_Loss(batch_size = 128, temperature = 0.5)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment