Created
July 29, 2019 17:50
-
-
Save williamFalcon/07c2a86725a74af794ff5baf4121ffa6 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# ask lightning to use 4 GPUs for training | |
trainer = Trainer(gpus=[0, 1, 2, 3]) | |
trainer.fit(model) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
In current version of PTL, specifying
distributed_backend
is mandatory even fordp
. This snippet will raiseso I'd update the line 2 to
trainer = Trainer(gpus=[0, 1, 2, 3], distributed_backend='dp')
for newcommers reading this tutorial which is currently referenced in the docs.