Skip to content

Instantly share code, notes, and snippets.

@williamFalcon
Created July 29, 2019 17:50
Show Gist options
  • Save williamFalcon/07c2a86725a74af794ff5baf4121ffa6 to your computer and use it in GitHub Desktop.
Save williamFalcon/07c2a86725a74af794ff5baf4121ffa6 to your computer and use it in GitHub Desktop.
# ask lightning to use 4 GPUs for training
trainer = Trainer(gpus=[0, 1, 2, 3])
trainer.fit(model)
@lmartak
Copy link

lmartak commented Jan 15, 2020

In current version of PTL, specifying distributed_backend is mandatory even for dp. This snippet will raise

MisconfigurationException: When using multiple GPUs set Trainer(distributed_backend=dp) (or ddp)

so I'd update the line 2 to trainer = Trainer(gpus=[0, 1, 2, 3], distributed_backend='dp') for newcommers reading this tutorial which is currently referenced in the docs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment