Skip to content

Instantly share code, notes, and snippets.

@alapini
Forked from gmarkall/gpumem.py
Created April 1, 2019 15:57
Show Gist options
  • Select an option

  • Save alapini/5d113fe52fc39c3e1658ef866fbd007c to your computer and use it in GitHub Desktop.

Select an option

Save alapini/5d113fe52fc39c3e1658ef866fbd007c to your computer and use it in GitHub Desktop.
Get memory info for all GPUs in Numba
from numba import cuda
gpus = cuda.gpus.lst
for gpu in gpus:
with gpu:
meminfo = cuda.current_context().get_memory_info()
print("%s, free: %s bytes, total, %s bytes" % (gpu, meminfo[0], meminfo[1]))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment