I'm on Windows 10, and also in possession of an NVidia GTX 1080, which I lied to myself was for getting to grips with deep learning and
not for playing video games. I was having trouble installing the keras package with GPU support
from the instructions. The CPU version installed fine, but when I tried to install the
GPU-aware version I got the dreaded "Error: Python module was not found" error. Oh, and
I also repeatedly got an error about the wrong version of NumPy being found.
Somehow I managed to fix this. I think the main thing was to manually install a conda environment, rather than leaving it to the
install_keras command in R. I don't have much advice on installing CUDA and cuDNN. I definitely installed CUDA 10 (and was using
tensorflow 1.13 that is supposed to use that), but I may well have attempted to put previous version on my path during attempts to
get this to work, or maybe the conda stuff below took care of all that.
First, I followed Harveen Singh's instructions on installing tensorflow with GPU support:
conda create --name tf_gpu tensorflow-gpu
conda activate tf_gpu
python
Check if tensorflow is using your GPU:
import tensorflow as tf
sess = tf.Session(config=tf.ConfigProto(log_device_placement=True))It should be pretty obvious if tensorflow has found a GPU, because the word "gpu" will show up as well as "cpu", and it should mention the model of the card.
For some reason, after exiting python, I had to close and reopen my Anaconda window to get it to recognize any commands, but you might not have that problem. At any rate, the next step was to install keras with GPU support:
conda activate tf_gpu
conda install keras-gpu
python
and test that keras could see the GPU (similarly, it should mention seeing a GPU as well as a CPU)
from keras import backend as K
K.get_session().list_devices()Over in R:
library(keras)
use_condaenv("tf_gpu")
backend()$get_session()$list_devices()You should see the same message as in Python. I was able to get through the tutorial on MNIST without errors. When calling the fit
function, I saw a message:
I tensorflow/stream_executor/dso_loader.cc:152] successfully opened CUDA library cublas64_100.dll locallywhich seems like a good sign.
To use keras from Jupyter Lab and to do useful visualizations, install:
conda install -c conda-forge jupyterlab
conda install nb_conda_kernels
conda install pillow
conda install matplotlib
conda install -c conda-forge opencv
and run jupyter lab from inside the activated tf_gpu environment. nb_conda_kernels seems to be needed so Jupyter can use a different environment as the kernel. Pillow is the Python 3 alternative to PIL, needed to use ImageDataGenerators (which pass image files from disk directly to a model during training). Installing opencv lets you import cv2 to generate class activation maps.
If you forget to install some packages before running Jupyter Lab, according to jakevdp, it's safe to install them like this:
import sys
!conda install --yes --prefix {sys.prefix} matplotlib
Thank you very much for this entry:
I am trying to install keras on R from the scratch after a total recover of my windows 10 system because a general system problem, and I am trying to verify if all is properly installed; when I introduce the sentences (from R) you addressed, I obtain the following error message:
Error in py_get_attr_impl(x, name, silent) :
AttributeError: module 'tensorflow.keras.backend' has no attribute 'get_session'
I have created the conda environment using install_keras(tensorflow="gpu") function, but it does not work! In the past, I also created the conda environments fron conda, but now this does not work.... I think the problem is related with the current versions of keras, tensorflow, reticulate, cudNN and so on now in 2021; please: could you help me to identify the correct versions of all these software packages and applications? I am using:
R 4.0.3; CUDA 10.1 (Aug 2019); cudNN64_8; keras 2.3.0.0 (in R); reticulate 1.18 (in R) and tensorflow 2.2.0 (also in R) Do you know if all these versions are compatibles?
Best Regards, and thank you very much for your patience!