Also make effort to check out what kind of image variants do the repository host under the Tags tab. For example, nvidia/cuda tags include various CUDNN versions and Ubuntu versions. Docker Hub image repository Overview tab usually contains information what different tags mean.

If you choose a machine learning framework Docker base image such as tensorflow/tensorflow, make sure that variant includes GPU support if you plan on using GPUs, like tensorflow/tensorflow:1.12.0-gpu-py3 where the gpu part tells that it has been built on top of nvidia/cuda, enabling GPU access.

By the end of this step, you should end up with a base Docker image that closest resembles your project stack e.g. nvidia/cuda:9.0-cudnn7-runtime-ubuntu16.04 or tensorflow/tensorflow:1.12.0-gpu-py3.