stage | group | info | title |
---|---|---|---|
Verify |
Runner |
To determine the technical writer assigned to the Stage/Group associated with this page, see https://handbook.gitlab.com/handbook/product/ux/technical-writing/#assignments |
Using Graphical Processing Units (GPUs) |
{{< details >}}
- Tier: Free, Premium, Ultimate
- Offering: GitLab.com, GitLab Self-Managed, GitLab Dedicated
{{< /details >}}
{{< history >}}
- Introduced in GitLab Runner 13.9.
{{< /history >}}
GitLab Runner supports the use of Graphical Processing Units (GPUs). The following section describes the required configuration to enable GPUs for various executors.
No runner configuration is needed.
Prerequisites:
- Install NVIDIA Driver.
- Install NVIDIA Container Toolkit.
Use the gpus
or service_gpus
configuration option in the runners.docker
section:
[runners.docker]
gpus = "all"
service_gpus = "all"
See the documentation for the GitLab fork of Docker Machine.
No runner configuration should be needed. Be sure to check that the node selector chooses a node with GPU support.
GitLab Runner has been tested on Amazon Elastic Kubernetes Service with GPU-enabled instances.
You can use runners with NVIDIA GPUs. For NVIDIA GPUs, one
way to ensure that a GPU is enabled for a CI job is to run nvidia-smi
at the beginning of the script. For example:
train:
script:
- nvidia-smi
If GPUs are enabled, the output of nvidia-smi
displays the available devices. In
the following example, a single NVIDIA Tesla P4 is enabled:
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 450.51.06 Driver Version: 450.51.06 CUDA Version: 11.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 Tesla P4 Off | 00000000:00:04.0 Off | 0 |
| N/A 43C P0 22W / 75W | 0MiB / 7611MiB | 3% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
If the hardware does not support a GPU, nvidia-smi
should fail either because
it's missing or because it can't communicate with the driver:
modprobe: ERROR: could not insert 'nvidia': No such device
NVIDIA-SMI has failed because it couldn't communicate with the NVIDIA driver. Make sure that the latest NVIDIA driver is installed and running.