2018-04-17 20:40:55 +00:00
# `nvidia-smi` Input Plugin
This plugin uses a query on the [`nvidia-smi` ](https://developer.nvidia.com/nvidia-system-management-interface ) binary to pull GPU stats including memory and GPU usage, temp and other.
### Configuration
```toml
# Pulls statistics from nvidia GPUs attached to the host
[[inputs.nvidia_smi]]
2018-07-11 03:20:44 +00:00
## Optional: path to nvidia-smi binary, defaults to $PATH via exec.LookPath
# bin_path = "/usr/bin/nvidia-smi"
2018-04-17 20:40:55 +00:00
2018-07-11 03:20:44 +00:00
## Optional: timeout for GPU polling
# timeout = "5s"
2018-04-17 20:40:55 +00:00
```
2018-10-08 19:57:30 +00:00
#### Windows
On Windows, `nvidia-smi` is generally located at `C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe`
2020-04-24 00:20:35 +00:00
On Windows 10, you may also find this located here `C:\Windows\System32\nvidia-smi.exe`
You'll need to escape the `\` within the `telegraf.conf` like this: `C:\\Program Files\\NVIDIA Corporation\\NVSMI\\nvidia-smi.exe`
2018-10-08 19:57:30 +00:00
2018-04-17 20:40:55 +00:00
### Metrics
- measurement: `nvidia_smi`
- tags
2018-06-12 21:28:56 +00:00
- `name` (type of GPU e.g. `GeForce GTX 1070 Ti` )
2018-04-17 20:40:55 +00:00
- `compute_mode` (The compute mode of the GPU e.g. `Default` )
- `index` (The port index where the GPU is connected to the motherboard e.g. `1` )
- `pstate` (Overclocking state for the GPU e.g. `P0` )
- `uuid` (A unique identifier for the GPU e.g. `GPU-f9ba66fc-a7f5-94c5-da19-019ef2f9c665` )
- fields
- `fan_speed` (integer, percentage)
2018-06-12 21:28:56 +00:00
- `memory_free` (integer, MiB)
- `memory_used` (integer, MiB)
- `memory_total` (integer, MiB)
2018-06-11 23:06:26 +00:00
- `power_draw` (float, W)
2018-04-17 20:40:55 +00:00
- `temperature_gpu` (integer, degrees C)
- `utilization_gpu` (integer, percentage)
- `utilization_memory` (integer, percentage)
2019-05-27 02:02:09 +00:00
- `pcie_link_gen_current` (integer)
- `pcie_link_width_current` (integer)
- `encoder_stats_session_count` (integer)
- `encoder_stats_average_fps` (integer)
- `encoder_stats_average_latency` (integer)
- `clocks_current_graphics` (integer, MHz)
- `clocks_current_sm` (integer, MHz)
- `clocks_current_memory` (integer, MHz)
- `clocks_current_video` (integer, MHz)
2018-04-17 20:40:55 +00:00
### Sample Query
The below query could be used to alert on the average temperature of the your GPUs over the last minute
```
SELECT mean("temperature_gpu") FROM "nvidia_smi" WHERE time > now() - 5m GROUP BY time(1m), "index", "name", "host"
```
2019-08-14 23:56:45 +00:00
### Troubleshooting
2019-11-13 00:13:30 +00:00
Check the full output by running `nvidia-smi` binary manually.
Linux:
```
sudo -u telegraf -- /usr/bin/nvidia-smi -q -x
2019-08-14 23:56:45 +00:00
```
2019-11-13 00:13:30 +00:00
Windows:
2019-08-14 23:56:45 +00:00
```
2019-11-13 00:13:30 +00:00
"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -x
```
Please include the output of this command if opening an GitHub issue.
2019-08-14 23:56:45 +00:00
2018-04-17 20:40:55 +00:00
### Example Output
```
nvidia_smi,compute_mode=Default,host=8218cf,index=0,name=GeForce\ GTX\ 1070,pstate=P2,uuid=GPU-823bc202-6279-6f2c-d729-868a30f14d96 fan_speed=100i,memory_free=7563i,memory_total=8112i,memory_used=549i,temperature_gpu=53i,utilization_gpu=100i,utilization_memory=90i 1523991122000000000
nvidia_smi,compute_mode=Default,host=8218cf,index=1,name=GeForce\ GTX\ 1080,pstate=P2,uuid=GPU-f9ba66fc-a7f5-94c5-da19-019ef2f9c665 fan_speed=100i,memory_free=7557i,memory_total=8114i,memory_used=557i,temperature_gpu=50i,utilization_gpu=100i,utilization_memory=85i 1523991122000000000
nvidia_smi,compute_mode=Default,host=8218cf,index=2,name=GeForce\ GTX\ 1080,pstate=P2,uuid=GPU-d4cfc28d-0481-8d07-b81a-ddfc63d74adf fan_speed=100i,memory_free=7557i,memory_total=8114i,memory_used=557i,temperature_gpu=58i,utilization_gpu=100i,utilization_memory=86i 1523991122000000000
```
2019-07-09 22:45:02 +00:00
### Limitations
Note that there seems to be an issue with getting current memory clock values when the memory is overclocked.
This may or may not apply to everyone but it's confirmed to be an issue on an EVGA 2080 Ti.