You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
2.2 [Install IPEX-LLM](#for-pytorch-and-huggingface-1) (for PyTorch and HuggingFace)
19
19
2.3 [Install IPEX-LLM](#for-llamacpp-and-ollama-1) (for llama.cpp and Ollama)
20
-
5.[Use Cases](#3-use-cases)
20
+
3.[Use Cases](#3-use-cases)
21
21
3.1 [PyTorch](#31-pytorch)
22
22
3.2 [Ollama](#32-ollama)
23
23
3.3 [llama.cpp](#33-llamacpp)
24
-
3.4 [vLLM](#34-vllm)
24
+
3.4 [vLLM](#34-vllm)
25
+
4.[Troubleshooting](#4-troubleshooting)
26
+
4.1 [RuntimeError: could not create an engine](#41-runtimeerror-could-not-create-an-engine)
25
27
---
26
28
27
29
## 1. Linux
@@ -180,3 +182,21 @@ For instructions on how to run **llama.cpp** with IPEX-LLM, refer to the [llama.
180
182
181
183
To set up and run **vLLM**, follow the [vLLM Quickstart guide](https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/vLLM_quickstart.md).
If you encounter a `RuntimeError` like the output shown above while working on Linux after running `conda deactivate` and then reactivating your environment using `conda activate env`, the issue is likely caused by the `OCL_ICD_VENDORS` environment variable.
192
+
193
+
To fix this on Linux, run the following command:
194
+
195
+
```bash
196
+
unset OCL_ICD_VENDORS
197
+
```
198
+
199
+
This will remove the conflicting environment variable and allow your program to function correctly.
200
+
201
+
**Note:** This issue only occurs on Linux systems. It does not affect Windows environments.
0 commit comments