We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 0481a3a commit fccff80Copy full SHA for fccff80
README.md
@@ -189,16 +189,6 @@ CMAKE_ARGS="-DGGML_VULKAN=on" pip install llama-cpp-python
189
190
</details>
191
192
-<details>
193
-<summary>Kompute</summary>
194
-
195
-To install with Kompute support, set the `GGML_KOMPUTE=on` environment variable before installing:
196
197
-```bash
198
-CMAKE_ARGS="-DGGML_KOMPUTE=on" pip install llama-cpp-python
199
-```
200
-</details>
201
202
<details>
203
<summary>SYCL</summary>
204
0 commit comments