We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent f417cce commit c8cd8c1Copy full SHA for c8cd8c1
README.md
@@ -121,7 +121,7 @@ CMAKE_ARGS="-DLLAMA_CUDA=on" pip install llama-cpp-python
121
122
It is also possible to install a pre-built wheel with CUDA support. As long as your system meets some requirements:
123
124
-- CUDA Version is 12.1, 12.2 or 12.3
+- CUDA Version is 12.1, 12.2, 12.3, or 12.4
125
- Python Version is 3.10, 3.11 or 3.12
126
127
```bash
@@ -133,6 +133,7 @@ Where `<cuda-version>` is one of the following:
133
- `cu121`: CUDA 12.1
134
- `cu122`: CUDA 12.2
135
- `cu123`: CUDA 12.3
136
+- `cu124`: CUDA 12.4
137
138
For example, to install the CUDA 12.1 wheel:
139
0 commit comments