Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 36041c8

Browse filesBrowse files
committed
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
2 parents d015bdb + dcc26f7 commit 36041c8
Copy full SHA for 36041c8

File tree

Expand file treeCollapse file tree

4 files changed

+25
-22
lines changed
Filter options
Expand file treeCollapse file tree

4 files changed

+25
-22
lines changed

‎README.md

Copy file name to clipboardExpand all lines: README.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -169,7 +169,7 @@ docker run --rm -it -p 8000:8000 -v /path/to/models:/models -e MODEL=/models/ggm
169169
## Low-level API
170170

171171
The low-level API is a direct [`ctypes`](https://docs.python.org/3/library/ctypes.html) binding to the C API provided by `llama.cpp`.
172-
The entire lowe-level API can be found in [llama_cpp/llama_cpp.py](https://github.com/abetlen/llama-cpp-python/blob/master/llama_cpp/llama_cpp.py) and directly mirrors the C API in [llama.h](https://github.com/ggerganov/llama.cpp/blob/master/llama.h).
172+
The entire low-level API can be found in [llama_cpp/llama_cpp.py](https://github.com/abetlen/llama-cpp-python/blob/master/llama_cpp/llama_cpp.py) and directly mirrors the C API in [llama.h](https://github.com/ggerganov/llama.cpp/blob/master/llama.h).
173173

174174
Below is a short example demonstrating how to use the low-level API to tokenize a prompt:
175175

‎docker/openblas_simple/Dockerfile

Copy file name to clipboardExpand all lines: docker/openblas_simple/Dockerfile
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ COPY . .
99
RUN apt update && apt install -y libopenblas-dev ninja-build build-essential
1010
RUN python -m pip install --upgrade pip pytest cmake scikit-build setuptools fastapi uvicorn sse-starlette pydantic-settings
1111

12-
RUN LLAMA_OPENBLAS=1 pip install llama_cpp_python --verbose
12+
RUN CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama_cpp_python --verbose
1313

1414
# Run the server
1515
CMD python3 -m llama_cpp.server

‎poetry.lock

Copy file name to clipboardExpand all lines: poetry.lock
+20-17Lines changed: 20 additions & 17 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

‎pyproject.toml

Copy file name to clipboardExpand all lines: pyproject.toml
+3-3Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,17 +17,17 @@ python = "^3.8.1"
1717
typing-extensions = "^4.7.1"
1818
numpy = "^1.24.4"
1919
diskcache = "^5.6.1"
20-
uvicorn = { version = "^0.23.1", optional = true }
20+
uvicorn = { version = "^0.23.2", optional = true }
2121
fastapi = { version = ">=0.100.0", optional = true }
2222
sse-starlette = { version = ">=1.6.1", optional = true }
2323
pydantic-settings = { version = ">=2.0.1", optional = true }
2424

2525
[tool.poetry.group.dev.dependencies]
2626
black = "^23.7.0"
2727
twine = "^4.0.2"
28-
mkdocs = "^1.4.3"
28+
mkdocs = "^1.5.2"
2929
mkdocstrings = {extras = ["python"], version = "^0.22.0"}
30-
mkdocs-material = "^9.1.19"
30+
mkdocs-material = "^9.1.21"
3131
pytest = "^7.4.0"
3232
httpx = "^0.24.1"
3333
scikit-build = "0.17.6"

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.