Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 43e006a

Browse filesBrowse files
committed
docs: Remove divider
1 parent 2cc6c9a commit 43e006a
Copy full SHA for 43e006a

File tree

Expand file treeCollapse file tree

1 file changed

+0
-10
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+0
-10
lines changed

‎README.md

Copy file name to clipboardExpand all lines: README.md
-10Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
# 🦙 Python Bindings for [`llama.cpp`](https://github.com/ggerganov/llama.cpp)
2-
---
32

43
[![Documentation Status](https://readthedocs.org/projects/llama-cpp-python/badge/?version=latest)](https://llama-cpp-python.readthedocs.io/en/latest/?badge=latest)
54
[![Tests](https://github.com/abetlen/llama-cpp-python/actions/workflows/test.yaml/badge.svg?branch=main)](https://github.com/abetlen/llama-cpp-python/actions/workflows/test.yaml)
@@ -25,7 +24,6 @@ Documentation is available at [https://llama-cpp-python.readthedocs.io/en/latest
2524

2625

2726
## Installation
28-
---
2927

3028
Install from PyPI (requires a c compiler):
3129

@@ -109,7 +107,6 @@ See the above instructions and set `CMAKE_ARGS` to the BLAS backend you want to
109107
Detailed MacOS Metal GPU install documentation is available at [docs/install/macos.md](https://llama-cpp-python.readthedocs.io/en/latest/install/macos/)
110108

111109
## High-level API
112-
---
113110

114111
[API Reference](https://llama-cpp-python.readthedocs.io/en/latest/api-reference/#high-level-api)
115112

@@ -273,7 +270,6 @@ llm = Llama(model_path="./models/7B/llama-model.gguf", n_ctx=2048)
273270

274271

275272
## OpenAI Compatible Web Server
276-
---
277273

278274
`llama-cpp-python` offers a web server which aims to act as a drop-in replacement for the OpenAI API.
279275
This allows you to use llama.cpp compatible models with any OpenAI compatible client (language libraries, services, etc).
@@ -313,7 +309,6 @@ For possible options, see [llama_cpp/llama_chat_format.py](llama_cpp/llama_chat_
313309
- [Vision API support](https://llama-cpp-python.readthedocs.io/en/latest/server/#multimodal-models)
314310

315311
## Docker image
316-
---
317312

318313
A Docker image is available on [GHCR](https://ghcr.io/abetlen/llama-cpp-python). To run the server:
319314

@@ -323,7 +318,6 @@ docker run --rm -it -p 8000:8000 -v /path/to/models:/models -e MODEL=/models/lla
323318
[Docker on termux (requires root)](https://gist.github.com/FreddieOliveira/efe850df7ff3951cb62d74bd770dce27) is currently the only known way to run this on phones, see [termux support issue](https://github.com/abetlen/llama-cpp-python/issues/389)
324319

325320
## Low-level API
326-
---
327321

328322
[API Reference](https://llama-cpp-python.readthedocs.io/en/latest/api-reference/#low-level-api)
329323

@@ -351,13 +345,11 @@ Check out the [examples folder](examples/low_level_api) for more examples of usi
351345

352346

353347
## Documentation
354-
---
355348

356349
Documentation is available via [https://llama-cpp-python.readthedocs.io/](https://llama-cpp-python.readthedocs.io/).
357350
If you find any issues with the documentation, please open an issue or submit a PR.
358351

359352
## Development
360-
---
361353

362354
This package is under active development and I welcome any contributions.
363355

@@ -384,7 +376,6 @@ make clean
384376
```
385377

386378
## FAQ
387-
---
388379

389380
### Are there pre-built binaries / binary wheels available?
390381

@@ -407,6 +398,5 @@ I originally wrote this package for my own use with two goals in mind:
407398
Any contributions and changes to this package will be made with these goals in mind.
408399

409400
## License
410-
---
411401

412402
This project is licensed under the terms of the MIT license.

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.