You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Simple Python bindings for **@ggerganov's**[`llama.cpp`](https://github.com/ggerganov/llama.cpp) library.
4
+
Simple Python bindings for [`ggllm.cpp`](https://github.com/cmp-nct/ggllm.cpp) library.
11
5
This package provides:
12
6
13
7
- Low-level access to C API via `ctypes` interface.
14
8
- High-level Python API for text completion
15
9
- OpenAI-like API
16
10
- LangChain compatibility
17
11
18
-
Documentation is available at [https://llama-cpp-python.readthedocs.io/en/latest](https://llama-cpp-python.readthedocs.io/en/latest).
12
+
This project is currently in alpha development and is not yet completely functional. Any contributions are warmly welcomed.
19
13
20
14
21
-
## Installation from PyPI (recommended)
22
-
23
-
Install from PyPI (requires a c compiler):
24
-
25
-
```bash
26
-
pip install llama-cpp-python
27
-
```
28
-
29
-
The above command will attempt to install the package and build `llama.cpp` from source.
30
-
This is the recommended installation method as it ensures that `llama.cpp` is built with the available optimizations for your system.
31
-
32
-
If you have previously installed `llama-cpp-python` through pip and want to upgrade your version or rebuild the package with different compiler options, please add the following flags to ensure that the package is rebuilt correctly:
The low-level API is a direct [`ctypes`](https://docs.python.org/3/library/ctypes.html) binding to the C API provided by `llama.cpp`.
133
-
The entire lowe-level API can be found in [llama_cpp/llama_cpp.py](https://github.com/abetlen/llama-cpp-python/blob/master/llama_cpp/llama_cpp.py) and directly mirrors the C API in [llama.h](https://github.com/ggerganov/llama.cpp/blob/master/llama.h).
63
+
The entire lowe-level API can be found in [falcon_cpp/falcon_cpp.py](https://github.com/sirajperson/falcon-cpp-python/blob/master/falcon_cpp/falcon_cpp.py) and directly mirrors the C API in [libfalcon.h](https://github.com/cmp-nct/ggllm.cpp/blob/master/libfalcon.h).
134
64
135
65
Below is a short example demonstrating how to use the low-level API to tokenize a prompt:
>>> n_tokens =falcon_cpp.falcon_tokenize(ctx, b"Q: Name the planets in the solar system? A: ", tokens, max_tokens, add_bos=llama_cpp.c_bool(True))
77
+
>>>falcon_cpp.falcon_free(ctx)
148
78
```
149
79
150
80
Check out the [examples folder](examples/low_level_api) for more examples of using the low-level API.
151
81
152
-
153
82
# Documentation
154
-
155
-
Documentation is available at [https://abetlen.github.io/llama-cpp-python](https://abetlen.github.io/llama-cpp-python).
156
-
If you find any issues with the documentation, please open an issue or submit a PR.
83
+
Coming soon...
157
84
158
85
# Development
159
86
160
-
This package is under active development and I welcome any contributions.
87
+
Again, this package is under active development and I welcome any contributions.
161
88
162
89
To get started, clone the repository and install the package in development mode:
163
90
@@ -179,12 +106,12 @@ poetry install --all-extras
179
106
python3 setup.py develop
180
107
```
181
108
182
-
# How does this compare to other Python bindings of `llama.cpp`?
183
-
184
-
I originally wrote this package for my own use with two goals in mind:
109
+
# This Project is a fork of llama-cpp-python
185
110
186
-
- Provide a simple process to install `llama.cpp` and access the full C API in `llama.h` from Python
187
-
- Provide a high-level Python API that can be used as a drop-in replacement for the OpenAI API so existing apps can be easily ported to use `llama.cpp`
111
+
This project was originally llama-cpp-python and owes an immense thanks to @abetlen.
112
+
This projects goal is to
113
+
- Provide a simple process to install `ggllm.cpp` and access the full C API in `libfalcon.h` from Python
114
+
- Provide a high-level Python API that can be used as a drop-in replacement for the OpenAI API so existing apps can be easily ported to use `ggllm.cpp`
188
115
189
116
Any contributions and changes to this package will be made with these goals in mind.
0 commit comments