Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 1ea6154

Browse filesBrowse files
fix(docs): Update development instructions (abetlen#1833)
1 parent 9bd0c95 commit 1ea6154
Copy full SHA for 1ea6154

File tree

Expand file treeCollapse file tree

1 file changed

+16
-2
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+16
-2
lines changed

‎README.md

Copy file name to clipboardExpand all lines: README.md
+16-2Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -752,15 +752,29 @@ pip install --upgrade pip
752752
pip install -e .
753753

754754
# if you want to use the fastapi / openapi server
755-
pip install -e .[server]
755+
pip install -e '.[server]'
756756

757757
# to install all optional dependencies
758-
pip install -e .[all]
758+
pip install -e '.[all]'
759759

760760
# to clear the local build cache
761761
make clean
762762
```
763763

764+
Now try running the tests
765+
766+
```bash
767+
pytest
768+
```
769+
770+
There's a `Makefile` available with useful targets.
771+
A typical workflow would look like this:
772+
773+
```bash
774+
make build
775+
make test
776+
```
777+
764778
You can also test out specific commits of `llama.cpp` by checking out the desired commit in the `vendor/llama.cpp` submodule and then running `make clean` and `pip install -e .` again. Any changes in the `llama.h` API will require
765779
changes to the `llama_cpp/llama_cpp.py` file to match the new API (additional changes may be required elsewhere).
766780

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.