Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 8d76fc1

Browse filesBrowse files
committed
Make Mac dist slimmer (?)
Don't require installing all dev packages, just pyinstaller when doing `deploy.pyinstaller.mac`. Also don't use temp var for lockfile, never referred to.
1 parent 6d17f51 commit 8d76fc1
Copy full SHA for 8d76fc1

File tree

Expand file treeCollapse file tree

3 files changed

+5
-4
lines changed
Filter options
Expand file treeCollapse file tree

3 files changed

+5
-4
lines changed

‎Makefile

Copy file name to clipboardExpand all lines: Makefile
+1-1Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,7 @@ deploy.pyinstaller.mac:
9595
CMAKE_BUILD_TYPE="Release" \
9696
CMAKE_ARGS="-DGGML_METAL=OFF -DGGML_LLAMAFILE=OFF -DGGML_BLAS=OFF \
9797
-DGGML_NATIVE=ON -DGGML_CPU_AARCH64=ON" \
98-
python3 -m pip install -v -e .[server,dev]
98+
python3 -m pip install -v -e .[server,pyinstaller]
9999
@server_path=$$(python -c 'import llama_cpp.server; print(llama_cpp.server.__file__)' | sed s/init/main/) ; \
100100
echo "Server path: $$server_path" ; \
101101
base_path=$$(python -c 'from llama_cpp._ggml import libggml_base_path; print(str(libggml_base_path))') ; \

‎llama_cpp/llama_chat_format.py

Copy file name to clipboardExpand all lines: llama_cpp/llama_chat_format.py
+1-2Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -622,8 +622,7 @@ def chat_completion_handler(
622622

623623
# We ensure that output path ends with .ndjson in pydantic validation.
624624
lockfile_path = output_path.with_suffix(".lock")
625-
lock = filelock.FileLock(str(lockfile_path))
626-
with lock:
625+
with filelock.FileLock(str(lockfile_path)):
627626
with output_path.open("a", encoding="utf-8") as f:
628627
json.dump({"prompt": result.prompt, "prompt_tokens": prompt}, f)
629628
f.write("\n")

‎pyproject.toml

Copy file name to clipboardExpand all lines: pyproject.toml
+3-1Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,9 +56,11 @@ dev = [
5656
"httpx>=0.24.1",
5757
"pandas>=2.2.1",
5858
"tqdm>=4.66.2",
59+
]
60+
pyinstaller = [
5961
"pyinstaller>=6.11.1",
6062
]
61-
all = ["llama_cpp_python[server,test,dev]"]
63+
all = ["llama_cpp_python[server,test,dev,pyinstaller]"]
6264

6365
[tool.scikit-build]
6466
wheel.packages = ["llama_cpp"]

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.