Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit be2c961

Browse filesBrowse files
author
Mug
committed
2 parents c4a8491 + cbd26fd commit be2c961
Copy full SHA for be2c961

File tree

Expand file treeCollapse file tree

8 files changed

+204
-114
lines changed
Filter options
Expand file treeCollapse file tree

8 files changed

+204
-114
lines changed

‎CMakeLists.txt

Copy file name to clipboardExpand all lines: CMakeLists.txt
+5-1Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,11 @@ cmake_minimum_required(VERSION 3.4...3.22)
22

33
project(llama_cpp)
44

5-
if (UNIX)
5+
option(FORCE_CMAKE "Force CMake build of Python bindings" OFF)
6+
7+
set(FORCE_CMAKE $ENV{FORCE_CMAKE})
8+
9+
if (UNIX AND NOT FORCE_CMAKE)
610
add_custom_command(
711
OUTPUT ${CMAKE_CURRENT_SOURCE_DIR}/vendor/llama.cpp/libllama.so
812
COMMAND make libllama.so

‎docs/index.md

Copy file name to clipboardExpand all lines: docs/index.md
+4Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -105,12 +105,16 @@ python3 setup.py develop
105105
- __call__
106106
- create_chat_completion
107107
- set_cache
108+
- save_state
109+
- load_state
108110
- token_bos
109111
- token_eos
110112
show_root_heading: true
111113

112114
::: llama_cpp.LlamaCache
113115

116+
::: llama_cpp.LlamaState
117+
114118
::: llama_cpp.llama_cpp
115119
options:
116120
show_if_no_docstring: true

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.