Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 228949c

Browse filesBrowse files
committed
feat: Update llama.cpp
1 parent 903b28a commit 228949c
Copy full SHA for 228949c

File tree

Expand file treeCollapse file tree

2 files changed

+3
-1
lines changed
Filter options
Expand file treeCollapse file tree

2 files changed

+3
-1
lines changed

‎llama_cpp/llama_cpp.py

Copy file name to clipboardExpand all lines: llama_cpp/llama_cpp.py
+2Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -296,6 +296,7 @@ def byref(obj: CtypesCData, offset: Optional[int] = None) -> CtypesRef[CtypesCDa
296296
# LLAMA_VOCAB_PRE_TYPE_GPT2 = 7,
297297
# LLAMA_VOCAB_PRE_TYPE_REFACT = 8,
298298
# LLAMA_VOCAB_PRE_TYPE_COMMAND_R = 9,
299+
# LLAMA_VOCAB_PRE_TYPE_OLMO = 10,
299300
# };
300301
LLAMA_VOCAB_PRE_TYPE_DEFAULT = 0
301302
LLAMA_VOCAB_PRE_TYPE_LLAMA3 = 1
@@ -307,6 +308,7 @@ def byref(obj: CtypesCData, offset: Optional[int] = None) -> CtypesRef[CtypesCDa
307308
LLAMA_VOCAB_PRE_TYPE_GPT2 = 7
308309
LLAMA_VOCAB_PRE_TYPE_REFACT = 8
309310
LLAMA_VOCAB_PRE_TYPE_COMMAND_R = 9
311+
LLAMA_VOCAB_PRE_TYPE_OLMO = 10
310312

311313

312314
# // note: these values should be synchronized with ggml_rope

‎vendor/llama.cpp

Copy file name to clipboard

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.