Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 3fe045f

Browse filesBrowse files
committed
Re-apply grammar parser memory bug fix 22aba95
This important change was accidentally removed in 94d0940. Credit for discovering (and most importantly, reporting) this issue goes to Eclypsium Security Researcher Richard Johnson. Bug fix sent upstream in ggml-org/llama.cpp#7194
1 parent b5c6df6 commit 3fe045f
Copy full SHA for 3fe045f

File tree

Expand file treeCollapse file tree

2 files changed

+12
-3
lines changed
Filter options
Expand file treeCollapse file tree

2 files changed

+12
-3
lines changed

‎llama.cpp/grammar-parser.cpp

Copy file name to clipboardExpand all lines: llama.cpp/grammar-parser.cpp
+9Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -145,6 +145,9 @@ namespace grammar_parser {
145145
pos++;
146146
last_sym_start = out_elements.size();
147147
while (*pos != '"') {
148+
if (!*pos) { // [jart] don't sync until upstream fixes bug
149+
throw std::runtime_error("unexpected end of input");
150+
}
148151
auto char_pair = parse_char(pos);
149152
pos = char_pair.second;
150153
out_elements.push_back({LLAMA_GRETYPE_CHAR, char_pair.first});
@@ -159,6 +162,9 @@ namespace grammar_parser {
159162
}
160163
last_sym_start = out_elements.size();
161164
while (*pos != ']') {
165+
if (!*pos) { // [jart] don't sync until upstream fixes bug
166+
throw std::runtime_error("unexpected end of input");
167+
}
162168
auto char_pair = parse_char(pos);
163169
pos = char_pair.second;
164170
enum llama_gretype type = last_sym_start < out_elements.size()
@@ -167,6 +173,9 @@ namespace grammar_parser {
167173

168174
out_elements.push_back({type, char_pair.first});
169175
if (pos[0] == '-' && pos[1] != ']') {
176+
if (pos[1]) { // [jart] don't sync until upstream fixes bug
177+
throw std::runtime_error("unexpected end of input");
178+
}
170179
auto endchar_pair = parse_char(pos + 1);
171180
pos = endchar_pair.second;
172181
out_elements.push_back({LLAMA_GRETYPE_CHAR_RNG_UPPER, endchar_pair.first});

‎llama.cpp/server/server.cpp

Copy file name to clipboardExpand all lines: llama.cpp/server/server.cpp
+3-3Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -866,12 +866,12 @@ struct llama_server_context
866866
}
867867
}
868868

869-
if (slot->ctx_sampling != nullptr)
870-
{
869+
if (slot->ctx_sampling != nullptr) {
871870
llama_sampling_free(slot->ctx_sampling);
872871
}
873872
slot->ctx_sampling = llama_sampling_init(slot->sparams);
874-
if (!slot->ctx_sampling) { // [jart] fixes crash
873+
if (slot->ctx_sampling == nullptr) {
874+
// for now, the only error that may happen here is invalid grammar
875875
LOG_TEE("%s: failed to initialize sampling subsystem\n", __func__);
876876
return false;
877877
}

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.