Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 9e61661

Browse filesBrowse files
committed
fix indexing token_logprobs after sorting
1 parent ca11673 commit 9e61661
Copy full SHA for 9e61661

File tree

Expand file treeCollapse file tree

1 file changed

+3
-3
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+3
-3
lines changed

‎llama_cpp/llama.py

Copy file name to clipboardExpand all lines: llama_cpp/llama.py
+3-3Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -958,7 +958,7 @@ def _create_completion(
958958
)
959959
],
960960
"text_offset": [text_offset],
961-
"token_logprobs": [sorted_logprobs[int(token)][0]],
961+
"token_logprobs": [current_logprobs[int(token)]],
962962
"top_logprobs": [top_logprob],
963963
}
964964
returned_tokens += 1
@@ -1033,7 +1033,7 @@ def _create_completion(
10331033
self.detokenize([token]).decode("utf-8", errors="ignore")
10341034
],
10351035
"text_offset": [text_offset],
1036-
"token_logprobs": [sorted_logprobs[int(token)][0]],
1036+
"token_logprobs": [current_logprobs[int(token)]],
10371037
"top_logprobs": [top_logprob],
10381038
}
10391039

@@ -1131,7 +1131,7 @@ def _create_completion(
11311131
zip(logprobs_token, range(len(logprobs_token))), reverse=True
11321132
)
11331133
)
1134-
token_logprobs.append(sorted_logprobs[int(token)][0])
1134+
token_logprobs.append(logprobs_token[int(token)])
11351135
top_logprob: Optional[Dict[str, float]] = {
11361136
self.detokenize([i]).decode("utf-8", errors="ignore"): logprob
11371137
for logprob, i in sorted_logprobs[:logprobs]

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.