Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit dd9ad1c

Browse filesBrowse files
committed
Formatting
1 parent 9d60ae5 commit dd9ad1c
Copy full SHA for dd9ad1c

File tree

Expand file treeCollapse file tree

1 file changed

+2
-5
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+2
-5
lines changed

‎llama_cpp/llama.py

Copy file name to clipboardExpand all lines: llama_cpp/llama.py
+2-5Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -306,7 +306,7 @@ def _sample_top_p_top_k(
306306
llama_cpp.llama_sample_typical(
307307
ctx=self.ctx,
308308
candidates=llama_cpp.ctypes.pointer(candidates),
309-
p=llama_cpp.c_float(1.0)
309+
p=llama_cpp.c_float(1.0),
310310
)
311311
llama_cpp.llama_sample_top_p(
312312
ctx=self.ctx,
@@ -637,10 +637,7 @@ def _create_completion(
637637
self.detokenize([token]).decode("utf-8", errors="ignore")
638638
for token in all_tokens
639639
]
640-
all_logprobs = [
641-
Llama._logits_to_logprobs(row)
642-
for row in self.eval_logits
643-
]
640+
all_logprobs = [Llama._logits_to_logprobs(row) for row in self.eval_logits]
644641
for token, token_str, logprobs_token in zip(
645642
all_tokens, all_token_strs, all_logprobs
646643
):

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.