Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 6cf5876

Browse filesBrowse files
committed
Deprecate generate method
1 parent b3805bb commit 6cf5876
Copy full SHA for 6cf5876

File tree

Expand file treeCollapse file tree

1 file changed

+6
-0
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+6
-0
lines changed

‎llama_cpp/llama.py

Copy file name to clipboardExpand all lines: llama_cpp/llama.py
+6Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
import uuid
44
import time
55
import math
6+
import warnings
67
import multiprocessing
78
from typing import List, Optional, Union, Generator, Sequence, Iterator
89
from collections import deque
@@ -239,6 +240,11 @@ def generate(
239240
Yields:
240241
The generated tokens.
241242
"""
243+
warnings.warn(
244+
"Llama.generate is deprecated and will be removed in v0.2.0",
245+
DeprecationWarning,
246+
stacklevel=2,
247+
)
242248
assert self.ctx is not None
243249
self.reset()
244250
while True:

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.