Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 3757328

Browse filesBrowse files
committed
fix: free last image embed in llava chat handler
1 parent 7712263 commit 3757328
Copy full SHA for 3757328

File tree

Expand file treeCollapse file tree

1 file changed

+5
-0
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+5
-0
lines changed

‎llama_cpp/llama_chat_format.py

Copy file name to clipboardExpand all lines: llama_cpp/llama_chat_format.py
+5Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2615,6 +2615,11 @@ def embed_image_bytes(image_bytes: bytes):
26152615
if self._last_image_embed is not None and self._last_image_hash is not None and hash(image_bytes) == self._last_image_hash:
26162616
return self._last_image_embed
26172617
with suppress_stdout_stderr(disable=self.verbose):
2618+
# Free the previous image embed
2619+
if self._last_image_embed is not None:
2620+
self._llava_cpp.llava_image_embed_free(self._last_image_embed)
2621+
self._last_image_embed = None
2622+
self._last_image_hash = None
26182623
embed = (
26192624
self._llava_cpp.llava_image_embed_make_with_bytes(
26202625
self.clip_ctx,

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.