Closed
Description
The last version 0.1.66, not unload model from VRAM.
I thought the problem was with oobabooga/text-generation-webui oobabooga/text-generation-webui#2920, but after digging into the code from projects, I think the problem is with this library.
I think there
llama-cpp-python/llama_cpp/llama.py
Line 1439 in 442213b
llama_free_model
function.
Sorry if I'm wrong, I don't work with python.
abetlen
Metadata
Metadata
Assignees
Labels
No labels