Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- [ ✅] I carefully followed the README.md.
- [ ✅] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- [ ✅] I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
MiniCPM-V 2.6 memory leak ?
Current Behavior
I am testing minicpm-v-2.6 ,here is part of my test code:
when i run code ,it adds 10MB per inference,i use memory_profiler find in
llama-cpp-python/llama_cpp/llama_chat_format.py
Line 2839 in 2bc1d97
I thought at first that this variable "embed" wasn't being released,I added the code manual release on line 2856,but the program reported an error.
then I replace "embed" with "self._last_image_embed",Memory leaks continue to occur.
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.then
Example environment info:
Python 3.10.10
ubuntu 18.04
cudatoolkit 12.1
llama-cpp-python 0.2.90