Closed
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior and Current Behavior
The latest version of llama-cpp-python kills python kernel with LlamaGrammar.
I ran the following code:
from llama_cpp import Llama, LlamaGrammar
model = Llama(model_path="ggufs/Meta-Llama-3-8B-Instruct.Q5_K_M.gguf", verbose=False) # Model doesn't matter.
grammar = LlamaGrammar.from_string('root ::= "a"+')
model("hello", max_tokens=10, grammar=grammar)
When it ran, the python kernel died immediately for unknown reason. Dying kernel doesn't happen without use of LlamaGrammar
.
Because this behavior has not been observed recently (actually few days ago), I suspect my recent update of llama-cpp-python module made this problem.
What I tried is:
- Build the latest code by myself -> kernel died
- Build the code in the latest release by myself -> kernel died
- Re-install wheel of the latest release -> No problem
My experiment might suggest this problem comes from backend Llama.cpp and is not llama-cpp-python's fault.
But anyway, I wanna know whether people is experiencing this bug.
Environment
OS: macOS Sonoma
Processor: M2Max 64GB.
Python version: 11
ExtReMLapin
Metadata
Metadata
Assignees
Labels
Something isn't workingSomething isn't working