Closed
Description
models that use the 'qwen' chat format dont use custom system messages. This also seems to be the case for snoozy. A few others (vicuna, openbuddy, phind, intel) see to have a hard coded system message but as they dont have system_templates that might be the desired setup.
I think the only change required is
https://github.com/abetlen/llama-cpp-python/tree/main/llama_cpp/llama_chat_format.py
in format_qwen
1012 system_message = "You are a helpful assistant."
↓
1012 system_message = _get_system_message(messages)
abetlen
Metadata
Metadata
Assignees
Labels
Something isn't workingSomething isn't working