Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Qwen format not taking system messages #1697

Copy link
Copy link
Closed
@mahobley

Description

@mahobley
Issue body actions

models that use the 'qwen' chat format dont use custom system messages. This also seems to be the case for snoozy. A few others (vicuna, openbuddy, phind, intel) see to have a hard coded system message but as they dont have system_templates that might be the desired setup.

I think the only change required is
https://github.com/abetlen/llama-cpp-python/tree/main/llama_cpp/llama_chat_format.py
in format_qwen

1012 system_message = "You are a helpful assistant."

1012 system_message = _get_system_message(messages)

abetlen

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.