Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit d68fc07

Browse filesBrowse files
authored
Add Zephyr format (abetlen#937)
1 parent 4184835 commit d68fc07
Copy full SHA for d68fc07

File tree

Expand file treeCollapse file tree

1 file changed

+15
-0
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+15
-0
lines changed

‎llama_cpp/llama_chat_format.py

Copy file name to clipboardExpand all lines: llama_cpp/llama_chat_format.py
+15Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -621,6 +621,21 @@ def format_mistrallite(
621621
_prompt = _format_no_colon_single(system_message, _messages, _sep)
622622
return ChatFormatterResponse(prompt=_prompt)
623623

624+
@register_chat_format("zephyr")
625+
def format_zephyr(
626+
messages: List[llama_types.ChatCompletionRequestMessage],
627+
**kwargs: Any,
628+
) -> ChatFormatterResponse:
629+
system_template = """<|system|>
630+
{system_message}"""
631+
system_message = _get_system_message(messages)
632+
system_message = system_template.format(system_message=system_message)
633+
_roles = dict(user="<|user|>\n", assistant="<|assistant|>\n")
634+
_sep = "</s>"
635+
_messages = _map_roles(messages, _roles)
636+
_messages.append((_roles["assistant"], None))
637+
_prompt = _format_chatml(system_message, _messages, _sep)
638+
return ChatFormatterResponse(prompt=_prompt, stop=_sep)
624639

625640
@register_chat_format("chatml")
626641
def format_chatml(

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.