Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Functionary is not compatible with python 3.8 #1361

Copy link
Copy link
Closed
@squaringTheCircle

Description

@squaringTheCircle
Issue body actions

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

Python 3.8 is listed as a supported version, yet it is incompatible with functionary.

Running llama-cpp-python in python 3.8 with any functionary model and using a prompt that doesn't call a tool should not raise an exception.

Current Behavior

Raises an Attribute error
AttributeError: 'str' object has no attribute 'removesuffix'

Environment and Context

Any environment using python 3.8

Failure Information (for bugs)

Compatibility with python 3.8 was lost in pull request #1282 as inbuilt string methods removesuffix and removeprefix were introduced in python 3.9 PEP 616

Steps to Reproduce

Minimal example

from llama_cpp import Llama
from llama_cpp.llama_tokenizer import LlamaHFTokenizer

llm = Llama.from_pretrained(
    repo_id="meetkai/functionary-small-v2.4-GGUF",
    filename="functionary-small-v2.4.Q4_0.gguf",
    chat_format="functionary-v2",
    tokenizer=LlamaHFTokenizer.from_pretrained("meetkai/functionary-small-v2.4-GGUF"),
    n_gpu_layers=-1
)

messages = [
    {"role": "user", "content": "Hello"}
]
tools = [ 
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g., San Francisco, CA"
                    }
                },
                "required": ["location"]
            }
        }
    }
]

result = llm.create_chat_completion(
      messages = messages,
      tools=tools,
      tool_choice="auto",
)

Failure Logs

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/llama_cpp/llama.py", line 1654, in create_chat_completion
    return handler(
  File "/usr/local/lib/python3.8/dist-packages/llama_cpp/llama_chat_format.py", line 2035, in functionary_v1_v2_chat_handler
    content += completion_text.removesuffix("\n<|from|>assistant\n").removesuffix("\n<|from|> assistant\n")
AttributeError: 'str' object has no attribute 'removesuffix'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.