Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Thread bug in server code #62

Copy link
Copy link
Closed
Closed
Copy link
@hengjiUSTC

Description

@hengjiUSTC
Issue body actions

For https://github.com/abetlen/llama-cpp-python/blob/main/llama_cpp/server/__main__.py#L202
This line of code will actually block https://github.com/abetlen/llama-cpp-python/blob/main/llama_cpp/server/__main__.py#L226 for sending heartbeat signal. Although async is used, but the main thread is blocked on executing create_chat_completion. If model is large and first message take a long time to send, network connection will drop

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.