Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Tags: joseph16388/llama-cpp-python

Tags

v0.2.78

Toggle v0.2.78's commit message
chore: Bump version

v0.2.78-metal

Toggle v0.2.78-metal's commit message
chore: Bump version

v0.2.77

Toggle v0.2.77's commit message
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python in…

…to main

v0.2.77-metal

Toggle v0.2.77-metal's commit message
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python in…

…to main

v0.2.77-cu124

Toggle v0.2.77-cu124's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat: adding `rpc_servers` parameter to `Llama` class (abetlen#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <abetlen@gmail.com>

v0.2.77-cu123

Toggle v0.2.77-cu123's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat: adding `rpc_servers` parameter to `Llama` class (abetlen#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <abetlen@gmail.com>

v0.2.77-cu122

Toggle v0.2.77-cu122's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat: adding `rpc_servers` parameter to `Llama` class (abetlen#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <abetlen@gmail.com>

v0.2.77-cu121

Toggle v0.2.77-cu121's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat: adding `rpc_servers` parameter to `Llama` class (abetlen#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <abetlen@gmail.com>

v0.2.76

Toggle v0.2.76's commit message
chore: Bump version

v0.2.76-metal

Toggle v0.2.76-metal's commit message
chore: Bump version

Morty Proxy This is a proxified and sanitized view of the page, visit original site.