Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

jim3692/koboldcpp-flake

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

KoboldCpp Vulkan Flake

Run nix run github:jim3692/koboldcpp-flake

It downloads Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf using huggingface-cli, if it doesn't already exist in ~/.cache/huggingface, and then starts KoboldCpp in Vulkan mode.

Morty Proxy This is a proxified and sanitized view of the page, visit original site.