Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Latest commit

 

History

History
History
39 lines (24 loc) · 1.76 KB

File metadata and controls

39 lines (24 loc) · 1.76 KB
Copy raw file
Download raw file
Outline
Edit and raw actions

How to Use

You can run Kontext using stable-diffusion.cpp with a GPU that has 6GB or even 4GB of VRAM, without needing to offload to RAM.

Download weights

Convert Kontext weights

You can download the preconverted gguf weights from FLUX.1-Kontext-dev-GGUF, this way you don't have to do the conversion yourself.

.\bin\Release\sd-cli.exe -M convert -m ..\..\ComfyUI\models\unet\flux1-kontext-dev.safetensors -o ..\models\flux1-kontext-dev-q8_0.gguf -v --type q8_0

Run

  • --cfg-scale is recommended to be set to 1.

Example

For example:

 .\bin\Release\sd-cli.exe -r .\flux1-dev-q8_0.png --diffusion-model  ..\models\flux1-kontext-dev-q8_0.gguf --vae ..\models\ae.sft --clip_l ..\models\clip_l.safetensors --t5xxl ..\models\t5xxl_fp16.safetensors -p "change 'flux.cpp' to 'kontext.cpp'" --cfg-scale 1.0 --sampling-method euler -v --clip-on-cpu
ref_image prompt output
change 'flux.cpp' to 'kontext.cpp'
Morty Proxy This is a proxified and sanitized view of the page, visit original site.