Issues
is:issue state:open
is:issue state:open
Search results
- Status: Open.#2115 In ModelCloud/GPTQModel;
[BUG] AWQ quantization fail for GLM-4.5-Air
bugSomething isn't workingSomething isn't workingStatus: Open.#2114 In ModelCloud/GPTQModel;- Status: Open.#1998 In ModelCloud/GPTQModel;
- Status: Open.#1841 In ModelCloud/GPTQModel;
- Status: Open.#1700 In ModelCloud/GPTQModel;
- Status: Open.#1678 In ModelCloud/GPTQModel;
- Status: Open.#1659 In ModelCloud/GPTQModel;
[BUG] Gemma 3 4b quantization outputs multi-modal config despite it being text only
bugSomething isn't workingSomething isn't workingStatus: Open.#1655 In ModelCloud/GPTQModel;- Status: Open.#1654 In ModelCloud/GPTQModel;
[BUG] GPTQ-quantized OVIS 1B model yields poor performance & misaligned outputs in vLLM-0.9.1
bugSomething isn't workingSomething isn't workingStatus: Open.#1653 In ModelCloud/GPTQModel;- Status: Open.#1648 In ModelCloud/GPTQModel;
- Status: Open.#1639 In ModelCloud/GPTQModel;