Issues
is:issue state:open
is:issue state:open
Search results
- Status: Open.#242 In ServerlessLLM/ServerlessLLM;
- Status: Open.#237 In ServerlessLLM/ServerlessLLM;
[BUG] Qwen/Qwen2.5-7B-Instruct generate nothing when using sllm-store
BugSomething isn't workingSomething isn't workingStatus: Open.#220 In ServerlessLLM/ServerlessLLM;- Status: Open.#217 In ServerlessLLM/ServerlessLLM;
- Status: Open.#210 In ServerlessLLM/ServerlessLLM;
- Status: Open.#209 In ServerlessLLM/ServerlessLLM;
- Status: Open.#176 In ServerlessLLM/ServerlessLLM;
[Feature Request] [Enhancement] Merge <code>sllm-cli</code> and <code>sllm-serve</code> into a Unified Interface
Priority 1Medium priorityMedium priorityStatus: Open.#170 In ServerlessLLM/ServerlessLLM;- Status: Open.#157 In ServerlessLLM/ServerlessLLM;
- Status: Open.#156 In ServerlessLLM/ServerlessLLM;
[Feature Request] AI-based spell check
Good first issueGood for newcomersGood for newcomersPriority 2Low priorityLow priorityStatus: Open.#151 In ServerlessLLM/ServerlessLLM;[Feature Request] Does sllm support multi-node inference?
QuestionFurther information is requestedFurther information is requestedStatus: Open.#133 In ServerlessLLM/ServerlessLLM;