-
Notifications
You must be signed in to change notification settings - Fork 46
Insights: ServerlessLLM/ServerlessLLM
Overview
-
- 6 Merged pull requests
- 5 Open pull requests
- 0 Closed issues
- 1 New issue
Could not load contribution data
Please try again later
6 Pull requests merged by 3 people
-
fix: ROCm upgrade for vLLM update
#264 merged
Jun 5, 2025 -
fix: broken markdown links in examples
#256 merged
May 30, 2025 -
docs: move vllm patch to store quick start
#255 merged
May 29, 2025 -
ci(docs): add documentation build validation workflow
#253 merged
May 29, 2025 -
ci(docs): add workflow to sync documentation site
#254 merged
May 29, 2025 -
Fy/quick start
#247 merged
May 29, 2025
5 Pull requests opened by 4 people
-
updated inference tests to use github actions
#259 opened
Jun 3, 2025 -
chore(ci): migrate from ubuntu-20.04 and clean up deprecated dependencies
#260 opened
Jun 4, 2025 -
fix(workflows): enhance documentation sync reliability and update actions
#261 opened
Jun 4, 2025 -
docs: Refine PEFT LoRA Serving Example Documentation
#262 opened
Jun 4, 2025 -
feat: Update vllm version to 0.9.0.1 and support v1 engine
#263 opened
Jun 4, 2025
1 Issue opened by 1 person
-
[Feature Request] Upgrade to vLLM v0.8.5.post1, engine v1
#257 opened
May 30, 2025
4 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
feat(cli): unify sllm-cli and sllm-serve into one CLI entrypoint
#245 commented on
Jun 5, 2025 • 4 new comments -
[BUG] Failed to initialize model
#249 commented on
Jun 4, 2025 • 0 new comments -
feat: all quantization methods
#246 commented on
Jun 2, 2025 • 0 new comments -
feat: Separate fine-tuning backend from the transformers backend
#251 commented on
Jun 1, 2025 • 0 new comments