Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Pull requests: triton-inference-server/vllm_backend

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

ci: Move from V0 to V1
#100 opened Oct 17, 2025 by yinggeh Loading…
feat: Report more vllm metrics enhancement New feature or request
#92 opened May 13, 2025 by Pavloveuge Loading…
3 of 10 tasks
add multimodal support for qwen2.5
#90 opened May 6, 2025 by abdulazizab2 Loading…
Add support for priority in vllm backend
#88 opened Apr 24, 2025 by TheCodeWrangler Loading…
2 of 5 tasks
Followup with some fixes
#77 opened Dec 20, 2024 by oandreeva-nv Draft
docs: Update README.md documentation Improvements or additions to documentation
#63 opened Sep 6, 2024 by yinggeh Draft
Add input and output tokens to response
#41 opened May 16, 2024 by kebe7jun Loading…
ProTip! Exclude everything labeled bug with -label:bug.
Morty Proxy This is a proxified and sanitized view of the page, visit original site.