Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
@bentoml

BentoML

The easiest way to build fast and reliable AI serving systems

Welcome to BentoML 👋 Twitter Follow Slack

github banner

What's cooking? 👩‍🍳

🍱 BentoML: The Unified Serving Framework for AI/ML Systems

BentoML is a Python library for building online serving systems optimized for AI apps and model inference. It supports serving any model format/runtime and custom Python code, offering the key primitives for serving optimizations, task queues, batching, multi-model chains, distributed orchestration, and multi-GPU serving.

🎨 Examples: Learn by doing!

A collection of examples for BentoML, from deploying OpenAI-compatible LLM service, to building voice phone calling agents and RAG applications. Use these examples to learn how to use BentoML and build your own solutions.

🦾 OpenLLM: Self-hosting Large Language Models Made Easy

Run any open-source LLMs (Llama, Mistral, Qwen, Phi and more) or custom fine-tuned models as OpenAI-compatible APIs with a single command. It features a built-in chat UI, state-of-the-art inference performance, and a simplified workflow for production-grade cloud deployment.

☁️ BentoCloud: Unified Inference Platform for any model, on any cloud

BentoCloud is the easist way to build and deploy with BentoML, in our cloud or yours. It brings fast and scalable inference infrastructure into any cloud, allowing AI teams to move 10x faster in building AI applications with ML/AI models, while reducing compute cost - by maxmizing compute utilization, fast GPU autoscaling, minimimal coldstarts and full observability. Sign up today!.

Get in touch 💬

👉 Join our Slack community!

👀 Follow us on X @bentomlai and LinkedIn

📖 Read our blog

Pinned Loading

  1. BentoML BentoML Public

    The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!

    Python 7.7k 836

  2. OpenLLM OpenLLM Public

    Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.

    Python 11.3k 719

  3. Yatai Yatai Public

    Model Deployment at Scale on Kubernetes 🦄️

    TypeScript 811 72

  4. BentoVLLM BentoVLLM Public

    Self-host LLMs with vLLM and BentoML

    Python 109 14

  5. BentoDiffusion BentoDiffusion Public

    BentoDiffusion: A collection of diffusion models served with BentoML

    Python 363 29

  6. comfy-pack comfy-pack Public

    A comprehensive toolkit for reliably locking, packing and deploying environments for ComfyUI workflows.

    Python 146 14

Repositories

Loading
Type
Select type
Language
Select language
Sort
Select order
Showing 10 of 106 repositories
  • comfy-pack Public

    A comprehensive toolkit for reliably locking, packing and deploying environments for ComfyUI workflows.

    bentoml/comfy-pack’s past year of commit activity
    Python 146 Apache-2.0 14 9 1 Updated May 16, 2025
  • BentoML Public

    The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!

    bentoml/BentoML’s past year of commit activity
    Python 7,704 Apache-2.0 836 125 3 Updated May 16, 2025
  • helm-charts Public
    bentoml/helm-charts’s past year of commit activity
    1 0 0 0 Updated May 15, 2025
  • yatai-image-builder Public

    🐳 Build OCI images for Bentos in k8s

    bentoml/yatai-image-builder’s past year of commit activity
    Go 18 10 6 10 Updated May 15, 2025
  • BentoTriton Public
    bentoml/BentoTriton’s past year of commit activity
    Python 1 Apache-2.0 1 0 0 Updated May 15, 2025
  • bentoml/BentoTwilioConversationRelay’s past year of commit activity
    Python 9 2 1 1 Updated May 14, 2025
  • BentoVLLM Public

    Self-host LLMs with vLLM and BentoML

    bentoml/BentoVLLM’s past year of commit activity
    Python 109 Apache-2.0 14 0 1 Updated May 13, 2025
  • OpenLLM Public

    Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.

    bentoml/OpenLLM’s past year of commit activity
    Python 11,264 Apache-2.0 719 2 0 Updated May 12, 2025
  • BentoLlamaCpp Public

    BentoML + llama.cpp

    bentoml/BentoLlamaCpp’s past year of commit activity
    Python 0 0 0 0 Updated May 10, 2025
  • setup-bentoml-action Public

    GitHub Action for boostrap BentoML setup

    bentoml/setup-bentoml-action’s past year of commit activity
    Shell 0 Apache-2.0 1 0 0 Updated May 7, 2025

Sponsoring

  • @pdm-project

Top languages

Loading…

Most used topics

Loading…

Morty Proxy This is a proxified and sanitized view of the page, visit original site.