Skip to content

Pinned Loading

  1. vllm vllm Public

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python 39.8k 6k

  2. llm-compressor llm-compressor Public

    Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

    Python 1k 91

Repositories

Showing 10 of 15 repositories
  • vllm Public

    A high-throughput and memory-efficient inference and serving engine for LLMs

    vllm-project/vllm’s past year of commit activity
    Python 39,802 Apache-2.0 5,963 1,332 (11 issues need help) 448 Updated Mar 1, 2025
  • aibrix Public

    Cost-efficient and pluggable Infrastructure components for GenAI inference

    vllm-project/aibrix’s past year of commit activity
    Jupyter Notebook 2,701 Apache-2.0 235 116 (11 issues need help) 13 Updated Mar 1, 2025
  • llm-compressor Public

    Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

    vllm-project/llm-compressor’s past year of commit activity
    Python 1,026 Apache-2.0 91 23 46 Updated Mar 1, 2025
  • production-stack Public

    Scale from single vLLM instance to distributed vLLM deployment without changing any application code.

    vllm-project/production-stack’s past year of commit activity
    Python 535 Apache-2.0 75 26 (4 issues need help) 9 Updated Feb 28, 2025
  • vllm-ascend Public

    Community maintained hardware plugin for vLLM on Ascend

    vllm-project/vllm-ascend’s past year of commit activity
    Python 245 Apache-2.0 40 37 (1 issue needs help) 14 Updated Feb 28, 2025
  • buildkite-ci Public
    vllm-project/buildkite-ci’s past year of commit activity
    HCL 8 18 0 3 Updated Feb 27, 2025
  • flash-attention Public Forked from Dao-AILab/flash-attention

    Fast and memory-efficient exact attention

    vllm-project/flash-attention’s past year of commit activity
    Python 47 BSD-3-Clause 1,513 0 11 Updated Feb 27, 2025
  • vllm-spyre Public

    Community maintained hardware plugin for vLLM on Spyre

    vllm-project/vllm-spyre’s past year of commit activity
    Python 12 Apache-2.0 4 0 5 Updated Feb 26, 2025
  • FlashMLA Public Forked from deepseek-ai/FlashMLA
    vllm-project/FlashMLA’s past year of commit activity
    C++ 2 MIT 702 0 0 Updated Feb 25, 2025
  • vllm-project/vllm-project.github.io’s past year of commit activity
    HTML 4 11 0 0 Updated Feb 25, 2025

Sponsors

  • @LEE5J
  • @terrytangyuan
  • @mhupfauer
  • @AlpinDale
  • @HiddenPeak
  • @dvlpjrs
  • @vincentkoc
  • @mgoin
  • @robertgshaw2-redhat
  • Private Sponsor

Top languages

Loading…

Most used topics

Loading…