Skip to content

Pinned Loading

  1. vllm vllm Public

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python 58.5k 10.2k

  2. llm-compressor llm-compressor Public

    Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

    Python 2k 233

  3. recipes recipes Public

    Common recipes to run vLLM

    132 42

Repositories

Showing 10 of 23 repositories

Sponsors

  • @yankay
  • @Mega4alik
  • @brickfrog
  • @GabrielBianconi
  • @imkero
  • @comet-ml
  • @thomas-hiddenpeak
  • @terrytangyuan
  • @dvlpjrs
  • @vincentkoc
  • @robertgshaw2-redhat
  • Private Sponsor

Most used topics

Loading…