安裝¶
vLLM 支援以下硬體平臺
硬體外掛¶
以下後端位於 主 vllm 倉庫 之外,並遵循 硬體可插拔 RFC。
| 加速器 | PyPI / 包 | 倉庫 |
|---|---|---|
| Google TPU | tpu-inference | https://github.com/vllm-project/tpu-inference |
| Ascend NPU | vllm-ascend | https://github.com/vllm-project/vllm-ascend |
| Intel Gaudi (HPU) | 不適用,從原始碼安裝 | https://github.com/vllm-project/vllm-gaudi |
| MetaX MACA GPU | 不適用,從原始碼安裝 | https://github.com/MetaX-MACA/vLLM-metax |
| Rebellions ATOM / REBEL NPU | vllm-rbln | https://github.com/rebellions-sw/vllm-rbln |
| IBM Spyre AIU | vllm-spyre | https://github.com/vllm-project/vllm-spyre |
| Cambricon MLU | vllm-mlu | https://github.com/Cambricon/vllm-mlu |
| Baidu Kunlun XPU | 不適用,從原始碼安裝 | https://github.com/baidu/vLLM-Kunlun |
| Sophgo TPU | 不適用,從原始碼安裝 | https://github.com/sophgo/vllm-tpu |