跳到內容

dstack

vLLM_plus_dstack

您可以在基於雲的GPU機器上執行vLLM,使用dstack。dstack是一個開源框架,可以在任何雲上執行LLM。本教程假設您已經在雲環境中配置了憑據、閘道器和GPU配額。

要安裝dstack客戶端,請執行

pip install dstack[all]
dstack server

接下來,要配置您的dstack專案,請執行

mkdir -p vllm-dstack
cd vllm-dstack
dstack init

接下來,要為LLM(本例中為NousResearch/Llama-2-7b-chat-hf)配置VM例項,請為dstack Service建立以下serve.dstack.yml檔案:

配置
type: service

python: "3.11"
env:
    - MODEL=NousResearch/Llama-2-7b-chat-hf
port: 8000
resources:
    gpu: 24GB
commands:
    - pip install vllm
    - vllm serve $MODEL --port 8000
model:
    format: openai
    type: chat
    name: NousResearch/Llama-2-7b-chat-hf

然後,執行以下CLI進行配置:

命令
$ dstack run . -f serve.dstack.yml

⠸ Getting run plan...
Configuration  serve.dstack.yml
Project        deep-diver-main
User           deep-diver
Min resources  2..xCPU, 8GB.., 1xGPU (24GB)
Max price      -
Max duration   -
Spot policy    auto
Retry policy   no

#  BACKEND  REGION       INSTANCE       RESOURCES                               SPOT  PRICE
1  gcp   us-central1  g2-standard-4  4xCPU, 16GB, 1xL4 (24GB), 100GB (disk)  yes   $0.223804
2  gcp   us-east1     g2-standard-4  4xCPU, 16GB, 1xL4 (24GB), 100GB (disk)  yes   $0.223804
3  gcp   us-west1     g2-standard-4  4xCPU, 16GB, 1xL4 (24GB), 100GB (disk)  yes   $0.223804
    ...
Shown 3 of 193 offers, $5.876 max

Continue? [y/n]: y
⠙ Submitting run...
⠏ Launching spicy-treefrog-1 (pulling)
spicy-treefrog-1 provisioning completed (running)
Service is published at ...

配置完成後,您可以使用OpenAI SDK與模型進行互動:

程式碼
from openai import OpenAI

client = OpenAI(
    base_url="https://gateway.<gateway domain>",
    api_key="<YOUR-DSTACK-SERVER-ACCESS-TOKEN>",
)

completion = client.chat.completions.create(
    model="NousResearch/Llama-2-7b-chat-hf",
    messages=[
        {
            "role": "user",
            "content": "Compose a poem that explains the concept of recursion in programming.",
        }
    ],
)

print(completion.choices[0].message.content)

注意

dstack會自動使用dstack的令牌在閘道器上處理身份驗證。同時,如果您不想配置閘道器,可以配置dstack Task而不是ServiceTask僅用於開發目的。如果您想了解更多關於如何使用dstack服務vLLM的實踐材料,請檢視此儲存庫