Repository navigation

#

vllm-operator

AI Inference Operator for Kubernetes. The easiest way to serve ML models in production. Supports VLMs, LLMs, embeddings, and speech-to-text.

Go
893
15 小时前

This Repository contains terraform configuration for vllm production-stack in the cloud managed K8s

0
1 个月前