Repository navigation
#
vllm-operator
- Website
- Wikipedia
AI Inference Operator for Kubernetes. The easiest way to serve ML models in production. Supports VLMs, LLMs, embeddings, and speech-to-text.
Go
1043
8 小时前
This Repository contains terraform configuration for vllm production-stack in the cloud managed K8s
HCL
1
5 天前