Repository navigation
kserve
- Website
- Wikipedia
Standardized Distributed Generative and Predictive AI Inference Platform for Scalable, Multi-Framework Deployment on Kubernetes
Hopsworks - Data-Intensive AI platform with a Feature Store
🪐 1-click Kubeflow using ArgoCD
Carbon Limiting Auto Tuning for Kubernetes
AWS SageMaker, SeldonCore, KServe, Kubeflow & MLflow, VectorDB
My repo for the Machine Learning Engineering bootcamp 2022 by DataTalks.Club
The Machine Learning Zoomcamp teaches foundational and advanced ML concepts using tools like NumPy, Pandas, Scikit-Learn, TensorFlow, XGBoost, Flask, Docker, AWS, Kubernetes, and KServe. It covers regression, classification, evaluation metrics, neural networks, deployment strategies, and end-to-end projects to bridge theory and practice.
Collection of bet practices, reference architectures, examples, and utilities for foundation model development and deployment on AWS.
Deploying machine learning model using 10+ different deployment tools
This repository demonstrates how to deploy, scale, and monitor machine learning models on Kubernetes using KServe and Kubeflow Lite components.
Client/Server system to perform distributed inference on high load systems.
Hands-on labs on deploying machine learning models with tf-serving and KServe
A demo to accompany our blogpost "Scalable Machine Learning with Kafka Streams and KServe"
Everything to get industrial kubeflow applications running in production
TeiaCareInferenceClient is a C++ inference client library that implements KServe protocol
Kubeflow examples - Notebooks, Pipelines, Models, Model tuning and more
In this video, we’ll walk you through building a powerful machine learning model using Kubeflow and deploying it seamlessly to KServe with InferenceService!
KServe TrustyAI explainer
A scalable RAG-based Wikipedia Chat Assistant that leverages the Llama-2-7b-chat LLM, inferenced using KServe