Repository navigation
llm-ops
- Website
- Wikipedia
🦍 The Cloud-Native API Gateway and AI Gateway.
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
Deploy serverless AI workflows at scale. Firebase for AI agents
AutoRAG: An Open-Source Framework for Retrieval-Augmented Generation (RAG) Evaluation & Optimization with AutoML-Style Automation
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
AIConfig is a config-based framework to build generative AI applications.
The collaborative spreadsheet for AI. Chain cells into powerful pipelines, experiment with prompts and models, and evaluate LLM responses in real-time. Work together seamlessly to build and iterate on AI applications.
Python SDK for running evaluations on LLM generated responses
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
[⛔️ DEPRECATED] Friendli: the fastest serving engine for generative AI
cluster/scheduler health monitoring for GPU jobs on k8s
Quality Control for AI Artifact Management
Miscellaneous codes and writings for MLOps
Streamlit-based chatbot leveraging Ollama via LangChain and PostHog-LLM for advanced logging and monitoring
The prompt engineering, prompt management, and prompt evaluation tool for TypeScript, JavaScript, and NodeJS.
Lightweight Agent Framework for building AI apps with any LLM
A Python package for tracking and analyzing LLM usage across different models and applications. It is primarily designed as a library for integration into development process of LLM-based agentic workflow tooling, providing robust tracking capabilities.
LLM-ML-Observability Toolkits and Serivces