Repository navigation
structured-output
- Website
- Wikipedia
The AI framework that adds the engineering to prompt engineering (Python/TS/Ruby/Java/C#/Rust/Go compatible)
Unified Go interface for Language Model (LLM) providers. Simplifies LLM integration with flexible prompt management and common task functions.
A versatile workflow automation platform to create, organize, and execute AI workflows, from a single LLM to complex AI-driven workflows.
MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. It implements OpenAI-compatible API endpoints, enabling seamless integration with existing OpenAI SDK clients while leveraging the power of local ML inference.
Simplifies the retrieval, extraction, and training of structured data from various unstructured sources.
OpenAPI definitions, converters and LLM function calling schema composer.
🚬 cigs are chainable Ai functions for typescript. Call functions with natural language and get a response back in a specified structure. Uses OpenAI's latest Structured Outputs.
[ti]ny [li]ttle machine learning [tool]box - Machine learning, anomaly detection, one-class classification, and structured output prediction
Making LLM Tool-Calling Simpler.
Non-Pydantic, Non-JSON Schema, efficient AutoPrompting and Structured Output Library
This repository demonstrates how to leverage OpenAI's GPT-4 models with JSON Strict Mode to extract structured data from web pages. It combines web scraping capabilities from Firecrawl with OpenAI's advanced language models to create a powerful data extraction pipeline.
Learn how to build effective LLM-based applications with Semantic Kernel in C#
Python decorator to define GPT-powered functions on top of OpenAI's structured output
Repository for our paper "DRS: Deep Question Reformulation With Structured Output".
This is the Python backend for InsightAI
Structured Output OpenAI Showcase. A Prime Numbers Calculator that demonstrates OpenAI's structured output capabilities. This repository is public because current LLM examples often use outdated API calls, and this script aims to help users quickly experiment with structured outputs.
Open Source Deep Research
A sample application to demonstrate how to use Structured Outputs in OpenAI Chat Completions API with streaming, built using Next.js.
Develop an intuition about Large Language Models (LLMs)