Repository navigation

#

local-llm

Mintplex-Labs/anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.

JavaScript
49661
1 天前

Local Deep Research achieves ~95% on SimpleQA benchmark (tested with GPT-4.1-mini). Supports local and cloud LLMs (Ollama, Google, Anthropic, ...). Searches 10+ sources - arXiv, PubMed, web, and your private documents. Everything Local.

Python
3455
1 小时前

a magical LLM desktop client that makes it easy for *anyone* to use LLMs and MCP

Svelte
502
4 天前

A simple "Be My Eyes" web app with a llama.cpp/llava backend

JavaScript
492
2 年前

A python script designed to translate large amounts of text with an LLM and the Ollama API

Python
384
18 天前

A local, privacy-first résumé builder using LLMs and Markdown to generate ATS-ready DOCX files with Pandoc — no cloud, no tracking.

TypeScript
339
1 个月前

Local, OpenAI-compatible text-to-speech (TTS) API using Chatterbox, enabling users to generate voice cloned speech anywhere the OpenAI API is used (e.g. Open WebUI, AnythingLLM, etc.)

Python
313
18 天前

Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐

TypeScript
299
1 年前

A curated list of awesome platforms, tools, practices and resources that helps run LLMs locally

293
1 天前

LocalineAI brings powerful AI capabilities directly to your Windows terminal while keeping your data completely private and secure. No cloud dependencies, no data sharing - just pure AI power at your fingertips.

290
4 个月前

LocalineAI brings powerful AI capabilities directly to your Windows terminal while keeping your data completely private and secure. No cloud dependencies, no data sharing - just pure AI power at your fingertips.

289
4 个月前

LocalineAI brings powerful AI capabilities directly to your Windows terminal while keeping your data completely private and secure. No cloud dependencies, no data sharing - just pure AI power at your fingertips.

287
4 个月前

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include multi-server, dynamic model switching, streaming responses, tool management, human-in-the-loop, thinking mode, full model parameters configuration, custom system prompt and saved preferences. Built for developers working with local LLMs.

Python
285
6 天前

LocalineAI brings powerful AI capabilities directly to your Windows terminal while keeping your data completely private and secure. No cloud dependencies, no data sharing - just pure AI power at your fingertips.

268
4 个月前

A simple, intuitive toolkit for quickly implementing LLM powered applications.

Python
265
9 个月前