The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
-
Updated
Sep 6, 2025 - JavaScript
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Harness LLMs with Multi-Agent Programming
Local Deep Research achieves ~95% on SimpleQA benchmark (tested with GPT-4.1-mini). Supports local and cloud LLMs (Ollama, Google, Anthropic, ...). Searches 10+ sources - arXiv, PubMed, web, and your private documents. Everything Local.
Free, high-quality text-to-speech API endpoint to replace OpenAI, Azure, or ElevenLabs
A simple "Be My Eyes" web app with a llama.cpp/llava backend
a magical LLM desktop client that makes it easy for *anyone* to use LLMs and MCP
A local, privacy-first résumé builder using LLMs and Markdown to generate ATS-ready DOCX files with Pandoc — no cloud, no tracking.
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
Local, OpenAI-compatible text-to-speech (TTS) API using Chatterbox, enabling users to generate voice cloned speech anywhere the OpenAI API is used (e.g. Open WebUI, AnythingLLM, etc.)
A curated list of awesome platforms, tools, practices and resources that helps run LLMs locally
Code with AI in VSCode, bring your own ai.
React Native Apple LLM plugin using Foundation Models
Your fully proficient, AI-powered and local chatbot assistant🤖
Local coding agent with neat UI
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include multi-server, dynamic model switching, streaming responses, tool management, human-in-the-loop, thinking mode, full model parameters configuration, custom system prompt and saved preferences. Built for developers working with local LLMs.
💻一款简洁实用轻量级的本地AI对话客户端,采用Tauri2.0和Next.js编写 A simple, practical, and lightweight local AI chat client, written in Tauri 2.0 & Next.js.
LLM story writer with a focus on high-quality long output based on a user provided prompt.
A python package for developing AI applications with local LLMs.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."