Your Personal AI Assistant; easy to install, deploy on your own machine or on the cloud; supports multiple chat apps with easily extensible capabilities.
-
Updated
Apr 29, 2026 - Python
Your Personal AI Assistant; easy to install, deploy on your own machine or on the cloud; supports multiple chat apps with easily extensible capabilities.
🔥 Official Firecrawl MCP Server - Adds powerful web scraping and search to Cursor, Claude and any other LLM clients.
The open-source LLMOps platform: prompt playground, prompt management, LLM evaluation, and LLM observability all in one place.
Control Gmail, Google Calendar, Docs, Sheets, Slides, Chat, Forms, Tasks, Search & Drive with AI - Comprehensive Google Workspace / G Suite MCP Server & CLI Tool
Automated TDD enforcement for Claude Code
The LLM Anti-Framework
Your autonomous engineering team in a CLI. Point Zeroshot at an issue, walk away, and return to production-grade code. Supports Claude Code, OpenAI Codex, OpenCode, and Gemini CLI.
Full computer-use for AI agents. Self-learning workflows. Native macOS. No screenshots required.
Prismer Cloud
Workspace template + MCP server for Claude Code, Codex CLI, Cursor & Windsurf. Multi-agent knowledge engine (ag-refresh / ag-ask) that turns any codebase into a queryable AI assistant.
Entity-level git merge driver. Resolves false conflicts git invents when independent agents edit the same file. ~95% reduction vs. line-based merge.
NyaProxy acts like a smart, central manager for accessing various online services (APIs) – think AI tools (like OpenAI, Gemini, Anthropic), image generators, or almost any web service that uses access keys. It helps you use these services more reliably, efficiently, and securely.
The best way to create, deploy, and share MCP Servers
Agent Skills for Solopreneurs
Framework for AI agents to build and maintain an Obsidian wiki using Karpathy's LLM Wiki pattern
Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.
Open‑WebUI Tools is a modular toolkit designed to extend and enrich your Open WebUI instance, turning it into a powerful AI workstation. With a suite of over 15 specialized tools, function pipelines, and filters, this project supports academic research, agentic autonomy, multimodal creativity, workflows, and more
A command-line interface tool for serving LLM using vLLM.
Give each AI agent its own isolated machine with root, Docker, and systemd. Active defense detects and stops threats automatically..
Add a description, image, and links to the llm-tools topic page so that developers can more easily learn about it.
To associate your repository with the llm-tools topic, visit your repo's landing page and select "manage topics."