hakan-personal-mcp
by sudohakan
Overview
A self-healing and self-improving multi-agent AI system designed to manage GitBook documentation, Postman collections, perform database and system operations, cross-instance monitoring, and automatic GitHub backups.
Installation
docker compose up -dEnvironment Variables
- GITBOOK_URL
- POSTMAN_DIR
- LOG_LEVEL
- OLLAMA_URL
- CACHE_TTL
- GITHUB_TOKEN
- CODEX_API_KEY
- OPENAI_API_KEY
- CLAUDE_CODE_API_KEY
- ANTHROPIC_API_KEY
- AI_KEY_PASSWORD
- MONGODB_URI
Security Notes
The server features extensive use of `execAsync` and `spawn` for executing shell commands and external processes, including database backups/restores (`pg_dump`, `mysqldump`), system commands (`robocopy`, `npm build`, `git`), and even arbitrary process spawning for inter-MCP communication (`mcp_connect`). Many of these commands directly interpolate AI-generated or user-provided input, creating a significant risk for command injection if input is not perfectly sanitized. The `systemOptimizationTools` explicitly elevate privileges (`runAsAdmin`) to perform powerful system modifications, which is a high risk if compromised. While self-improvement has `restrictedPaths` and an 'approval workflow' as mitigations, the overall design grants high operational power, making it critically important to run in an isolated environment with trusted inputs.
Similar Servers
inspector
A web-based client and proxy server for inspecting and interacting with Model Context Protocol (MCP) servers, allowing users to browse resources, prompts, and tools, perform requests, and debug OAuth authentication flows.
mcp_massive
An AI agent orchestration server, likely interacting with LLMs and managing multi-agent workflows.
2ly
Skilder is an infrastructure layer for AI agent tooling, providing a private tool registry and embedded runtimes for integrating with various agent frameworks and custom tools.
llms
A centralized configuration and documentation management system for LLMs, providing tools for building skills, commands, agents, prompts, and managing MCP servers across multiple LLM providers.