digital-twin-workshop
Verified Safeby tinatuazon
Overview
Provides AI-powered query capabilities for a professional digital twin profile via the Model Context Protocol (MCP) for integration with AI tools like GitHub Copilot, allowing interactive access to background, skills, and career goals.
Installation
cd mcp-server && npm install && npm startEnvironment Variables
- GROQ_API_KEY
- UPSTASH_VECTOR_REST_URL
- UPSTASH_VECTOR_REST_TOKEN
Security Notes
The server explicitly relies on environment variables for API keys (Groq, Upstash), preventing hardcoding of secrets. Input validation and truncation are implemented before feeding user queries to the LLM. LLM system prompts include strict instructions to limit responses to provided context, mitigating prompt injection risks. The `profile-loader.ts` reads from a local JSON, which is safe as it's not user-controlled. No `eval` or direct dynamic code execution is apparent. Network requests to external AI/vector services include timeouts and retries, enhancing stability. Overall, good security practices are in place for this type of application.
Similar Servers
sparql-llm
An LLM-powered agent for generating, validating, and executing SPARQL queries against biomedical knowledge graphs, utilizing Retrieval-Augmented Generation (RAG) with endpoint-specific metadata and schema for improved accuracy.
mcp-local-rag
Local RAG server for developers enabling private, offline semantic search with keyword boosting on personal or project documents (PDF, DOCX, TXT, MD, HTML).
RagThisCode
Set up a RAG (Retrieval-Augmented Generation) system to chat with the code of any public or private GitHub repository.
nordstemmen-ai
Semantic search engine for public documents of Nordstemmen municipality, integrated with AI platforms via the Model Context Protocol (MCP).