Back to Home
TheaMarieM icon

digital-twin-portfolio

Verified Safe

by TheaMarieM

Overview

AI-powered portfolio acting as a digital twin, offering interactive querying, interview simulation, and RAG-driven semantic search of professional experiences via the Model Context Protocol (MCP) for AI assistants like Claude Desktop and VS Code Copilot.

Installation

Run Command
python mcp/server.py

Environment Variables

  • OPENAI_API_KEY
  • GROQ_API_KEY
  • UPSTASH_VECTOR_REST_URL
  • UPSTASH_VECTOR_REST_TOKEN
  • UPSTASH_VECTOR_INDEX
  • UPSTASH_REDIS_REST_URL
  • UPSTASH_REDIS_REST_TOKEN
  • USE_LOCAL_EMBEDDINGS
  • LOCAL_EMBEDDING_SERVICE_URL
  • OLLAMA_URL
  • OLLAMA_MODEL
  • EMBEDDING_MODEL
  • EMBEDDING_DIM
  • MAX_HISTORY_MESSAGES
  • SESSION_TTL_SECONDS
  • LLM_PROVIDER
  • OPENAI_CHAT_MODEL

Security Notes

The project demonstrates good security practices with explicit rate limiting (Redis-based for chat, in-memory for RAG as implemented in `app/api/rag/route.ts`), comprehensive input validation, and content filtering for sensitive information and prompt injections. It uses secure session ID generation and sets security headers on API responses. Environment variables are correctly used for secrets. The use of `json.loads` on external API responses carries inherent risk, but is standard practice when interacting with LLM/embedding services and is handled within `try-except` blocks. In-memory rate limiting for RAG in the Next.js backend could be a concern for multi-instance production deployments, which the README addresses by suggesting Redis for production.

Similar Servers

Stats

Interest Score0
Security Score8
Cost ClassMedium
Avg Tokens1500
Stars0
Forks0
Last Update2025-11-19

Tags

AIPortfolioMCPRAGLLM