mcp-server-weaviate
by iteam1
Overview
This server provides tools to interact with a Weaviate vector database, enabling semantic, keyword, and hybrid search functionalities for Q&A and information retrieval within an agentic application context.
Installation
uv --directory /absolute/path/to/mcp-server-weaviate run mcp_server_weaviate --weaviate-http-port <weaviate-http-port> --weaviate-grpc-port <weaviate-grpc-port>Environment Variables
- WEAVIATE_HTTP_PORT
- WEAVIATE_GRPC_PORT
- OPENAI_API_KEY
Security Notes
The provided `docker-compose.yml` explicitly sets `AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'` for the Weaviate instance. This is a critical security vulnerability as it allows unauthenticated access to the database, potentially leading to data exposure and unauthorized modifications if deployed in a non-isolated environment. While convenient for local development, it poses a significant risk for production use without changes. OpenAI API keys are handled via environment variables, which is a good practice.
Similar Servers
memorizer-v1
A .NET-based service for AI agents to store, retrieve, and search through memories using vector embeddings, featuring asynchronous chunking, version control, and relationship management.
nexus-dev
Provides a local RAG (Retrieval-Augmented Generation) system and persistent memory for AI coding agents to enhance their contextual understanding, cross-project learning, and tool-use capabilities.
simple-memory-mcp-server
A Python server designed to manage and serve memory for AI agents, facilitating their interaction with external Large Language Models or data sources.
workspace-qdrant-mcp
A semantic workspace platform that provides project-scoped vector database operations through a Model Context Protocol (MCP) server, backed by a high-performance Rust daemon for file watching, processing, and ingestion. It enables LLM agents to naturally interact with project knowledge through conversational memory and hybrid semantic search.