hayhooks
by deepset-ai
Overview
Deploy and serve Haystack Pipelines and Agents as REST APIs or MCP Tools, with OpenAI compatibility and Open WebUI integration, including support for RAG systems with file uploads and streaming.
Installation
hayhooks runEnvironment Variables
- OPENAI_API_KEY
- LOG
- HAYHOOKS_HOST
- HAYHOOKS_PORT
- HAYHOOKS_ROOT_PATH
- HAYHOOKS_PIPELINES_DIR
- HAYHOOKS_ADDITIONAL_PYTHON_PATH
- HAYHOOKS_USE_HTTPS
- HAYHOOKS_DISABLE_SSL
- HAYHOOKS_SHOW_TRACEBACKS
- HAYHOOKS_STREAMING_COMPONENTS
- HAYHOOKS_CORS_ALLOW_ORIGINS
- HAYHOOKS_CORS_ALLOW_METHODS
- HAYHOOKS_CORS_ALLOW_HEADERS
- HAYHOOKS_CORS_ALLOW_CREDENTIALS
- HAYHOOKS_CORS_ALLOW_ORIGIN_REGEX
- HAYHOOKS_CORS_EXPOSE_HEADERS
- HAYHOOKS_CORS_MAX_AGE
- HAYHOOKS_MCP_HOST
- HAYHOOKS_MCP_PORT
Security Notes
The default CORS settings (`HAYHOOKS_CORS_ALLOW_ORIGINS=["*"]`) allow all origins, which is a significant security risk if the server is exposed publicly without tighter controls. The RAG example's `docker-compose.yml` configures Elasticsearch with `xpack.security.enabled=false`, which is suitable only for local development and highly insecure for production environments. Sensitive API keys (e.g., OPENAI_API_KEY) are loaded from environment variables, which is a good practice, but careful management is required.
Similar Servers
mcpo
Exposes Model Context Protocol (MCP) tools as OpenAPI-compatible HTTP servers.
mcp-openapi-server
A Model Context Protocol (MCP) server that exposes OpenAPI endpoints as MCP tools, along with optional support for MCP prompts and resources, enabling Large Language Models to interact with REST APIs.
zeromcp
A minimal, pure Python Model Context Protocol (MCP) server for exposing tools, resources, and prompts via HTTP/SSE and Stdio transports.
jentic-sdks
The Jentic MCP Plugin enables AI-agent builders to discover, load, and execute external APIs and workflows via the Model Configuration Protocol (MCP), generating LLM-compatible tool definitions.