ollama-mcp-example
Verified Safeby flexnst
Overview
Sets up a local LLM environment with Ollama, a web UI (Open-WebUI), and integrates Model Context Protocol (MCP) tools for enhanced LLM capabilities, such as web content retrieval.
Installation
make initEnvironment Variables
- FETCH_USER_AGENT
Security Notes
The project leverages Docker for service isolation. However, several services are configured with broad `*` for CORS origins (`OLLAMA_ORIGINS: '*'`, `CORS_ORIGINS=*`, `--allow-origin=*`). While common for local development, this poses a security risk if the environment is exposed publicly. The `WEBUI_SECRET_KEY` for Open-WebUI is explicitly set to an empty string, which should be configured with a strong secret in a production environment for proper session management and security. No 'eval' or apparent malicious patterns were found in the provided code.
Similar Servers
DevDocs
DevDocs is a web crawling and content extraction platform designed to accelerate software development by converting documentation into LLM-ready formats for intelligent data querying and fine-tuning.
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
webscraping-ai-mcp-server
Integrates with WebScraping.AI to provide LLM-powered web data extraction, including question answering, structured data extraction, and HTML/text retrieval, with advanced features like JavaScript rendering and proxy management.
ollama-fastmcp-wrapper
A proxy service that bridges Ollama with FastMCP, enabling local LLM tool-augmented reasoning by exposing MCP servers' functionality to Ollama models.