mcp-openapi-server
Verified Safeby ivo-toby
Overview
Exposes OpenAPI endpoints as Model Context Protocol (MCP) tools, enabling Large Language Models (LLMs) to discover and interact with REST APIs through a standardized protocol.
Installation
npx @ivotoby/openapi-mcp-server --api-base-url <YOUR_API_BASE_URL> --openapi-spec <YOUR_OPENAPI_SPEC_URL_OR_PATH>Environment Variables
- API_BASE_URL
- OPENAPI_SPEC_PATH
- OPENAPI_SPEC_FROM_STDIN
- OPENAPI_SPEC_INLINE
- API_HEADERS
- SERVER_NAME
- SERVER_VERSION
- TRANSPORT_TYPE
- HTTP_PORT
- HTTP_HOST
- ENDPOINT_PATH
- TOOLS_MODE
- DISABLE_ABBREVIATION
Security Notes
The server includes origin validation for HTTP transport and has a configurable maximum request body size (4MB). It utilizes an `AuthProvider` interface for dynamic authentication, reducing reliance on potentially expired hardcoded tokens. Placeholder tokens in examples are clearly marked. No `eval` or obvious malicious patterns were found. Users must responsibly manage API keys and tokens (e.g., via environment variables or a custom AuthProvider) to avoid hardcoding secrets in production.
Similar Servers
mcpo
Exposes Model Context Protocol (MCP) tools as OpenAPI-compatible HTTP servers for integration with LLM agents and other applications.
tmcp
Build Model Context Protocol (MCP) servers for AI agents, providing schema-agnostic tools, resources, and prompts, with optional OAuth 2.1 authentication and distributed session management.
boilerplate-mcp-server
This boilerplate provides a production-ready foundation for developing custom Model Context Protocol (MCP) servers in TypeScript to connect AI assistants with external APIs and data sources, exemplified by an IP geolocation lookup tool.
mcp
A TypeScript SDK for building and interacting with Model Context Protocol (MCP) servers, facilitating AI agent interaction through exposed tools, prompts, and resources via JSON-RPC over HTTP/SSE.