ocs-mcp-server
by AILLY2025
Overview
The OCS MCP Server acts as an API gateway for an OCS (Order Management System), providing tools for order review, order querying, and warehouse management to automate business processes.
Installation
docker run -p 8000:8000 -e OCS_BASE_URL=your_url ocs-mcp-serverEnvironment Variables
- OCS_BASE_URL
- OCS_ACCESS_TOKEN
- OCS_APPLICATION_KEY
- OCS_USE_DYNAMIC_AUTH
- OCS_USERNAME
- OCS_PASSWORD
- OCS_LOGIN_TYPE
- OCS_LOGIN_PATH
- OCS_REFERER
- OCS_USER_AGENT
- OCS_AUTH_COOKIE
- OCS_USER_INFO_COOKIE
- OCS_VERSION_COOKIE
- OCS_TOOL_AOP_ENABLED
- OCS_TOOL_AOP_RULES
- OCS_STRATEGY_RULE_ID
- OCS_STRATEGY_SELLER_AUTO_REMARK_KEY_FLAG
- MCP_SERVER_HOST
- MCP_SERVER_PORT
Security Notes
CRITICAL: Hardcoded Docker registry credentials (`DOCKER_USERNAME`, `DOCKER_PWD`) are present in `build.sh`, posing a severe risk if the repository is public. Hardcoded `fallback_access_token`, `application_key`, default `username`/`password`, and cookie values are present in `config.py`. Disabling SSL/TLS certificate verification (`httpx.AsyncClient(verify=False)`, `aiohttp.ClientSession(ssl=False)`) in `services/ocs_service.py` and `services/auth_service.py` makes the application vulnerable to Man-in-the-Middle (MITM) attacks, especially in production environments.
Similar Servers
fluidmcp
Orchestrates Model Context Protocol (MCP) servers and LLM inference engines (like vLLM) via a unified FastAPI gateway, enabling dynamic management, tool invocation, and multi-model LLM serving.
agentxsuite
A unified open-source platform for connecting, managing, and monitoring AI agents and tools across various Model Context Protocol (MCP) servers.
enterprise_mcp_server
Provides a robust, multi-component Model Context Protocol (MCP) solution with an API Gateway for routing and management, an Enterprise MCP Server for core services like authentication and tool administration, and a Tool Server for operational tool execution, designed for integration with clients like Cursor and Claude Code.
observe-community-mcp
Provides LLMs with intelligent access to Observe platform data through semantic search, automated dataset discovery, and metrics intelligence.