Back to Home
Theanh130124 icon

Hands_on_LLM

by Theanh130124

Overview

Develop and demonstrate LLM agents interacting with external tools via MCP (Microservice Communication Protocol) servers, featuring prompt caching for efficiency and a Streamlit UI for a banking assistant.

Installation

Run Command
python mcp/mcp_server.py

Environment Variables

  • MODEL_API_KEY
  • HF_API_KEY

Security Notes

The LLM client configuration (`mcp/llm_call_mcp_sse.py` and `mcp/agent_call_mcp_sse.py`) uses hardcoded `api_key="FAKEAPI"` or `api_key="ANYTHING"` when connecting to a custom LLM endpoint (`http://103.78.3.96:8000/v1`). This indicates a lack of proper authentication for LLM calls and presents a significant security vulnerability if the target endpoint is exposed or not adequately secured. Additionally, prompt caching stores queries and responses in plain JSON files on the local filesystem, which could expose sensitive information.

Similar Servers

Stats

Interest Score0
Security Score3
Cost ClassHigh
Avg Tokens1000
Stars0
Forks0
Last Update2025-11-27

Tags

LLM AgentTool UseCachingStreamlitMCP Server