nova-llm
Verified Safeby GalacticQuasar
Overview
A full-stack LLM agent workflow with custom tool calling capabilities and configuration with Model Context Protocol (MCP) servers, supporting multiple Gemini models.
Installation
npm startEnvironment Variables
- GEMINI_API_KEY
- PORT
- CLIENT_ORIGIN
Security Notes
The server uses environment variables for API keys (GEMINI_API_KEY) and client origin, which is good practice. A rate limit is applied to the streaming endpoint, enhancing protection against abuse. The custom tool calls (`getTime`, `getRandomNumber`) use safe native JavaScript functions, mitigating direct injection risks through their arguments. However, the server automatically executes `npx @modelcontextprotocol/server-sequential-thinking` to connect to an MCP server; while intended, this introduces a dependency on the security of that external package and the `npm` ecosystem during execution.
Similar Servers
gemini-cli
Provides an A2A (Agent-to-Agent) server for the Gemini CLI, enabling external agents to interact with and utilize the CLI's capabilities for executing tasks and accessing tools.
mcp-use
A comprehensive framework for building full-stack Model Context Protocol (MCP) applications, including AI agents, MCP servers with UI widgets, and integrated debugging tools in both Python and TypeScript.
agentor
Build and deploy scalable AI agents that can interact with various tools and communicate via A2A and MCP protocols.
mcp-gemini-prompt-enhancer
A Model Context Protocol (MCP) server that provides a prompt optimization service for Large Language Models (LLMs) using Google Gemini, with advanced prompt engineering support and automatic PDF asset management.