nova-llm
Verified Safeby GalacticQuasar
Overview
A full-stack LLM agent workflow with custom tool calling capabilities and configuration with Model Context Protocol (MCP) servers, supporting multiple Gemini models.
Installation
npm startEnvironment Variables
- GEMINI_API_KEY
- PORT
- CLIENT_ORIGIN
Security Notes
The server uses environment variables for API keys (GEMINI_API_KEY) and client origin, which is good practice. A rate limit is applied to the streaming endpoint, enhancing protection against abuse. The custom tool calls (`getTime`, `getRandomNumber`) use safe native JavaScript functions, mitigating direct injection risks through their arguments. However, the server automatically executes `npx @modelcontextprotocol/server-sequential-thinking` to connect to an MCP server; while intended, this introduces a dependency on the security of that external package and the `npm` ecosystem during execution.
Similar Servers
gemini-cli
The A2A (Agent-to-Agent) server implementation for the Gemini CLI, exposing tools and resources via the Model Context Protocol (MCP) to extend Gemini CLI capabilities.
mcp-use
A full-stack framework for building Model Context Protocol (MCP) servers, clients, and AI agents in Python and TypeScript, with support for UI widgets, code execution, and observability.
gemini-mcp-tool
A Model Context Protocol (MCP) server that enables AI assistants to interact with the Google Gemini CLI for comprehensive code and file analysis, structured edit suggestions, and creative brainstorming.
agentor
Build and deploy scalable AI agents that can interact with various tools and communicate via A2A and MCP protocols.