Back to Home
TingjiaInFuture icon

allbeapi

Verified Safe

by TingjiaInFuture

Overview

Enables Large Language Models to interact with and execute local Python libraries and custom scripts by exposing them as Model Context Protocol (MCP) servers.

Installation

Run Command
allbeapi start <library_name>

Environment Variables

  • LOG_LEVEL

Security Notes

The system's core functionality involves dynamic introspection, code generation, and execution of Python functions and methods on objects returned by tools. This design implies a strong trust boundary: the user *must* trust the Python libraries or custom scripts they choose to expose. The `call-object-method` tool, while powerful for stateful workflows, allows LLMs to invoke arbitrary methods on stored Python objects. If an exposed function returns an object with methods that can perform sensitive operations (e.g., file system access, arbitrary command execution via `subprocess`), an LLM could potentially call these methods. While the `analyzer.py` includes an "input complexity filter" to limit arguments, there is no explicit filtering or sandboxing for the methods available on *returned* objects. Data does not leave the local network, but the local execution environment is not sandboxed. The dependency installer (`installer.py`) uses `pip install` with basic package name validation, which is standard but still involves executing external code.

Similar Servers

Stats

Interest Score35
Security Score6
Cost ClassLow
Avg Tokens300
Stars2
Forks0
Last Update2025-12-05

Tags

LLM Agent ToolsPython AutomationAPI GenerationStateful OperationsLocal AI