Back to Home
phil65 icon

LLMling

by phil65

Overview

A declarative Python framework for building LLM applications, managing resources, prompts, and tools, serving as a backend for MCP servers and Pydantic-AI agents.

Installation

Run Command
uvx mcp-server-llmling@latest start path/to/your/config.yml

Security Notes

The `register_code_tool` functionality in `src/llmling/config/runtime.py` uses `exec()` to dynamically execute Python code provided by the LLM. This is a critical security risk as a malicious or compromised LLM could execute arbitrary code. Additionally, resource loaders and toolsets (e.g., `PathResourceLoader`, `RepositoryResourceLoader`, `OpenAPITools`) allow access to arbitrary external URLs and Git repositories, posing risks like Server-Side Request Forgery (SSRF) and arbitrary file downloads. Proper sandboxing, strict input validation, and careful permission management are crucial if exposing these capabilities to an LLM, especially with untrusted input.

Similar Servers

fastmcp

22084

FastMCP is an ergonomic interface for the Model Context Protocol (MCP), providing a comprehensive framework for building and interacting with AI agents, tools, resources, and prompts across various transports and authentication methods.

Other
8
$Low

npcpy

1170

Core library of the NPC Toolkit that supercharges natural language processing pipelines and agent tooling. It's a flexible framework for building state-of-the-art applications and conducting novel research with LLMs. Supports multi-agent systems, fine-tuning, reinforcement learning, genetic algorithms, model ensembling, and NumPy-like operations for AI models (NPCArray). Includes a built-in Flask server for deploying agent teams via REST APIs, and multimodal generation (image, video, audio).

Other
2
$High

arcade-mcp

789

Provides a framework and pre-built toolkits for integrating Large Language Models (LLMs) with various external services and databases, enabling AI agents to interact with the real world.

Other
9
$Medium

Lynkr

225

Lynkr is an AI orchestration layer that acts as an LLM gateway, routing language model requests to various providers (Ollama, Databricks, OpenAI, etc.). It provides an OpenAI-compatible API and enables AI-driven coding tasks via a rich set of tools and a multi-agent framework, with a strong focus on security, performance, and token efficiency. It allows AI agents to interact with a defined workspace (reading/writing files, executing shell commands, performing Git operations) and leverages long-term memory and agent learning to enhance task execution.

Other
9
$Medium

Stats

Interest Score39
Security Score3
Cost ClassHigh
Avg Tokens5000
Stars17
Forks2
Last Update2025-12-09

Tags

LLMAIFrameworkPythonDeclarativeYAMLToolsPromptsResourcesMCP