divakar-2005-02-02.github.io
by Divakar-2005-02-02
Overview
Demonstrates how to build an MCP server and client in .NET for tool exposure, integrating results into an Ollama LLM for enhanced responses.
Installation
nl-mcp-stdio-ollama-time-demo.exeSecurity Notes
The provided `index.md` file itself is safe and does not contain malicious patterns. However, the project's instructions involve downloading and running a compiled executable (`.exe`) from GitHub releases. Running unverified executables from the internet carries inherent security risks, as the actual source code of the executable is not available for review in this context.
Similar Servers
mcp-client-for-ollama
An interactive terminal client for connecting local Ollama LLMs to Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation for local LLMs.
mcp-dotnet-samples
This MCP server retrieves GitHub Copilot customizations, including instructions, agents, prompts, and collections, from the `awesome-copilot` repository to provide contextual guidance to AI models.
How-To-Create-MCP-Server
This project demonstrates how to set up a basic Model Context Protocol (MCP) server in .NET for interaction with AI tools like Copilot Chat.
ollama-fastmcp-wrapper
A proxy service that bridges Ollama with FastMCP, enabling local LLM tool-augmented reasoning by exposing MCP servers' functionality to Ollama models.