Назад к каталогу
mcp-client-for-ollama

mcp-client-for-ollama

Сообщество

от jonigl

0.0
0 отзывов

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, dynamic model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model parameters configuration, custom system prompt and saved preferences. Built for developers working with local LLMs.

Установка

pip install --upgrade ollmcp

Описание

<p align="center"> <img src="https://github.com/jonigl/mcp-client-for-ollama/blob/main/misc/ollmcp-logo-512.png?raw=true" width="256" /> </p> <p align="center"> <i>A simple yet powerful Python client for interacting with Model Context Protocol (MCP) servers using Ollama, allowing local LLMs to use tools.</i> </p> --- # MCP Client for Ollama (ollmcp) [![Python 3.10+](https://img.shields.io/badge/Python-3.10+-blue.svg)](https://www.python.org/downloads/) [![PyPI - Python Version](https://img.shields.io/pypi/v/ollmcp?label=ollmcp-pypi)](https://pypi.org/project/ollmcp/) [![PyPI - Python Version](https://img.shields.io/pypi/v/mcp-client-for-ollama?label=mcp-client-for-ollama-pypi)](https://pypi.org/project/mcp-client-for-ollama/) [![Build, Publish and Release](https://github.com/jonigl/mcp-client-for-ollama/actions/workflows/publish.yml/badge.svg)](https://github.com/jonigl/mcp-client-for-ollama/actions/workflows/publish.yml) [![CI](https://github.com/jonigl/mcp-client-for-ollama/actions/workflows/ci.yml/badge.svg)](https://github.com/jonigl/mcp-client-for-ollama/actions/workflows/ci.yml) <p align="center"> <img src="https://raw.githubusercontent.com/jonigl/mcp-client-for-ollama/v0.15.0/misc/ollmcp-demo.gif" alt="MCP Client for Ollama Demo"> </p> <p align="center"> <a href="https://asciinema.org/a/jxc6N8oKZAWrzH8aK867zhXdO" target="_blank">🎥 Watch this demo as an Asciinema recording</a> </p> ## Table of Contents - [Overview](#overview) - [Features](#features) - [Requirements](#requirements) - [Quick Start](#quick-start) - [Usage](#usage) - [Command-line Arguments](#command-line-arguments) - [Usage Examples](#usage-examples) - [How Tool Calls Work](#how-tool-calls-work) - ✨**NEW** [Agent Mode](#agent-mode) - [Interactive Commands](#interactive-commands) - [Tool and Server Selection](#tool-and-server-selection) - [Model Selection](#model-selection) - [Advanced Model Configuration](#advanced-model-configuration) - [Server Reloading for Development](#server-reloading-for-development) - [Human-in-the-Loop (HIL) Tool Execution](#human-in-the-loop-hil-tool-execution) - [Performance Metrics](#performance-metrics) - [Autocomplete and Prompt Features](#autocomplete-and-prompt-features) - [Configuration Management](#configuration-management) - [Server Configuration Format](#server-configuration-format) - [Tips: Where to Put MCP Server Configs and a Working Example](#tips-where-to-put-mcp-server-configs-and-a-working-example) - [Compatible Models](#compatible-models) - [Ollama Cloud Models](#ollama-cloud-models) - [Where Can I Find More MCP Servers?](#where-can-i-find-more-mcp-servers) - [Related Projects](#related-projects) - [License](#license) - [Acknowledgments](#acknowledgments) ## Overview MCP Client for Ollama (`ollmcp`) is a modern, interactive terminal application (TUI) for connecting local Ollama LLMs to one or more Model Context Protocol (MCP) servers, enabling advanced tool use and workflow automation. With a rich, user-friendly interface, it lets you manage tools, models, and server connections in real time—no coding required. Whether you're building, testing, or just exploring LLM tool use, this client streamlines your workflow with features like fuzzy autocomplete, advanced model configuration, MCP servers hot-reloading for development, and Human-in-the-Loop safety controls. ## Features - 🤖 **Agent Mode**: Iterative tool execution when models request multiple tool calls, with a configurable loop limit to prevent infinite loops - 🌐 **Multi-Server Support**: Connect to multiple MCP servers simultaneously - 🚀 **Multiple Transport Types**: Supports STDIO, SSE, and Streamable HTTP server connections - ☁️ **Ollama Cloud Support**: Works seamlessly with Ollama Cloud models for tool calling, enabling access to powerful cloud-hosted models while using local MCP tools - 🎨 **Rich Terminal Interface**: Interactive console UI with modern styling - 🌊 **Streaming Responses**: View model outputs in real-time as they're generated - 🛠️ **Tool Management**: Enable/disable specific tools or entire servers during chat sessions - 🧑‍💻 **Human-in-the-Loop (HIL)**: Review and approve tool executions before they run for enhanced control and safety - 🎮 **Advanced Model Configuration**: Fine-tune 15+ model parameters including context window size, temperature, sampling, repetition control, and more - 💬 **System Prompt Customization**: Define and edit the system prompt to control model behavior and persona - 🧠 **Context Window Control**: Adjust the context window size (num_ctx) to handle longer conversations and complex tasks - 🎨 **Enhanced Tool Display**: Beautiful, structured visualization of tool executions with JSON syntax highlighting - 🧠 **Context Management**: Control conversation memory with configurable retention settings - 🤔 **Thinking Mode**: Advanced reasoning capabilities with visible thought processes for supported models (e.g., gpt-oss, deepseek-r1, qwen3, etc.) - 🗣

Отзывы (0)

Пока нет отзывов. Будьте первым!