automcp
Сообществоот NapthaAI
Easily convert tool, agents and orchestrators from existing agent frameworks to MCP servers
Установка
# Basic installationОписание
# automcp ## 🚀 Overview automcp allows you to easily convert tools, agents and orchestrators from existing agent frameworks into [MCP](https://modelcontextprotocol.io/introduction) servers, that can then be accessed by standardized interfaces via clients like Cursor and Claude Desktop. We currently support deployment of agents, tools, and orchestrators as MCP servers for the following agent frameworks: 1. CrewAI 2. LangGraph 3. Llama Index 4. OpenAI Agents SDK 5. Pydantic AI 6. mcp-agent ## 🔧 Installation Install from PyPI: ```bash # Basic installation pip install naptha-automcp # UV uv add naptha-automcp ``` Or install from source: ```bash git clone https://github.com/napthaai/automcp.git cd automcp uv venv source .venv/bin/activate pip install -e . ``` ## 🧩 Quick Start Create a new MCP server for your project: Navigate to your project directory with your agent implementation: ```bash cd your-project-directory ``` Generate the MCP server files via CLI with one of the following flags (crewai, langgraph, llamaindex, openai, pydantic, mcp_agent): ```bash automcp init -f crewai ``` Edit the generated `run_mcp.py` file to configure your agent: ```python # Replace these imports with your actual agent classes from your_module import YourCrewClass # Define the input schema class InputSchema(BaseModel): parameter1: str parameter2: str # Set your agent details name = "<YOUR_AGENT_NAME>" description = "<YOUR_AGENT_DESCRIPTION>" # For CrewAI projects mcp_crewai = create_crewai_adapter( orchestrator_instance=YourCrewClass().crew(), name=name, description=description, input_schema=InputSchema, ) ``` Install dependencies and run your MCP server: ```bash automcp serve -t sse ``` ## 📁 Generated Files When you run `automcp init -f <FRAMEWORK>`, the following file is generated: ### run_mcp.py This is the main file that sets up and runs your MCP server. It contains: - Server initialization code - STDIO and SSE transport handlers - A placeholder for your agent implementation - Utilities to suppress warnings that might corrupt the STDIO protocol You'll need to edit this file to: - Import your agent/crew classes - Define your input schema (the parameters your agent accepts) - Configure the adapter with your agent ## 🔍 Examples ### Running the examples The repository includes examples for each supported framework: ```bash # Clone the repository git clone https://github.com/NapthaAI/automcp.git cd automcp # Install automcp in development mode pip install -e . # Navigate to an example directory cd examples/crewai/marketing_agents # Generate the MCP server files (use the appropriate framework) automcp init -f crewai # Edit the generated run_mcp.py file to import and configure the example agent # (See the specific example's README for details) # Add a .env file with necessary environmental variables # Install dependencies and run automcp serve -t sse ``` Each example follows the same workflow as a regular project: 1. Run `automcp init -f <FRAMEWORK>` to generate the server files 2. Edit `run_mcp.py` to import and configure the example agent 3. Add a .env file with necessary environmental variables 4. Install dependencies and serve using `automcp serve -t sse` ### CrewAI example Here's what a typical configured `run_mcp.py` looks like for a CrewAI example: ```python import warnings from typing import Any from automcp.adapters.crewai import create_crewai_adapter from pydantic import BaseModel from mcp.server.fastmcp import FastMCP mcp = FastMCP("MCP Server") warnings.filterwarnings("ignore") from crew import MarketingPostsCrew class InputSchema(BaseModel): project_description: str customer_domain: str name = "marketing_posts_crew" description = "A crew that posts marketing posts to a social media platform" # Create an adapter for crewai mcp_crewai = create_crewai_adapter( orchestrator_instance=MarketingPostsCrew().crew(), name=name, description=description, input_schema=InputSchema, ) mcp.add_tool( mcp_crewai, name=name, description=description ) # Server entrypoints def serve_sse(): mcp.run(transport="sse") def serve_stdio(): # Redirect stderr to suppress warnings that bypass the filters import os import sys class NullWriter: def write(self, *args, **kwargs): pass def flush(self, *args, **kwargs): pass # Save the original stderr original_stderr = sys.stderr # Replace stderr with our null writer to prevent warnings from corrupting STDIO sys.stderr = NullWriter() # Set environment variable to ignore Python warnings os.environ["PYTHONWARNINGS"] = "ignore" try: mcp.run(transport="stdio") finally: # Restore stderr for normal operation sys.stderr = original_stderr if __name__ == "__main__": import sys if len(sys.argv) > 1 and sys.argv[1] == "sse": serve_sse() else: serve_stdio
Отзывы (0)
Пока нет отзывов. Будьте первым!
Статистика
Информация
Технологии
Похожие серверы
GitHub MCP
Полная интеграция с GitHub API: репозитории, issues, pull requests, actions и многое другое.
Filesystem MCP
Безопасный доступ к файловой системе для чтения, записи и управления файлами с настраиваемыми разрешениями.
Context7 MCP
Доступ к актуальной документации библиотек и фреймворков.
Serena MCP
Мощный MCP сервер для семантической навигации по коду и рефакторинга.