Назад к каталогу
gptr-mcp

gptr-mcp

Сообщество

от assafelovic

0.0
0 отзывов

MCP server for enabling LLM applications to perform deep research via the MCP protocol

Установка

git clone https://github.com/assafelovic/gptr-mcp.git

Описание

<div align="center" id="top"> <img src="https://github.com/assafelovic/gpt-researcher/assets/13554167/20af8286-b386-44a5-9a83-3be1365139c3" alt="Logo" width="80"> # 🔍 GPT Researcher MCP Server [![Website](https://img.shields.io/badge/Official%20Website-gptr.dev-teal?style=for-the-badge&logo=world&logoColor=white&color=0891b2)](https://gptr.dev) [![Documentation](https://img.shields.io/badge/Documentation-DOCS-f472b6?logo=googledocs&logoColor=white&style=for-the-badge)](https://docs.gptr.dev/docs/gpt-researcher/mcp-server/getting-started) [![Discord Follow](https://dcbadge.vercel.app/api/server/QgZXvJAccX?style=for-the-badge&theme=clean-inverted&?compact=true)](https://discord.gg/QgZXvJAccX) </div> ## Why GPT Researcher MCP? While LLM apps can access web search tools with MCP, **GPT Researcher MCP delivers deep research results.** Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space. GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers: - ✨ Higher quality information - 📊 Optimized context usage - 🔎 Comprehensive results - 🧠 Better reasoning for LLMs ## 💻 Claude Desktop Demo https://github.com/user-attachments/assets/ef97eea5-a409-42b9-8f6d-b82ab16c52a8 ## 🚀 Quick Start with Claude Desktop **Want to use this with Claude Desktop right away?** Here's the fastest path: 1. **Install dependencies:** ```bash git clone https://github.com/assafelovic/gptr-mcp.git pip install -r requirements.txt ``` 2. **Set up your Claude Desktop config** at `~/Library/Application Support/Claude/claude_desktop_config.json`: ```json { "mcpServers": { "gptr-mcp": { "command": "python", "args": ["/absolute/path/to/gpt-researcher/gptr-mcp/server.py"], "env": { "OPENAI_API_KEY": "your-openai-key-here", "TAVILY_API_KEY": "your-tavily-key-here" } } } } ``` 3. **Restart Claude Desktop** and start researching! 🎉 For detailed setup instructions, see the [full Claude Desktop Integration section](#-claude-desktop-integration) below. ### Resources - `research_resource`: Get web resources related to a given task via research. ### Primary Tools - `deep_research`: Performs deep web research on a topic, finding the most reliable and relevant information - `quick_search`: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more [here](https://docs.gptr.dev/docs/gpt-researcher/search-engines) - `write_report`: Generate a report based on research results - `get_research_sources`: Get the sources used in the research - `get_research_context`: Get the full context of the research ### Prompts - `research_query`: Create a research query prompt ## Prerequisites Before running the MCP server, make sure you have: 1. Python 3.11 or higher installed - **Important**: GPT Researcher >=0.12.16 requires Python 3.11+ 2. API keys for the services you plan to use: - [OpenAI API key](https://platform.openai.com/api-keys) - [Tavily API key](https://app.tavily.com) You can also connect any other web search engines or MCP using GPTR supported retrievers. Check out the [docs here](https://docs.gptr.dev/docs/gpt-researcher/search-engines) ## ⚙️ Installation 1. Clone the GPT Researcher repository: ```bash git clone https://github.com/assafelovic/gpt-researcher.git cd gpt-researcher ``` 2. Install the gptr-mcp dependencies: ```bash cd gptr-mcp pip install -r requirements.txt ``` 3. Set up your environment variables: - Copy the `.env.example` file to create a new file named `.env`: ```bash cp .env.example .env ``` - Edit the `.env` file and add your API keys and configure other settings: ```bash OPENAI_API_KEY=your_openai_api_key TAVILY_API_KEY=your_tavily_api_key ``` You can also add any other env variable for your GPT Researcher configuration. ## 🚀 Running the MCP Server You can run the MCP server in several ways: ### Method 1: Directly using Python ```bash python server.py ``` ### Method 2: Using the MCP CLI (if installed) ```bash mcp run server.py ``` ### Method 3: Using Docker (recommended for production) #### Quick Start The simplest way to run with Docker: ```bash # Build and run with docker-compose docker-compose up -d # Or manually: docker build -t gptr-mcp . docker run -d \ --name gptr-mcp \ -p 8000:8000 \ --env-file .env \ gptr-mcp ``` #### For n8n Integration If you need to connect to an existing n8n network: ```bash # First, start the container docker-compose up -d # Then connect to your n8n network docker network connect n8n-mcp-net gptr-mcp # Or create a shared network first docker network create n

Отзывы (0)

Пока нет отзывов. Будьте первым!