Назад к каталогу
just-prompt

just-prompt

Сообщество

от disler

0.0
0 отзывов

just-prompt is an MCP server that provides a unified interface to top LLM providers (OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama)

Описание

# Just Prompt - A lightweight MCP server for LLM providers `just-prompt` is a Model Control Protocol (MCP) server that provides a unified interface to various Large Language Model (LLM) providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. See how we use the `ceo_and_board` tool to make [hard decisions easy with o3 here](https://youtu.be/LEMLntjfihA). <img src="images/just-prompt-logo.png" alt="Just Prompt Logo" width="700" height="auto"> <img src="images/o3-as-a-ceo.png" alt="Just Prompt Logo" width="700" height="auto"> ## Tools The following MCP tools are available in the server: - **`prompt`**: Send a prompt to multiple LLM models - Parameters: - `text`: The prompt text - `models_prefixed_by_provider` (optional): List of models with provider prefixes. If not provided, uses default models. - **`prompt_from_file`**: Send a prompt from a file to multiple LLM models - Parameters: - `abs_file_path`: Absolute path to the file containing the prompt (must be an absolute path, not relative) - `models_prefixed_by_provider` (optional): List of models with provider prefixes. If not provided, uses default models. - **`prompt_from_file_to_file`**: Send a prompt from a file to multiple LLM models and save responses as markdown files - Parameters: - `abs_file_path`: Absolute path to the file containing the prompt (must be an absolute path, not relative) - `models_prefixed_by_provider` (optional): List of models with provider prefixes. If not provided, uses default models. - `abs_output_dir` (default: "."): Absolute directory path to save the response markdown files to (must be an absolute path, not relative) - **`ceo_and_board`**: Send a prompt to multiple 'board member' models and have a 'CEO' model make a decision based on their responses - Parameters: - `abs_file_path`: Absolute path to the file containing the prompt (must be an absolute path, not relative) - `models_prefixed_by_provider` (optional): List of models with provider prefixes to act as board members. If not provided, uses default models. - `abs_output_dir` (default: "."): Absolute directory path to save the response files and CEO decision (must be an absolute path, not relative) - `ceo_model` (default: "openai:o3"): Model to use for the CEO decision in format "provider:model" - **`list_providers`**: List all available LLM providers - Parameters: None - **`list_models`**: List all available models for a specific LLM provider - Parameters: - `provider`: Provider to list models for (e.g., 'openai' or 'o') ## Provider Prefixes > every model must be prefixed with the provider name > > use the short name for faster referencing - `o` or `openai`: OpenAI - `o:gpt-4o-mini` - `openai:gpt-4o-mini` - `a` or `anthropic`: Anthropic - `a:claude-3-5-haiku` - `anthropic:claude-3-5-haiku` - `g` or `gemini`: Google Gemini - `g:gemini-2.5-pro-exp-03-25` - `gemini:gemini-2.5-pro-exp-03-25` - `q` or `groq`: Groq - `q:llama-3.1-70b-versatile` - `groq:llama-3.1-70b-versatile` - `d` or `deepseek`: DeepSeek - `d:deepseek-coder` - `deepseek:deepseek-coder` - `l` or `ollama`: Ollama - `l:llama3.1` - `ollama:llama3.1` ## Features - Unified API for multiple LLM providers - Support for text prompts from strings or files - Run multiple models in parallel - Automatic model name correction using the first model in the `--default-models` list - Ability to save responses to files - Easy listing of available providers and models ## Installation ```bash # Clone the repository git clone https://github.com/yourusername/just-prompt.git cd just-prompt # Install with pip uv sync ``` ### Environment Variables Create a `.env` file with your API keys (you can copy the `.env.sample` file): ```bash cp .env.sample .env ``` Then edit the `.env` file to add your API keys (or export them in your shell): ``` OPENAI_API_KEY=your_openai_api_key_here ANTHROPIC_API_KEY=your_anthropic_api_key_here GEMINI_API_KEY=your_gemini_api_key_here GROQ_API_KEY=your_groq_api_key_here DEEPSEEK_API_KEY=your_deepseek_api_key_here OLLAMA_HOST=http://localhost:11434 ``` ## Claude Code Installation > In all these examples, replace the directory with the path to the just-prompt directory. Default models set to `openai:o3:high`, `openai:o4-mini:high`, `anthropic:claude-opus-4-20250514`, `anthropic:claude-sonnet-4-20250514`, `gemini:gemini-2.5-pro-preview-03-25`, and `gemini:gemini-2.5-flash-preview-04-17`. If you use Claude Code right out of the repository you can see in the .mcp.json file we set the default models to... ``` { "mcpServers": { "just-prompt": { "type": "stdio", "command": "uv", "args": [ "--directory", ".", "run", "just-prompt", "--default-models", "openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5

Отзывы (0)

Пока нет отзывов. Будьте первым!