Установка
brew install --cask osaurusОписание
# Osaurus [](https://github.com/dinoki-ai/osaurus/releases) [](https://github.com/dinoki-ai/osaurus/releases) [](LICENSE) [](https://github.com/dinoki-ai/osaurus/stargazers) -black?logo=apple>)      <p align="center"> <img width="372" height="222" alt="Screenshot 2025-12-10 at 2 50 04 PM" src="https://github.com/user-attachments/assets/a144d08d-5179-4cb1-9a01-29a8fb0b5493" /> </p> **Native macOS LLM server with MCP support.** Run local and remote language models on Apple Silicon with OpenAI-compatible APIs, tool calling, and a built-in plugin ecosystem. Created by Dinoki Labs ([dinoki.ai](https://dinoki.ai)) **[Documentation](https://docs.osaurus.ai/)** · **[Discord](https://discord.gg/dinoki)** · **[Plugin Registry](https://github.com/dinoki-ai/osaurus-tools)** · **[Contributing](docs/CONTRIBUTING.md)** --- ## Install ```bash brew install --cask osaurus ``` Or download from [Releases](https://github.com/dinoki-ai/osaurus/releases/latest). After installing, launch from Spotlight (`⌘ Space` → "osaurus") or run `osaurus ui` from the terminal. --- ## What is Osaurus? Osaurus is an all-in-one LLM server for macOS. It combines: - **MLX Runtime** — Optimized local inference for Apple Silicon using [MLX](https://github.com/ml-explore/mlx) - **Remote Providers** — Connect to OpenAI, OpenRouter, Ollama, LM Studio, or any OpenAI-compatible API - **OpenAI & Ollama APIs** — Drop-in compatible endpoints for existing tools - **MCP Server** — Expose tools to AI agents via Model Context Protocol - **Remote MCP Providers** — Connect to external MCP servers and aggregate their tools - **Plugin System** — Extend functionality with community and custom tools - **Developer Tools** — Built-in insights and server explorer for debugging - **Apple Foundation Models** — Use the system model on macOS 26+ (Tahoe) ### Highlights | Feature | Description | | ------------------------ | ---------------------------------------------------------- | | **Local LLM Server** | Run Llama, Qwen, Gemma, Mistral, and more locally | | **Remote Providers** | OpenAI, OpenRouter, Ollama, LM Studio, or custom endpoints | | **OpenAI Compatible** | `/v1/chat/completions` with streaming and tool calling | | **MCP Server** | Connect to Cursor, Claude Desktop, and other MCP clients | | **Remote MCP Providers** | Aggregate tools from external MCP servers | | **Tools & Plugins** | Browser automation, file system, git, web search, and more | | **Developer Tools** | Request insights, API explorer, and live endpoint testing | | **Menu Bar Chat** | Built-in chat overlay with global hotkey (`⌘;`) | | **Model Manager** | Download and manage models from Hugging Face | --- ## Quick Start ### 1. Start the Server Launch Osaurus from Spotlight or run: ```bash osaurus serve ``` The server starts on port `1337` by default. ### 2. Connect an MCP Client Add to your MCP client configuration (e.g., Cursor, Claude Desktop): ```json { "mcpServers": { "osaurus": { "command": "osaurus", "args": ["mcp"] } } } ``` ### 3. Add a Remote Provider (Optional) Open the Management window (`⌘ Shift M`) → **Providers** → **Add Provider**. Choose from presets (OpenAI, Ollama, LM Studio, OpenRouter) or configure a custom endpoint. --- ## Key Features ### Local Models (MLX) Run models locally with optimized Apple Silicon inference: ```bash # Download a model osaurus run llama-3.2-3b-instruct-4bit # Use via API curl http://127.0.0.1:1337/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{"model": "llama-3.2-3b-instruct-4bit", "messages": [{"role": "user", "content": "Hello!"}]}' ``` ### Remote Providers Connect to any OpenAI-compatible API to access cloud models alongside local ones. **Supported presets:** - **OpenAI** — GPT-4o, o1, and other OpenAI models - **OpenRouter** — Access multiple providers through one API - **Ollama** — Connect to a local or remote Ollama instance - **LM Studio** — Use LM Studio as a backend - **Custom** — Any OpenAI-compatible endpoint Features: - Secure API key storage (macOS Keychain) - C
Отзывы (0)
Пока нет отзывов. Будьте первым!
Статистика
Информация
Технологии
Похожие серверы
mcp-chain-of-draft-server
Chain of Draft Server is a powerful AI-driven tool that helps developers make better decisions through systematic, iterative refinement of thoughts and designs. It integrates seamlessly with popular AI agents and provides a structured approach to reasoning, API design, architecture decisions, code reviews, and implementation planning.
mcp-use-ts
mcp-use is the framework for MCP with the best DX - Build AI agents, create MCP servers with UI widgets, and debug with built-in inspector. Includes client SDK, server SDK, React hooks, and powerful dev tools.
mesh
Define and compose secure MCPs in TypeScript. Generate AI workflows and agents with React + Tailwind UI. Deploy anywhere.
rhinomcp
RhinoMCP connects Rhino 3D to AI Agent through the Model Context Protocol (MCP)