Read: From using AI to building AI systems, a defining note on what I’m exploring.

Osaurus

A macOS LLM server compatible with OpenAI and Anthropic APIs, offering MCP services and native Apple Silicon support.

Dinoki AI · Since 2025-08-17
Loading score...

Detailed Introduction

Osaurus is a macOS-focused LLM server and developer toolkit that lets developers and creators run models locally or in the cloud with OpenAI / Anthropic-compatible APIs. It provides an MCP (Model Context Protocol) server for integration with clients like Cursor and Claude Desktop, and includes a menu bar chat, plugins, and developer tools for embedding model capabilities into the desktop ecosystem securely and with low latency.

Main Features

  • OpenAI and Anthropic compatible API layer for easy integration with existing tools and clients.
  • MCP server to enable context sharing and plugin extensions with desktop clients such as Cursor and Claude Desktop.
  • Native Apple Silicon support and local model execution to reduce latency and improve privacy.
  • Menu bar chat, plugin system, and developer tools for debugging and extension.

Use Cases

  • Run local or on-device models on macOS for privacy-first inference and testing.
  • Provide a compatible model backend and MCP interface for desktop apps, enabling plugin-driven workflows.
  • Use as a local OpenAI/Anthropic-compatible endpoint in development and CI for offline testing and integration.

Technical Details

Osaurus is implemented with Swift and the native macOS stack, designed as a developer-friendly inference and integration platform. Repository topics include mcp, llm, and apple-neural-engine. The project is released under the MIT License and targets scenarios requiring local model deployment, low-latency inference, and desktop integration.

Comments

Osaurus
Score Breakdown
🛠️ Dev Tools 🛰️ Inference Service