Show HN: Osaurus – Ollama-Compatible Runtime for Apple Foundation Models
github.comOsaurus is an open-source local inference runtime for macOS, written in Swift and optimized for Apple Silicon.
It lets you run Apple Foundation Models locally — fully accelerated by the Neural Engine — while also exposing OpenAI- and Ollama-compatible endpoints, so you can connect your favorite apps, tools, or clients without any code changes.
Key points:
* Supports Apple Foundation Models natively
* Compatible with OpenAI & Ollama APIs
* ~7 MB binary, runs locally (no cloud, no telemetry)
* MIT Licensed, open source
Project: https://osaurus.ai
Source: https://github.com/dinoki-ai/osaurus
We’re exploring what a local-first AI ecosystem could look like — where inference, privacy, and creativity all happen on your own hardware. Feedback and testing welcome!
Very cool! Since it’s possible to train foundation model adapters, is a library for user fine tunes possible?
Not yet, but can be supported in the future