
产品
Melix
专为 Apple Silicon 打造的本地 AI 运行时。
Melix 完全在本地设备上运行机器学习推理,无需云端依赖,数据不离机。支持微调、基准测试,并通过兼容 OpenAI 的 API 提供服务。
Key Features
关键功能
生产环境真正重要的能力:接口、控制、数据流与运营工作流。

产品上下文
这个产品做什么,适合谁,以及放在哪里。
Melix 完全在本地设备上运行机器学习推理,无需云端依赖,数据不离机。支持微调、基准测试,并通过兼容 OpenAI 的 API 提供服务。
实施信息
实施细节、集成指引与运行参考。
帮助团队评估采用、完成接入,并负责任地持续运行这个产品。
Technical details
01MLX-powered local inference
Runs Apple MLX framework directly on Apple Silicon hardware for fast, energy-efficient on-device model execution.
02LoRA fine-tuning
Apply Low-Rank Adaptation to customize models locally without sending data to external training infrastructure.
03OpenAI-compatible HTTP API
Drop-in replacement for OpenAI API endpoints — existing tooling and coding agents work without modification.
04Service-first architecture
Runs as a persistent background service with gRPC and HTTP/SSE support, accessible to any local application.
Resources
01Homebrew Installation
Install and manage Melix via Homebrew. The formula handles runtime dependencies and service registration.
02CLI Reference
Automate model loading, benchmark execution, and adapter activation through the command-line interface.
03API Compatibility Guide
Configure existing OpenAI SDK clients to use Melix as a local backend with no code changes required.

CONTACT
Ready to run AI inference without cloud dependencies?
Melix keeps models, data, and inference on your machine — private, fast, and OpenAI-compatible.