Locally AI icon

Locally AI

Run AI models locally on your iPhone, iPad, and Mac

Free LLM ServerChat

Overview

Locally AI lets you run powerful open-source AI models directly on your Apple devices without an internet connection. Chat with Meta Llama, Google Gemma, Qwen, DeepSeek R1, and more—all processed entirely on-device for complete privacy. The app features voice conversations, Siri integration, Apple Shortcuts support, and Control Center access. Built on Apple's MLX framework, it's optimized for Apple Silicon to deliver fast, private AI assistance wherever you are.

Pricing: Free

Architecture: Apple Silicon

Key Features

  • Run AI models completely offline without internet
  • Support for Llama 3.2/3.1, Gemma 2/3, Qwen, DeepSeek R1, and more
  • Voice conversations with natural speech interaction
  • Siri integration for hands-free access
  • Control Center and Lock Screen quick access
  • Apple Shortcuts automation support
  • Customizable system prompts
  • Vision model support for image understanding
  • Built on MLX for Apple Silicon optimization
  • Complete privacy—all data stays on device

Tags

chatvoice inputvoice synthesis