A native macOS, iOS, and watchOS ChatGPT client built for speed and simplicity.
- π¬ Fast & Native: Streaming chat interface tailored for Apple platforms.
- βοΈ Multi-Provider: Works with OpenAI-compatible endpoints, including OpenAI, Azure OpenAI, GitHub Models, Gemini and Claude providers.
- π Multi-Model Chat: Compare responses from multiple models simultaneously.
- π Apple Intelligence: Uses the on-device Apple Intelligence API when available (macOS/iOS).
- π Local Models: Run models locally for complete privacy (macOS).
- π οΈ MCP Support: Use Model Context Protocol (MCP) tools (macOS).
- π¨ Image Generation: Create images using models like
gpt-image-1. - ποΈ Organization: Searchable conversations with auto-generated titles.
- π Secure: API keys stored in Keychain; conversations encrypted on disk.
- π Export: Save chats as Markdown or PDF.
- β watchOS Companion: Quick chat access from your Apple Watch.
- Grab the latest
.dmgfrom the Releases page. - Open the disk image and drag Ayna.app into your Applications folder.
- App is quarantined because it's not notarized. To remove the quarantine, run this command in Terminal:
xattr -dr com.apple.quarantine /Applications/Ayna.app- Launch Ayna from Applications.
You can also install Ayna via Homebrew:
brew tap sozercan/ayna
brew install --cask ayna- macOS 14.0 (Sonoma) or newer, iOS 17.0 or newer, or watchOS 10.0 or newer.
- An API key for OpenAI, Azure OpenAI, Gemini or Claude, or a GitHub account for GitHub Models (optional if using local models).
- Open Settings (
Cmd+,) β API. - Select your provider and add a model:
- OpenAI: Use the default endpoint or a custom OpenAI-compatible endpoint.
- Azure OpenAI: Use
https://<resource>.openai.azure.comwith your deployment name as the model. - GitHub Models: Sign in with your GitHub account (OAuth).
- Apple Intelligence: For on-device inference.
- Start chatting!
- Start a New Chat (
Cmd+N). - In the model selector, choose multiple models (e.g., GPT-4o and Claude 3.5 Sonnet).
- Send your prompt.
- Ayna will stream responses from all selected models in parallel, allowing you to compare their outputs side-by-side.
- Go to Settings β MCP Tools.
- Enable the default
wassetteruntime (requires the Wassette CLI and runswassette serve --stdio) or add any other MCP server to give the AI more capabilities.
Cmd+N: New conversationCmd+,: Open SettingsEnter: Send messageShift+Enter: New line
- No Telemetry: We don't track your usage.
- Local Storage: Conversations are encrypted and stored only on your device.
- Secure Keys: API keys are stored securely in the system Keychain.
Developers, please see CONTRIBUTING.md for build instructions, architecture details, and testing guidelines.
Found a bug? Open an issue.