JARVIS from Iron Man.

For real.

A voice-first AI assistant for macOS. It sees your screen, controls your apps, learns you over time — and runs fully local on your Mac.

"I use it every day to run my Mac." — Dogfooded by the founder.

100%
Local Processing
Zero cloud dependency
<100ms
Response Time
Voice in, action out
4B
Parameters
On-device MLX model
Adaptability
Self-training pipeline
Core Capabilities

Three Pillars

Everything Jarvis does centers on three fundamental promises — control, intelligence, and privacy.

Pillar 1

Controls Your Computer

Not a chat window — a real agent.

Eyes (screen vision) and hands (GUI control). Drives Apple Calendar, Mail, Notes, Chrome, Terminal, and Claude Code via AppleScript. Full vision + click/type GUI control is shipping.

Pillar 2

Learns You

Self-training on Apple Silicon.

When Jarvis can't do something, it interviews you, generates synthetic training data, runs a critic pass, fine-tunes itself via MLX, and hot-swaps the new adapter — live. Evolving info goes to memory; stable info goes into weights.

Pillar 3

Fully Yours

100% local. Zero cloud.

MLX baseline model (jarvis:saturday-4b) runs on-device. Your data, your voice, your habits — never leave the machine. The last assistant you'll ever need.

Self-Improvement Loop

Learns. Adapts. Evolves.

When Jarvis encounters something it can't do, it doesn't stop. It learns — generating its own training data and fine-tuning itself on your Apple Silicon hardware.

01

Interview

Claude Sonnet 4.6 as planner interviews you to understand what it can't do and what you need.

Natural conversation, contextual understanding
02

Generate

Produces 30–100 synthetic training examples tailored to the identified capability gap.

Automated data synthesis pipeline
03

Critic & Review

Automated critic pass + optional human review ensures training data quality.

Quality gate before fine-tuning
04

Fine-Tune

MLX LoRA fine-tuning on Apple Silicon. Training progress visualized on the orb in real-time.

Hot-swap adapter without restart
05

Evaluate & Deploy

Eval gate ensures the new skill meets quality thresholds before going live.

Two tracks: tool_skill + knowledge
Evolving info → memory  |  Stable info → weights  |  Consolidation → when you're AFK
Development Timeline

Roadmap

From foundation to full autonomy — the path to the assistant that never stops improving.

Phase 0–1Shipped

Foundation

Tool-use refactor, screen vision (eyes), GUI control (hands)

Phase 2–3In Progress

Integration

MCP host, Google APIs, always-on menu bar presence

Phase 4–5Roadmap

Safety & Power

Safety/undo system, browser superpowers with autonomous navigation

Phase 6–9Roadmap

Autonomy

Long-running tasks, continuous self-improvement loop

Under the Hood

Built for Performance

Every layer of the stack is optimized for local-first, real-time AI on Apple Silicon.

AI Core

Claude Sonnet 4.6
Planner & orchestrator
MLX
On-device fine-tuning
jarvis:saturday-4b
Local baseline model
LoRA
Adapter-based skills

Vision & Control

Screen Vision
Real-time screen parsing
GUI Control
Click/type automation
AppleScript
Native app integration
MCP Host
Tool protocol (roadmap)

Platform

FastAPI + WebSocket
Real-time communication
Three.js
3D orb UI (Vite/TS)
SQLite + FTS5
Memory, tasks, notes
Apple Silicon
MLX-optimized hardware

Voice

Web Speech API
Voice input
ElevenLabs / Fish
High-fidelity TTS
macOS say
Fallback TTS
Always-on
Menu bar presence (roadmap)
<100ms
Voice-to-action latency
0
Bytes sent to cloud
4B
Parameters on-device
Early Access

The Last Assistant
You'll Ever Need

Join the waitlist for early access. Be among the first to experience JARVIS on your Mac — fully local, fully yours.

No spam. Your email stays local until early access rolls out. Built for people who take their privacy seriously.