yabby

Your AI agent. Off the grid. Under your roof.

yabby turns an Apple Silicon Mac into a hardened, fully local AI agent stack. LLM inference runs natively on-device. Email, calendar, search, DNS, backups, and remote access run on hardware you already own. No cloud LLM providers. No hardcoded identity anywhere. A guided wizard asks for everything. The install runs in two phases — bootstrap on the Mini, then install from your MacBook over SSH — and takes ~20 minutes plus the model download.


Status

Beta (v0.7.x). Two full dress-rehearsal rebuilds clean end-to-end. Expect to file an issue or two — that’s the whole point. See releases for the latest tag and CHANGELOG for what landed when.

The short version

It is not the easiest way to have an AI agent. It is, as far as we can tell, the easiest way to have one that nobody else can read, train on, throttle, deprecate, or turn off.

For the full architecture, component table, and shopping list, read the README on GitHub. This site is just the welcome mat.