I Built a Fully Offline AI Butler. His Name is Alfred.
A few weeks ago, I didn't have a single local AI model running on my machine. Today I can say "Hey Alfred" out loud and a British butler answers me — summarising news, opening apps, telling me the time, and doing it all completely offline, on my own hardware, with zero subscriptions and zero data leaving my house. Here's exactly how I got there, and why you should probably do the same. Why run a local LLM at all? The honest answer: privacy and resilience. Every time you type something into ChatGPT or Claude, that conversation goes to a server somewhere. For most things, that's fine. But there's something fundamentally different about an AI that runs entirely on your own machine — one that works when your internet goes down, one that doesn't bill you per query, one that has no idea what you asked it at 2am. I've also been retrenched before because of AI automation. That experience made me want to understand this technology, not just consume it. R...