← Blog

Why I Built DevBunker

Configuring local AI dev environments on Apple Silicon takes way too long. Here's why I packaged mine into a blueprint anyone can clone.

I have a Mac Studio M3 Ultra. It can run 70B parameter models locally, faster than most cloud inference endpoints for single-user workloads. It's a genuinely ridiculous machine.

But every time I spun up a new project, I spent the first two to three hours fighting the same setup problems: Docker resource limits not tuned for Apple Silicon, Ollama not connecting to the right socket, the reverse proxy misconfigured, models not fitting in unified memory correctly. Hours of yak shaving before writing a single line of actual product code.

The fix is obvious in hindsight

I already had a working configuration. I'd tuned it across a dozen projects over two years. The insight was that I should package it, not recreate it. DevBunker is just that: my personal AI dev stack blueprint, extracted into a clean repo that anyone with an M-series Mac can clone and run.

It includes Terraform configs for local infrastructure provisioning, Docker Compose stacks optimized for unified memory, Ollama setup with sensible defaults, Open WebUI wired up and ready to go, and a dev proxy so you're hitting one URL instead of managing five ports.

Why a paid product and not open source?

It could be open source. But I've seen what happens to free dev tools: they either get abandoned, or the maintainer burns out trying to support every edge case for free. A small price tag creates a natural filter. The people who pay are the people who actually want to use it, not just collect it. It also lets me keep it updated.

At $49 one-time, if it saves you two hours of setup time, it's paid for itself at any reasonable hourly rate for a developer.

What's next

The initial release covers a single-machine setup. Future versions will cover multi-model routing, GPU memory profiling utilities, and configurations for specific use cases like RAG pipelines and code generation workloads. Buyers get all future updates.

If you have an M-series Mac and you're doing any local AI development, check it out.