Seeing Inside the Machine
Introducing Gemma Tuner for macOS
I remember the first time I saw a beating human heart.
A twenty-something was holding it in his hands.
A man had been stabbed, his ventricle pierced.
The ER docs cracked his chest open and plunged their hands into the hole like someone fishing for a lost set of keys in their car at night.
I was a 19 year-old EMT, standing on the trauma room overlook at LA County-USC Medical Center.
Below me, a tattooed man was bleeding out on a table. I got to see the inside of this man. His beating heart. His inflating lungs. His pulsing aorta.
After a few minutes they rushed him to the operating room, the resident jogging alongside the gurney, hand still inside the chest, squeezing.
It was human as a machine – that’s the only way I can describe it.
You spend your whole life looking at people from the outside — the curves, the skin, the faces. It’s like staring at a car and only ever seeing the paint job and the tinted windows. And then someone pops the hood and you see the pipes, the wiring, the machinery of life itself.
Humans are machines.
Now we are creating machine that act like humans. “Anthropic” indeed.
We’re in the middle of the biggest technological shift since electricity, and only the high-priests of AI have seen an AI model learn.
Billions of people use ChatGPT, Gemini, Claude. They argue about whether AI is conscious. They debate whether it will take their jobs. Pundits and politicians write op-eds about what these systems are and what they mean for humanity.
Almost none of them have ever watched one learn.
We treat AI like a black box that talks back. Like a magical oracle that arrived fully formed from the cloud. We have opinions about the soul of something we've never seen the guts of.
That’s not understanding. That’s superstition.
In truth, AI is magic. But it is also a machine. Just like us. But you can’t know that – you won’t feel it in your bones – until you’ve seen its beating heart with your own eyes.
I built something to change this.
It’s called Gemma Tuner, and it’s a free, easy-to-use, open-source toolkit that lets you train and fine-tune Google’s new Gemma 4 AI models on your Mac, like you were an AI researcher at Deepmind. Text, images, audio — all three modalities, running natively on Apple Silicon. No NVIDIA GPU rental. No cloud account. No PhD required.
It went viral on Twitter this week. Front page of Hacker News. Google’s official Gemma account even tweeted it.
But the tool isn’t really the point. The experience is the point.
When you run Gemma Tuner, a step-by-step wizard walks you through everything — what kind of training you want. Model. Dataset. Learning rate. It’s all there.
On the suggestion of a friend, I even threw a sample dataset so you can start immediately. Answer a few questions, hit enter, and training begins.
And then you see it.
A real-time training visualizer opens — dark background, orange curves, neon green attention maps.
Your machine is learning.
I designed it to feel like a Robinhood stock chart: alive, urgent, like you’re watching something grow and evolve in real time. Because you are.
The loss curve sweeps across the screen — that’s how wrong the model is, how surprised it is by the truth. You watch it drop. The attention heat map lights up, showing you which parts of the input the model is focusing on. Gradient signal strength, step size, RAM usage — everything exposed. Everything visible.
You are teaching the sand to think. Not someone else. You.
You are seeing inside your machine. Your model. On your laptop.
At LA County-USC, they built a catwalk inside the Trauma Room so that students can see the blood and guts – the miracle and the machinery of life – up close, with their own two eyes.
We have anatomy labs for medical students and shop class for high schoolers and chemistry sets for kids. We need the same thing for AI — a way for ordinary people to crack open the chest and see the beating heart of these machines.
That’s why I built this thing.
I invite you and your kids – especially your kids – to take it for a spin.




