Issue #001

The License Isn't the Work

March 31, 2026·3 min read

Your team has an AI license. That's not adoption. That's a gym membership.

Everyone signs up in January. Everyone has access to the equipment. Almost nobody is doing the actual work required to get results. The gym looks full. The progress isn't there.

The licenses are bought. The tools are installed. Leaders reported adoption up. And the codebase underneath it all is quietly becoming something nobody wants to look at too closely.

I know this because we lived it.

What We Actually Built

When we first rolled out AI coding tools to our team, we moved fast. The productivity promise was real. The pressure to move was too. So we gave engineers access, showed them the basics, and let them run.

Output went up immediately. PRs were opening faster. Features were getting drafted quicker. The metrics looked good. We didn't see what was accumulating underneath.

What nobody tells you when they're selling you the license is that AI doesn't produce the same output twice. Two engineers prompting the same problem will generate two different solutions. Two different patterns. Two different approaches to error handling. Two different ways to manage state. If your team doesn't have alignment on what good looks like before AI starts generating code, you will get every version of good at once. All of it shipped. All of it in production.

A few months in, we had a junk drawer with a deployment pipeline attached to it. The codebase was inconsistent in ways that weren't visible from any single PR. You had to look at the whole system to see what was happening, and we had been creating faster and breaking faster and the team was starting to feel it. Reviews were taking longer because reviewers had no mental model of what the code should look like. Engineers on an incident call were getting slower at debugging because the system no longer had consistent patterns to reason through. Output was going up and trust in the output was going down at the same time.

That's the gym membership problem at team scale. Everyone is on the machines. Nobody is following the same program. And the results are nowhere near what anyone expected when they signed up.

The Work

The unpredictability of AI output is the work. Getting your team aligned before the AI generates anything ... that's the work. Documenting your patterns, your error handling standards, your architectural decisions in a form the tool can learn from ... that's the work. Building guardrails so the AI starts from your standards instead of its defaults ... that's the work.

None of that comes with the license. And none of it gets easier just because things are being created faster.

We had to stop and do the alignment work before the AI could. We documented how our team makes decisions. We wrote down the patterns we expect to see in PRs before we asked AI to write them. We aligned on error handling, logging, state management, and architectural standards before we built anything on top of them. Then we built hooks, rules, and workflows that encoded those standards so the tool started from what we'd agreed on rather than what it guessed.

It took weeks of deliberate, unglamorous work. And when we came out the other side, the codebase stabilized in ways that felt like productivity. Not output ... productivity.

Why This Is Getting Harder

The current environment is making this harder. The pressure to move fast is real. The vendors want to show you demos, not implementation plans. The consultants want to talk about transformation, not the month of documentation work that has to happen first. Meanwhile engineering teams are holding on for dear life. Creating faster. Breaking faster. Trying to figure out what accountability looks like when AI wrote the code that failed.

The leaders who come out of this well won't be the ones who adopted fastest. They'll be the ones who did the work before they scaled.

The AI Leadership Audit maps exactly what that work looks like across six chapters you can work through with your team ... jonoherrington.com/leadership-audit.