Latest

When AI Makes You Forget How to Code
The friction that builds judgment is disappearing
A junior engineer told me he couldn't explain his own code. Not because he wasn't working. Because he wasn't being forced to learn anymore.
Latest

The friction that builds judgment is disappearing
A junior engineer told me he couldn't explain his own code. Not because he wasn't working. Because he wasn't being forced to learn anymore.
Previous
Buying AI access was the easy part. Now learn to fly.
Procurement is not adoption. The leaders who are winning the AI transition did one thing the others didn't: they went first. They used the tools, showed their teams what the learning curve looked like, and built the conditions for real capability to develop. Buying licenses created access. It didn't create any of that.
Proximity built that instinct. Drift is spending it.
When AI scales output without scaling review, the quality gate breaks. The problem isn't the tool. It's the leader who lost the reference point for what good looks like in their own system.
Your team is prompting from scratch every time. That's the problem.
If I had to roll out AI to my engineering team again, I wouldn't start with the tools. I'd start with one repeatable workflow, map every step, define what good looks like, and turn it into a pipeline that compounds. Here's how that works in practice.
The gap between your senior and junior engineers is compounding in ways most leaders aren't watching.
A junior engineer told me he wasn't sure he understood everything the AI was outputting for him. He was reading it, checking it, shipping it. But he couldn't reliably tell if it was right. That conversation opened a harder question about what AI is actually doing to the engineers who never built the foundation first.
The governance gap nobody talks about when your team ships AI-generated code
Technical leaders are being asked to set AI standards, define acceptable output quality, and govern what gets into production. You cannot do any of that if you've lost your own connection to how your team builds. Here's what the drift looks like, and what it costs you when AI has already doubled your team's output.
AI usage numbers can look healthy while your engineers quietly stop being able to explain their own systems
When I reviewed my team's AI adoption dashboard last quarter, every number was moving in the right direction. What the dashboard couldn't show me was who had stopped understanding the systems they were shipping. Adoption metrics and skill development metrics are not the same thing. I had been treating them like they were.
Why compliance dashboards can't tell you what's actually happening with AI on your engineering team
More than 500 engineers described what happened when their companies chose mandates over conversations. The dashboards showed adoption. The engineers showed compliance. What didn't appear anywhere was what wasn't working.
You get 3x of whatever the path of least resistance produces.
Jono Herrington pushed for higher velocity. The team delivered. Then he sat at a lunch table bragging about doubled output while his engineers knew exactly what they'd done to produce that number. The same pattern is playing out in every AI productivity mandate right now.
When AI scales faster than your engineering culture, the codebase tells you the truth.
Six weeks after giving our team AI coding tools without guardrails, the codebase looked like a junk drawer with a CI/CD pipeline. Our first instinct was to fix the AI config. The right answer was to fix the humans first.