I've seen engineers lose the spark in their eyes in their craft having to just plan and review the output. The sheer pace of AI is wearing down engineers.
Procurement is not adoption. The leaders who are winning the AI transition did one thing the others didn't: they went first. They used the tools, showed their teams what the learning curve looked like, and built the conditions for real capability to develop. Buying licenses created access. It didn't create any of that.
When AI scales output without scaling review, the quality gate breaks. The problem isn't the tool. It's the leader who lost the reference point for what good looks like in their own system.
If I had to roll out AI to my engineering team again, I wouldn't start with the tools. I'd start with one repeatable workflow, map every step, define what good looks like, and turn it into a pipeline that compounds. Here's how that works in practice.
A junior engineer told me he wasn't sure he understood everything the AI was outputting for him. He was reading it, checking it, shipping it. But he couldn't reliably tell if it was right. That conversation opened a harder question about what AI is actually doing to the engineers who never built the foundation first.
Technical leaders are being asked to set AI standards, define acceptable output quality, and govern what gets into production. You cannot do any of that if you've lost your own connection to how your team builds. Here's what the drift looks like, and what it costs you when AI has already doubled your team's output.
When I reviewed my team's AI adoption dashboard last quarter, every number was moving in the right direction. What the dashboard couldn't show me was who had stopped understanding the systems they were shipping. Adoption metrics and skill development metrics are not the same thing. I had been treating them like they were.
More than 500 engineers described what happened when their companies chose mandates over conversations. The dashboards showed adoption. The engineers showed compliance. What didn't appear anywhere was what wasn't working.