Sessions available starting July 13, 2026

A Diagnostic Workshop for Engineering Leadership Teams

The AI Leadership Audit

Your engineering org adopted AI. Your leadership didn’t.

AI is accelerating your teams faster than your leadership practices can keep up. This workshop diagnoses the six gaps that open when speed outpaces judgment ... before they cost you your best people.

Get in the Queue

The Real Problem With AI Adoption

Every engineering org is under pressure to adopt AI. Most are handling it the same way. Top-down mandates. Adoption dashboards. All-hands announcements about the future of productivity.

The tools get installed. The metrics climb. Leadership declares success.

Meanwhile, underneath the dashboard, something else is happening. Your strongest engineers are losing skills they spent years building. Your leaders are approving work they can no longer evaluate. Your team is fragmenting into five different AI workflows with no shared standards. The engineers who used to love the hard problems are quietly checking out because nobody asked them how AI should fit into their work.

The gap between what leadership sees and what the team experiences is growing wider every sprint. AI just made it grow faster.

This workshop diagnoses that gap. In four hours. With your leadership team in the room.

What We Diagnose

Six patterns that show up in every engineering org navigating AI adoption. Each one is invisible on a dashboard. Each one is obvious to the engineers living with it.

01

The Mandate Problem

Leadership rolls out AI with an email and a dashboard. No pilot. No feedback loop. No conversation about where AI helps and where it doesn't. Adoption becomes compliance. The engineers who care the most are the first ones to disengage.

The best AI adoption I've led started with a week of exploration and zero deliverables. I was one step ahead of my team. That's all a leader needs to be. The worst AI adoption I've seen started with a mandate and ended with a team that stopped caring about the quality of what they shipped.

How did your team adopt AI, and who decided how it would be used?

02

The Judgment Decay Problem

Leaders who stopped building can't evaluate AI-generated output. They approve PRs they can't fully read. They commit timelines based on assumptions they can no longer pressure-test. The gap between leadership decisions and engineering reality widens every quarter.

I still read pull requests. Not to approve them. To understand how my teams think about problems. I still prototype before I promise a timeline to a VP. The moment I stop understanding how our systems break, I lose the ability to make good decisions about them. Most leaders lost that ability months ago and their teams already know it.

When was the last time your leadership team touched the system they're making decisions about?

03

The Measurement Problem

Someone saw a headline about AI making developers 3x productive. Now that's the target. Nobody measured what 1x looked like. Velocity became a growth metric instead of a planning tool. Story points are being inflated to survive expectations that were never grounded in reality.

A 3x improvement in code generation means nothing if your deployment pipeline takes 4 hours. A 3x improvement in ticket throughput means nothing if half those tickets are rework from last sprint. Speed without direction is expensive chaos. The best engineering leaders I've worked with never asked for 3x anything. They asked what's in the way and removed it.

What are you actually measuring, and does it tell you what you think it does?

04

The Atrophy Problem

Engineers directing AI agents all day are producing more output than ever. They're also losing the muscle memory for debugging, for critical thinking through edge cases, for understanding why certain patterns break under load.

AI is a multiplier. But a million times zero is still zero. The engineers who use AI best are the ones who built deep skills first and use AI to extend them. The engineers who skipped the fundamentals are crashing at a higher altitude with a bigger smile. Your team has both. Your leaders need to know the difference.

If AI stopped working tomorrow, could your team still do the jobs they were hired for?

05

The Recovery Problem

Before AI, engineering had natural downtime built into the day. Waiting for builds. Writing mechanical tests. Refactoring that was almost meditative. Those moments weren't wasted time. They were recovery time.

AI eliminated all of it. Every minute is now high-stakes evaluation. Did the model get this right. Does this fit the architecture or just pass the tests. The mental model most leaders carry is that AI saves time so engineers should have more capacity. The reality is that AI compresses effort and engineers hit cognitive walls earlier. Your team's burnout is compounding and the dashboard can't see it.

Where in your team's day is nobody asking them to evaluate anything?

06

The Culture Problem

Five engineers using AI five different ways. No shared standards for how to evaluate output. No agreement on when to trust AI and when to override it. Engineers citing ChatGPT in architecture reviews instead of defending decisions in their own words.

AI didn't create this problem. It exposed teams that never had a shared decision-making framework. The teams where AI is working well are the ones that had strong engineering culture before AI showed up. The teams where AI is creating chaos are the ones that were already running on convenience instead of standards. AI just made it visible to everyone at the same time.

Does your team have a shared framework for when and how to use AI, or did everyone just figure it out on their own?

How the Workshop Works

Before the Session

Each participant receives the AI Leadership Audit. A diagnostic document that walks through all six patterns with real scenarios, self-assessment prompts, and questions designed to surface what most leadership teams avoid discussing. They come in having already sat with the uncomfortable parts.

During the Session (4 Hours)

Hour 1

The Diagnosis

Facilitated assessment. Each leader identifies which of the six patterns is most present in their org. The room gets honest about what AI adoption actually looks like on their teams versus what the dashboard shows.

Hour 2

The Evidence

Small group work. Leaders examine the specific systems, habits, and decisions that reveal their real AI leadership gaps. Not what they intend to do. What they actually do. What their engineers actually experience.

Hour 3

The Conversation

The part most teams avoid. Structured dialogue where leaders hear how their patterns are landing with the people building the software. Facilitated so it stays productive. Not personal.

Hour 4

The Rebuild

Each leader leaves with one concrete system change they will implement in the next 30 days. Not a development plan with quarterly milestones. One thing. Specific. Measurable. Visible to their team within a week.

After the Session

30-day follow-up check-in to assess what changed and what didn’t. The audit doesn’t end in the room. It ends when the team notices something is different.

Who This Is For

Engineering leadership teams at companies where AI adoption is happening faster than the leadership practices supporting it. Specifically:

  • VPs of Engineering, CTOs, and Senior Directors leading teams of 30–150+ engineers through AI-accelerated workflows.

  • Leadership teams where AI tools are adopted but adoption feels chaotic, inconsistent, or forced.

  • Organizations where the best engineers are frustrated and leadership can't pinpoint why.

  • HR and L&D leaders looking for programs that address AI readiness and leadership capability simultaneously.

  • Any engineering leader willing to hear what their team has already concluded about how AI adoption is being led.

Who Facilitates This

Jono Herrington

Jono built and led Converse’s global digital engineering org at Nike, scaling the team across the US, Serbia, and India. He’s spent 15+ years building platforms and the teams behind them.

He still writes code. He still reads pull requests. He still prototypes before he promises a timeline. He led AI adoption across a distributed engineering team without a single mandate ... and watched 500+ engineers in an online thread describe what happens when their leaders chose mandates instead.

Every diagnostic pattern in this workshop comes from a failure he experienced first or documented from the front lines of engineering leadership during the AI transition.

Every mirror in this workshop comes from a mistake he made first.

Investment

Half-Day Workshop

$15,000*

Up to 15 participants

  • Pre-session AI Leadership Audit document for every participant
  • Facilitated session with Jono Herrington
  • 30-day follow-up check-in
  • Post-session summary of diagnostic findings and recommendations

Full-Day Intensive

$25,000*

Up to 25 participants

  • All six diagnostic patterns with extended exercises
  • Team-level AI workflow assessment
  • Individual action plans
  • Pre-session AI Leadership Audit document for every participant
  • Facilitated session with Jono Herrington
  • 30-day follow-up check-in
  • Post-session summary of diagnostic findings and recommendations

*Travel and lodging billed at cost for all engagements.

This workshop is typically funded through AI readiness, digital transformation, or engineering enablement budgets.

Stay in the Loop

Want to know when sessions open?

Not ready to reach out yet. Drop your email and you’ll hear when new dates are announced.

Next Step

Get in the Queue

The first session runs July 13. Spots are limited to 15 people per session. Send a note and we’ll have a 20-minute conversation to confirm the diagnostic is the right fit for your team.