We've released four new workflows to the team in the last month. Built AI systems for client reporting, creative production, campaign analysis. Sent the links in Slack. Ran the demos. Recorded the walkthroughs.
Nobody's using them.
Not because the team is resistant. They're not. Every single person I've sat down with one-on-one wants to get there. They can see what's possible. They've watched the demos, nodded along, asked good questions. And then Wednesday comes around and they're back to the way they've always worked.
Our head of operations put it perfectly in a Monday planning session: process is outpacing adoption. We're proving process so quickly that the team can't adopt the behaviour as fast. He's right. And the instinct — mine included — is to run more training. More sessions. More documentation. More Slack messages with shiny new tools attached.
That instinct is wrong.
Every person who's crossed over to working with AI properly did it the same way. Not in a training session. Not from a recording. Not from a Slack message with a link to a new workflow.
They sat down with someone, opened a real client project, worked on a real problem with real data, and built something. That's it. That's the only thing that works.
One of our senior account managers said it at the end of a hands-on session this week: "I feel like I've just taken a big step forward — there's a couple of things that are going to be big unlocks for me." Thirty-four minutes. One session. Real client data. Not a slide deck. Not a recording he'd watch at 2x speed and forget by Friday.
Our head of AI and automation has been living this frustration. He's built the workflows, documented the processes, released them to the team. And then he watches people not use them. His conclusion: we need to actually show the team, not just send it in Slack and say "this is a new tool for you to use." He's right. But even "showing" undersells it. Showing creates excitement. Building creates capability.
Training teaches the tool. Building teaches the thinking.
The gap isn't knowledge. Everyone on the team knows Claude exists. Most of them have an account. Some have experimented. The gap is the distance between knowing what AI can do in the abstract and knowing what to ask it for your specific client, your specific data, your specific problem at nine-thirty on a Tuesday morning.
One of our paid media specialists knows the prompts exist. He knows he should be using them. His director described it plainly: the guy's still repeating his same Google Ads task that he was doing when he first started. He knows. He just hasn't crossed over.
That crossing-over moment can't happen in a group training. Our co-founder identified this months ago — the aha moment takes time, and you have to show people individually. He's got smart, motivated people who genuinely want to adopt AI, and he still has to sit with them one-on-one to make it click.
This isn't a criticism of the team. It's a recognition of how the shift actually works. You can't teach someone to think with AI. You can only build with them until they start doing it on their own.
Here's the thing I kept getting wrong in my head. I was thinking about this as an adoption problem across the whole company — how do we get forty-plus people building with AI? But that frames it as if every person matters equally. They don't. Not in terms of value, but in terms of leverage.
Our directors run the company. They hold the client relationships, make the strategic calls, carry the context for every account in their portfolio. They're not managers who delegate and review. They're the ones doing the work — the thinking, the planning, the problem-solving that drives results.
Which means the directors crossing over to Claude Code isn't a nice-to-have. It's the whole game. Because AI amplifies what you already bring. If you've got fifteen years of paid media strategy in your head, Claude Code doesn't just make you a bit faster. It makes you dramatically more capable. You can build a full campaign restructure visualisation during a planning call. You can pull together a client strategy portal from raw data in hours, not weeks. The depth of experience becomes the prompt quality. The strategic judgement becomes the editorial filter.
And it's exponential. A small capability difference at the senior level — one director who's crossed over versus one who hasn't — produces a massive difference in output. Not twenty percent more. Five times more. Because they're applying the tools to higher-leverage problems with richer context.
We have roughly five people who can do this work today. We need that to be fifteen within a month. And the first ten of those need to be our most senior people — the directors and leads who carry the most context and make the biggest decisions. Then the layer below them. The compounding only works if it starts at the top.
We need to actively be in there working with people. That's the line I keep coming back to. Not sending resources. Not running workshops. Sitting next to someone — physically or virtually — while they build something real, and staying there until they don't need you anymore.
But the priority order matters. Every week a director hasn't crossed over is a week of exponential output left on the table. Not because they're failing — because the leverage of a senior person with these tools is so disproportionate that even a short delay costs more than it looks.
You can bucket the team. We're starting to do that — separating people whose capabilities are about to accelerate from people still getting the fundamentals right. Different sessions, different pace, different expectations. But the sequencing is clear: directors first, then their leads, then everyone else. Each person who crosses over can pull the next one across.
We're not going to train our way to adoption. We're going to build our way there. One person, one client, one session at a time — starting with the people whose crossing over changes everything downstream.