Who’s Telling Your Story?

Your people are not waiting for you to figure out your AI strategy before forming opinions about it.

They're reading the headlines. They're watching other companies announce layoffs alongside AI investment. They're talking to each other. And if you haven't told them how you're thinking about AI - what it means for the work, for their roles, for what you value - they're filling that gap with whatever's loudest in the environment right now.

Right now, what's loudest is fear.

The silence isn't neutral

Most leaders who aren't talking about AI aren't doing so out of indifference. They're waiting. Waiting until the strategy is clearer. Until there's something concrete to say. Until they have answers rather than open questions.

That instinct is understandable. It's also costly.

When people don't hear from you, they don't sit in comfortable uncertainty. They construct a story from the signals available to them - the industry news, the rumours, the things left unsaid. And in a low-trust environment, the story they construct tends toward the worst version of events.

Recent surveys consistently show employees are more anxious about AI than leaders assume. Adoption is accelerating, but confidence in how it’s being implemented is not keeping pace.

What people actually need to hear

Here's the difficult part. Some of the anxiety is rational.

Job security is an obvious concern. But so too is relevance. About whether AI use will be monitored and used to evaluate them. About whether their judgment still matters when a tool can produce an answer in seconds. These fears don't require a restructuring announcement to take hold - they're already present, and silence feeds them.

Reassurance without substance doesn't hold. Well-intentioned messages that get ahead of what you genuinely know don't land as comfort. They land as spin, and people can tell the difference.

What they need is honesty about how you're thinking - even when the thinking is incomplete.

That means being explicit about your intentions. What do you want AI to make possible? What problems are you trying to solve? What role do your people play in that? If efficiency gains are part of the picture, what happens to the time and capacity that frees up?

It also means being honest about what you don't yet know. Leaders who communicate through uncertainty - "here's what we're working through, here's how we're approaching it, here's how you'll be part of it" - build more trust than leaders who go quiet until they have a polished answer.

You can't lead a narrative you're not in

Your communication networks are already carrying something. The question is whether you're putting signal into them or leaving that to chance.

Think back to Edition 3: communication networks shape what people believe is possible. Trust networks shape what people are willing to try. If those networks are currently carrying anxiety and speculation, the answer isn't better policy documents or an all-hands with slides. It's consistent, honest communication from leaders who are visibly engaging with the reality their people are living.

That means creating space for genuine questions, not just managed messaging. It means being willing to hear what people are worried about and responding honestly - including when the honest response is "I don't know yet, but here's how we're thinking about it."

Psychological safety is the precondition for everything else. People who don't trust leadership intent don't experiment openly. They don't share what's working. They don't flag what's going wrong. The learning stops, and with it, the capability.

That starts with what you say, and when you say it.

Where to start

Most leaders already know more about where they stand on AI than they've shared with their teams. The barrier isn't clarity. It's the discomfort of speaking before you have everything figured out.

That discomfort is worth pushing through. The adaptive capability I described in the last edition - the organizational ability to keep learning as AI keeps evolving - cannot be built in an environment of anxiety and mistrust. Communication is not the soft side of AI adoption. It's the infrastructure.

Next
Next

There is no ‘there’