When the Narrative Outruns Reality

AI washing has been making headlines. Recent debate around Block and Jack Dorsey is one example of how quickly external narratives can move ahead of what is actually visible from the outside. Whether or not any individual case proves the point, the wider leadership risk is real.

It is easy to look at this and see it as a Big Tech problem. The scale may be different. The scrutiny may be different. But the underlying dynamic is not. This can happen in any organization. And when the narrative outruns reality trust is usually the first thing to go.

The speed of the narrative now, across media, investor conversations, boardrooms and industry events, means leaders need to pay attention to the gap between what is being said externally and what is actually true internally. Not once, but continuously.

It is worth checking whether your AI story is consistent, credible and defensible - and what it costs when it is not.

That gap rarely opens on purpose. External narratives get shaped by what sounds confident, what signals momentum, what boards, funders or clients want to hear. Internal reality is usually messier: uneven adoption, unclear policies, unresolved questions, uncertainty about strategy, and anxiety about what all of this means for roles.

It does not stay invisible for long.

People notice inconsistency. And when they do, what starts to erode is not just confidence in the AI strategy. It is confidence in leadership.

Getting this right is not about polished messaging. It is about knowing what you actually believe about AI in your organization, being honest about where things stand, and being willing to say the same thing to everyone.

Because the cost is not abstract.

It shows up in motivation, trust and performance. Your best people have options. If they do not believe the story you are telling about AI, they will look for a leader whose story feels more credible. And the people who stay but do not believe it are unlikely to build the capability you need.

Adoption runs on trust and curiosity. You cannot mandate either.

Three questions worth asking - and regularly:

If a new hire asked your team to describe your AI strategy, what would they actually say?

Are there questions about AI and roles that you have been avoiding answering directly?

When did you last talk openly about what is not working, not just what is?

Selective storytelling is part of the problem: sharing the wins and saying less about the friction. So is borrowed language that sounds right but does not reflect reality. So is vague positivity that signals enthusiasm without saying anything specific. And so is silence, which people will fill in for themselves.

The most credible position right now is also the most honest one. We are all finding our way here. That is not a weakness to manage around. It is the truth to lead with.

Right now, credibility comes from alignment. From saying the same true thing internally and externally. And from making sure your story can survive contact with reality.

Next
Next

Who’s Telling Your Story?