The January Window

People come back from holidays with energy for change. They have stepped away from the daily grind long enough to see what has been frustrating them. They are more open to doing things differently.

I think this is a powerful moment for leaders who know how to harness it.

What happens too often is organizations wasting this energy on top-down planning. Targets are set. Priorities are cascaded. New initiatives are announced. Employees are treated as recipients of strategy rather than sources of insight.

I have found there is a better way.

Create space for people to share what they are learning, identify solutions together, and build momentum through peer connection rather than mandate. This is a fundamentally democratic approach to capability building. It signals that you are listening, that you recognize people’s appetite for improvement, and that you trust them to help solve real problems.

When it comes to AI adoption, this matters even more than usual.

What I have seen is that the most valuable AI applications in organizations are rarely sitting in a strategy deck. They are scattered across the workforce in the form of experiments, workarounds, and solutions that most people never hear about.

January gives you a brief opening to harvest that intelligence before everyone gets pulled back into business as usual.

The Problem and Use Case Workshop

Instead of announcing your 2026 AI strategy, start by surfacing the problems your people are facing and explore solutions together.

Bring together a cross-section of the company for a working session focused on real workflow friction. Share what has already been tried. Identify what is actually working. Generate use cases worth piloting now.

I do not see this as a brainstorm. It is a structured conversation designed to activate the peer networks through which AI capability actually spreads.

Who Should Be in the Room

Small enough for real conversation. Large enough for diverse perspectives.

You need people already experimenting with AI, bringing lived learning. Skeptics and people frustrated with current tools, bringing reality checks. Workflow owners from different functions, bringing context about where work actually happens.

Avoid making it all senior leadership. That kills candour. Avoid making it all individual contributors. That limits follow-through.

Do include people from sales, marketing, operations, customer teams, HR, strategy, and anywhere else repetitive cognitive work happens, with a deliberate mix of seniority levels.

Who Should Facilitate

This works best with a skilled facilitator who can draw out insight without steering the room. If you do not have strong internal facilitation capacity, bring in external support.

The goal is not consensus. It is clarity about what is actually happening and where value might sit.

An Effective Structure

1. Start with real problems
Begin with workflow pain, not AI possibilities.

Where are people copying, pasting, and reformatting information?
What tasks are repetitive, data-heavy, or generative?
Where are bottlenecks quietly eating time and energy?

This grounds the conversation in lived friction rather than theoretical opportunity.

2. Surface and explore use cases
Where are people already using AI, officially or unofficially, to address these problems? What has worked? What failed? What did they learn?

As patterns emerge, dig into specifics.

  • A sales team member shares that post-meeting admin used to take hours. They started using voice notes and AI to generate structured summaries, CRM updates, and follow-ups. The work dropped dramatically.

The impact is immediately visible. A task that quietly consumes time across the team becomes easier, faster, and more consistent. A sales lead recognizes that what looked like a personal workaround could address a team-wide bottleneck.

Someone in customer success recognises the same pattern and asks whether the approach could be piloted across both teams.

  • A department head describes the constant burden of tracking regulatory changes and updating procedures to remain compliant. AI could monitor updates and draft first-pass revisions, with human verification built in.

The opportunity cuts across multiple departments, but it also raises questions about accuracy, oversight, and governance. It is clearly valuable work, but it would need careful testing and clear verification workflows before being relied on.

  • A strategy lead explains how they are already using AI to synthesize newsletters, reports, and industry publications into weekly insight briefs, turning information overload into actionable intelligence.

What changes is not just speed, but signal. Marketing and product teams immediately see relevance, because the output connects external noise to concrete strategic implications.

  • An HR leader shares a pilot onboarding assistant that answers common questions and directs new hires to the right resources, reducing email traffic and shortening the time it takes for people to feel productive.

Other teams quickly recognize similar challenges in training, internal procedures, and technical documentation. The pattern is not about onboarding alone, but about how knowledge is transferred inside the organization.

Each example reveals not just a use case, but a transferable pattern.

3. Prioritize by value and feasibility
Assess each use case on two dimensions.

Value: time saved, risk reduced, quality improved, probability of achieving business goals increased.
Feasibility: ability to implement using existing tools, skills, and governance.

Start with the high-value, high-feasibility work. These are the quick wins that build confidence and capability.

4. Create peer connections deliberately
The workshop’s real power is not just in the list of use cases. It is the network activation.

Make it explicit. Who is already doing what? Who wants to learn from them? Who wants to pilot next? Who has the same problem in a different part of the organization?

This is how capability spreads.

What Happens After Matters Most

A workshop only creates value if it produces action.

Before people leave the room, capture the problems surfaced, document prioritized use cases with value and feasibility notes, identify champions for each pilot, and record the peer connections that should continue.

Assign clear ownership and short timelines, usually four to six weeks. 

Schedule a follow-up. What worked? What did not? What should scale, stop, or change?

Create a simple channel for ongoing peer learning. A Slack channel, monthly show-and-tell, whatever fits your culture. Light-touch but consistent.

Why This Works

AI capability spreads through peer influence and specific problem-solutions, not mandates and policy.

This approach does not roll out technology. It activates the knowledge networks that already exist inside organizations.

I believe that rather than grand announcements, January is the best time for listening well and building momentum where it already wants to grow.

Previous
Previous

Who Owns This?

Next
Next

From Scattered Wins to Organizational Capability