Skip to content
AI strategic thinking
Strategic thinking over a cuppa ...

Getting Your Team to Think Strategically About AI

Summary:

Most teams respond to AI with tactics - a demo here, a pilot there. Strategic thinking is a skill that can be taught, starting with three deceptively simple questions.

I was enjoying the coffee and the great conversation. A colleague and I were catching up – she’d recently moved into a bigger leadership role, and we were talking through the adjustments. The classic stuff: she missed the chance to do the actual work, her calendar was now wall-to-wall management meetings, and she was spending more time writing performance plans than solving interesting problems. But she was doing well with it. Smart, capable, figuring things out on her own the way good leaders tend to.

Then she asked me a question that stopped me cold.

“How do you get your team to think strategically?”

Simple question. I didn’t have a simple answer. In fact, I found myself getting quieter and quieter, trying to figure it out for myself. I’d always just let people find their own level – focus on what they knew, deliver their piece of the puzzle, and trust that the strategic thinking would happen somewhere above them in the org chart. Her question made me realize I hadn’t invested the right time with my own teams on this, and that was an uncomfortable thing to admit over coffee.

I’ve been thinking about that conversation a lot lately, because the question has gotten significantly more urgent. “Think strategically” used to mean something like “connect your daily work to the bigger business objectives.” That’s still true, but now there’s an AI-shaped layer on top of everything, and most teams are responding to it tactically – someone sees a demo, someone runs a pilot, someone builds a chatbot for customer service. These aren’t bad instincts, but they’re plays without a game plan. And if the first three articles in this series have made anything clear, it’s that starting with the technology instead of the change is how you end up with impressive demos and disappointing results.

Strategic thinking about AI is a skill. It can be taught. And right now, most organizations aren’t teaching it – they’re just hoping it happens on its own.

Tactics Without a Game Plan

Let me describe what tactical AI looks like, because you’ll probably recognize it. Someone on the team watches a webinar about generative AI and comes back excited. A department head sees a competitor’s press release about an AI initiative and wants to know why we’re not doing that. The IT group sets up a sandbox environment and invites people to experiment. A vendor shows up with a slick demo that promises to automate something painful. All of this activity feels productive. It generates energy, it fills meeting agendas, and it gives leadership something to point to when the board asks “what are we doing about AI?”

But it’s tactics. It’s individual plays with no game plan connecting them. Nobody’s asked the harder questions: which of these experiments actually connects to a business outcome we care about? What would have to change in our operations, our customer relationships, or our data infrastructure to capture real value? And what are we choosing not to pursue, so we can focus our limited time and attention on the things that matter most?

There’s a classic Harvard Business Review article by Collis and Rukstad – Can You Say What Your Strategy Is? – that lays out the simplest framework I’ve ever seen for distinguishing strategy from tactics. Strategy answers three questions: what’s your objective, what’s your scope, and what’s your advantage? Or even simpler: what you do, where you play, and how you win. Tactics are the plays you run inside that frame. Without the frame, you’re just running plays and hoping they add up to something.

That’s what I see in most organizations right now. Lots of AI plays. No AI strategy. And the teams running those plays are working hard, producing results, and genuinely trying to do the right thing – but nobody’s given them the strategic frame to evaluate whether what they’re building actually matters to the business. It’s the same pattern I’ve seen with every technology wave, and the fix is the same too: teach people to think strategically before you ask them to act tactically.

What You Do, Where You Play, How You Win

The beauty of the Collis and Rukstad framework is that it works at every level – company, division, team, even individual contributor. And when you layer AI on top of it, the questions get specific enough to be genuinely useful. If you’ve been following my work on the Five Building Blocks, you’ll see them show up naturally here – not as a separate framework, but as the lenses that make each strategic question sharper.

What you do is your Products and Services – not how you fulfill them, but what people actually pay you to do or provide. This is the most fundamental strategic question, and it’s the one AI is reshaping most dramatically. A manufacturer whose equipment can predict its own maintenance needs is offering something fundamentally different from one that sends a tech when something breaks. A professional services firm that delivers AI-augmented analysis alongside its consulting is selling a different product than one that sells hours. The strategic AI question here is: could AI change what we are to our customers, not just how efficiently we deliver it?

Where you play is your Customers – and by customers I mean the whole ecosystem: markets, channels, partners, the relationships that connect you to the people who pay you. The strategic AI question here is: what do our customers actually need from us that they’re not getting, and could AI help us deliver it? Not “can we put a chatbot on the website” but “do we understand our customer relationships well enough to know where AI could genuinely improve the experience?” If your sales team can’t articulate why customers choose you over the competition, AI isn’t going to figure that out for them. But if they can articulate it, AI might help them deliver on that promise at a scale they couldn’t reach before.

How you win is where the Building Blocks really stack up, because your competitive advantages tend to layer across several of them. Your products might give you a defensible moat – unique capabilities protected by IP that competitors can’t easily replicate. Your Operations might be the differentiator – the ability to deliver faster, with better quality, with richer information alongside the product than anyone else in your space. Your people might be the advantage – deep relationships with customers, institutional knowledge that takes years to build, the kind of trust that doesn’t transfer when someone changes vendors. And Data underpins all of it – you can’t execute on any of these advantages if your data is trapped in silos, your systems don’t connect, and your team can’t get to the information they need when they need it.

I worked with a mid-market manufacturer that had their production planners spending half their week manually reconciling data between two systems that didn’t talk to each other. An AI tool could automate that reconciliation in minutes. But the strategic insight wasn’t the automation – it was what happened after the automation. Those planners suddenly had half their week back, and the question became: what should they be doing with that time? The tactical answer is “more of the same stuff.” The strategic answer is “they should be analyzing production patterns and making recommendations that nobody has time for today.” Same tool, completely different value depending on whether you’re thinking tactically or strategically. That’s an operations advantage – execution excellence – and AI didn’t create it. AI just made it possible to actually pursue it.

The Building Blocks don’t map to Collis and Rukstad in neat little boxes, and that’s actually the point. Strategic thinking means seeing how the pieces connect across those boundaries, not just optimizing inside each one.

It’s just as important to know what you don’t do. We can’t be all things to all people, and AI makes it tempting to try because the barrier to entry is so low. Every department can spin up a pilot. Every team can find a use case. But strategic thinking means prioritizing – focusing your limited time, attention, and organizational energy on the things that create the most value, and having the discipline to say no to the things that don’t, even when they’re shiny and easy to demo1Working through this “what you do, where you play, how you win” framework for your specific business is exactly the kind of conversation JazzAI is designed for.

Teaching the Skill

My coffee companion’s insight, the one that stuck with me, was deceptively simple: her team was comfortable thinking tactically because nobody had ever taught them to think any other way. They weren’t resisting strategic thinking – they’d just never been shown what it looked like or given the space to practice it.

This is where most organizations fall down. They hire smart, capable people, put them in roles that reward tactical execution, promote the ones who execute best, and then wonder why nobody’s thinking strategically. You get what you build for, and most organizations are built for tactics.

So how do you actually teach strategic thinking about AI? Not in a classroom, and not with a framework presentation (though I’ve just given you a good one). You teach it by changing the questions you ask in the conversations you’re already having.

Start with the 80/20 question. When someone brings you an AI use case – and they will, because everyone’s got one – ask them: “Of all the things we could do with AI, is this in the 20% that would drive 80% of the value?” Most of the time, the honest answer is no. That’s not a reason to kill the idea, but it’s a reason to put it in context. The discipline of asking “is this the highest-value thing we could be working on?” is a strategic skill, and every time you ask it, you’re teaching your team to ask it themselves.

Then ask the “what changes” question. For any AI initiative that passes the 80/20 filter, push the conversation past the technology: “If this works exactly as planned, what changes in how we operate? Who’s affected? What processes have to be redesigned? What data do we need that we don’t have today?” This is the start-with-the-change discipline from Article 1, embedded into your team’s thinking process. You’re not asking them to build a business case – you’re asking them to think through the implications before they fall in love with the solution.

And ask the “what are we not doing” question. This is the hardest one, because saying no to an AI initiative in 2026 feels like saying no to the future. But every yes costs attention, resources, and organizational energy. Every pilot that doesn’t connect to a strategic outcome is a distraction that makes the real work harder. Teaching your team to evaluate what not to pursue is just as valuable as teaching them to identify what to pursue – maybe more so, because the temptation to do everything is stronger than it’s ever been.

The Conversations Nobody Wants to Have

Here’s the thing about strategic thinking: once you start doing it well, it surfaces uncomfortable truths. That’s actually the point – but it doesn’t make the truths any easier to sit with.

When your team starts asking “what you do, where you play, how you win” in the context of AI, some of the answers are going to be hard. Some processes aren’t worth saving – they exist because of limitations that AI eliminates, and the strategic move is to let them go, not automate them. Some roles are going to change significantly, not because AI replaces people but because the knowing work that filled their days gets absorbed by tools, and what’s left requires a different set of skills. Some AI opportunities are going to require capabilities your team doesn’t have yet, which means either developing those capabilities or bringing in people who have them – and both of those conversations are loaded.

This is where everything in this series connects. You can’t have these conversations if you started with the technology instead of the change, because you won’t have the strategic frame to explain why the hard decisions are necessary. You can’t have them if your team doesn’t understand their own work deeply enough to distinguish what’s essential from what’s just familiar. And you certainly can’t have them without empathy for what it feels like to hear that the work you’ve been doing for years is about to change in ways you didn’t ask for and can’t fully predict.

Strategic thinking isn’t just about identifying opportunities. It’s about creating the conditions where your team can look at those opportunities honestly – including the ones that are scary – and make good decisions together. That requires trust, it requires understanding, and it requires a leader who’s done the internal work to lead the conversation with both clarity and compassion.

Where This Goes Next

So far, we’ve moved from mindset to action. Start with the change. Understand the gap between knowing and understanding. Lead with empathy. And now, teach your team to think strategically about where AI fits in your business.

But thinking and doing are different things. At some point, someone on your team is going to have to build something – run a pilot, test a tool, try an idea. And the environment you create for that experimentation determines whether it produces real learning or just expensive demos. In the next article, we’ll talk about what it takes to build an environment where AI innovation actually happens – and why most organizations get it wrong by making it either too safe or too risky.

If you’re working through how to lead AI into your business and want practical frameworks – not vendor hype – join our mailing list for the rest of this series and more.

24 February, 2026

  • 1
    Working through this “what you do, where you play, how you win” framework for your specific business is exactly the kind of conversation JazzAI is designed for

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
AI transformation ownership

Who Owns Your AI Transformation?

IT, Marketing, and Operations can all claim ownership of your AI strategy. Committees claim nothing. The right owner is defined by what they can see, connect, and serve.

Read more
ai innovation environment

Building an Environment of Possibilities – Where AI Innovation Actually Happens

Most AI innovation programs produce demos, not results. The environment that actually ships sits between the sandbox and the cowboy - and it requires judgment, not just enthusiasm.

Read more
ai change leadership

The Empathy Gap in AI Transformation

Experience replaces feelings with competence. That's its job - but it costs you the emotional memory your team needs you to have right now about AI.

Read more
AI transformation skills gap

Your Team Doesn’t Understand Their Jobs (And That’s About to Matter)

The gap between knowing a job and understanding it has been invisible for decades. AI is about to make it the most important distinction on your team.

Read more
Index