Skip to main content
Key Takeaways

AI Leadership: AI adopters see 1.5x revenue growth over three years; 74% haven't shown tangible value yet.

Usage Metrics: Half of companies using AI lack insight on workforce impact, questioning AI's business value.

Workflow Bottleneck: Benefits come from redesigning workflows, not just adopting AI tools, for economic impact.

Competitor Market: AI-centric companies outperform by reducing layers and improving output; mid-market must adapt.

Improvement Rate: Organizational capacity to learn and adapt matters more than starting point in AI adoption.

In most boardrooms, AI lives somewhere between the IT budget and the innovation team's quarterly update. It gets a few slides. Maybe a demo. Leadership nods, asks about adoption rates, and moves on to margin pressure and headcount planning.

The people running these meetings aren't unintelligent. They're running a playbook that worked for decades. Revenue target goes up 10%, headcount goes up 8%, budget scales proportionally. Growth equals more people, more spend, more capacity.

While it's true that AI has become a scapegoat for layoffs, it's also true that the technology is breaking that traditional equation, and most leadership teams haven't caught up.

Keep Reading—and Keep Leading Smarter

Create a free account to finish this piece and join a community of forward-thinking leaders unlocking tools, playbooks, and insights for thriving in the age of AI.

Step 1 of 3

Name*
This field is hidden when viewing the form

The AI-ready Leadership Gap is Measurable Now

Boston Consulting Group surveyed more than 1,000 C-suite executives across 59 countries and found that companies leading in AI adoption have achieved 1.5 times higher revenue growth, 1.6 times greater shareholder returns, and 1.4 times higher returns on invested capital over three years compared to companies still experimenting.

Meanwhile, 74% of organizations have yet to show tangible value from their AI investments.

The Conference Board's 2026 C-Suite Outlook Survey surfaced something more revealing: 98% of board members identified measuring AI ROI as a priority, compared to just 33% of CEOs. Boards increasingly see AI as a capital allocation question. Many CEOs still treat it as strategic exploration.

That disconnect explains a lot. When AI sits under IT, it stays a tools conversation. When it shows up alongside revenue growth and margin protection on the board agenda, it becomes something else entirely: an operating model conversation.

The difference between those two framings is where the performance gap starts.

Join the People Managing People community for access to exclusive content, practical templates, member-only events, and weekly leadership insights—it’s free to join.

Join the People Managing People community for access to exclusive content, practical templates, member-only events, and weekly leadership insights—it’s free to join.

Name*

Usage Metrics are a Dead End

There's a predictable maturity curve that companies go through. Stage one tracks licenses distributed, active users, number of prompts, training completion rates. These metrics feel concrete. They're politically safe. They fit neatly into a slide deck.

Then someone asks what any of it means for the business.

That question usually causes confusion. Not because the answer doesn't exist, but because the organization hasn't built the connective tissue between AI activity and financial outcomes.

How are you going to use AI to create value? Not what’s the ROI of AI. Those are two very different questions. And a leader needs to understand what the difference is between those two.

PMP – Podcast Guest – Charlene Li-16724
Charlene LiOpens new window

Strategic Advisor and Founder of Quantum Networks Group

Most don't yet. The ROI question invites spreadsheet logic. The value creation question forces you to rethink how the business actually works. Heidi Farris, CEO of workforce analytics firm ActivTrak, sees the measurement failure across hundreds of companies.

They measure activity instead of behavior change. Companies track logins, queries, and seat licenses, and call it an AI measurement program. That’s not measurement, that’s hope.

Heidi Farris-68318

ActivTrak research found that 50% of companies using AI aren't measuring its workforce impact at all, meaning half the market has no visibility into whether their spend is producing results.

The leadership teams that push past this point stop trying to measure company-wide adoption. They anchor AI to a small number of high-leverage workflows and start tracking what changes at the economic level.

In marketing, that means campaign production cycle time or output per marketer. In support, it means cost per ticket and escalation rates. In services, it means delivery cycle time and revenue per employee.

PwC's 2025 Global AI Jobs Barometer analyzed nearly a billion job ads and thousands of company financial reports and found that industries most exposed to AI saw revenue-per-employee growth of 27% between 2018 and 2024, roughly three times the growth in least-exposed industries. Since GenAI's emergence in 2022, productivity growth in those exposed industries has nearly quadrupled.

That data tells a story. But revenue per employee doesn't improve because you hand people a chatbot. It improves because someone redesigned the work.

The Real Bottleneck is Workflows

Tool deployment without workflow redesign is the most common and most expensive mistake companies make with AI. A team gets access to an AI tool. They use it to do the same work slightly faster. The productivity gain is real but marginal, and it rarely shows up in the financial model because nobody restructured the process around it.

An RGP survey of 200 U.S. CFOs found that 66% expect significant AI ROI within two years, but only 14% report meaningful value today. That gap doesn't close by distributing more licenses. It closes when leadership starts asking different questions in planning meetings.

Instead of "how many people do we need to hit the number," the question becomes "what portion of this work should still require a person at all." That reframing changes the entire planning conversation. It affects headcount forecasting, role design, capital allocation, and pricing strategy.

Leaders who take this seriously fund process redesign, not just tools. They allocate budget for change management and enablement alongside software licenses, because they recognize that the bottleneck was never the technology. It was the workflow.

The Competitor You Should Worry About

The competitive pressure tends to announce itself quietly. A newer entrant ships faster, prices more aggressively, or responds to customers at a speed that doesn't make sense under traditional cost assumptions.

The first instinct is to explain it away. They're burning VC cash. They're cutting corners on quality. Their model won't scale.

Sometimes those explanations are right. But increasingly, the real answer is structural. Companies built around AI-assisted workflows from day one carry less coordination overhead, fewer management layers, and fundamentally different unit economics.

That distinction matters more for mid-market companies than for anyone else. Enterprise playbooks assume you can throw capital and headcount at the problem. Startups can rebuild from scratch. Mid-market companies sit in between, often running legacy operating models while competing against organizations that never inherited them.

The productive response isn't panic hiring or slashing headcount. It's running honest pilots aimed at economic compression: shorter cycle times, lower error rates, fewer coordination layers, more output per employee.

The goal is to learn whether your operating model can absorb meaningful AI integration, or whether the model itself needs to change.

What Separates AI-Ready Leaders

Executive teams aren't failing because they lack intelligence or ambition. They're failing because their instincts are calibrated for a different kind of change.

Executives who came up managing linear growth in stable systems default to incrementalism. When AI presents a nonlinear opportunity, they unconsciously shrink it down to something manageable. "Let's pilot." "Let's experiment." "Let's monitor."

These are reasonable responses in normal conditions. Discontinuities compound faster than incremental thinking can respond.

There's also a structural blindness at play. Executives think in functions and departments. AI transforms tasks. If you never break roles down into component tasks, you can't see where the leverage actually exists.

An analyst's job looks like one thing from an org chart perspective. Broken into its constituent tasks, some of those tasks are prime candidates for AI augmentation, and the role that remains after that redesign might look very different.

BCG's research reinforces this: 62% of AI's value comes from core business functions like operations, sales, and R&D, not from support functions where most companies start their experiments. Focusing AI on the periphery of the business will yield peripheral results.

What Compounding Looks Like

Organizations pulling ahead aren't just more efficient. They're improving at a faster rate, and that rate difference compounds.

  • Early movers redesign workflows. Those redesigned workflows produce shorter cycle times and faster feedback loops. Faster feedback means more learning cycles. More learning cycles mean quicker improvement.

Late movers can copy the tools. They can't copy two years of workflow adaptation and operational muscle memory.

  • Revenue per employee rises. This creates strategic optionality. The company can reinvest margin into growth, compete more aggressively on price, or attract better talent. That optionality itself compounds.
  • Decision velocity increases. This one gets overlooked. AI-mature teams use AI in strategic planning, analysis, scenario modeling, and synthesis. That reduces friction in executive decision-making. If one company makes strategic decisions 30-40% faster and iterates on feedback, the advantage builds quarter over quarter.

The gap widens because their organizational learning speed is structurally higher.

Where Serious Leaders Start

When a CEO moves from curiosity to commitment, the first action usually isn't buying anything. It's declaring a specific economic outcome and assigning accountability.

Something like:

  • "We are going to increase revenue per employee by 25% over 18 months."
  • "We are going to reduce service delivery cycle time by 40%."

That kind of declaration ties AI to a financial metric that shows up in board reporting. It forces workflow redesign because you can't hit those targets with tools alone.

The second move is structural. One executive, often the COO or a newly created role, gets enterprise-level ownership of workflow redesign across functions. Their mandate is to map how work actually gets done, break roles into tasks, identify automation leverage points, redesign the flow, and measure economic impact.

The third move is cultural, and it's the one that sticks. The CEO changes the default questions in planning meetings. Instead of "what headcount do you need," they ask "what would this function look like if AI were embedded by default" and "what part of this work is uniquely human." When leadership consistently asks those questions, the organization adapts.

The Rate of Improvement Matters More Than the Starting Point

There's a tendency to look at the performance data and feel like the window has closed. It hasn't. BCG's research shows that even among leading companies, the capabilities are still developing. It doesn't really matter who started first. Right now, it's about who is building the organizational capacity to learn and adapt faster.

But capacity doesn't build itself. It requires leadership teams willing to question the architecture of how their company creates value, not just optimize what already exists.

That's the real capability gap. The willingness to model nonlinear change, tolerate strategic ambiguity, and redesign rather than optimize.

We can't think of AI-ready leadership as a credential or a title. It's a posture toward the business. The leaders who have it are already redesigning their AI operating models. The ones who don't are still measuring adoption rates and hoping the performance gap stops widening.

The numbers suggest it won't.

David Rice

David Rice is a long time journalist and editor who specializes in covering human resources and leadership topics. His career has seen him focus on a variety of industries for both print and digital publications in the United States and UK.

Interested in being reviewed? Find out more here.