Skip to main content

Most AI projects don't die from bad technology. They die from good technology meeting unprepared humans.

Last year, MIT research made headlines around the world when they found that 95% of generative AI pilots fail to deliver measurable ROI. The failure rate has nothing to do with model capability. Organizations deploy AI tools at breakneck speed with no governance structure, no accountability for training, no clear purpose communicated to teams. Within weeks, systems break because the humans weren't ready.

People use tools inconsistently which causes outputs to vary wildly. Quality tanks. Employees become dependent on systems they don't understand, and their core skills atrophy. Under pressure to deliver results, they abandon official tools entirely and turn to whatever personal AI feels easier.

Keep Reading—and Keep Leading Smarter

Create a free account to finish this piece and join a community of forward-thinking leaders unlocking tools, playbooks, and insights for thriving in the age of AI.

Step 1 of 3

Name*
This field is hidden when viewing the form

We now know this as shadow AI.

The cost isn't just wasted development time. It's productivity losses, degraded output quality, compliance exposures, and strategic misdirection.

Why AI Pilots Fail

Leaders see AI adoption as a technology deployment when it's actually an organizational transformation.

A deployment is a tech challenge. You the right data for the tool you choose, sort out licenses and access, and flip the switch.

Transformation is a human challenge. It touches behaviors, workflows, ways of working, personal identities, and capabilities. The work happens in how people think, decide, and collaborate, not in the tool itself.

I don’t think it’s wise to treat generative AI as a tech deployment. It really is more of a change management exercise because it involves getting people to think differently about how they work and it involves them changing their behaviors. And then eventually you want those behaviors to turn into habits.

PMP – Podcast Guest – Glen Cathey-64375
Glen CatheyOpens new window

SVP of Talent Advisory at Randstad Enterprise

When you skip the transformation work, you get chaos disguised as innovation.

The pressure to move fast is real. Everyone assumes their competitors are sprinting ahead, so they skip the hard work of system design and workflow redesign. Some organizations take a cavalier approach from the top, driven by desires to cut costs, increase efficiency, or be seen as cutting edge.

The result? 42% of companies scrapped their AI initiatives in 2025, up sharply from just 17% the year before.

The Three Capability Gaps Killing AI ROI

The skills gap isn't one problem. It's three distinct capability deficits that compound into organizational failure.

1. Technical judgment for deployment decisions

Your team needs to understand when to use the tool and when not to. That's different from knowing how to use it.

Too often, people lack the judgment to evaluate whether AI output is good enough, appropriate for the context, or even answering the right question. They can't distinguish between "the tool generated something" and "the tool generated something useful."

This has nothing to do with SQL skills or version control. It's about developing the discernment to make smart AI deployment decisions in real work contexts.

Taylor Blake, SVP of AI Labs at Degreed, points to a fundamental disconnect.

"The difference between an AI demo and AI in practice can be a huge gap. And you just don't really know until you get hands on and you have to see and feel those problems," he said.

Join the People Managing People community for access to exclusive content, practical templates, member-only events, and weekly leadership insights—it’s free to join.

Join the People Managing People community for access to exclusive content, practical templates, member-only events, and weekly leadership insights—it’s free to join.

Name*

2. Quality standards and evaluation frameworks

Most organizations show people how the tool works. That gets them to scratch the surface.

What's missing is the capability to evaluate outputs against meaningful standards. Without clear quality frameworks, people default to "it produced something, so I'll use it."

The ROI gap matters because your actual return comes from making people better at their highest-value work, not from generating more output. In a world where everyone has access to the same tools, creating an environment where people apply AI strategically differentiates your business outcomes.

3. Creative application and problem-solving

Most training completely fails here. It teaches features instead of inspiring creative problem-solving.

Maybe someone has no problem generating marketing copy once they have creative direction. But when it comes to analyzing options and deciding on a direction, they get stuck. That's their bigger opportunity, the place where AI could reshape their approach to work.

Showcasing different features to inspire creative application helps employees tackle their most time-consuming challenges. That's where transformation actually happens.

The Identity Crisis You're Ignoring

Personal identity doesn't show up in most AI rollout plans. It should.

When AI starts handling parts of someone's role, it creates profound insecurity and anxiety. People fear what will happen if they don't adapt. That fear manifests in two destructive ways.

1.) They resist or sabotage the company's AI efforts. Not overtly, but through passive non-adoption, workarounds, and quiet undermining of official systems.

2.) They turn to tools that feel easier to use, regardless of whether those tools actually serve the business need. This creates the shadow AI problem: 90% of workers use personal AI tools like ChatGPT daily for job tasks, while only 40% of companies have official LLM subscriptions.

The data on employee anxiety is staggering. 65% of employees are anxious about AI replacing their job. About two-thirds are concerned about not knowing how to use AI ethically. This anxiety directly undermines adoption, with up to 70% of AI-related change initiatives failing due to employee pushback or inadequate management support.

Justin Angsuwat, Chief People Officer at Culture Amp, has observed something counterintuitive in his organization's AI rollout of an AI coach.

Sometimes assuming that the senior or high performing folks would be the first to jump on and kind of master AI is an assumption that’s not always coming to fruition because I think unlearning how you’ve done things is actually pretty hard. If you’ve been doing the same thing for 20 years, part of your identity is wrapped up in how you get to that answer.

Justin Angswat, Chief People Officer at CultureAmp
Justin AngsuwatOpens new window

Chief People Officer at Culture Amp

The paradox is brutal: people are simultaneously anxious about falling behind and actively undermining the systems designed to help them succeed.

Doing the Human Work First

For a COO or CHRO about to roll out AI tools, the intervention starts with communication and trust-building.

Employees need to understand why you're doing it this way, what they will gain from it, and how this shapes their future—both with the company and professionally.

They may still not buy in completely. But transparency creates the foundation for genuine adoption.

Angsuwat's team focused on building confidence before worrying about perfect implementation.

"Our goal was to improve employee confidence using AI because it's kind of hard to measure an aha moment," he explains. "It was intentionally about learning, trying things, just having a crack, really. It wasn't about delivering some polished outcome, which really kind of took the pressure off."

The reframe that matters is to help people see working with AI as an opportunity to develop new skills and rethink what their core contribution is. Not as a threat to their current role, but as a chance to evolve into something more valuable.

But training alone isn't enough. Angsuwat discovered this the hard way.

"What was interesting was even as we went through this six week program, some people followed along and they would do the little prompts, they go create their computer game, and then you'd go into a regular day to day workflow. And then they were intimidated again."

This requires leadership to do the hard work upfront before deployment, not after problems emerge.

Cathey emphasizes the need to create space for experimentation.

"I think it's really important for companies to recognize that when there is change involved, people are going to have to slow down to speed up. No one is going to go from a newbie to an expert in a day. It takes a process and you have to give your people space and time to experiment safely."

The Governance Blind Spot

When leadership unleashes tools without clear ownership, accountability structures, or visibility into usage patterns, you create a free-for-all.

Most companies have no tracking of what work people are actually doing with the technology or how they were using it. That's a recipe for a governance crisis.

Without an AI governance framework, organizations face reputational damage from inconsistent or problematic outputs, loss of customer trust when quality degrades, financial losses from wasted investment and rework, and regulatory penalties from compliance failures.

The operational danger is immediate, Without clear governance structures, you can't assign responsibility when AI systems fail or lead to negative consequences. Establishing accountability is essential for addressing issues and improving systems over time.

The skills gap goes beyond individual capabilities. It's rooted in a need to build organizational systems that create visibility, accountability, and continuous improvement.

The $5.5 Trillion Question

Over 90% of global enterprises will face critical skills shortages by 2026. The projected losses from sustained skills gaps: $5.5 trillion in global market performance.

That might sound theoretical or sensational, but 94% of leaders face AI-critical skill shortages today, according to the World Economic Forum. One in three reports gaps of 40% or more.

The competitive pressure is real, but the rush to deploy without building capability is creating a bigger problem. Organizations that pursue external partnerships achieve deployment success rates of 67% compared to 33% for internal builds. The "build it ourselves" mentality that worked for traditional software actively sabotages AI success.

Meanwhile, 75% of organizations report they are either nearing, at, or past the change saturation point. The average employee experienced 10 planned enterprise changes in recent years, up from just two in 2016. More than half of IT professionals say they've accelerated AI rollout over the last 24 months.

The human cost of moving too fast compounds the technical failures.

4 Steps to Stop Your Pilots from Failing

If you're a COO or CHRO about to deploy AI, here's what needs to happen before you flip the switch.

1. Define the transformation, not just the deployment

Map which workflows will actually change. Identify whose roles will be affected and how. Define what success looks like beyond "tool is deployed" and establish clear ownership for training, governance, and ongoing support.

2. Build the communication infrastructure

Articulate why you're adopting this specific approach. Explain what employees will gain professionally and address identity and anxiety concerns directly. Create feedback loops to surface problems early.

3. The first 30 days: create capability-building systems

Go beyond "here's how it works" training to "here's how to think with it." Help people identify their highest-value problems to solve. and establish quality standards and evaluation frameworks. Showcase creative applications that inspire strategic use.

Give people permission to slow down initially. Recognize that behavioral change takes time, and people need space to experiment without fear of productivity penalties.

4. Ongoing: establish governance and visibility

Track actual usage patterns and outcomes. Create accountability structures for quality and compliance. Build feedback mechanisms to improve the system. Address shadow AI before it becomes a crisis.

Critical reality check: Only 15% of US employees report that their workplaces have communicated a clear AI strategy. If your team can't articulate why you're doing this and how it benefits them, you're already behind.

The Bridge You Need to Build

Adoptions are failing because organizations are deploying tools faster than they're building the human infrastructure to govern, implement, and leverage them effectively.

The capability mismatch is there to see and the costs are measurable as competitive pressure continues to mount. But the solution isn't to slow down AI adoption. It's to build human capability development in parallel. That means treating this as the organizational transformation it actually is, not as a tech project with a go-live date.

Get it right, and you turn AI into durable competitive advantage. Skip the human work and you join the 95% whose pilots never deliver ROI.

Leaders act like they're choosing between speed and readiness. In reality, they're choosing between sustainable transformation and expensive failure.

David Rice

David Rice is a long time journalist and editor who specializes in covering human resources and leadership topics. His career has seen him focus on a variety of industries for both print and digital publications in the United States and UK.

Interested in being reviewed? Find out more here.