Skip to main content
Key Takeaways

AI Challenges: Executives are often misled about AI's capabilities, ignoring fundamental organizational issues affecting deployment.

Data Issues: Common barriers to effective AI integration are executive buy-in, data quality, and organizational literacy.

Infrastructure Needs: Successful AI applications require robust infrastructure, often needing a 12 to 18 month roadmap for implementation.

Leadership Alignment: C-suite collaboration is crucial to address shared problems instead of pursuing isolated department strategies.

Operational Discipline: Real, sustainable AI success stems from disciplined operations, not just new technology or tools.

Sometime in the last two years, most C-suite executives convinced themselves they had an AI problem. The board wants a strategy. Competitors are announcing pilots. The CFO is asking about ROI timelines. So they hired a Chief AI Officer, stood up a task force, and started shopping vendors.

None of that addresses the actual problem.

Eric Gonzalez has spent years inside organizations as a fractional chief data officer, helping companies untangle the gap between what executives believe AI will do and what their operations can actually support. 

Keep Reading—and Keep Leading Smarter

Create a free account to finish this piece and join a community of forward-thinking leaders unlocking tools, playbooks, and insights for thriving in the age of AI.

Step 1 of 3

Name*
This field is hidden when viewing the form

The Same Three Answers

He tells a story about a panel he ran a few years back where the audience — a room of data and technology leaders — was asked to identify the top obstacles blocking healthy data infrastructure. The results came back as a word cloud. The same three things rose to the surface consistently: executive buy-in, data quality, and data literacy.

"If you asked that question in 2005," Gonzalez said at the Optimized AI Conference in Atlanta, "those same things are going to be plaguing organizations."

He's right. And the implications are more uncomfortable than most executives want to sit with.

Organizational debt that makes AI deployment so difficult wasn't created by AI. It was created by years of fragmented systems, unresolved governance questions, siloed teams, and misaligned incentives that nobody fixed because the cost of fixing them always seemed higher than the cost of working around them.

Then generative AI arrived, and suddenly those same problems became very expensive to ignore.

The pattern Gonzalez describes is consistent across industries and leadership functions. Go to conferences focused on AI, operations, or people, and you’ll likely hear some version of this: Organizations chase a new technology hoping it will resolve problems that are fundamentally about people and process.

It happens with data engineering. It happened when data science was labeled the sexiest job in America. Now it's happening again. The technology changes. The underlying dysfunction remains.

The Challenge of AI

What makes this moment different is the size of the bet. AI investments are larger, faster, and more visible than anything that came before.

When a pilot fails, it's too often because the data feeding it is unreliable, or because no one established clear ownership of the models being deployed, or because two departments were operating on entirely different assumptions about what the tool was supposed to do.

Gonzalez points to a telling example. A healthcare payer with more than 25 million members deployed generative AI not for claims adjudication but for writing denial letters — a compliance requirement that had previously required enormous manual effort. The result was an 80-90% reduction in overhead for that process. No multimodal agent. No enterprise-wide autonomous system. A narrow, well-defined problem with clean enough data to actually solve it.

The unsexy work came first. That's almost always how the genuine wins happen.

The Conditions for Value

The executives making real progress share a particular discipline: they stopped asking where AI can be used and started asking what must be true for AI to deliver value here. The distinction sounds subtle. It isn't. 

The first question leads to a proliferation of pilots that never escape proof of concept. The second question forces an honest accounting of whether the foundation — data quality, governance, cross-functional alignment, clear ownership — is actually in place.

Most of the time, it isn't. And the honest answer to "what must be true" is a 12 to 18 month roadmap that starts with infrastructure no one wants to fund because it doesn't show up in a demo.

This is where C-suite alignment matters more than any individual function's strategy. The CHRO worried about workforce disruption, the CIO managing technical debt, the COO redesigning workflows, the CFO evaluating returns — they are not looking at different problems. They are looking at the same problem from different vantage points. 

When those perspectives don't connect, AI investments often get sequenced wrong, accountability diffuses, and the gap between expectation and result widens until someone's budget gets cut.

Gonzalez frames the binding constraint clearly.

Technology is rarely what stops organizations," he said. "Political structure, siloed ownership, competing priorities, and misaligned incentives are what stop them. Those are leadership problems. They require leadership solutions, not another vendor, not a better model, not a shinier tool.

Durable progress looks boring from the outside. It's rearchitecting platforms, resolving data governance questions that have lingered for years, making the hard trade-offs that come with centralizing ownership of AI development and deployment. None of it presents well in a quarterly review. All of it compounds.

But AI rewards operational discipline. It doesn't create it.

David Rice

David Rice is a long time journalist and editor who specializes in covering human resources and leadership topics. His career has seen him focus on a variety of industries for both print and digital publications in the United States and UK.

Interested in being reviewed? Find out more here.