Governance Shift: AI governance is evolving from compliance to a competitive advantage, affecting procurement and development.
Agentic Demand: By 2026, 40% of enterprise applications will integrate task-specific AI agents, requiring new governance frameworks.
Market Disparity: Despite projected growth, the AI governance market remains small, leaving organizations to address infrastructure gaps themselves.
Billions are flowing into AI infrastructure and algorithms. But the part of transformation that actually determines success for organizations — people, processes, governance — barely registers.
The question at several sessions during HumanX in San Francisco last week wasn't whether artificial intelligence would reshape how companies operate. We’re already seeing that happen. Instead, the focus was on where to place the next bet.
At a session focused on early-stage investment, panelists named the companies they were most excited about: a video intelligence platform, an autonomous defense boat manufacturer, a consumer wellness app. Not one touched the space where most AI transformations actually fail.
That, in itself, is a market signal.
According to Menlo Ventures' 2025 State of Generative AI in the Enterprise report, U.S. enterprises spent $37 billion on generative AI last year, a 3.2x increase from $11.5 billion in 2024. The largest share, roughly $19 billion, went to the application layer — software and products built on top of underlying models.
Infrastructure absorbed the rest. Rodrigo Liang, co-founder and CEO of SambaNova Systems, described inference demand as a market whose growth is "yet to really come." Micron Technology alone has committed over $25 billion in capital expenditures to address the memory shortage AI's buildout has created.
Meanwhile, the global AI governance market — the tooling that helps organizations manage how AI actually operates inside their workforce — was valued at $308.3 million in 2025, according to Grand View Research. Less than 1% of what enterprises spent on generative AI overall.
A $308 Million Answer to a $37 Billion Problem
Boston Consulting Group's research on AI adoption found that 74% of companies are struggling to achieve or scale value from AI despite years of investment. The primary reason is that roughly 70% of implementation challenges come from people and process-related issues. Technology problems account for 20%. Algorithms are the layer attracting the most capital, but only account for 10%.
The pattern among companies actually scaling results follows the same logic of the old 70-20-10 model: leaders put 70% of their investment into people and processes, 20% into technology and data, 10% into algorithms.
That framework, rooted in organizational research at the Center for Creative Leadership and applied to AI transformation by BCG, has become a reliable diagnostic for why enterprise programs stall. Most organizations are spending on the 10% and 20% and wondering why the 70% isn't moving.
The venture market reflects the same distortion. The investors at that HumanX panel weren't ignoring the organizational side of AI, they simply weren't funding it.
Change management infrastructure, workforce transition support, and governance tooling are all hard to productize. Organizational change requires specificity. It resists subscription packaging.
What the Conference Floor Confirmed
A separate HumanX session on AI governance made the consequences concrete. Navrina Singh, founder and CEO of Credo AI, has watched this pattern long enough to name it directly.
Many companies, when we speak to them, are like, ‘let’s wait for an incident to happen, and then if we need to invest in AI governance’. Guess what? By that time, they’re going to be irrelevant.
Singh, who works across Fortune 500 clients in ten sectors, argues that governance is shifting from a compliance consideration to a competitive one.
Saahil Jain, CTO of You.com, put it more bluntly.
Governance isn’t “a nice to have, but a critical feature. We’re very enterprise focused, so in some ways, I view it as a necessary but insufficient condition to driving adoption among enterprises in a global market?
Both described a procurement environment where Fortune 500 buyers, having spent the past two years prototyping AI vendors, are now scrutinizing whether those vendors can be trusted with enterprise data and systems.
Singh noted the governance function itself is changing shape, moving from centralized compliance teams to embedded practice, with builders incorporating guardrails at the point of development rather than governance teams adding oversight after the fact.
That shift matters because it means the demand for governance tooling is broadening, reaching engineering and product functions that previously sat outside the conversation.
The Agent Problem
The urgency sharpens around agentic AI. Gartner projects that 40% of enterprise applications will be integrated with task-specific AI agents by the end of 2026, up from less than 5% today.
Liang, whose company builds inference infrastructure for exactly this environment, described what that transition demands. As agents orchestrate work autonomously across systems, enterprises will need entire ecosystems to certify outputs.
Industries like banking, insurance, governments, you have to produce certifiable outputs. You need to be able to trace it.
That infrastructure barely exists as a commercial product.
Grand View Research projects the governance market will reach roughly $3.6 billion by 2033, a 36% annual growth rate that reflects genuine demand. But the starting point is so small, and the agentic deployment curve so steep, that market growth alone won't deliver what organizations need on the timeline they actually face.
BCG's 74% isn't a mystery against that backdrop. The vendor market has optimized for what's scalable and sellable (compute, models, applications) and left the organizational infrastructure of AI transformation largely for companies to build on their own.
Leaders treating governance as foundational now, before the market matures, aren't just managing risk. They're building something they won't be able to buy.
