A new SHRM report puts a number on a problem that's been building for two years: 57% of HR professionals working in states with workforce-related AI laws say they aren't aware those laws exist.
And with that, the wild ride of AI governance enters a new era of challenges.
The finding comes from SHRM's 2026 State of AI in HR report. As states including Colorado, Illinois, and New York have moved to regulate how employers use AI in hiring and employment decisions, the assumption inside many organizations has been that legal would track the laws and HR would implement the policies. What the SHRM data suggests is that neither is happening with any consistency.
Ravin Jesuthasan, a future of work strategist who has spent years studying how organizations distribute accountability for workforce decisions, was direct when asked where AI compliance sits inside today's typical enterprise.
I don’t think most organizations have an answer yet. The problem is structural. A lot of it is falling through the cracks between some of the legacy functions.
That gap has a compounding variable: shadow AI. Many employees are using general-purpose AI tools to inform or assist in employment decisions, often without any visibility from HR or legal.
When those tools touch screening, performance evaluation, or promotion, they may trigger disclosure and bias-testing requirements under state law. If leadership doesn't know the tools are being used, disclosure becomes functionally impossible.
Laws Going into Effect
Colorado's AI Act, taking effect June 30, requires employers using high-risk AI systems in employment decisions to conduct impact assessments and notify affected individuals.
The Illinois Artificial Intelligence Video Interview Act mandates disclosure when AI analyzes candidate interviews.
New York City Local Law 144 requires bias audits for automated employment decision tools.
Each carries its own definitions, thresholds, and enforcement mechanisms, creating a patchwork of obligations that no single function inside most organizations is equipped to track alone.
Accountability and Governance
The accountability question is where things get particularly tangled. Legal understands the regulatory landscape in the abstract. HR owns the employment decisions. Technology manages the tools. What the SHRM data implies is that these three aren't connecting into anything coherent.
Jesuthasan sees some organizations beginning to adapt.
You're seeing not just new roles and functions emerge, but some of the legacy functions get recalibrated.
He pointed specifically to CHROs who have taken on additional responsibility for AI enablement, framing it as an attempt to expand existing function boundaries so the gaps get closed rather than waiting for a new org chart to emerge around them.
That kind of evolution has been visible across the industry for the past 18 months. People enablement, head of AI transformation, workforce intelligence lead — the language is shifting because the job is shifting.
Whether expanded mandates come with actual authority over compliance decisions, and the budget to build the infrastructure compliance requires, is still an open question at most companies.
Awareness is a prerequisite, not a destination. Knowing a law exists doesn't tell you who inside your organization is responsible for it, or whether your current AI deployments fall within its scope. Those are decisions that require someone to own them explicitly, and right now, most organizations haven't made that call.
