Validated Fear: Job displacement due to AI has become a real and immediate concern for employees.
Unexpected Fear: Shadow AI usage reveals trust issues, with employees using unapproved tools to meet job demands.
Confidence Crisis: Increased AI usage is coupled with decreased employee confidence in effectively using these tools.
Emerging Anxiety: FOBO, the Fear of Becoming Obsolete, worries employees about their skills losing relevance.
Workload Concern: AI increases work efficiency, leading to higher expectations and contributing to employee burnout.
In June 2024, Gartner identified five employee fears driving AI resistance: job displacement, AI making work harder or less interesting, bias in AI systems, lack of transparency, and loss of autonomy. Those fears were reasonable predictions based on what researchers thought might happen.
Eighteen months later, we have data on what actually happened. The fears have evolved in ways that make the 2024 list look quaint. Some got worse. Some morphed into different problems entirely. And new anxieties emerged that no one saw coming.
If you're still using the 2024 playbook for addressing AI resistance, you're designing interventions for fears your employees no longer have or that have evolved.
The Fear That Got Validated: Job Displacement Became Real
Job displacement was always going to be the obvious fear. What changed is that it stopped being theoretical.
Nearly 55,000 US job cuts were directly attributed to AI in 2025, according to Challenger, Gray & Christmas. Workday eliminated 8.5% of its workforce to "reallocate resources toward AI investments." Amazon cut 14,000 corporate roles. Salesforce reduced customer support by 4,000 positions, with CEO Marc Benioff stating AI now handles up to half the company's work.
Employees watched this happen. They saw the press releases. They knew people who got laid off. The fear isn't speculative anymore.
What's changed is the velocity. Layoffs aren't a distant possibility anymore. They're quarterly announcements with AI explicitly cited as the reason, whether that's true or not.
Employees aren't forming opinions about whether this might affect them someday. They're watching it happen to people who had the same job they do.
By mid-2025, employees at organizations undergoing comprehensive AI-driven redesign were significantly more worried about job security (46%) than those at less-advanced companies (34%), according to BCG research. The pattern holds: the closer people get to actual AI deployment, the more threatened they feel. That should tell you something about how reassurance campaigns are landing.
The intervention here isn't better messaging. Employees have stopped believing promises that "AI won't replace you, it will augment you" because they've watched it not be true for thousands of their peers. What might actually work is showing them the skills that matter for roles that aren't disappearing, with investment that proves the organization means it.
The Fear No One Predicted: Shadow AI Exploded
In 2024, nobody was tracking shadow AI as a major employee fear. By early 2026, it's everywhere.
Between 78% and 86% of employees now use unapproved AI tools at work, depending on which study you read. Not occasionally. Regularly. Security professionals, the people who should know better, are the worst offenders at nearly 90%.
This feels more like desperation than rebellion to me. A majority of employees say they're willing to accept security risks to meet deadlines. That’s because organizations rolled out AI mandates without providing adequate tools, training, or time to learn, so employees went around them.
Many workers now say they trust AI more than their colleagues. Think about that. Your people are turning to ChatGPT instead of asking the person sitting next to them because the AI is faster, doesn't judge them for not knowing something, and won't tell their manager they're struggling.
Shadow AI reveals a trust crisis that has nothing to do with the technology. Employees don't believe their organizations will give them what they need to succeed, so they're finding their own solutions and hiding the evidence.
The fix isn't blocking ChatGPT at the network level. It's asking why people felt they needed to go around you in the first place. When 90% of employees admit to using personal AI tools for work while only 14% pay for them, you have a systemic gap between what people need and what you're providing.
The Fear That Morphed: Confidence Collapsed While Usage Increased
Remember when the concern was that employees wouldn't adopt AI tools? That problem solved itself. What replaced it is worse.
AI usage jumped 13% in 2025, according to ManpowerGroup. Great news, until you see that confidence in using those tools dropped 18% over the same period. People are using AI because they have to, not because they believe they're doing it well.
Seventy-five percent of employees don't feel confident using AI in their day-to-day work. They're clicking buttons, submitting prompts, and hoping for the best. The gap between "everyone's using it" and "almost no one feels competent" is where adoption goes to die.
This confidence collapse hits older workers hardest. Baby boomers saw a 35% decrease in AI confidence. Gen X dropped 25%. These aren't people resisting technology. They're people who built expertise over decades watching it become potentially irrelevant overnight, with no clear path to rebuild.
Organizations assumed that rolling out tools would be enough. Train people on the interface, send them a few tutorials, move on. But confidence doesn't come from knowing which buttons to click. It comes from understanding when to use AI, when not to, and how to verify the output isn't garbage.
When people lose confidence, they don't stop using the tools. They just stop trusting their own judgment about when and how to use them. That's how you get a workforce that's technically compliant but functionally incompetent.
The Fear With A New Name: FOBO Emerged
In 2024, we talked about job displacement. In 2026, employees have a more specific anxiety: FOBO, the Fear of Becoming Obsolete.
This is different from worrying you'll get laid off. FOBO is the creeping sense that your skills are degrading in real time, that you're falling behind faster than you can catch up, and that the window to stay relevant is closing while you're still trying to figure out what relevant means.
Fifty-two percent of workers worry about AI's impact on their future in the workplace, according to Pew Research. Forget "will I have a job next year." The question keeping them up at night is "will I matter in five years."
This fear is particularly acute for younger professionals who are watching entry-level learning opportunities disappear. The grunt work that taught people how to think, how to spot patterns, how to develop judgment is being automated away. What used to be a two-year learning curve is now a three-month AI prompt.
FOBO shows up as employees staying in jobs they hate because movement feels riskier than stagnation. Sixty-four percent are "job hugging," clinging to current roles despite burnout because they don't trust they can compete for something better. Many fear that companies are using AI as cover for layoffs rather than genuine transformation, making job security feel even more precarious.
The intervention can't be another training program. FOBO isn't about technical skills. It's about employees questioning whether their fundamental value proposition still matters. That requires showing people how human judgment, context, and expertise create outcomes AI can't replicate. Not in a motivational poster sense. In a "here's the work we need humans to do that the AI actually can't" sense.
The Fear That Got Worse: AI Made Work Harder, Not Easier
In 2024, researchers worried AI might make work less interesting. The reality is more brutal. AI made work faster, which made organizations demand more of it.
You save two hours using an AI tool to draft a report. Your manager assigns you three more reports. The efficiency gain doesn't give you time back. It resets expectations about how much you should produce.
This is technostress, and it's pervasive. Employees report being interrupted by meetings, emails, and notifications about every two minutes. AI tools were supposed to reduce this cognitive load. Instead, they added another layer of demands: learn the tool, integrate it into your workflow, use it to do more work than you did before.
Workers describe an "always-on" culture where AI blurs the boundaries between work and life. The tools are so accessible that there's no good reason not to be working. Respond to that email at 9 PM using AI to draft it. Join that meeting from your phone while you're at dinner. The technology makes it possible, so the expectation becomes that you will.
Some organizations are discovering this the hard way. They implement AI, productivity metrics improve, and six months later engagement scores crater because people are burning out from doing more work at a faster pace with no end in sight.
The fix is recognizing that productivity gains are not the same as business outcomes. If you're using AI to do more of the wrong work faster, you haven't improved anything. You've just accelerated toward burnout.
What Changed Between 2024 and 2026
The 2024 fears were about what AI might do. The 2026 fears are about what organizations are doing with AI.
Employees aren't afraid of the technology. They're afraid of leaders who treat AI transformation like a technology project instead of a fundamental restructuring of how work happens, who demand adoption without providing support, and who use efficiency gains to pile on more work rather than create space for people to adapt.
The research is consistent. Trust in direct managers is the strongest predictor of whether people engage with organizational change. That trust gets built through managers who understand what their people are experiencing and address it specifically.
Your AI transformation isn't failing because of the technology. It's failing because you're treating employee resistance as an irrational response to change instead of a rational response to how you're managing that change.
The fears have evolved. Your interventions need to catch up.
