If your organization still views HR as the compliance cops standing by to squash complaints about an AI mandate, this episode is going to upend that thinking. David sits down with Tim Fisher—Head of AI at Black & White Zebra and longtime observer of organizational behavior—to dissect the real challenge of AI adoption in business. What he’s learned from being an “HR outsider” is simple but staggering: most companies are treating AI like a tech rollout when the real work is massive human change at scale—the kind of transformation HR has always lived in, even if others failed to notice.
From the cavernous disconnect between what HR is being asked to do with AI and what they’re actually equipped to do, to why HR leaders must be at the table before strategy is set (not as cleanup crew), today’s conversation strips away the myth of AI as just another productivity tool. Tim and David map the human, technical, and leadership gaps that are shaping every AI effort today—and why success in 2026 will look nothing like a rushed compliance sprint.
What You’ll Learn
- Why AI transformation is fundamentally a human transformation, not just a technical rollout.
- How HR’s real value lies in understanding human change at scale, not HR compliance chores.
- The authority gap masquerading as a “readiness” problem—and why HR needs decision power, not just a seat at the table.
- Why successful AI transformations will be slower than CEOs imagine, and what that means for planning.
- The messy reality of AI implementation: from shadow usage to sabotage to shifting work dynamics.
Key Takeaways
- HR is not compliance police. Organizations have misvalued HR for years—but HR is the unit that actually understands change at scale.
- AI is a forcing function. It’s forcing organizations to ask why they do things the way they do them—and whether they should be doing them at all.
- Human + tech alignment is non-negotiable. Getting HR and tech leaders structurally closer early in the process solves more problems than picking the latest tool.
- Authority matters more than readiness. Asking HR to lead people change without real decision power is just cleanup duty disguised as strategy.
- Transformation isn’t a “magic button.” Real change requires new ways of working, clearer human-in-the-loop practices, and honestly confronting how workflows are designed.
- Quick wins ≠ transformation. Short-term productivity gains are helpful, but without a holistic, long-term plan, they won’t sustain or differentiate.
- People use AI even when it’s banned. Shadow usage is widespread; it’s not the technology that’s the problem—it’s unaddressed anxiety and lack of trust.
- Expect slow and visible change. AI success won’t look like a stealth rollout; it will look like coordinated, cross-functional effort that’s seen and owned internally.
Chapters
- 00:00 – Why this conversation matters
- 01:33 – HR assumptions vs reality
- 03:31 – AI as a human challenge
- 05:04 – Gap between HR’s ask and capability
- 06:03 – HR and tech need alignment
- 07:34 – Authority gap, not readiness
- 09:11 – Real implementation blockers
- 14:33 – Shifting work dynamics
- 16:47 – Shadow use and sabotage
- 18:53 – Quick wins vs true transformation
- 22:24 – Leadership and mission misalignment
- 24:42 – Existential work questions
- 28:32 – 2025’s biggest surprise
- 33:36 – What AI discourse is missing
- 39:06 – What will define success in 2026
- 40:41 – Advice for HR leaders starting now
Meet Our Guest

Tim Fisher is the Vice President of AI at Black & White Zebra, where he leads the company’s AI strategy and implementation across products, content, and operations. With deep experience in applied AI, automation, and emerging technologies, Tim focuses on turning advanced tools into practical, scalable solutions that drive real business impact. He is a thoughtful leader and frequent voice on how organizations can responsibly adopt AI to enhance decision-making, efficiency, and innovation.
Related Links:
- Join the People Managing People Community
- Subscribe to the newsletter to get our latest articles and podcasts
- Check out this episode’s sponsor: Intuit QuickBooks Payroll
- Connect with Tim on LinkedIn
- Check out Black & White Zebra
Related articles and podcasts:
David Rice: You know that embarrassing moment when you realize you had no idea what your colleague does? Well, Tim Fisher had that moment recently, but with all of you in HR. He's the Head of AI at Black & White Zebra, our parent company here at People Managing People, and he spent years in Leadership Worlds working closely with HR professionals. But like so many others, he thought the job was essentially compliant cops.
And as he is been looking under the hood at challenges HR faces with AI, he's realized that this is the only function in most organizations that understands how massive human change at scale actually is. And right now, organizations are treating AI like a tech rollout when it's actually the largest organizational challenge since the Industrial Revolution.
When he looks at the gap between what HR is being asked to do with AI and what you're actually equipped to do, his answer is pretty simple. It's a cavern. Today we're gonna cover why the most important thing you can do right now is get HR and tech leaders much closer, how AI is serving as a forcing function to finally ask, why do we do it this way?
The authority gap that's being disguised as a readiness problem and why successful transformations will be slower than your CEO wants and involve HR from day zero.
I'm David Rice. This is People Managing People. And if your organization still thinks HR's job is to make people stop complaining about your AI mandate, this conversation from my colleague Tim is going to challenge all of that. So, let's get into it.
All right. So Tim, welcome.
Tim Fisher: Thank you very much.
David Rice: You know, you came in as an HR outsider, and I love this point of view 'cause technically I'm an HR outsider, right? So maybe I'm a bit biased, but I think it's a valuable point of view. I'm curious what surprised you most about HR's actual reality versus maybe some assumptions that you had coming in?
Tim Fisher: I think they were all assumptions. I think's fair say. It's funny, so in my role, which I know we'll talk maybe a little bit about, but I get to spend a lot of time with different parts of the organization, sort of like in the consulting fashion.
So there's this naivety that I have coming into every conversation and every organization, but I have to say. This is so embarrassing in retrospect. Thinking about it, like saying it out loud actually to you is, you know, there's this assumption that all HR is, what I now know is called compliance HR, which is like, you know, mom and dad at work.
It's like Big Brother and like, you know, this is the people you run and like tattle things on. But shocker, it's way more than that and far more important than that, and far more valuable to the business than that. And it's embarrassing because I've been in leadership roles for a long time and worked really closely with people who I clearly did not understand what they did at all.
So, yeah, it's been a, it's been a big change.
David Rice: It's not unusual. I'll say this, the HR folks get it all the time. But like, I think for so long there was like this narrative that, especially like when, I remember when I first started covering it like five years ago, you know, it was this sort of narrative, like HR is undervalued and I think what I've seen this year is more that it's like misvalued rather than undervalued.
Like the probably a better word. Yeah. Like you said. Yeah. The assumption is compliance cops, but the reality is. The only people with most under organizations understand how massive human change at scale, like how massive a challenge that is. And you know, I think we've seen this year more than ever, the challenge tied to AI is a human challenge.
Tim Fisher: It most certainly is. Yeah.
David Rice: I think you know what some of the areas where you feel like organizations are getting it wrong right now in terms of AI transformation?
Tim Fisher: I mean, is it fair to say everything?
David Rice: Yeah, it's fair.
Tim Fisher: So I think there's lots of places that are getting it wrong. I mean, I think, you know, you said it very pointedly, it's just like it is a human challenge, I think.
So many organizations still see AI as a technical rollout, and while there are parts of that, that, that are true, the reality is this is the largest organizational change since the Industrial Revolution, and it's not being treated that way, and it's not being handled that way. It's not being thought about that way.
And I think, you know, and I wouldn't have said this a couple of years ago. HR leaders are the people who have to be the people that help make all of this happen. And so I think not realizing that it's almost all humans and then secondarily and you know, to our conversation, not realizing how important the human resources organization has to be to help make it happen.
David Rice: I'm curious 'cause like now you've been looking at HR a little bit. And you're seeing some of the challenges and we went to the Gartner conference down in Orlando and we talked to, I don't know how many different people down there. And we, I think one of the things we both realized is like, everybody's in such different places with it.
It depends on industry, it depends on maturity of the organization or like, you know, just the general willingness. But I'm kind of curious about like. When you think about that gap between what HR is being asked to do with AI and then what they're actually equipped to do, how would you describe that?
Tim Fisher: It's a cavern. It's like nuanced, right? Everybody's in a different position. So some organizations have HR leaders to spend time with their C-level folks and who have a deep understanding of the business and who are super close to their CTOs or CIOs. Most organizations aren't in that situation. And like I said, it's like the whole thing is change management masquerading as a tech rollout.
So I think this is so basic, but, and we heard this to Gartner too, I think that the. The most important thing anybody could do right now is get their HR leaders much, much closer to their technology leaders. I think that's like the most visible cavernous gap at the moment. I think a lot of wonderful things would flow out of that kind of automatically just like building that relationship.
But that's something that I consistently see. And of course, based on the everything that we heard and the people we talked to, I. I don't think that would surprise anyone.
David Rice: It's interesting, I was thinking about this earlier and you know, one of the things that's happened is like we had this narrative for a long time about HR getting its seat at the table and then sort of felt like that kind of happened.
Now they're getting, I think, feel like they got brought in time for this challenge. But what's, what I'm afraid is gonna happen a little bit as this isn't gonna be a capability gap, there's gonna be an authority gap. It gets disguised as a readiness problem. Right. And so like. HR leaders get asked to lead a huge transformation like this, but they're not given the actual power to lead it.
So like they're being brought in after a vendor gets selected or after a strategy set, and then the rollout timeline's been decided, but they weren't in the room when it was decided. And so they're told like, okay, you gotta manage the people side of this. Well, that's just cleanup duty. That's not leadership.
You know what I mean? Like that's not being in the core. Conversation matters. I think HR is equipped to manage this massive human change. I mean, that's literally their job in a lot of ways. But organizations that treat 'em kinda like they're equipped to manage compliance administration, like we talked about at the start.
Right? So you get this bizarre dynamic where like HR, where organizational leadership says, well, we need HR to lead our AI transformation, so to speak. And then what they actually mean is, well, we need you to make people stop complaining about the fact that we've mandated that they use AI for this, and this.
Right? Or that they can't use it for that. That's not like a valuable use of HR's time, I don't think.
Tim Fisher: No, I completely agree with that. And I think in HR, and again, just with all my realizations about this part of the organization. I think that AI is going to potentially serve more as a forcing function than a thing in and of itself, at least in the very near term.
And what, by that I mean outside of HR, I talk about this all the time, where organizations do what they do and what they've been doing for a long time. And you know, we maybe see like seismic shifts in the marketplace happen maybe a few times a generation, or when you and I were kids, it was once a generation and like all these big things, right?
And so it's been a forcing function for organizations for the first time ever to ask themselves, why do we do something the way that we do it? Like, is this even the best way to do it? Even these are questions that they could have been asking themselves long before LLMs came out, or AI was any part of the conversation.
It's a healthy conversation to look inward and say, are we doing things the right way? Are they the most efficient? You know, most of the time an organ, a business will. Be bought and sold a couple of times before or like, you know, a management consultancy will come in and make these like big changes, right?
And so AI has been the forcing function. And so I think in the world of HR, AI is going to be a forcing function to make organizations have that aha moment where like we probably should bring in the people who work with our most valuable resource at the very beginning of conversations instead of to do, like you said, I think your work was like cleanup duty or something. Yeah.
David Rice: I'm curious, you know, like what are some of the biggest implementation challenges that you see? And I mean like the unglamorous stuff that doesn't make it into case studies.
Tim Fisher: All of it is very unglamorous. One of the big challenges that I see, but I don't hear talked about nearly enough is the different kind of working relationship between, I'm gonna coin them, builders the people who like automate things, which traditionally has been strictly developers, but as you and I both know and all of us know, and even a lot of the HR leaders that we've talked to, like people have the ability to.
Build things on their own or like, you know, interact with technology in new ways 'cause of LLM. So like whoever it is in your organization, whether it's like the engineers or the people adjacent to them, it's the relationship between those folks and the people who actually do the jobs. And that relationship is not something that typically, like I came from a really big organization, like 4,000 people and big verticals and like giant engineering departments, giant product departments, giant human resources, giant editorial, whatever.
It's right. Hundreds, thousands of people. The way these things typically work, when, like, let's say a website needs something different, done, I don't know they need to design change or they wanna roll out a feature or something, you know? Maybe like the editorial staff will say, we need something done, and there's this long chain of operations that happen where an engineer gets a ticket to go do something.
Those people never talk to each other. There's a big disconnect between, there's no relationship between the people that are building the thing and the people that like use the tool or provide information to it or whatever. But now we have engineers that are automating things and essentially teaching an AI how to do a job that they don't know how to do because they're not the ones that do them just because they have engineer in their title and just because they're usually the ones that code.
I know I'm sort of all over the place with this one, but like what I'm trying to get at is the people that. The users or the people that participate in the system are not the engineers typically. They're the ones that just like do the grunt work under the hood, right? Those people have to be much, much closer.
It's not too different than the conversation we just had from like a leadership perspective where, you know, we're talking about HR leadership coming in so late to the game when all these decisions have been made, but down in the trenches it's exactly the same thing. And so what we have are engineers.
Being told by the business to automate something. But in an AI world, that means teaching an AI how to do a job that they don't know how to do. And so what happens is the people who actually do know how to do the job are coming in so late that it's not doing the job properly. And so that's like a very unsexy complicated.
Messy sort of situation that isn't getting talked about enough, like the entire way in which we build products and that we run projects and that we do all the things that every business on earth does, has to fundamentally change because of these tools that we are treating like they're little robotic people, but are, we're still in some way programming like they're traditional software programs.
And that's just like not, it's not how this is gonna be successful. That's one big one. Another one, and we've already used this word a thousand times change. I mean, people hate change. Even like a really well designed like AI agent system that slips right into some digital workflow or something.
It's not going to be adopted. People won't use it if you don't figure a bunch of stuff out. Again, I was just talking about this was the team involved from the start? Just this, the technology folks were, are people being educated on how to use it? Is there like a feedback path? Like how do I tell an agent that's not doing its job properly?
Are the right humans in the loop? We hear about this phrase all the time. Humans in the loop are the right people at the right places? In like this new workflow where something gets done, oftentimes not, is there overhead being added back in, around like where the AI is put in? Like is there overhead to like, interact with the AI at the start of the workflow or like do something with the information it produces?
There's all these like really complicated questions. And so many people in leadership still see AI as a magic button. It's like, well, I just went to ChatGPT and made my itinerary for my Paris trip. And so I expect the same sort of magic button behavior internally here. It doesn't work like that.
It's just, it's a lot harder than it, it's a lot harder than it appears, I guess. So those are the two big things that come to mind, like the change and then just the different working relationship that. The people who build things in organizations have with the people who use things in the organization.
David Rice: Yeah. No, I mean, it's like, well, it's changing the dynamics of like so many different roles, right? Like being, I was thinking about like a UX designer or even like a UX writer, right? That was like a role for a while and you just like kind of advise or like say what something would should look like or feel like, or, you know, read like, and now like you can just build the whole thing and then be like, Hey, can you make this better?
And then that's what the engineer, like, you could basically just like come with like a prototype and then they just have to make that fine tune and make sure the backend works and all that. But like, it's just changed the entire dynamic of so much work. And I think about like the opportunity that exists there, but also like the creek factor of like, actually should that person be doing that? You know, like...
Tim Fisher: Yeah, I think like we've all worked with, you know, maybe like with CEOs or other leaders and stuff where you're like, oh, he's playing designer again, or like, whatever it is, right? Yeah. Everyone has an opinion and so they say now that everyone is empowered to demonstrate that maybe in a larger way, there's two sides to that.
I am not a designer in any way, shape or form. I struggle with stick figures. And so, when I'm trying to articulate an idea, these tools are very powerful for me because I can describe it in a language that I understand and I know the LLM understands. And then an image generation model or a coding model can.
Draw something up or literally create a prototype, and that allows me to communicate in a way that I've never been able to communicate with my designers or my engineers or my product people before. That is a dangerous place for some people to be, because if I wasn't disciplined enough, it could encourage me to push my organization down a path that maybe wasn't, it wasn't particularly high quality simply because it was easy for me to show it off.
That's definitely something to watch out for.
David Rice: Two things that we came across in 2025 that I'm curious to see how we're gonna address in 2026. The first is like the AI sabotage part. Where it's like literally people are just musing it poorly on purpose to get poor results and essentially turn that over or to like make a case that it's not working.
And then the other thing is shadow usage, where people are doing all kinds of things that we don't know about. And I think shadow usage isn't in and of itself problematic in the sense of like, it's okay if people like wanna develop different skills or like try their hand at some other thing. But it does get into a problem area where like.
They're just not doing what they're meant to be doing at all. You know, like they're putting company stuff into day things that you haven't paid for and things like that can get. To be a bit of an issue, and I'm not sure how this is gonna shift, but I think it's gotta be a conversation that starts with trust around, we trust you to do your work, we want you to trust us that you're gonna be here tomorrow for one thing.
Because I think that's part of that anxiety is driving a lot of it.
Tim Fisher: Yeah. Just a quick sidebar. I think a CISO might disagree with. Shadow usage is okay. Perspective. No, for sure they would, you know, there are definitely some considerations there, but those are very manageable and are not the real problem.
Yeah, the sabotage and the, just the concern. Look, I use the phrase all the time. I think it's just, it could have been any technology. Any big disruptive technology, but it's all just a forcing function. Like organizations are gonna to figure out, are we gonna go the path of like radical honesty and transparency, and are we going to think about what people will do with their time before we talk about the things we take away from them?
I hear very little conversations around like, there's this sort of like high level, well, everybody gets to level up, everybody gets to do work that AI can't do, and while. In and of itself, that might be a potentially true statement. I don't hear organizations talk about it very often. Why isn't that part of the conversation at the beginning?
It should be. I think the world used to move so slowly that we sort of wiggled our way through the change almost by accident, and this is moving too quickly. It requires organizations to be purposeful and thoughtful and I don't see a lot of that going on unfortunately.
David Rice: I even just like the use cases, people are scattershot and thinking about which ones they wanna consider.
Another thing we've seen this year is a lot of people chasing quick wins, productivity, efficiency metrics, right? That being like people sort of North Star and I think 2026, certainly. I think by the time we got to the end of 2025, a lot of people have realized like, well, that's not actually gonna justify the investment, or that's not actually gonna be sustainable over the long term or differentiate as much.
So what are some of the ways people can kind of get outta that mentality and think more holistically? 'cause I think when we say a AI transformation sounds cool, but a lot of people look at it and they go, yeah, alright, well what does that mean? It's like, how should they be thinking about it?
Tim Fisher: I think it's actually okay to think about it in two different ways at the same time, I think that total transformation, literally rethinking how everything gets done in your business, because there is a groundbreaking set of tools that didn't exist when all those processes were put together has to happen.
It's, that's top down. So like, how do I zoom all the way out? How do I blow everything up? Not actually, but how might I rebuild my business from scratch if these tools were available today, which is what every startup is doing, right? How do I do that? But also recognize that is not the same thing as a series of quick wins added together.
An organization needs to think longer term and top down. How do we change? And they also need to be practical and think bottoms up and ask themselves. What are ways that we can use AI in quicker win ways to augment the poor processes we have right now and get some ROI for that investment. But it's a really tempting thing to confuse the bottoms up work with total transformation, and it is not the same thing.
You can plug every hole and swap out every joint in a workflow. And that doesn't mean that the organization looks like what the startup is putting together in your world. That is not encumbered at all by anything that happened yesterday or the way anybody did anything last week. So organizations have to do a really complicated thing, which is work on total transformation, slower and in a more appropriate way, as well as.
Get the most out of some of the stuff they can in the short term, if that made any sense. Look, we both worked at a place, we've all worked at different places, and we know how easy it can be to live in the here and now. And so it requires a lot of discipline to do both of those things simultaneously.
David Rice: You didn't get into human resources to chase time sheets, or wrestle with payroll errors. But when your systems don't communicate, you get dragged down. That's where Intuit QuickBooks Payroll comes in. QuickBooks Payroll is your business management solution that connects HR, payroll, time tracking, and finance in one powerful platform. AI and automation do the heavy lifting. No silos, no steep learning curve.
QuickBooks Payroll can help you cut down on the chaos and focus on what matters most, being a human resource. Better systems lead to better people management and better workplaces. Discover how QuickBooks Payroll can help you today. Learn more by going to quickbooks.com/payroll. That's quickbooks.com/payroll.
The other piece to this, like we talk about change management, we talk about the technology itself. We talk about sort of the human challenge, right? But there's another problem where it comes down to leadership alignment. And if you haven't got good alignment from the top and people are, it's not even just about like messaging, it's just about sort of like understanding how what you're doing with it is feeding the bigger mission.
If that's not there from, you know, across the C-suite and then even down one level down to your directors or managers, you're not gonna find the results that you want. I mean, is the disconnect there though, between like, is it AI strategy overall or is it like the integration into workflows?
Tim Fisher: I think it's both. And know I have AI in my title, but one of the fun things I like to say is it doesn't make any sense. I think, again, like, I'm gonna use this phrase again. I need to put it on a t-shirt. I think it's a forcing function for change. I think what every organization has to figure out is, or the question they have to answer is, what does change look like at this organization?
How much can we take on per unit time? What does our plan for completely reinventing our business? I think it's about every organization building a capability of constant internal transformation, and you have to figure out how to do that without changing everyone's job descriptions every seven days, and you have to figure out how to do that.
In a way that allows you to communicate coherently to your organization around like the what you're doing and why when sometimes you are figuring it out as the CEO all along the way. That's a dynamic that hasn't existed before. It's like even the idea of like a job description. I think we all still inherently do this thing where we see a job description, we apply for a job, and we expect that job is going to be what we do for maybe the rest of our lives.
That world doesn't exist anymore. And so like how does the entire world adapt or start becoming comfortable kind of being uncomfortable all the time and being in a, sorry. I always philosophize, I can't help myself, but I think these are real as floaty as some of these concepts are. I think it's so complicated because no one has ever in the history of the human race.
Been in a situation where in their lifetime they're gonna have to go through, and I mean individuals and organizations, they're gonna have to go through rounds of change that only happened over generations before. That's a fundamental shift in work. And you know, if we're still treating HR people like their second class.
Citizens in the organization and we're still thinking about hiring and retention the same way that we did even five years ago. And we're still thinking of technology as like something that sits over in one corner of the office. None of this is gonna work. How to get from here to there is really unclear.
David Rice: Yeah. I think one thing that we realized, like the more you listen to, I don't know if you go to any, you know, you've been to several conferences. I've been to quite a few webinars lately, and the one thing I keep realizing is like nobody really knows what they're doing. No. You know what I mean? Like, I was thinking about this the other day.
I was like, you know, they invent the printing press and all of a sudden there's like. A wave of exploratory things within literature, within philosophy, within human thought, right? Because we could now communicate it in such a different way. And I'm like, this is like a new wave and like how we're gonna process information.
For ourselves, you know, not as a society necessarily, but just like on a personal level, how you process information is likely going to change very drastically in the next five years, same way as it did as the internet became the norm. And I'm like, that'll be two of those changes in my lifetime. The people who went through the printing press, I can't say for sure that they didn't experience any other wild changes, but I'm guessing that life just moved a lot slower back then.
And that was the big shift of that time. And I'm like. It's a lot of stress on modern humanity that it just feels like all these changes keep happening like every 10 years. You know? It's like I've seen the internet, social media, and AI now, and I'm not even 45.
Tim Fisher: It's a lot. I think the cleanest way to say it is we're not wired for this. We're not made to adapt at this level this quickly, but we have to figure it out.
David Rice: And I think part of it is like, and this is kind of a weird thing to say at this particular moment, but I think everybody's gotta get a little bit more comfortable being a bit philosophical and being a bit existential.
What is our purpose? Because in 10 years you, I mean, it doesn't take much to see the reality where like, in 10 years, like what is it that we do? You said earlier, the level up thing and the number one thing that's always driven me nuts about the AI conversation in the workplace is like, well, it's gonna free people up to do all these other things.
I'm like, okay. Like what? And nobody ever says anything. Nobody ever has the answer to that question. And I'm like, well, if you're gonna keep saying that, somebody's gotta tell me what that is at some point.
Tim Fisher: I mean, look I think there are some specific, even departments and organizations and stuff like that where like you could see where if my marketers had more time to do X, if they weren't in spreadsheets or whatever, that would be amazing for this department or this organization or whatever.
I think much more often than not, the answer is not at all clear, and certainly at a really high level, like on a societal level, it's definitely not clear. I made a comment to someone a few months ago that like, we don't have time to wait for the philosophers to like write books. We have to be them. We have to figure this out as we go.
One sort of like analogy or metaphor, always mess those up. It was like, you know, building and flying the plane at the same time. I remember I was super happy when the. Image generators got good enough that I could start whipping out those for very specific slides and decks as I was giving presentations.
'cause it was such a great way to describe how we were even thinking about AI inside of the organization that I was in because, and I think it's an honest way to, to talk about how someone's not standing around with all the answers and we're trying to get 'em on the phone or read their book. No one has them.
You know when you get to a point in your life when you realize, for the most part, no one's got their hands on the wheel anywhere and everyone's just sort of winging it. I remember into adulthood when I had that moment, you know, politics and business and everything, and I think it's never been more clear as right now.
And as scary as that is, it's also okay and normal. It's a little frightening all at the same time.
David Rice: Lemme ask you this, what was your biggest, I didn't expect this moment in 2025,
Tim Fisher: I think I'm surprised that. So many people do use it and pretend they don't. I've seen the numbers over, you know, it's a lot more than half of the people in organizations where they're not allowed to use it at all, use it every day, and, you know, just like at a really high level, like, I mean, it's a really high percentage.
I don't remember the last report that I saw, but it was very high. Even though like the organizations themselves don't report that sort of usage, so people are using their own personal accounts and things like that to do work. I think it's interesting that there's a. Two sides or like a dichotomy or something between the way people talk about how much AI scares them and then the reality of how much they use it.
I think it's a testament to how incredibly effective is a of a knowledge tool that it can be, and how if you use it properly, like a partner, it can be really productive and not particularly scary. I think the numbers surprise me, honestly. I. I think I imagined in, I think we just recently celebrated like the three year anniversary of Chad GBT, I think it was last month or something.
I thought by now that either, and maybe we do this with all things, but I thought it would be more extreme one way or the other. I thought that there would be a, an enormous pushback and organizations would ban it and people wouldn't want to use it or it would be. Maybe even a little more integrated into everything we do.
Maybe more officially, no one has a crystal ball, and it's been interesting to watch, but I'm surprised how many people use it every single day. How many people already treat it like it can't go away. And that's a testament to something really effective. But all at the same time, we see all the numbers about.
How so few organizations have rolled out like truly official use cases and like, you know, permanent implementations of like AI. But again, I think just take us back to the top down and bottoms up transformational work. I think bottoms up work is just so much easier. It's so much easier for individual people who do very specific things and understand their jobs deeply to make a connection very soon that.
If I made this automation or I had ChatGPT do this for me, that my job would get better. And those things happen sooner than we've reinvented marketing at our company. Those are harder.
David Rice: Yeah, I mean, I would say, you know, from my perspective is maybe the discourse mismatch, but you know what I'm gonna hand my biggest surprise to, we had a episode of the podcast where somebody, you know, people write in.
And some guy somewhere found a way to use it to write pickup lines for his coworkers. And I was actually just really impressed with the ingenuity, like as he feeding it their slack conversations and then making like really nuanced ones, you know.
Tim Fisher: I'll roll my eyes and say humans, but like when you think about it, even this is getting nerdy for a second.
All the AI work that happens every time you talk to ChatGPT. It's actually happening on a, on something called A GPU. It's a video card. For those of you that know anything about computers, it's the thing that makes your video games work. The senior PlayStation five, they're just, they're in a room stacked on top of each other, and that's what runs AI.
I think about that a lot because I think about how so much ingenuity actually starts with fun or silliness. I remember when ChatGPT came out and everybody said. Right away. Everybody was writing stories and poems. They were having to do creative things because that was like, wow, a software program did something kind of novel, which is not something software does.
I remember hearing over and over again, I said in meetings at my previous employer, and they're like, this is useless. You know, there was no value here. And some of us would say like, just zoom out one level. It is a tool that understands and produces language. Like, I'm sure we can do something with that.
We're a company that deals in language all day long and it sometimes takes longer. So as funny as the pickup line thing is, that's probably the beginning of what will become something productive because it seems like that's just the way it goes every time.
David Rice: So I, you know, you obviously follow all this stuff as close as I do.
I had ask you like, when you hear the AI discourse and the way that it's changed in the last year, what do you think about that? Like. Because I, you know, a year ago we were having conversations about things that I, we probably don't even think about it now, some of it. But you know, I think, like right now, I'd say like, just we've gotten to the point where probably 70% of the conversation I see like on LinkedIn is pretty technical, where it's like either scientific or practical implementation stuff.
Maybe 10% of it's like bubble talk, you know, like financial stuff. And then like 20% is about the people challenge, right? And I, some of that gets pretty clickbait, but I don't know, like, what are we not talking about enough in this conversation?
Tim Fisher: I think I, I'm a fan of the movement of those numbers in the sense that it was so much doom and gloom and clickbaity stuff for a while.
The only thing you could find in the public discord was we're all gonna die, or every disease will be cured by December. It was one of the two. Obviously, you know, anyone who actually understands how it works or pays attention rolled their eyes at both of those things. The only practical conversations were sort of deep in the technical.
They were in like the forums where people were like coding with it and conversations that eventually turned into startups and, you know, things like that. People are like, okay, there's some low hanging fruit here that based on just the basic understanding of all the sports, we should be able to pull this off.
So that's been really cool. And so I think the practical is starting to get a little more common. What I would like to see is, and again, maybe this is just minor nerdiness and what I enjoy talking about and figuring out, but like using technology to solve problems is really exciting to me. I actually don't get a big kick out of the image generation or the video or anything like that for the purposes of simple entertainment.
I'm in the background, like getting excited about an N eight N workflow or a Zapier workflow or something. 'cause I'm like. There's this thing I hate doing that I could make go away, or there's this thing I always wanted to do and I would need an army of humans to pull it off. Even just like basic things, like I'm a consumer of lots of types of information and I would love an AI system that reads it all for me and sends me a digest at the end of the day and says, here's what happened in the world of X, Y, Z.
You know, that doesn't, that sort of exists. Doesn't really like, those are the things exciting to me. I would like to see more of that conversation happening. The things that are sort of happening in the bowels of companies where it's just, you know, it's like talking about workflows and things like that, which isn't a very sexy conversation, but, so I guess what I'm saying is I would like the practical conversations to shift or you don't have to shift.
I would like more practical conversations to happen in public in addition to the ones that are happening that are really tech focused. Because we read a lot about those, like people coding and stuff like that. It's all very cool and very practical and very helpful. But there's so many other ways that these tools can help other parts of the business outside of engineering, certainly, you know, at home and stuff like that.
So I, I would like to see that happen. I'm, my fear, however, is the conversations about a bubble are just gonna get more and more. It's just, it's too tempting to write articles that, you know, people will click on instead of ones that they might find actually useful, so.
David Rice: Yeah, economic collapse is always, you know, a sexy headline.
Tim Fisher: It's super sexy. Yeah. Shouldn't be. But it is.
David Rice: Yeah. I think the part that scares me a little bit is we saw the reaction to when, like the Google VO stuff came out and we started seeing all this like AI video proliferation, and same thing with Sora, right? There was sort of this like, you can't trust anything.
We don't know what's real. There'll be more of that. It'll get more sophisticated. And the problem is like the next tool, whatever that is, the reaction to it. Like these things tend to get more and more toxic. And then you keep finding out about like, oh, there's this guy in Sri Lanka who uses it just to create these specific types of videos.
And so like, my, the discourse around it will continue to, as it gets used to like spread misinformation. I don't know. I think that we end up in a place with discourse around things after a certain period of time where like there's just so much entrenchment and toxicity that there's no conversation to actually be had anymore.
You know, the speed at which this is advancing. I worry that might be become a possibility. So I think...
Tim Fisher: I commented earlier about how we're not wired for this. There was never natural, any natural pressure whatsoever to adapt this quickly to anything. Nothing in the world changes this fast naturally.
And you know, we also get tired. You know, the news cycle, the change cycle. It's exhausting. It is exhausting. On the other hand, I surprised, unless I'm being tricked on a daily basis, I'm surprised how. Little awfulness has happened, at least that we're aware of around some of this stuff. I'm kind of shocked actually.
David Rice: I mean a lot of it is just like veggie superheroes like I posted, you know, like the guy who just turns into broccoli. Like, I don't know. A lot of it is just sort of whatever. I mean you can call it slop 'cause it is just content, but it's not particularly useful or damaging in a lot of cases. You know, it's just stuff, it's like trinkets.
But for the internet. Everybody has that ant who just has all the stuff and you're like, you know, you don't have any shelf space, but you're, you asked for picture frames for Christmas.
It's like the same thing. Like, we don't need any more stuff flashing at us, but we invented this thing that makes a lot of stuff super fast.
Tim Fisher: I think the scary thing is with the quality of tools available to us. It's not that a bunch of people need to do bad things, it's that one person needs to do something really bad.
You know, we've all learned over the last 20 years thanks to social media that every single person has an audience of everyone that it's the opportunity for damage is high. So. I guess we'll see.
David Rice: Well, I will close with two things. One, what's your prediction for what will separate successful transformations from failed ones in 2026?
Tim Fisher: I mean, I don't know. It's sort of our whole conversation, right? I think successful transformations will be slower than the CEO wants. I think they will be well televised internally, and I think they will have. I mean, not to be like silly because of a topic of this conversation, but they'll have the HR team involved from day zero.
And again, it's just like, I mean, back to like your very first question, like my perspective on this part of the org is just like so incredibly different than it was when I came into it thinking it was just about compliance. I mean, no organization is starting, I hope not is starting to think about this today, but if they were, I would say.
Get your HR team involved immediately as much or more than your tech team plan on it taking 10 times longer than you think. The AI magic button takes the push and plan on being really honest about what you don't know and what you do know with your organization. And I think if somebody were to do all those things right now, I think they might even catch up and surpass where some organizations are right now that are still thinking about it as you know.
As we're rolling out Gmail and it's not the same thing at all.
David Rice: Probably nobody's starting today, but let's just say they were for a second, and as HR leader, they gotta start their AI journey tomorrow. What advice are you giving them?
Tim Fisher: I would say res your gut for sure. I would say your job is more important than you probably realized it is.
I would say that the very first thing you should do to the extent that your organization doesn't understand what you do or the value you bring, make that the first thing in your to-do list. I think that'll be key to success versus failure of AI in your org.
David Rice: Tim, thanks for coming on. We'll have to do this again sometime.
Tim Fisher: Alright, sounds good. Thanks David.
David Rice: Alright listeners, until next time. As always, head on over to People Managing People, sign up for the newsletter, check out our AI transformation Explorer, check out our upcoming events. And as always, let's be ready for the next pot.
And until next time, this is our last show for the year, which how we're closing out the year. So Happy New Year, Merry Christmas, whole day, happy holidays and we'll see you on the other side.
