Episode 260: Why People Analytics Needs a Product Mindset to Thrive with AI (with Ashar Khan)
People analytics has spent years building credibility through data. Now the pressure is different. Business leaders aren’t just asking for insight - they’re expecting direction. Where should we invest? What should we stop doing? What risks are we not seeing yet?
But many teams still find themselves pulled back into reporting cycles, ad-hoc requests, and an overemphasis on metrics that don’t always lead to better decisions.
So what shifts when people analytics starts operating more like a product and less like a project function?
In this episode, David Green is joined by Ashar Khan, Head of People Insights and Solution Design at Autodesk, to explore how the function evolves from delivering data to shaping choices at scale.
Join this conversation as they discuss:
The skills and mindsets modern people analytics teams need beyond technical expertise
What an effective people analytics operating model looks like in practice
The core capabilities required to bridge HR technology and HR strategy
Where “metric fixation” leads organisations toward false confidence and poor decisions
Why the assumption that AI automatically means “fewer people” misses the bigger picture
Practical advice for CHROs building or redesigning a people analytics function today
This episode is sponsored by Worklytics.
How productive is your organisation, really? Worklytics makes it clear - with privacy-first insights from everyday work data. See how meeting volume, manager effectiveness, collaboration health, and AI adoption are impacting your team’s focus, efficiency, and outcomes - so you can make smarter decisions, faster.
No surveys. No assumptions. Just clear insight into work. Right now, Worklytics is offering podcast listeners a free 30-day trial of their productivity analytics dashboard.
Learn more at worklytics.co/productivity
This episode of the Digital HR Leaders podcast is brought to you by Worklytics.
[0:00:00] David Green: When I speak to people analytics leaders, I find that the function is increasing its influence over the people and business strategy. People analytics teams are expected to influence technology choices, shape workforce strategies, and help leaders think through some pretty complex trade-offs around talent, productivity, and where to invest. But to do this effectively, it means moving beyond a rearview-mirror reporting team into a strategic forward-looking function that helps the organisation see around corners. This calls for a high level of business acumen, a product mindset, an ability to influence, and a real partnership with the business. So, how do people analytics teams navigate all of that? What skills and capabilities move that needle? And what does that mean for its operating model?
To help us unpack this, I'm delighted to be joined in this episode by Ashar Khan, Head of People Insights and Solution Design at Autodesk. Ashar sits right at the intersection of people analytics, technology and people strategy. And today, we are going to dig into what that shift really looks like in practice. We explore the critical capabilities teams need beyond technical skills, how to avoid what we call 'metric fixation', how to move from a focus on projects to scaling products, and Ashar's advice for those leaders building or evolving a people analytics function today. We have quite a lot to cover. So, with that, let's get the conversation started.
Ashar, welcome to the show. It's good to see you. To kick things off, could you share with our listeners a little bit about yourself and your role at Autodesk?
[0:01:51] Ashar Khan: Sure, yeah. David, thanks for having me. Super-excited for this conversation. Longtime listener and keen to be now a participant in the podcast. So, yeah, I'll tell you a bit about myself. I've spent most of my career in the people analytics domain across mostly tech companies, sometimes financial services. I joined Autodesk about five years ago. And the role I joined in was core people insights, but has evolved over time. So, I'll tell you a bit about how it's evolved. So, as I said, I joined to really start the people analytics team at Autodesk, we call it people insights, and in our early years, we were very much a traditional people analytics team, very focused on descriptive insights, predictive reporting, really developing a centre of excellence that was focused on a single source of truth on insights. We also had an employee listing programme that we inherited from the culture team, so a lot of our focus was around revamping that, putting more data-driven insights on that. That was our first couple of years.
Over time, we actually added capabilities around research. So, we added quantitative and qualitative researchers. And that evolution was intentional, because we wanted to make sure we answered those first-order questions first on, for example, what is our attrition rate, beyond we start talking about if attrition is hurting Autodesk, or if we should be thinking about it differently. And what we wanted to be very intentional about is, our research should have both quantitative and qualitative lens. And I think that has been a really strong differentiator for us because as, David, we've spoken about this, you'll agree, the quantitative side is only one side of the coin, right? The qualitative piece adds a lot of context. And there's a lot of synergy between each of those, where our quantitative research feeds into what our qualitative questions should be, and vice versa. So, that was the first evolution; we added research into the team.
Over time actually, our Chief People Officer, Rebecca Pearce, noticed that separate from the insights world, there was this widening gap between what our stated HR strategy was and what our HR technology experience was like with either vendors or in-house builds. And that was mostly because there was no bridge between the HR strategy CoEs and our technology partners internally or our SaaS partners. So, we then moved from being a people insights team to now what we call people insights and solution design. And solution design is all about bridging that gap between HR strategy and HR technology, and we really sit at that intersection. And our remit really is to ensure that there's no disconnect between our strategy and technology. And critically, we're not the decision-makers, we're not the implementers, we act as custodians of our HR technology. And one of the most significant things we've also done is to open up avenues for our CoE teams. So, in the past, we were very much focused on platform tools, like Workday. And now, we always start with Workday, but consider building on top of Workday, buying best-in-class tools in addition to Workday. So, that's how our team has evolved.
Our most recent change is actually adding a people AI centre of excellence within my team. And very early stage on that, but really what we are doing is we have two lenses. One is internal facing, so how should HR -- we call it people and places, so if you hear me say that, I mean HR -- how should our people and places team use AI to really automate repetitive work so that we can move our effort up the value chain, right? So, it's not about automating all our work, it's actually about moving our focus up the value chain. So, that's internally within HR. But then externally for the enterprise, how do we advise the enterprise on the evolving people strategy when AI becomes prevalent at work, and we're seeing that happen already? So, especially in the second lens, we're really excited about this work, but still in very early stages. So, I know sort of long answer about what you do, but I thought it might be informative to talk about the evolution of the team since I've been here.
[0:05:38] David Green: No, that really helpful, Ashar, and I think that will feed into a lot of the conversation that we have, and certainly from the research that we're doing at Insight222, we're increasingly finding that your peers, the person responsible for people insights, people analytics in the organisation is increasingly right at the fulcrum of that kind of AI strategy for HR, whether that's inward-facing, as you said, but also external-facing to the organisation as well and supporting the things around workforce planning and upskilling, and all those sorts of things as well. So, it's really interesting how you pull those strands together. So, the next question, I'm sure there's multiple things to answer this, and as a people analytics leader, Ashar, what keeps you awake at night?
[0:06:22] Ashar Khan: Well, yeah, as you rightly pointed out, lots does. Actually, I sleep very well. But what worries me is, I think first and foremost, I've either seen or been part of teams in the past that develop an ivory tower of insights. So, technically sound, probably bleeding-edge insights, but actually not business relevant, right? So, it's questions no one's asking. Sometimes questions people aren't asking are still interesting and business relevant. But what really gets you is when the question isn't even business relevant, right? That's what I fear the most, and that's something I'm constantly trying to check within our team. It's a difficult one, to be honest, because in our domain, we're scientists, right, so we want to explore objective truth, we want to investigate everything there is to investigate. We also have a tendency to fall in love with the problem sometimes. So, it's something that I find constantly pulling myself back up out of to really ask myself and ask the team, what is the business impact of this work we're doing? And that's sort of our guiding compass. We make sure we go back to that on a regular basis.
Another one, I think this is a perennial sort of people analytics problem, is our stakeholders not truly understanding the value of people analytics work, right? So, in the past, many moons ago, people analytics was perceived as the team that makes dashboards. And we still do, right? Any good people on this team worth its salt will make dashboards, because that is part of our function, it's part of creating that objective measure of truth. There's a lot of value in that. But what I fear is too many teams stop there. They're not able to get to the second-order value, third-order value, when you move from an insight to a deeper question that you can start investigating, because you already have that first metric. So, I think that's one that worries me for our industry of practice more so than the teams I'm leading.
Then, you asked for everything that worries me. Actually, something that worries me perhaps the most is just the rapid evolution cycles in AI. And it worries me and excites me, it terrifies me, but it is the most interesting work that I think our teams are doing across the entire HR industry. I think it fundamentally reinvents what HR work looks like, but also what work in general looks like, right? And I think the people analytics or HR lens on that makes us at the centre of the conversation, if we have the wherewithal to show up to that conversation in the right way.
[0:09:02] David Green: I know that you founded the people analytics function at Autodesk, and you talked in your introduction about how that's evolved over the last few years. I'd love to understand, and again, this is more of a general conversation really, to hear from a people analytics leader, what are some of the critical skills you look for in your people analytics team, and how has that evolved over time?
[0:09:23] Ashar Khan: Yeah, it's a good question, because it's not always obvious, right? Because if you think about a people analytics team, if you have a naïve point of view, which I know your listeners won't, but just entertain me, the first skill that might come to mind is technical skills. And I think they're important. And I think obviously, you can't run an analytics team without technical skills. But what I've seen is technical skills are becoming table stakes, right, so it's not a differentiator. What is really a differentiator, and you mentioned critical skills, right? So, I wouldn't count most technical skills as critical skills, but there are some, but most I think are table stakes.
I think what are critical skills is, I mentioned it earlier about this ivory-tower problem, right? You really solve the ivory-tower problem by having true business acumen, really understanding the company that you work for, how it creates value for customers, what is the business model, what are the levers that the company pulls to be able to make more money? I can't overestimate how important it is for everyone to know that, obviously, but in particular, a savvy analytics team to know that, because that is the answer to what are the questions we should ask. So, once you understand that well, every piece of work you're doing, you're investing in, has an ROI. So, I would say business acumen for me is number one, it's been number one for a while. And even with emerging AI skills, I don't think business acumen is replaced as number one.
The other one I would say, and this might be more particular to my team, because as I said, we've also included solution design, is this concept of product thinking, right? So, really thinking of the strategy that people and places are trying to create and trying to understand how a product fits in that strategy; to see, does this strategy need a specific bespoke technology? What type of experience does it need? How do you develop that product into an MVP and then you launch it? What does it look like after launch? How do you maintain? I think my personal bias is, I really don't have strength in maintaining things. I've always been a tinkerer, I've been a builder. You ask anyone on my team, I have a bit of the shiny-object syndrome. So, I think even for me, this product thinking mindset is a continuous sort of evolution, because I have proclivity for the early part of product life cycles, but the maintain product life cycle is something that's not intrinsic to me. So, I have to constantly remind myself to think in that way.
Storytelling with data is important, it will continue to be important. The way we tell story with data will evolve. We haven't touched on how AI changes people analytics a bit, but we're already seeing that. So, our team is already using AI to extract stories, and then our teams are acting as editors of the stories, as opposed to the ones who are doing the copywriting of the story. But that being said, that's still an important skill, to be able to really discern and judge the output that the AI is giving, is that the right story? So, those are the ones that I think have been important for at least the last couple of years.
At the tail end, you mentioned about evolution, right? So, I think with evolution, in addition to those, I think what becomes even more critical than it's ever been before is this real emphasis on data quality, data governance, and true data engineering skills. I think this is a differentiator. And the technical know-how is a differentiator, but also the capability and the mindset of data governance, it's a differentiator, because of something I mentioned earlier. AI models thrive off of data, right? So, you could buy the best SaaS tools in the world that have AI built into them, you could build your own AI models. But if you don't have high-quality data that has business relevance, so I'm not talking about is your workday data correct; it's have you manipulated that data, transformed that data into use cases that can be picked up by AI, and you prevent it from hallucinating because you've given it high-quality data. I think that is the emerging sort of critical skill, and it's where my team is certainly investing a lot of time and effort.
[0:13:37] David Green: This episode is sponsored by Worklytics. How productive is your organisation, really? Worklytics makes it clear with privacy-first insights from everyday work data. See how meeting volume, manager effectiveness, collaboration health, and AI adoption are impacting your team's focus, efficiency, and outcomes, so you can make smarter decisions faster. No surveys, no assumptions, just clear insight into work. Right now, Worklytics is offering podcast listeners a free 30-day trial of their Productivity Analytics dashboard. Learn more at worklytics.co/productivity.
If you were looking forward, and again a little bit into the crystal ball, it may be something you're already thinking about, but it might be a more general kind of view on the function and evolution of people analytics, what capabilities do you think people analytics teams will need far more of in the next 12 to 24 months?
[0:14:58] Ashar Khan: Yeah, we spoke about that a bit already in terms of investing in data engineering. I think that's a significant one. I think business acumen is a perennial one. I do think that it sort of depends on the structure of the people organisation you're in. But the HR teams are turning to people on these teams as the custodians of AI. Talking to a lot of my peers, I think there's a mixed bag. There's some teams that already have the capability in house to be able to build AI rapidly and prototype, whereas others don't. So, if you're in one of those teams, I'd say building that capability is critical, because you don't have to become an expert in AI overnight but you need to know more than the rest of your HR peers. And actually, the effort needed to learn these AI tools is tending to zero, right? So, I have a technical background, but I haven't coded in a while because my team is much better at it than I am. But I find myself tinkering with some of these low-code tools, and I'm always amazed by how much you're able to do without a technical background. Having a technical background helps in the implementation, but to get to sort of 80% viability is remarkably easy. And really what you need to get to 80% viability is understanding the problem, understanding the application, which are non-technical skills, right?
So, that's what I would say. I think if you're part of a team that doesn't have the AI chops yet, don't let it be this insurmountable hurdle ahead of you. You can actually make a lot of progress really, really quickly if you use some of these low-code platforms that are available.
[0:16:36] David Green: And actually, something else that you were saying, which probably is a look at the future in terms of future capabilities and the function you talked about, product mindset, and obviously you've got that responsibility with the solution design piece as well. And it's quite interesting, actually, because if you look at HR generally now, this isn't even necessary in Autodesk, clearly some of the more transactional, operational stuff will be done by technology moving forward, and different companies will move at different trajectories around that. So, obviously, the positive vision of this is that it will free up for HR to be more strategic and add more value, and I'm an optimist, so I think that's where it's going to go. But this means this product mindset is something that HR professionals generally are going to have; even if they're not directly responsible for it themselves, they're going to have to develop that mindset. And I just wonder what guidance you'd offer, Ashar, based on your experience of doing this, would you suggest to maybe HR professionals or people analytics professionals that are listening that want to develop their product mindset a bit more?
[0:17:46] Ashar Khan: Yeah, it's a great prompt because I think it's not native to people who've grown up in HR to have a product mindset, because we've never had to before. What I would say is to develop that product mindset, I think unlocking from the current paradigm is step zero or step one, right? That's to unlock beyond what currently exists. So, I find a lot of times when I'm speaking to either stakeholders within our HR org or peers in other HR orgs, they're locked too much into what our current technology experience is or what our current product experience is. And then, there's a lot of power from starting from a blank slate to say, "Okay, what would we build if we didn't have anything right now?" And then, that allows you to really think about what the ideal scenario is. You may not be able to get there, but at least you have a destination.
In order to get there, the second recommendation I would make is really rapid prototyping. Again, this is something that's not inherent in the way HR has traditionally operated. We've operated in a way where we, for example, if you want to implement payroll, which if anyone on the call ever has, my condolences, but you sort of have the six-month project where you go away, you do a lot of deep work, and then four months later, you have UAT, right? And then you have a two-month UAT, and then you implement. Those timelines are not product-mindset native, right? So, you have to put things in front of stakeholders and users much more rapidly, be very comfortable with products that you know are not showtime-ready, to get early rapid feedback and prototyping. And that actually, in my point of view, in our case at Autodesk, that doesn't end at product launch. So, we are rapidly prototyping and testing all our products continuously. Some of our products are built only for our people and places team. So, we have a lot of products linked around our employee listening lifecycle.
So, every time we have a survey launch, we have specific tools that get updated that understand comments and understand sentiment and help people sort of cascade insights to the enterprise. So, my team is always thinking of this, "Okay, so our next launch is coming. What is new? How do we hook users in with this new functionality, new capability? What did we hear in the last time that we can improve this time? And critically, what did we monitor last time that we can improve this time?" So, it's not just if a user was irritated enough to tell you it, but did we notice people drop off when certain behaviour was happening? Did we notice that people didn't find a certain functionality that we think they should find? So, it might be preaching to the choir here, but data is your friend. If you have that data around how people are using your product, and when I say product, it could be a product the people analytics team is building for itself as a people analytics tool. So, it doesn't have to be an enterprise-grade product. But in either case, using that data to guide your next step is critical. And I think that will be something that a lot of people analytics teams will feel very comfortable doing.
[0:20:39] David Green: And I guess it's also, bringing that together, it's also having that customer mindset, isn't it, of developing products for the customers? They might be internal customers, but they're customers.
[0:20:49] Ashar Khan: Exactly right. I mean, you hit the nail on the head. You'd be surprised in how many times that gets missed. And I'll admit it gets missed within my team. Sometimes it often gets missed by me in terms of, are we really representing the customer's needs here? Because HR tends to sometimes have echo chambers within our organisation. So, we start viewing the customer as the CoE that we're building this for, or we're developing something for the comp team, right? We're actually not, we're developing something for all our managers to use, and the comp team is our stakeholder, they're custodians of the tool, they are sponsors of the tool, but the customer is not. And it's very telling when even in shorthand, me or someone on my team will talk about the customer being the CoE, because it really tells you about the mindset. And that mindset is something that you have to change to make sure that in every product decision you're making, you're thinking of the end customer as opposed to stakeholders in the middle.
[0:21:46] David Green: What does your people analytics operating model look like at Autodesk? And again, we haven't covered this as a specific question, but you've definitely talked about parts of it already.
[0:21:58] Ashar Khan: Yeah, for sure. I can tell you a bit about how we're organised. That'll also help. But yeah, certainly, we've spoken about the capabilities we've evolved over time. But from how we organise ourselves, we have a pillar called data science and engineering. So, that is our team that, as the name suggests, has our data scientists, data engineers. We have front-end engineers also on this because we manage our own portal that houses all our dashboards, our research papers, any custom apps we build; all of that is housed in our custom portal. So, our front-end engineers are a critical resource to manage that experience. That is our biggest "product", and that faces all of our people team. So, data science and engineering is sort of the build engine for us. Then, we have our employee listening and research team. So, that team manages all our surveys, including our sort of regular all-employee surveys, but also our lifecycle surveys. But also, this is the same team that does our quantitative and qualitative listening. And it's very intentional to house these capabilities in the same leader, because there's a lot of synergy between deep employee listening and research from both "lenses". So, now this team works very closely with data science and engineering, because often when there's deep analysis happening, we have to pull in resources from data science and engineering to do so. So, those are our two primary mechanisms to build outcomes and build products.
Then, our third pillar is our insight partners. And our insight partners wear two hats. So, one, they take all of the products that our data science and engineering team and employee listening and research teams are making, and they go to market with that. And they're aligned to various parts of people in places. So, we have insight partners supporting various CoEs, like our total rewards team, our cultural diversity belonging team, Each of the various CoEs we have, talent acquisition, talent management. But also, these insight partners are aligned to people business partners, because the nature of a question that comes from a people business partner is very different to what a CoE needs, and we want to make sure we meet both those needs, especially because we see the people business partners as our direct line into the business, right? And that's part of our operating model, that we actually very rarely go directly to the business, if ever. It's always through a people business partner, because the people business partner has the broader context of the org. And talking about business acumen, business acumen is their entire job, right? So, they are the experts in business acumen, and we always involve them in any analysis, and certainly, when we're cascading insights to leaders. So, the insight partners, that's the first hat they wear.
The second one is they are also our sort of solution designers. So, because they're working so closely with the CoEs, they're the ones who can influence that CoE around what the solution design should be. Again, we're not the ones that implement a technology, we're not the ones that maintain a technology, we actually don't have decision rights on it. But because we know the domain so deeply, we work closely with your data, we work closely to understand your strategies, and because we have a technical lens on it, our insight partners are well positioned to influence leaders across people and places on what their tech strategy should be, how it should evolve. And they are all sort of reading from the same playbook, because we have a defined sort of people technology strategy for Autodesk, that's a multi-year journey, I have to admit, but that's important because what we wanted to prevent is each insights partner developing their own niche technology strategies with the CoEs they support.
The newest one around are AI people strategy. So, brand new, we're actually in hiring mode for that, but that includes both those lenses in terms of building AI for people in places, but also advising the enterprise on people strategy implications on AI. So, that includes specialists on AI from a technical lens, so people who actually build AI products, but also subject matter experts on compliance and legal and ethical considerations on AI, and also change managers, because we want to make sure, especially for people in places, as we turn on AI capabilities in the near future, we want someone to manage that change centrally, and someone who's closely aligned into the AI strategy. So, as opposed to managing the change like a typical SaaS tool rollout, we actually want to tell the story of what AI adoption in people and places look like. So, that's why we've added that additional capability.
[0:26:31] David Green: I want to take a short break from this episode to introduce the Insight222 People Analytics Program, designed for senior leaders to connect, grow, and lead in the evolving world of people analytics. The programme brings together top HR professionals with extensive experience from global companies, offering a unique platform to expand your influence, gain invaluable industry insight and tackle real-world business challenges. As a member, you'll gain access to over 40 in-person and virtual events a year, advisory sessions with seasoned practitioners, as well as insights, ideas and learning to stay up-to-date with best practices and new thinking. Every connection made brings new possibilities to elevate your impact and drive meaningful change. To learn more, head over to insight222.com/program and join our group of global leaders.
For those listening that maybe aren't people analytics professionals and experts in quantitative and qualitative, can you just explain what you mean by that when it comes to listening, and how bringing quantitative and qualitative data together helps you deliver more impact from an employee listening perspective?
[0:28:02] Ashar Khan: Yeah, great question. So, as an example, we run our surveys for all employees every six months. We might find that particular question. So, let's say, I'm going to make up examples here, but we might say that we have a question on decision-making, and we might find that that number is trending in a direction that we don't fully understand. So, it's maybe trending down, or it's flatlining when we expect it to go up. When we involve our research team, the first lens they'll take is understanding that data more deeply. So, they will, instead of looking at that snapshot of data to say, "The score is 80. Oh no, panic", they'll actually understand deeply by linking it to other data sets across our enterprise. So, they'll combine decision-making with perhaps data points that are available in our data lake. So, they would say, "Okay, how does decision-making interplay with tenure? How does it interplay with the role you're in? Are there insights we can glean from that at the outset?" They could also do some advanced analytics at that stage to understand, for example, how, through this concept of decision trees, what leads to someone having a lower score? How is it linked to other outcomes?
So, as an example, it might be if someone generally has lower engagement and has lower scores on question X, Y, Z, then they're much more likely to have lower scores on decision-making. Or it could be we could add non-employee listing data points into it. So, we partner with Worklytics, as you know, to understand employee networks, right? So, on an anonymous basis, but we are able to understand how people communicate across Autodesk using some of their communication data. So, we can actually combine that at the aggregate level into the same thing to say, "Okay, are people who are most siloed in the organisation, people who don't have as much communication, people who aren't perhaps attending meetings as frequently, or people who are not interacting with a large part of Autodesk, do they have lower perception on decision-making?" Or actually, the inverse problem might be true. They might have a higher sentiment on decision-making because they're not exposed to the decision-making at Autodesk.
So, all of these questions allow us to truly understand the score. And that was one of our big goals as we inherited the employee listing programme. Because when we inherited it, it sat in a non-technical team. So, the most value you could get out of it was saying your score went up or down. But really, what we were able to do is answer second-order and third-order questions about what this means. So, that's so far that quantitative lens I've put.
Once we get deep enough on the quantitative lens, we have some hypotheses. So, it could be people who are new to Autodesk have low decision scores, people who are siloed. I'm making all this up just as an example. Then the qualitative lens allows us to add much more context to that. So, our qualitative research team would then go and do focus groups, they would do structured interviews, truly understand from people. So, we would do it for a small subsection of Autodesk, because Autodesk is 15,000 people, not scalable to talk to everyone, but deeper conversations with a subset. And typically, we aim to get a representative subset of the group we're investigating. So, sometimes it's a representative subset of Autodesk, sometimes it's a representative subset of, for example, low-tenure employees, because that's the example I gave. So, that'll really allow you to understand beyond the metric, beyond, "Oh, decision-making is low"; it'll allow you to understand what did people mean when they responded, decision-making was low.
But it's critical that actually you combine that quantitative and qualitative lens together, because if you didn't have the quantitative lens, you don't know where to point your core lens. You're just going and talking to a random subset of people across Autodesk that don't represent where your issue is. So, that's one example. But candidly, the research space is one of my most favourite spaces in our team because there's so much that you can do and there's so much so much intersection of different methods between quantitative and qualitative research. We have such rich data sets on listening, but also our core employee data. So, yeah, I could talk about this forever, to be honest, but just wanted to make sure I gave one sort of hypothetical example to your point for listeners to get familiar with what this might look like.
[0:32:25] David Green: And I think, thank you for that, Ashar, I think it's really helpful. I think you provide a very powerful reason why, in the research that we've done at Insight222 on the evolving operating model for people analytics, that employee listening should be a core capability of people analytics. Because not all of us are able to analyse qualitative and quantitative data, but the concept of putting them together and actually getting to the real answers and the real insights that help drive decisions and drive change, positive change, is so important.
[0:32:59] Ashar Khan: Yeah. And actually, I mean something I should have said also earlier on, this is an evolution for Autodesk also. And as I said, we inherited this from the culture team. As we inherited it, one of the biggest sort of frameworks we put together was a trust framework around data. So, we actually can't access anyone's data on an individual basis. Everything sits behind lock and key. Everything we're able to aggregate is through queries that bring out aggregate scores, but we're able to query it in a sophisticated manner. So, we don't have an ability to see, "How did Ashar respond versus how did David respond?" And I think that's critical, because that's the way we're able to actually maintain a very healthy response rate within our listening system, and at the same time, get deeper insights. Now, I'll say that's a really delicate balance, because I've seen it done other ways, where the analytics is leading the change management too much, and actually you end up losing trust.
That was one of, I didn't mention it as a fear, but in this context, one of our fears was we might end up losing employee trust, and that would be indicated through lower response rates, that would be indicated through lower comments or comments that indicated that people weren't engaged. Actually, we found that not to be the case at all over the past few years. But especially because we're going to this topic, I want to make sure the listeners also call that out. That's a huge piece of work to build that trust and maintain that trust.
[0:34:26] David Green: Yeah. And actually, again, listening to what you said earlier, when you were talking through how you're building that AI people strategy additional component in your team, you talked about specialist AI. But the second thing you mentioned was subject matter experts on compliance, on ethics, on trust, because if you don't build that foundational trust layer, you're in real trouble, I think. And I think it's again a good guidance for anyone that's looking at doing something similar in organisations: think about that at the outset, not as an afterthought.
[0:34:58] Ashar Khan: Yeah, and we really had, as a leadership team, lots of discussions about whether that role sits in a people analytics team or whether it's already in legal. We already have legal business partners that are deeply involved, but having someone in the team who actually, I think this profile won't have, or likely won't have, a law degree, likely won't be a true subject matter expert in every sort of country's AI laws as they evolve, because we already have those legal experts inhouse that are aligned to our people and places. But having someone in the team whose entire job is to make sure we are going through the internal processes that we already have established, that has the rigor to make sure that every POC has legal involved from the outset, as opposed to, "Oh, we've done the POC for the past three months, we're going to consider scaling up", and then legal gets involved and we realise we're at risk, right?
So, it's a question that leaders should ask themselves. Does it sit in their team or does it sit in legal? Often, there's no right answer. For us, we found that actually this is an area that we want to overinvest. So, there is an overlap, for sure. We continue to think of our legal business partners as the decision-makers, as the subject matter experts in this space. But having someone within the team who can ensure momentum and ensure rigor on it gives us one more layer of certainty that this is going to be looked at.
[0:36:23] David Green: Very good. We're going to go back three or four questions, Ashar. We've had a great conversation so far, and I appreciate we probably overlapped some of the things I was planning to ask you, which is great. That's how the best conversations should go, to be honest with you. You mentioned a couple of things that I'm going to pull back. You talked about the danger of metric fixation. So, clearly, again, you've been on a journey with people insights at Autodesk and achieved a lot over the last five years. I mean, it's obvious just listening to you without even getting into the detail. Sometimes we hear from HR leaders, when they hear people analytics, they jump straight to metrics and dashboards. How do you get past that number? Maybe that's the second part of the question. The second part of the question, how do you get past that? But where have you seen metric fixation create not necessarily bad decisions, well, it could be bad decisions, or it could be not utilising the power of analytics as much as you could do, and how do you then overcome that?
[0:37:31] Ashar Khan: Yeah, it's a fascinating question. Certainly, I see metric fixation across our industry. If we do find metric fixation in a people analytics team that we're managing, or in the enterprise that we're a part of and we're a people analyst leader, I think actually the onus is on the people analyst leader to influence and change that, right? So, it's not something that the people analyst leader can sit back and say, "Oh no, this enterprise is really fixated on metrics. I wish I could do more to influence their decisions, but they only want the metric". Because really what's happening is, likely we're not giving them the reasons they should care beyond the metric. So, at many companies, or even early in my career, as we didn't have the rigor in people analytics, what we were delivering was the metric, right? So, what would end up happening is leaders will start managing the number rather than managing the decision the number was meant to inform.
So, as an example, giving regular updates on attrition and setting attrition targets, there's a healthy way to do that. But not going one step further and saying, "This is why the attrition target matters. This is how attrition higher or lower than the target ends up impacting the company, these are the ways that we believe, based on our quantitative research or qualitative research, you can influence attrition, these are the drivers of attrition". If you don't do all of that work, and you just give people an attrition number, David, your team's attrition was 10%, but our target was 9%. You're 1% over, right? And if you see that message every quarter, obviously leaders are going to manage the number. They're going to start talking about, "Oh, our attrition needs to be lower", without really a sense of how to do that, or is a 10% target relevant for David? Perhaps David leads a team that in the industry generally has actually a 20% attrition rate. So, David's attrition rate is very healthy, right?
So, I think what's happening is metric fixation is a natural consequence of incomplete value propositions that the people analytics leader is giving, right? I would say, if you have the full value proposition that you're putting in front of your HR team and your business leaders, it's almost impossible to have metric fixation, because you've given them the full story about, "This is the metric, this is why it matters, this is how you can change it". And at that stage, I would imagine most astute business leaders would move away from the metric and more about the levers they can use to influence the metric.
[0:40:09] David Green: And I think it's back to what you said earlier as well, "Okay, why are we measuring that?" And then, that's where your business acumen perhaps comes in. And as you said, is it good? Is it bad? Is it going up? Is it going okay? How do I compare to comparative teams in the business? How do we compare to our competitors? What about the geographic location? Who are we losing people to? Do we care? Are they the skills that we want to retain within the organisation? Who are we losing them to? Why are we losing them? What can we do about it? There's so many follow-up questions beyond the metric, isn't there, which then helps the manager actually take actions that not just improve the metric, if that's what you want to do, but actually improve it in the right way and everything else. So, very good.
[0:40:45] Ashar Khan: That's the fun work. Measuring attrition is no one's favourite part of the job. But understanding, and I think where attrition is getting a bad rep in people analytics, because it was where we started, but it's still important. But, yeah, I think the interesting part starts after you've measured the insight of the why and what are the factors that are influencing it. So, there's so much more interesting work beyond it. So, beyond anything else, us as scientists and constant tinkerers and builders, it's more fun to do that work rather than just measure the number and then call it a day.
[0:41:30] David Green: Yeah, and as I've heard others call it, "The metric issue, the what?" But then, the 'so what', and the 'now what' is what we can really help leaders get to. Very good. One question on AI then, and then we'll have one for CHOs or senior HR leaders that are building a people analytics function. So, again, one of the things that we're seeing with AI is all these stories talking about all CEOs saying they need fewer people. I'm not sure that's strictly true, by the way. It's a bit like saying all CEOs want everyone back in the office five days a week. That's not true either. Some business leaders have jumped to the conclusion that AI means we need fewer people. I'd love to hear your take on that, and particularly, again in general terms, thinking about how HR and people analytics teams support the business with this?
[0:42:19] Ashar Khan: Yeah, I mean, it's a narrative I've also heard externally. And I think that there's a bit of headline management that's happening and managing eyeballs there. But I think to anyone who's thinking even in some depth about that, that's deeply incomplete framing. So, I don't think anyone who's thought about this in any way can say that AI means we need fewer people. The reality is, AI is going to change the nature of work fundamentally, right? And that's going to happen. That's already happening for a lot of roles, and over time will happen for all roles. And what it'll do is, I think it'll automate certain portions of most people's roles, 'portions' being the critical word. And then, I think it will augment everything else that's left. And really what happens is, we don't necessarily need fewer people, because actually people need to move their focus up the value chain. So, human focus becomes the imperative, even more so than before. Because in the past, human focus wasn't needed for a lot of the manual tasks that AI will automate. So, this concept of judgment and validation, of being able to really discern what the AI is doing, is it the right thing; of ensuring that, I'm sure you've heard about the concept of human-in-the-loop. So, human-in-the-loop is critical, because AI is good at its job when that job is well-defined, and good at its job when data is pristine. But both of those conditions are rarely met, if never.
So, I think often, the hyperbole around needing people is not accurately assessing how critical humans still will be to be able to judge the outputs of AI. But again, as I said, we're able to move our focus up the value chain. Rather than doing the work, we are supervising the work that agents are doing and managing the work that AI is doing. So, all of that becomes actually deeply more interesting for everyone involved. But beyond this, I'd mentioned judgment and validation. Beyond that, I think a critical emerging skillset with AI will be this concept of systems thinking. So, how do you think of your entire work as a system, as opposed to, "This is my task, this is what I'm doing"? Because that systems-thinking approach allows you to re-engineer your work with AI at the core, as opposed to AI slapped on top of your existing sort of broken tasks, broken processes. I see too often, people jump to, "Oh, my job is X, Y, and Z. I'm just going to have AI do those exact same steps", as opposed to taking a step back to say, "I want to get to the outcome of Z. I don't know if I need X and Y. So, let's reinvent and let's think of new ways of optimising this work with AI natively built in, as opposed to slapped on top". And that's a huge piece of work. None of that means that we need necessarily fewer humans.
So, as I mentioned, anyone who's thought about this in any depth will quickly come to the conclusion, it's a naïve point of view to say AI will definitely mean we need fewer people.
[0:45:49] David Green: Two questions to go, Ashar. So, the first one, you've been in people analytics for a long time, you've built this function at Autodesk, and it's clearly adding more and more impact year on year. If you were advising a CHRO who's maybe building a people analytics function today, or maybe more realistically shifting from a reporting function to something that's plus-plus, what's the key guidance that you'd give them?
[0:46:19] Ashar Khan: Yeah, it's a great question. One thing I'll outline is at the outset, I don't think there's a one-size-fits-all model or approach. I've been part of many people analytics teams. Before this, I founded other people analytics teams. And the way I ran those teams was very different to the way I run the team at Autodesk and the way I built it at Autodesk. It's because of the needs of the enterprise being fundamentally different, right? So, as a CHRO thinks about this, I would ask them to be crystal clear on who this function serves, who is the customer, going back to your framing earlier. And often, that's not only one answer, but it's important to delineate who the various customers are. That sort of influences decisions around capabilities, influences decisions around scale, influences decision around SaaS models versus in-house. So, a lot of people analytics, you could buy at this stage, right? So, that's big momentum gaining. But based on who it serves, you might actually find that you might want to take the time to invest and build from scratch to get more bespoke analytics. So, who the function serves is a really critical question, which sometimes is glossed over, to say, "Oh, it just does analytics". But to what end, that is sometimes not filled out.
Then, to build on that, also, what problems is it looking to solve? Beyond the basic metrics and reporting layer that you're locked into, what are the business problems that need to be solved that will be illuminated through people analytics approach or data? Both of those questions really inform how the team should look and what model works for that team. I think beyond that, I mean a few tips I would give is for an early team, the most important thing to do is invest in data foundations. I might seem like a broken record, but truly what allowed us to scale was a really solid data foundation that was built by our data engineering function. They are the backbone of our team still. And as I said, we're over-investing in them even more because with the advent of AI, we see that capability become more critical. So, that's something that, again, sometimes I see people analytics teams skip over to get to the product quicker. They will just say, "Okay, the data foundation, I'm going to pull from the HCM", or something.
Then, finally, the last thing I would say is, try to resist the urge to scale too fast, or to scale before credibility is built. So, it may seem like as I'm talking now, we have a sizeable team. I think the reason we have a sizeable team is we started as a tiny team. We started as a team of me and one data engineer. And the data engineer was a data scientist, data engineer hybrid. And over time, we were able to, through showing value, scale up pretty rapidly over four years. But if I made the business case like three months in, to my CHRO to say, "Actually, my long-term vision is this team that has these pillars", I would have been laughed out the door, right? So, there's a lot to be said about early wins, there's a lot to be said about not just early wins, but business-relevant early wins, to be able to build credibility before you start approaching the topic of scale.
It's important to call this out because often, the teams that are talking about their analytics practices are the ones that have already scaled. So, it seems like that is the only way to have a team, but actually not, that couldn't be further from the truth. Those teams got to that scale and size through building credibility with business-relevant quick wins.
[0:49:57] David Green: Very good. Some really, really good advice there. I'm going to see what you can do now with the last one, which is the question of the series, Ashar. So, get your crystal ball out. This is a question we're asking every guest in this series. What do you foresee to be the role of HR in 2030?
[0:50:17] Ashar Khan: That's a challenging one, because even if you ask me what the role of HR is like in 2028, I would struggle with it. And some of that comes because my answer to what keeps me up at night is the pace of change, right? But extrapolating out, right, 2030, or as far as I can extrapolate out, I think HR becomes almost this architect of the organisation capability, right, of how the organisation creates value through employees. I think right now we're skimming the surface of that. But right now, we have too much of a focus on programmes, right? What is my comp programme? What is my talent acquisition programme? And I think in order to get to that concept of the architect of all capability, I know it sounds sort of heady, but really what we need to think about is instead of programmes, it's concept of systems, and systems being interconnected, not fixated on specific CoEs. So, what is our sort of workforce system? And that workforce system has each of our existing CoEs, which may not exist in the future, implicated, from talent acquisition to comp and people business partners to learning and development? So, thinking through that system mindset is going to become critical.
Underpinning all of that, I think, certainly by 2030, I think we have to have cracked this concept of skills. And this has been something that has, I think, we've spoken about this, David, many times. And I think as an industry, HR has spoken about this so many times. But I think that's a critical unlock, and I hope we get there before 2030, because really, that allows us to speak more intelligently than we currently do to the business. Because right now, there's a gap between our understanding of the business and what we can do, and I think the way to bridge that gap is to truly understand the skills, because that becomes the currency through which HR is able to add value on managing those skills, and then it gets, again, implicated in everything we do.
I'd be remiss not to mention AI, right? And I've tried to moderate how much I mention AI in this call, just because I don't want to add to the hype cycle, but I think it truly exists here. I think in 2030, HR, the H part of HR, might need to be also augmented with some sort of AI assets. Because truly, I think by 2030, this is where 2030 framing I think truly makes sense to me; by 2030, I do see us becoming fully integrated with human-AI collaboration, so teams being blends of human and AI. By that time, AI certainly would have, or likely would have, retained a lot more autonomy to make decisions on behalf of the enterprise with humans treating AI truly as a peer. Right now, I don't think we're there yet. I think it's aspirational to say even in the short term, we can get to peer status. And the reason we have things like human-in-the-loop is because we do need to supervise. But I think by 2030, certainly we can start viewing AI as a peer.
So, I think those are the biggest ones. I think certainly how to get there is like, I think the trajectory is correct. I think what we're seeing on many sophisticated HR teams really embracing the challenge of AI, not just from the HR lens, but the enterprise lens, is our way to become that sort of strategic advisor to the business on how to get to that 2030 vision.
[0:53:42] David Green: Very good. Well, for the fear of recency bias, Ashar, I've thoroughly enjoyed this conversation. It's one of the ones I've had with you peers in people analytics teams, it's one of the ones I've enjoyed the most, I think. Can you share with listeners how they can learn more about you, your work, and also more about Autodesk, if they want to learn more about Autodesk.
[0:54:04] Ashar Khan: Yeah. So, if you're interested to learn more about me, definitely find me on LinkedIn. That's the best avenue. I share ideas, follow leaders and thought leaders like yourself, David, and others. I think one of my most favourite things about our people analytics sort of community is truly it is a community. So, I love engaging with a number of peers who I've known for a long time through the practice, but also newcomers. And I think that's one of my most exciting thing. I think there's a really strong, fresh batch of talent that's entered people analytics that is looking to upend things, looking to upend traditional wisdom. So, yeah, if you're one of those, please, I'd love to hear from you for sure. And then, learning more about Autodesk, I think Autodesk, for sure, the corporate website is a great way to learn about it. I think finding us on social media is a great idea.
Autodesk, I'd say, is one of those companies that you may not have heard of, but you've certainly interacted with something that was built using our product. We are the company that allows you to design and make anything. So, it's very likely a product you're using or a building you're sitting in or a TV show you're watching was built using Autodesk software. So, that might be sort of the parting challenge I leave for listeners, to go research Autodesk and figure out something around them is certainly built using Autodesk. And that's something that I enjoyed a lot about when I joined Autodesk, learning about everything that the company does.
[0:55:35] David Green: Well, Ashar, thanks very much for being on the show. I look forward to seeing you probably at an Insight222 meeting in the coming months.
[0:55:42] Ashar Khan: Yes, you will. Thank you, David. Thank you for having me. I really enjoyed this conversation.
[0:55:47] David Green: Thanks again to Ashar for joining me today for what was a hugely enjoyable conversation. To everyone listening, I'd love to hear your reflections. What resonated most with you from the discussion with Ashar? You can join the discussion on LinkedIn. Just look out for my post about this episode and share your thoughts. I always enjoy hearing what you take away from these conversations. And if you found today's episode valuable, be sure to subscribe, rate, and share it with a colleague or friend. It really helps us keep bringing these kind of thoughtful, forward-looking conversations to HR leaders and professionals around the world. To stay connected with us at Insight222, follow us on LinkedIn, visit insight222.com, and sign up for our bi-weekly newsletter at myHRfuture.com for the latest research tools and trends shaping the future of HR and people analytics.
That's all for now. Thank you for tuning in and we'll be back next week with another episode of the Digital HR Leaders podcast. Until then, take care and stay well.