myHRfuture

View Original

Episode 98: How to Measure Culture and Organisational Behaviour (Interview with Hani Nabeel)

This week’s podcast guest is Hani Nabeel, Chief Behavioural Scientist at iPsychTec, who discusses how behaviours are the leading indicator of culture, and why he thinks it is time to stop relying on measuring the outcomes of culture, like engagement, during a culture transformation.

Throughout the episode, Hani and I discuss:

  • The use of neural thinking, neural technology, and predictive analytics to measure culture

  • How to connect culture to business outcomes and demonstrate the value of culture transformation

  • Various case studies from companies who are tackling cultural transformation and measuring success

Support for this podcast comes from iPsychtec. You can learn more by visiting https://ipsychtec.com/

You can listen to this week’s episode below, or by using your podcast app of choice, just click the corresponding image to get access via the podcast website here.

Interview Transcript

David Green: Today, I am delighted to welcome Hani Nabeel, Chief Behaviour Scientist at iPsychTec, to The Digital HR Leaders Podcast.

Hani, it is great to see you again, thanks for being on the show. Can you provide listeners with a brief introduction to you and your role at iPsychTec? 


Hani Nabeel: I am a behaviour scientist but actually, I didn’t start my career with behaviour science nor psychology, oddly enough. I started life in engineering and physics. Then airline flying for many years, 
in fact 10 years, flying passengers all around the world. I made it up to captain and then did, probably the strangest thing a pilot would do, which is leave the flight deck. Having got excited about human performance and the human role in the flight deck, so I went after my passion.

I went back to university to study both, a masters in occupational psychology with focus on quant behaviour science, as well as a masters in research for advanced methods. All of that really started my journey into understanding behaviours and how they manifest in our workplace. My early work was actually back at the flight deck, to understand particularly the cockpit design that nudges the correct behaviour which was great fun, but really that had driven a lot of the insights and early beginnings of the research that led to the development of CultureScope. This is where CultureScope and particularly iPsychTec comes into the story.

It took us seven years, the largest ever study to develop CultureScope. When I say largest, it was over 51,000 participants, across 60 diverse organisations, 61 countries, and was all about understanding behaviours and how to measure those accurately. And from that, how do we provide analytics? So particularly insightful and actionable analytics, as well as predictive analytics and a roadmap for improvement culture management. 
So part of an exciting journey.

iPsychTec as a business, as you might imagine, is heavily in the world of people analytics, however we are not generalists we focus on the behaviour science side of analytics. 


David Green: We have known each other for a while, but I didn't know you used to be a pilot. 


Hani Nabeel: For the last two years of my career, I was even promoted to captain and then left which was a bit strange. 


David Green: So you used to help fly people to the destination that they wanted to go and now you could argue that you are helping companies get to the culture and the behaviours that they need to have to be successful. Nice analogy there.

So let's start with culture. What do organisations often get wrong about culture? What are some of the common pitfalls that you have seen along your journey?

Hani Nabeel: The pitfalls I see now, is what I saw that many years ago, before I even started the research. When you ask any organisation now, how do you deal with culture? What are the metrics you look at? You are going to get some interesting and varied answers, some of which are more common.

Let’s say we do engagement surveys, so we understand people's opinions and how engaged they are. We may look at a customer or a client's feedback. Do they stay with us? They may look at business performance in regulated industries, they may think about incidents, accidents, issues, all sorts, a platter of analytics.

If you were to plot all of that altogether, you and I do a master sheet now, you would notice that all of this is important, but it is the outcomes of culture. So we seem to constantly measure the outcome of culture, nothing wrong with this of course, but I hope to draw an obvious insight from this.

If you want to change anything to improve culture, maintain it, sustain it, you have got to measure the inputs and the obvious input, you don't have to be a psychologist, is how things get done. It is a behaviour. So it is odd that we are measuring an outcome because, I hope that any basic common sense science would say, that if you are not measuring the input then you are just chasing your tail. You are constantly chasing lagging indicators, what has happened. But to change anything, you need to work on the input.

So I often say, I am sure you may have heard this. We measure engagement survey, it tells us all we need about culture. My argument, no, it doesn't. It turns you on to what outcome of culture do you need to understand behaviours, the leading indicators, and what is really going on, and how to impact the future as well. 


David Green: Yep. And then how, by changing some of those behaviours, you can get better, well or worse not that you would want worse, how you can get better outcomes. 


Hani Nabeel: Yes and that is the critical missing piece in the story. 


David Green: In terms of culture, and there are so many definitions of culture around, but I would love to hear your definition of culture. 


Hani Nabeel: So, I want to simplify it by saying, even if I will use simple language, it is the way that things get done at an organisation. It is the way your people behave towards each other, to the outside world, to your partners, and to your suppliers, all of this really. And interestingly enough, at any given second each organisation, of whatever size they are, their people are enacting that culture.

So every day people are, without having to think about it, they become somewhat programmed to enact it. 
And that is really a simple definition to think about. 


David Green: And that is that great link with behaviour because ultimately every person in your company is a representation of your culture and that manifests itself in the way they behave, whether that is to the customers, to each other, to suppliers etc. 


Hani Nabeel: Absolutely correct.


David Green: I think that is a nice definition of culture, nice and easy for everyone to follow and understand. So then how do you connect culture to business outcomes and really demonstrate the value of culture transformation, for example, and perhaps as part of that we can discuss how research methods have advanced over the last 10 years, in terms of measuring culture? 


Hani Nabeel: Yeah. This is the question that interestingly enough, I asked myself even before I started the research. Exactly the same words, so fascinating to answer that at this stage now, because it is how CultureScope was born.

So to answer your question accurately, the first thing we have got to think about is, if we know that behaviours and culture is the leading indicator, and we know it is a key differentiator, it is your brand value. Even I am sure, 
hopefully you agree, it is your leading edge over your competitors, right? It is your culture. Products and services don't make themselves and certainly on the road, they are not going to be your competitive edge.

So ultimately if we cannot understand culture, measure it, let's be accurate, bottle it, change it, drive it. 
If you can't measure it, you can't fix it, really simple stuff on how to maintain it.

So the first thing that you need to do in your journey for good analytics, is how do I measure culture accurately? It has got to be a valid diagnostic, you can't put rubbish in and expect great analytics out and insights, it is pointless. 
So that is the first part of the journey.

Measurement was something that was heavily in our research, so to give you and your audience an insight on that. We started that in our research because we didn't know, there was no research. What behaviours can we measure that are relevant for culture? So early on in our research, we made a short list of every behaviour we could measure using advanced psychometric techniques, online diagnostics.

229 behaviours in total, far too many of course, and in our journey we have managed to eliminate what doesn't vary from one company to another, because if it doesn’t you are not measuring anything.

Also we started connecting those behaviours to multiple semantic outcomes around performance, people leaving, people staying, people happy, people not, incidents, accidents, you name it. 
We ended up with 30 factors, which is the base measurement that we have today. 30 factors of behaviour that are important because they vary and they drive multiple outcomes.

The second key point from that is we also measure behaviours, and this is really important, but in two ways. So we have two measurement points, not one, for each person in the organisation we measure what to they do at work and what are their observed behaviours, in a separate measurement point around them. So we get to know on mass, what is the system around them that drives their behaviour and how do they live within it. So you have got what I lovingly call “the place and its people” 
We get to know what you drive at the place and how people live it.

The reason being, as sadly you will have seen it, that culture has often been the villain. How many organisations do we hear that from every day and we keep just saying, it is the fault of these people here. Well no, it is not, sometimes the system is driving that.

When I say the system, the place, rewards, organisational structure, operating models, they drive behaviours, however, we rarely design them for driving the right behaviours.

So I am a big advocate, you may have heard the term “culture by design” and we can talk about that later, which is really about designing, don’t be scared. You design your financial systems, you design your products, but have you designed your culture? People go, no, it designs itself. No, no, no, you can design it to drive the right behaviours you want to see.

The second part to answer your question is, we have got to connect all these behaviours to a plethora of outcomes. I usually start this with one or two. So I challenge any organisation I work with, to think about the thematics. Is it compliance? Is it risk management? Is it innovation? Is it creativity? Is it performance? And we invite them to bring a plethora of outcomes and believe it or not, CultureScope automatically can link the behaviours measured with any outcome you throw at it. And in fact we have, in our latest development program, it is a neural engine now that thinks about the data for you. So you don't have to be a expert mathematician, we connect the outcomes with behaviours to tell you three critical things. Here we go. 1] Why do things happen? You might have a lot of insight that happened today, not going to add value there. You know that. If people are engaged or not, if people have left of not, if accidents have happened or not, performed well or not, but you don't know why. We have got to tell you why things happened. 
I got to also tell you how to improve things. We do that through predictive analytics, but also we do that with what I call, technical term being path analysis, which is a roadmap. By connecting, what outcomes do I need to work on first, second, third, fourth. If you have 20 outcomes you want to improve, imagine that I could tell you what is the optimal route to work on? 
Don't invest in 20 at the same time, you don't have to do that, but what do you do first, second, and third.

And the final thing we absolutely must do, is tell you where.

So where am I great, where do I need to actually drive those behaviours? And I talk about behaviour presence and absence as a segway to them.

And if I can tell you all of this and now you have got, I hope through common sense, the classic brilliant way of managing your culture. Something that, as you might imagine in your journey in people analytics, is missing. 


David Green: So to take your second thing, how to improve things. Let’s say you want to improve safety. You might have to focus on leadership behaviours, for example. You might need to focus on training for those directly involved in the outcome of safety, before you actually impact on safety. Just a simple example hopefully there for listeners. 


Hani Nabeel: Absolutely. 
So in CultureScope world to simplify the data they see, remember they see the data of how our people behave, what do we drive at the place, and outcomes. Culture by design, we connect this, so what they can do with the data with that, by saying if it is people behaviours are absent, focus on capability training. You have got to focus on the people. 
If it is, the place is not driving those behaviours. Focus on two things, opportunity and reward. How do you nudge behaviours of the place? Give them opportunity. Give the opportunity and then you reward the people that enact it.

You have got to get all three. You cannot isolate. You can’t say, well our people are there, so it doesn't matter about the place. No. To get the behaviour to manifest fully, using the term you just said before, we have got to focus on, are people capable, number one. That the opportunity is there and the reward is there. 


David Green: Research methods have obviously advanced over the last 10 years, in terms of measuring culture. The work that you have done during that time is proof of that. I would love to hear your views on that? And presumably, the research methods have advanced in line with our technological capability, and probably analytics as well. I would love to hear your views on how things have advanced over the last 10 years or so? 


Hani Nabeel: Absolutely. Just generally, first of all, the people analytics world is changing rapidly and solving so many complex problems. Now I could argue, again as a big advocate of our industry, that HR generally should absolutely embrace this and run with this. I still see some fear or resistance, if you don't mind me saying.

The technology has advanced so much that actually you have got now incredible cloud computing that is safe. We obviously are a cloud application, we are a software service and it could be a product. That product is just basically all there for you and can do incredible heavy lifting for you in terms of, as an example, connecting and measuring those behaviours and connecting them with a plateau of outcomes. Complexity that would take you hours and hours and days to try and understand. Through neural syncing, the neural brain for us, basically it does all the multilevel modelling, all the regression analysis, understands whether it is a linear outcome, whether it is a binary outcome, or what applies to the right regression, to start really giving you the insights on the behaviours you need to work on. Here are the two or three behaviours to work on and here is your ROI. We even tell them, you can improve your outcomes, or the one outcome, by X amount by 10, 20, 30, if you just focus on these behaviours.

So what used to be really complex and would take 5 to 10 mathematicians to sit in a darkened room, will all the SPSS and so on, to try and tell you the solution.

For us, you press a button, maybe 20 seconds and you have got a solution.

So the world and compliance and data and the safe use of cloud computing, and there are many platforms out there in the world that make it safe, means that this is at your fingertips. This stuff is out there and you can't just stay laid back and continue doing what we have done before, because it didn't help us. 
If I were to argue, if culture has been so great that it has been the hero for that many years, probably I wouldn't exist, probably you and I wouldn't be doing this podcast even, because it would be great, but it isn’t. 


David Green: It wouldn't be a problem to solve.

Hani Nabeel: Sadly, as you know, a conveyor belt off of big, best brands in the world have been in trouble and fined heavily for it.


David Green: And I guess the ability for us to put this into technology means it is even more important that the quality of the research that informs that technology is important.

So you have told us a little bit about the research program that preceded CultureScope, so I will take the next two questions together.

It might be nice to, again, summarise that initial research program that you did and the one that you said that took over seven years. 
But I also understand that you have recently validated that research as well. So maybe as part of your answer, talk about the initial work and then how and why you have re-validated it?

Hani Nabeel: So the original research was five studies, tagged on a sixth study later on for outcome, but took seven years. And basically because we were going ground up, we had to go maximum behaviour measurement, maximum sample sizing, maximum diversity in sample sizing, because we just didn't know. This was unusual, no hypothesis, ground up, let the data tell you.

And with that, we tested multiple different diagnostics. What I mean by that is the forced choice scenario based. 
We also deployed item responses in a matrix of behaviours, most to least. We also deployed your typical normative, your Likert scale, to measure behaviours, we didn't know what worked.

So you can see from that research, we just tried to go from the ground up and it taught us a lot about behaviour measurement or the validity in behaviour measurement to be specific, and reliability. And that is why we honed in on diagnostic, being a forced choice scenario based, mixed in with that item response theory, which would rank it. So together it gave us validity. We also started testing something that has never been done before, which is, can the diagnostic be dynamic? So in CultureScope today, we used our own engine for computerised adaptive testing. So imagine each respondent in real time, answering the diagnostic, is actually connected to the brain and it is measuring accuracy and validity per behaviour measures. And we thought, we are not stupid, we are not going to keep asking the same question again, so we vary. So we might ask some different questions, additional questions, in order to drive the validity.

Regulators tell us often, that this is the most advanced, non-invasive way of measuring behaviour, it just cannot be gamed. It is not invasive, there is no wearables, it is not in front of anyone.

So that just gives you the depth you need to go, for the diagnostic. But that is available easily, at the press of a button.

The other research that we have done along the way, is things around variability. So to understand which of those 229 vary and by how much, in order to understand what is relevant, because if they are never varying from one company to another, why are we measuring them?

So we have gone through a process of elimination in our second study and then certainly in the third study we started doing test-retest, for reliability. Psychometrics have to be reliable and if there is no reason for change, is the result the same? Which is brilliant and that is basic psychometrics, a good way of doing it.

We also then went as far as understanding the relationship between P & O through multi methods of measurement, of course, and multi dimensions of behaviour. Is there a relationship between the place and the people, what we would expect?

In fact, in our complex matrices, this is something quite important. Can we see the relationship between two separate organisations, nothing to do with each other, which invalidates the data? So we have done some really heavy lifting here, in trying to do some good analytics to understand that. 
And finally, the predictive to outcomes, which we have done a plethora of many outcomes for different organisations, to see is there any modelling and does it effect size? So how much off that effect of the size if you like, the number of people in total, for that particular cohort of people. Is that percentile or predictive? So is it telling us something that is true?

We have been through a hell of a journey. I am trying to be careful to give you the journey without using complex words for the sake of it.

Yes, we have revalidated and you wonder why, because okay, we launched commercially in 2015, so the diagnostic was fine up until then. Is it valid now, is it still relevant? So we have done a huge bit of research yet again, all of last year, we have just released it now.

Over 48,000 participants, across 50 countries, but only nine organisations. And you may say, why only nine? We are not trying to rebuild it. Before, we didn't know what we were looking at, but if I wanted to look at the specific relationship or lack of, which it should be, between those nine, I had to have a control group.

The research has depths, but it has a control as well. Whereas before, when we were making the tool, it had to have depth and breadth, here it is about the depth.

We have done another six studies and this time it was understanding, is there a relationship between those behaviours or are they mutually exclusive? Which they should be. The accuracy of measurement should mean that we measure one thing without automatically having to measure another.

We went as far in one of our studies, in using AI method, to generate data, randomised data. So can the machine replicate this data in such a way that invalidates what is true of the measured behaviour? And the answer was no, it couldn't. 
Of course, typically as many people do, we went to the eigenvalues of every single item, yet again, which we had done in our first study. All of this we publish it quite openly. We are not withholding that research back and we send it to anybody who is interested. There is some heavy lift maths in there.

But also we measured it with some validity measures against obviously, good research out there in order to show validity as well.

So yeah, some exciting work happened to revalidate. Did we need to change anything. And that is what I mean, you have got to stay on top of your game if you are offering the world something that they can truly use, they have got to be confident that end to end game is as good as it was.

David Green: In summary, did you identify any major differences or did it just validate the initial work that was done, up until 2015? 


Hani Nabeel: It revalidated the work. The only thing we have discovered is we may be able to improve adaptive computer testing a little bit, to reduce the testing time. So the testing time without adaptive computer testing used to be long per diagnostics, like 22 minutes, in the scheme of things that might not be. But imagine when we added adaptive computer testing, we managed to bring it down to 12 minutes average completion, that is really good. I think we can bring it down even more, to an average of about 10, so it may have helped us fine tune a little bit. 
Basically the other findings were all about, yep, we are still as good as what we were and everything is as it should be. 


David Green: I think a good lesson for companies out there that maybe are for instance, building an algorithm for predictive attrition, because let's be honest, lots of companies built those in the past and the properly refining them again. You constantly need to validate what the algorithm is telling you because situations change. Obviously we have got this big thing called hybrid work now, remote working, which wasn't the case two years ago. We had people working remotely, but maybe not exclusively.

And also a good lesson, I think from a technology point of view that obviously what you are building is complex. I am wondering how are you, and again something for organisations that are maybe building things themselves to maybe measure culture and behaviours, but how are you dealing with the complexity of building these massive predictive engines?

Hani Nabeel: We are careful about the terminology we use and the methods we use. And you may have heard me talk about the neural syncing, the neural technology, rather than talking about what I think you often hear, machine learning and AI and so on. I just want to be honest to the world and also drive the methods. Basically, since we have done a lot of research and we have done that manually and you know the methods that you are using, the question becomes, can I build a brain where I can tell it everything I have done and the brain will do all the heavy lifting, so that we make it really simple for our customers. They just press a button and it does it. And we, ourselves, don’t have to go to a darkened room for hours on end, to come up with the answers.

So create a methodology. Test its robustness against multiple outcomes and multiple inputs, which is what we have done. Make sure that you create, not complex algorithms, but clear algorithms and methods that take you through the whole process and test that. Once you do that, that is how you program the neural brain to sync. 
We dealt with all the inputs as neurons, as pods, literally, that is how it is. And then the thinking brain is how it connects that together. Essentially when I say to you a neuron, what I mean by that is we taught the brain, the CultureScope brain, how to think about the data. It is not learning from the data, which is where machine learning comes in. 
So we are accurate about what we are doing. I think, I am only saying this because sadly I did hear are we moving in to machine learning and AI and we do tend to banter this without thinking, but often be careful about being purposeful. Do I want to learn from it or do I want it to tell me what is going on?

The danger with applying machine learning and AI now, is that it could learn the wrong behaviours because it might drive the right outcomes and give advice the wrong way to our customers and clients, which would be disastrous. At the moment, we are not doing that, the brain thinks about the data and simply shows you what behaviours you have got to drive an outcome and by how much. You as the receiver, we hand it back to humans to decide, that makes sense. That is the right outcome and yes, this model.

Now obviously in our journey we might go in to machine learning and AI, but we need to heavily go into augmented analytics. It is a complex world, but we have got to be careful. It is learning all the time, but we have got to make sure that we are learning the right behaviours and not the wrong ones, as an example, we have got to have that in place. So, be clear about what you are building. Make sure that it has robustness of the methodology that you deploy. 
And again, that is the goal with revalidating all of that every now and then.

That is really the best advice I can give. 


David Green: That is really pretty helpful Hani. And I know that you are living this with CultureScope and iPsychTech as well, managing the complexity as well as the legal and ethical concerns, you have actually launched a user forum haven’t you? Can you tell us a little bit more about what groups of people are important to bring together, when measuring behaviours? And maybe give us an insight into some of the conversations that are happening? 


Hani Nabeel: Yeah, sure. The user forum which we call, The CultureScope Club, has been something we dreamt about for a long time. 
So actually it was born in December last year, for the first time. I can’t tell you how excited and ecstatic I was, it was a very emotional moment. Every user can join, every company out there, and then our past companies we were working with. Sometimes they think that the problem is unique to their industry. Oh, it is just for me, just for my industry. But human beings are human beings, they exist in every industry. So if you think about behaviours, this is a human issue, which we have got to think about.

So, the user forum was about bringing people together that have been with us through a journey and being able to share the best practices around measurement, understanding what it tells us, and what have you done about it that works? So we can learn from each other.

Ultimately think about this. Some of our customers like Lloyd's Banking Group, which has been with us for five years now. They have their own people that actually interrogate the data, they are certified, so they use it as a product. They have a CultureScope department, people who have in their job title CultureScope lead. Now what is nice about it is they are in a unique culture division, or culture practice, that reports not just to HR but even to the board, to be able to give them the insights they need at their fingertips, which is incredibly powerful. Beyond that, take people like Visa or BP, they are all customers. Some are different, Bank of Ireland is new on the journey with us. EDF, like we are in many of the nuclear sites, to understand even a lot of complexity around safety and so on.

So you can imagine bringing all of those in one place and guess what, David? It is not even chaired by us. It is a rotating chair. So Lloyd's Banking Group, are chairing the first four over the whole year, so December to December, then they will pass the baton on and so on.

So we are there to help, but we are not even running them and that is why it is emotionally exciting for me. It is just like bringing all of those people that dealt with issues, some quite well, some still in the journey, the common saying here is people. It is people. 
So it is really fascinating listening to the first one. I have never been to a two hour session where it was nonstop and there was intense emotion, abide to culture. But it helps HR practitioners, culture practitioners. Sometimes, like in Deutsche Bank’s Fintech known as Breaking Wave, the CEO turned up. It is quite exciting what I am seeing. There is not one obvious job title sitting around this table. There are people from culture, from compliance, from risk, CEO, COO, all of them have come together in one place, to talk about one thing.

So it just shows you this is not a divisional problem. Yeah, I know we tend to try and address HR, but a lot of our customers aren't solely in HR, some are in compliance, some are risk management, some are safety, some are innovation, and some are in transformation. It would be incredibly powerful if we can connect these communities up and watch that space.

I am creating that true sense of community, if I can use that word, at CultureScope, and that is really exciting. I will try and report back to you on the progress of it. 


David Green: I look forward to it. Obviously we have known each other for a few years and one thing that has always struck me about CultureScope as a product is, as you said, traditionally most of the technology firms that I know, they are mainly targeting HR and people analytics professionals. But you have worked directly with boards in very large organisations, you mentioned some of the names there, and I think that shows that it is the cultural element of it and those behaviours that have those impacts on the outcomes, that really resonates. It also perhaps suggests that in some companies, HR is not quite ready for this discussion yet. 
I would love to hear your thoughts on that? Or, whether that has changed? 


Hani Nabeel: It is not fast enough changing. In our world, there are two different things to consider. One is, HR does not own culture, that is the interesting thing. Culture is owned by everybody and yes, the leaders are the custodians. And remember we tell leaders, it is not just about you behaving the way that you want. You have to create the environments, remember we kept talking about the place. So HR is a facilitator to help, but they don't own culture. So probably that is why for us, it is slightly different as there are multiple stakeholders that own culture, if we can call it that.

But the second thing which I just want to talk about is, people analytics and HR need to have a, I don't know how to put it, a better marriage. It is kind of a marriage of the moment, it just needs to improve that relationship. It is getting better but I sometimes feel, maybe it is just a feeling, that with some HR professionals they shy away or they are worried about people analytics. It is not there to replace their jobs, it is there to enhance what they do, with the language never heard of before. 
Imagine if you walk into a board meeting and say, I know the impact of what our people do, to our outcomes and I am going to tell you what we need to change, accurately. Just basic language could be a game changer.

All of this is possible today. It is at their fingertips, but maybe we need to just keep educate, educate, educate, and practical application, is what I would say. The best thing to do is just to do it.


David Green: And one of the things I think, that helps HR professionals is hearing stories and examples of work and how this has helped. I appreciate that you may not be able to name companies by name but do you have any case studies that you could share with us, that maybe illuminate some of the challenges that we have discussed? 


Hani Nabeel: Absolutely. 
 So some simple stories, the simplest one is, what drives our sales performance for these two divisional or large salespeople that sell our services? Some are great. Some are not. We are struggling to recruit sometimes. Is there a culture element to this or not? There might not be, but to answer and say, actually there are four behaviours you need to focus on. If you do that, you are 28 times more likely to get superstars. Suddenly they go from not knowing how culture can drive the sales, to actually knowing and focusing on that.

Obviously results have been incredible since they have done that. They went from not knowing to now, they know what they are focused at. Because they are getting salespeople that tell me, yeah, but they have done it before. Why aren't they working here? Well it is because here, is a different place. What is different? It is culture, right?

So, that is a simple story there, but the more interesting stories are how can we stop incidents happening, accidents, dangerous accidents, near misses? What is the POS data telling us when we can connect it to behaviours? Which behaviours lead to that? 
We have done some big stories if you recall, particularly around HSBC, if you remember when we cut our teeth big time, around the world, 71 countries. Which behaviours work for or against financial crime? And to actually be able to know in 71 countries, which are your behaviours that are likely to stop financial crime, which is what we are looking for, and be able to work on those and actually show the regulator that you have improved the behaviours and the outcomes you are after, to clear every regulatory issue. It is powerful.
 If you just want to link it to outcomes just imagine that story right there.

So transformation, we have gone from, we don’t know how are we going to transform? To me saying, what you need to do to improve is...

so for one company, they had a baseline of 20% possible success rate. That is just terrible of any transformation. We went from that, to 69% improvement. How? We could accurately tell you who are the change champions because their behaviour is already made for you. Before, they were randomly selecting people and that doesn't help.

We could tell you the size of the ask. So what you need to change in that bottom and how to enable them. Finally, we could tell you the best route for transformation through behaviours. 
So, we have gone from not being able to answer those three questions and ticking them off anyway to, I can tell you exactly who are the people that help you, what you need to do around them, and how they're going to do it. And then suddenly you get success.

And so we do have real success. Those are all real stories, with real outcomes, in our journey. I have got so much, we have got so many case studies that we have been really passionate about.

EDF decided to ID every single use case with us, for every nuclear site we have done this year, which I was humbled by. They said that we actually want to do this, we want to put it together, we want to tell the world what we did. We are in the journey, we are nowhere near acing it, but wow what we have been able to do, we want to document it. That is something that we have been doing as well, with them.

So some incredible stories out there, just so many probably outside of the realms of this discussion. 


David Green: I am sure we could probably have a discussion just on those.

It is interesting, you talked about the brain which is effectively what you are building, and what does the brain absorb? Knowledge. And what does analytics do? It gives people knowledge to help make the right decisions. As you say, transformations are notoriously complex, have a high failure rate. Well, if you can move from a 20% to a 69% chance of success, that is transformational, if you pardon the pun. As you said, identifying who those champions are because they are already exhibiting those behaviours that you need to drive that, that is invaluable for us, for a CEO and the board.

We are winding it down now. Hani, I know me and you could probably talk about this all day, but we can’t. But if you had to distill it, what would be your number one piece of advice for organisations looking to get a better handle on their culture?

Hani Nabeel: It is kind of an answer that is answered in a backwards sort of a way, which is weird, but let me do it.

You must be brave to use the term, culture by design, properly. You design lots of things in your company, design your office, design your furniture, design your systems, you might even have designed a talent plan. 
You rarely have a culture plan. Probably in your own career, I bet you have rarely heard the term, we've designed the optimal culture. I bet you haven't heard that?

But imagine if you are actually empowered to do that. So be brave and design the culture you want. Which means you have got to design the reward system, the operating model, you have got to design even your furniture, believe it or not, the physical workspace, around the behaviours you want.

And ultimately, why you want them is because they are a predictor for your outcomes. 
So culture by design of course, has to start with accurate measurement, be clear. Am I measuring what really drives culture? And if I am not, be honest. There is no point in saying, yeah, we are doing it with that engagement survey, which is people's opinion, which is an outcome measure. Measure, measure, measure. I always say, what can’t be measured cannot be fixed, sustained, improved. 
So correct measurement, correct analytics by connecting insightful and actionable insights. So correctly connect the outcomes you are after. Be clear. Not some sporadic outcomes that have nothing to do with what you are looking for. Input data, the behaviours and learn what it is telling you.

Then, based on what it is telling you, be clear on taking it from actionable insight to interventions that really work. Often you can learn from yourself. So what we show our clients sometimes, is actually you have a place in your business that is exhibiting those behaviours. So, they do, the place does, and it is the right outcome. So you don't have to dream up stuff, go talk to them, see what is going on. Is it the water they are drinking? Is it the furniture? What is it?

If you learn from them, then you might be able to deploy to where in the organisation those behaviours are absent and are already getting the wrong outcome.

So be brave about culture by design, but the answers could be very simple. Measure, connect, actionable insight becomes critical, and we go even by giving you the path to greatness. 


David Green: Love that, culture by design. A very nice way of putting it. So now, Hani, the last question before we ask you to provide your contact details and stuff for people listening. 
This is the question we are asking everyone in this series, which you are kindly sponsoring. How does behavioural science help to improve the workplace? 


Hani Nabeel: Behavioural science brings the people element back into the equation. Simple as that.

I was talking to an organisation that is doing a security review with a very large and well-established organisation. Security review tends to be people, process, technology, in fact people, process, technology is applied in many contexts around the world. We tend to focus on the process and the technology, but it was interesting in the security review, when they were talking about people, it was weak. So hence now, in the security review for instance, they are including the people.

People, let's be specific, it is about how they enact and how they behave at work, that is what you are looking at. You are not trying to deploy a personality measurement tool, be specific, what helps is how do they behave at work? They might be behaving in a certain way that might be constructive to the business or disruptive, but they just do it without thinking, because it is what the place wants them to be. 
So I normally say, I know incidents happens, issues happen, nobody comes to work to do bad things, no one. And that is the very few, by the way. But it is your systems and processes that could drive the wrong behaviours even.

So focus back on people, is the obvious thing I would say. Bring that back in, even in people analytics, what are you saying? Bring people back into the equation. If you don't have behaviour science as part of that, I would humbly and passionately argue about, are you really leading people analytics to its full potential. 


David Green: Great way to wrap up the conversation. Hani, thanks very much for being a guest on The Digital HR Leaders Podcast. Can you let listeners know how they can stay in touch with you, follow you on social media, and find out more about CultureScope and iPsychTech? 


Hani Nabeel: Yes, absolutely. LinkedIn is a great way. I am on LinkedIn, so feel free to connect. Our website ipsychtech.com is always live and updated. There is a nice video on there, at the beginning as well, but we get many stories out as well about webinars and podcasts and so on, that we do. Even with our customers, which is great fun. 
I would say where we are active is our website and LinkedIn, those are the main two ways to connect and I look forward to whoever reaches out, even if they want to know about how we do things, it would be a great pleasure. 


David Green: It has been great, Hani. And I have seen you on the journey, I remember when you won the award at the Wharton People Analytics Conference, all the way back in 2017, when we actually did conferences face-to-face and let's hope in 2022, we get back to some of that again. But it has been a pleasure to have you on the show, Hani, thank you very much.