How can AI be used to Reduce Bias in Recruitment?
Unconscious bias is a huge problem in the workplace, especially in areas like recruitment, promotion, and performance management as well as being a major barrier in efforts to improve diversity and inclusion. Whilst training and awareness can help, as humans we have inherent unconscious biases. So how can technology data and science help? And what steps do you need to take to minimise bias through the use of technologies like AI rather than to perpetuate it?
This is the topic of this week’s episode of the Digital HR Leaders podcast, and our guest is Frida Polli co-founder and CEO of pymetrics. Prior to starting pymetrics, Frida spent 10 years as an academic neuroscientist at Harvard and MIT.
In this extract taken from their conversation, Frida and David discuss the misconceptions that exist with regard to how AI is being used in recruitment and how we can leverage this powerful technology to reduce current biases that exist.
This episode of the Digital HR Leaders is a must-listen for people working or interested in recruiting, HR technology, behavioural science and people analytics as well as diversity and inclusion.
According to a survey conducted by the Conference Board in 2018, failure to attract top talent is the number one concern of senior executives today. We’ve seen this shift from concerns of global recession, disruption caused by technology and the fear of new competition.
With increasing costs of hiring talent along with growing fears of attracting and retaining top talent, it is not surprising that employers are turning to AI and technology to solve all of their problems. Especially as AI promises to pinpoint the right candidates in a fraction of the time, saving them money and without the need for human interaction, subjectivity and bias.
Common misconceptions of AI
There are a number of misconceptions that are associated with AI and the ways in which its used. Frida outlines that one of the most common misconceptions she encounters is the belief that any AI application used for recruiting purposes will contain biases of the human creator, because at the end of the day artificial intelligence is simply a machine copying a human right? While this can definitely be true if care is not taken to examine and validate the algorithms being used, there is a chance that they can potentially replicate human biases. This was seen in the case of Amazon, in which they discovered that their machine learning models were introducing biases favouring men due to the fact that it had been trained on a male-dominated set of CV’s. However, Frida rightly points out that much like electricity, AI can act as a fiercely powerful tool but ultimately, it’s the design of the technologists that really matters when developing AI programs. If the program is crafted and validated in such a way, then the fear of AI replicating human bias is not a concern.
“Artificial intelligence is an enabling layer that's just like electricity. And electricity can be a huge powerful force for good and it can also unfortunately be used in a harmful way. Technology is neutral. AI is neutral. It's really the design that the technologists have in mind when they create the artificial intelligence that matters at the end of the day.”
Types of unconscious bias in recruiting
When it comes to hiring, while we should be focusing on a candidate’s professional competency and ability to do the job at hand, as humans we often experience unconscious biases.
What is unconscious bias?
Unconscious or implicit biases are learned stereotypes that are automatic, unintentional, deeply engrained, universal, and able to influence behaviour
According to Wikipedia there are 180 social, memory and decision-making biases that affect individuals. Some of the common ones impacting recruitment are:
Confirmation bias is the tendency to favour or interpret new evidence as confirmation of one's existing beliefs or theories. If an interviewer has made an implicit judgement based on information they’ve gauged from the candidate’s CV, they’re likely to focus on asking questions that source information during the interview that confirms this belief.
Personal similarity bias is the tendency to seek out or favour others just like ourselves. Research conducted on hiring practices indicates that hiring managers in fact favour candidates that are most similar to themselves in terms of where they are from, their background or educational experiences, despite these similarities not being related to job performance.
Halo effect is a type of cognitive bias, in which the overall impression of a candidate influences how you feel and think about their character. The halo effect bias is largely based on first impressions. It occurs when the hiring manager likes a candidate and uses that as a basis for assuming, he or she will be good at the job rather than objectively assessing their skills and abilities.
How recruiting using AI reduces unconscious bias
While we know that as humans, we are prone to expressing unconscious biases when recruiting candidates for roles, the question arises as to how we might be able to eliminate such bias leveraging AI. During our interview David asks Frida what they’re doing at pymetrics to ensure that they’re eliminating any bias when designing their AI algorithms.
Frida outlines a number of steps that they follow to ensure they’re developing a product that removes bias from the recruitment process.
Firstly, they develop local job models for each organisation they work with that are reflective of that climate rather than attempting to use a universal model.
“We understand that they don’t have a sales profile that will work anywhere, and we'll go into a company we'll have your top performing salespeople go through our platform. We'll compare their traits against the baseline, and we'll say these are the traits that make someone successful.”
They also use a job analysis and predictive validation process to “understand how those traits map to the actual job that they're performing because obviously people want to know that. And so that's a big part for us in any case the validation process. It's a concurrent validation process that then we follow on with a predictive validation process. So then after we've had the algorithm live for a while, we'll then collect performance data, retention data and so on to validate it in a predictive way. So that's the validation piece. We also have construct validity and other types of validity that are important in the occupational psychology or IO world. And we have a whole team of Occ Psychs or IOs that have helped us really be buttoned up in that.”
Finally, they’ve created an audit process. “Essentially it's just like any audit process. You check the outcomes. You say, okay this algorithm I'm going to run it on a test group of real people that have given us pymetrics data. And we say okay between men and women are they getting an equal pass score? are people of different ethnic backgrounds getting equal pass scores? And if the answer is no because we're a white box or a glass box we can go back and say oh this feature is causing it. Okay let's remove it or de-weight it and then run it again. And again, when we say run a model, we're running hundreds of models. So, if one is showing that we can go and get one that's equally as good and then is not having that biased effect. And it's a pre-deployment audit process that we've open-sourced on GitHub so anyone can go and check it out and see what we're doing and we and that's what we give to a client is we'll say we won't release an algorithm unless it's passed our de-biasing process.”
While bias can potentially be built into AI or machine learning processes, there are many AI tools that now exist that can help inform our hiring processes and reduce bias. It’s important to note that these AI tools can be tested for bias, by ensuring that there is detailed validation and rigorous testing of the AI algorithm, it is possible to leverage AI in your recruitment process to pinpoint the right candidates in a fraction of the time, without the need for human interaction, subjectivity and bias while generating cost savings for your organisation.
Ultimately embracing AI tools and machine learning provides us with the opportunity to make decisions grounded in unbiased data. It’s important to remember that the AI tool doesn’t make the decision, we do. What it does do, is provide an additional data point that can help inform the decisions we make and ensure that they’re data-driven and free from bias.
There is a lot of hype surrounding AI in HR and its impact on work, but Artificial Intelligence, automation, augmentation, and the Future of Work are much more than just buzzwords. We need to interpret the real opportunities that new technologies can offer HR in how it can improve its own function and the organisation that it supports. If you’re interested in learning more about the opportunities offered by AI and Digital HR then you might be interested in our online training courses that focus on Digital HR. They cover a range of topics that walk you through the critical areas to include in your Digital HR strategy in more detail.
The myHRfuture academy is a learning experience platform for HR professionals looking to invest in their careers. We have several training courses that are targeted at both specialists in HR systems or digital HR as well as client-facing roles such as HR Business Partners. Our content helps HR professionals to become more digital and data-driven and will help you to navigate the complexities of the HR technology landscape and think about how to use HR technology to improve employee experience. See below for information on all of the courses that we have available on digital HR and HR technology for learners of all levels, as well as more information on our learning platform, that combines bitesized learning content with articles, research, case studies, podcasts, videos and access to a community collaboration platform for you to share ideas with peers and get answers from experts.
ABOUT THE AUTHOR
Ian Bailie is the Managing Director of myHRfuture.com and an advisor and consultant for start-ups focused on HR technology and People Analytics, including Adepto, Worklytics and CognitionX. In his previous role as the Senior Director of People Planning, Analytics and Tools at Cisco Systems, he was responsible for delivering the tools and insights to enable and transform the planning, attraction and management of talent across the organisation globally. Ian is passionate about HR technology and analytics and how to use both to transform the employee experience and prepare companies for the Future of Work.