Khyati Sundaram: Is AI “black box bias” sabotaging your talent pipeline?

-

Imagine that a colleague of yours is helping you source the perfect candidate for a new role, says Khyati Sundaram.

They come to you with a shortlist of candidates they like the look of. But when you ask them how they whittled down their list, they can’t tell you.

You can’t confirm precisely which criteria your colleague was scoring candidates against, why they chose these specific individuals, nor on what basis any candidates who didn’t meet the shortlist might have been dismissed.

This lack of information would probably make you feel a little uncomfortable or suspicious, right?

HRreview Logo

Get our essential weekday HR news and updates.

This field is for validation purposes and should be left unchanged.
Keep up with the latest in HR...
This field is hidden when viewing the form
This field is hidden when viewing the form
Optin_date
This field is hidden when viewing the form

 

Well, this is in essence what’s happening when AI is helping companies source and shortlist candidates for roles.

What is “black box bias”?

Most off-the-shelf AI models are a “black box”, meaning we cannot see how or why a model is making its decisions.

We may have programmed the model to look for candidates which meet a certain set of criteria for our role; but depending on the data that model was originally trained on, it might also be making decisions based on learned rules we can’t oversee. I call this “black box bias”.

“Black box bias” is bad news for your talent pipeline. Most garden-variety and open-source AI models have been trained on swathes of internet data going back decades. Research has categorically shown that these models perpetuate social bias, meaning their perception of “top talent” has centuries of racial and gender oppression baked in. In other words, it’s not a fair fight.

“Black box bias” in action

In a recent experiment, Bloomberg asked a text-to-image AI tool to generate pictures of people doing different kinds of jobs. The analysis found that images generated for “high-paying” jobs were dominated by subjects with lighter skin tones, while subjects with darker skin tones were more commonly generated by prompts like “fast-food worker” and “social worker.”

Most occupations in the dataset were dominated by men, except for low-paying jobs like housekeeper and cashier. And men with lighter skin tones represented the majority of subjects in every high-paying job, including “politician,” “lawyer,” “judge” and “CEO.” The data speaks for itself.

AI models mirror and amplify the biases we see all around us, with potentially sinister consequences when it comes to helping you source candidates for your next role. When black box AI feeds you a shortlist, you can’t know which talented individuals didn’t make the cut due to arbitrary factors like name, age, address, degree or even skin tone.

Likewise, you can’t be sure the people on your list are the best: they might just have nominal characteristics we’ve historically associated with certain types of jobs.

How do we fix it?

The only way to correct for “black box bias” is to be more discerning about which AI models we choose to use in recruitment.

Off-the-shelf AI models won’t cut it. We need new, ethical AI models cleaned of historical biases and any data which could trigger bias: like our names, gender or where we studied at university.

We also need recruitment AI models to be explainable so a human can stay in the loop. This means doing away with black box AI so HR leaders can see how and why decisions have been made by AI models, and step in to correct for biased selections when necessary.

Finally, we need to shift our focus away from proxies on CVs to demonstrable skills. Skills-based hiring is the best way to source top talent in an empirical and objective way. Our research at Applied shows that skills-based hiring drives a 4x increase in candidate ethnic diversity and a 93 percent retention rate.

In the absence of clear regulation about the use of AI for recruitment in the UK, it’s up to HR leaders to make the right choices. And we know that doing so pays off: diverse teams perform better, are more productive and make more money. Diversity is an important factor for talent attraction, too: 76 percent of employees and job seekers say diversity is important when considering job offers.

But when black box bias isn’t challenged, diversity – and your talent pipeline – is compromised. You know what to do.

__

Khyati Sundaram is the CEO of Applied.

CEO at 

Khyati Sundaram is the CEO of Applied: a behavioural science-backed tool which helps companies hire fairly and without bias. Before joining Applied, Khyati co-founded her own company and also worked in investment banking with JP Morgan and RBS. She also holds an MSc in Economics from the London School of Economics, as well as an MBA from the London School of Business.

Latest news

Helen Wada: Why engagement initiatives fail without human-centric leadership

Workforce engagement has become a hot topic across the boardroom and beyond, particularly as hybrid working practices have become the norm.

Recruiters warned to move beyond ‘post and pray’ as passive talent overlooked

Employers risk missing most candidates by relying on job boards as hiring methods struggle to deliver quality applicants.

Employment tribunal roundup: Appeal fairness, dismissal reasoning, discrimination tests and religious belief clarified

Decisions examine appeal failures, dismissal reasoning, discrimination claims and religious belief, offering practical guidance on fairness, causation and proportionality.

Fears of AI cheating in hiring ‘overblown’ as employers urged to rethink assessments

Employers may be overstating concerns about AI misuse in recruitment as evidence of candidate manipulation remains limited.
- Advertisement -

More employees use workplace health benefits, but barriers still limit access

Many workers struggle to access employer healthcare support due to confusion, costs and unclear processes.

Gender pay gap in tech widens to nine-year high as AI roles drive salaries

Women in IT earn less as salaries rise faster in male-dominated AI and cybersecurity roles, widening pay differences.

Must read

Emma Doyley: How to build your human firewall

When it comes to cyber security, everyone and anyone is at risk. So, avoiding cyber threats needs to be a company-wide mission, highlights Emma Doyley.

Margaret Kett and Chris Goward: Business transformation, an essential requirement of the corporate landscape

Margaret Kett and Chris Goward explore how transforming a business is key to its success. They discuss how businesses can captialise on market opportunities through diversity.
- Advertisement -

You might also likeRELATED
Recommended to you