HRreview Header

Khyati Sundaram: Is AI “black box bias” sabotaging your talent pipeline?

-

Imagine that a colleague of yours is helping you source the perfect candidate for a new role, says Khyati Sundaram.

They come to you with a shortlist of candidates they like the look of. But when you ask them how they whittled down their list, they can’t tell you.

You can’t confirm precisely which criteria your colleague was scoring candidates against, why they chose these specific individuals, nor on what basis any candidates who didn’t meet the shortlist might have been dismissed.

This lack of information would probably make you feel a little uncomfortable or suspicious, right?

 

HRreview Logo

Get our essential daily HR news and updates.

This field is for validation purposes and should be left unchanged.
Weekday HR updates. Unsubscribe anytime.
This field is hidden when viewing the form
This field is hidden when viewing the form
Optin_date
This field is hidden when viewing the form

 

 

Well, this is in essence what’s happening when AI is helping companies source and shortlist candidates for roles.

What is “black box bias”?

Most off-the-shelf AI models are a “black box”, meaning we cannot see how or why a model is making its decisions.

We may have programmed the model to look for candidates which meet a certain set of criteria for our role; but depending on the data that model was originally trained on, it might also be making decisions based on learned rules we can’t oversee. I call this “black box bias”.

“Black box bias” is bad news for your talent pipeline. Most garden-variety and open-source AI models have been trained on swathes of internet data going back decades. Research has categorically shown that these models perpetuate social bias, meaning their perception of “top talent” has centuries of racial and gender oppression baked in. In other words, it’s not a fair fight.

“Black box bias” in action

In a recent experiment, Bloomberg asked a text-to-image AI tool to generate pictures of people doing different kinds of jobs. The analysis found that images generated for “high-paying” jobs were dominated by subjects with lighter skin tones, while subjects with darker skin tones were more commonly generated by prompts like “fast-food worker” and “social worker.”

Most occupations in the dataset were dominated by men, except for low-paying jobs like housekeeper and cashier. And men with lighter skin tones represented the majority of subjects in every high-paying job, including “politician,” “lawyer,” “judge” and “CEO.” The data speaks for itself.

AI models mirror and amplify the biases we see all around us, with potentially sinister consequences when it comes to helping you source candidates for your next role. When black box AI feeds you a shortlist, you can’t know which talented individuals didn’t make the cut due to arbitrary factors like name, age, address, degree or even skin tone.

Likewise, you can’t be sure the people on your list are the best: they might just have nominal characteristics we’ve historically associated with certain types of jobs.

How do we fix it?

The only way to correct for “black box bias” is to be more discerning about which AI models we choose to use in recruitment.

Off-the-shelf AI models won’t cut it. We need new, ethical AI models cleaned of historical biases and any data which could trigger bias: like our names, gender or where we studied at university.

We also need recruitment AI models to be explainable so a human can stay in the loop. This means doing away with black box AI so HR leaders can see how and why decisions have been made by AI models, and step in to correct for biased selections when necessary.

Finally, we need to shift our focus away from proxies on CVs to demonstrable skills. Skills-based hiring is the best way to source top talent in an empirical and objective way. Our research at Applied shows that skills-based hiring drives a 4x increase in candidate ethnic diversity and a 93 percent retention rate.

In the absence of clear regulation about the use of AI for recruitment in the UK, it’s up to HR leaders to make the right choices. And we know that doing so pays off: diverse teams perform better, are more productive and make more money. Diversity is an important factor for talent attraction, too: 76 percent of employees and job seekers say diversity is important when considering job offers.

But when black box bias isn’t challenged, diversity – and your talent pipeline – is compromised. You know what to do.

__

Khyati Sundaram is the CEO of Applied.

CEO at 

Khyati Sundaram is the CEO of Applied: a behavioural science-backed tool which helps companies hire fairly and without bias. Before joining Applied, Khyati co-founded her own company and also worked in investment banking with JP Morgan and RBS. She also holds an MSc in Economics from the London School of Economics, as well as an MBA from the London School of Business.

Latest news

Middle East air disruption leaves UK staff stranded as employers weigh pay and absence decisions

Employers face complex decisions on pay, leave and remote working as travel disruption leaves British staff stranded in the Middle East.

Govt launches gender pay gap and menopause action plans to help women ‘thrive at work’

Employers are encouraged to publish action plans to reduce pay disparities and support staff experiencing menopause under new government measures.

Call for stronger professional standards to rebuild trust in jobs

Professional bodies call for stronger standards and Chartered status to improve trust, accountability and consistency across roles.

Modulr partners with HiBob to streamline payroll payments

Partnership integrates payments automation into payroll workflows to reduce manual processing and improve pay day reliability.
- Advertisement -

Jake Young: Strong workplace connections are the foundation of good leadership

Effective leaders are, understandably, viewed as key to organisational success. Good leaders are felt to improve employee engagement, productivity and retention.

AI reshapes finance jobs as entry-level roles come under pressure

Employers prioritise digital skills over traditional accounting as AI reshapes finance roles and raises concerns over entry-level opportunities.

Must read

Kate Bullinger: Employees rising – The next generation of employee engagement

Are your employees posting about your company online? What...

Ray Law: Why pension opt-outs are ringing alarm bells for HR

Nearly 1 in 10 employees are opting out of workplace pensions. For HR leaders, this isn’t just a retirement issue; it can signal longer-term risks to workforce resilience.
- Advertisement -

You might also likeRELATED
Recommended to you