Richard Justenhoven: The four main challenges to overcome when using AI in assessment

-

The goal of any recruitment process is to identify the right person for the job. The closer you match the individual to the requirements of the role, the more effective that person will be. You certainly don’t need Artificial Intelligence (AI) to achieve this, but AI will help you do it quicker and more efficiently.

The reality is that AI excels at two things: analysing massive amounts of data and conducting ‘narrow’ tasks – the things you might outsource to a shared service centre. We compare AI’s narrow tasks to household chores – you wouldn’t wash dishes in a washing machine or put clothes in a dishwasher. Each perform specific tasks that rely on humans. So a household’s time-saving machines – and an AI’s algorithms – are unique and aren’t made to be swapped.

AI helps make processes easier by providing useful information, at various stages, that will help make a final recruitment decision. It also has three other key benefits – reducing bias, being legally defensible and engagement.

Get our essential weekday HR news and updates.

This field is for validation purposes and should be left unchanged.
Keep up with the latest in HR...
This field is hidden when viewing the form
This field is hidden when viewing the form
Optin_date
This field is hidden when viewing the form

 

However, with AI in assessment growing at a rapid pace, besides technical challenges, there are four less obvious issues that must be addressed:

Defensibility:

The process of selecting candidates – be it for entry into the organisation or promotion within – must always be legally defensible. It must not be discriminatory nor favour a particular group of candidates based on gender, or race, or any of the other groups outline in equal opportunity legislation. There are also the rights of the individual – and the right to be informed how assessment information is to be used. Under the General Data Protection Regulation (GDPR), you need to make sure candidates know how and why assessment is being used including profiling to make decisions. But regardless of AI, this is good practice anyway.

Bear in mind too, that standardised ‘plug-and-play’ AI systems are available – but they won’t differentiate your employer brand. If your competitors use the same systems, you’ll all be chasing the same talent. Also, these systems utilise ‘deep learning networks’ which learn as they go. This sounds promising but actually it makes it very difficult to explain exactly why candidates were accepted or rejected. These systems therefore lead you to make selection decisions that you can’t defend, which leaves you vulnerable to litigation from disgruntled candidates. Only custom AI systems offer the ability to make transparent and defensible selection decisions.

Time:

Custom AI systems mirror human behaviour and replicate the best practice of your assessors and raters. To achieve this, you have to pre-feed the system with relevant information. It can take up to six months to ‘train’ an AI system to assess candidates in exactly the same way that your assessors and raters would judge them. Managing this lead time will be a major challenge for organisations.

Chief human resources officers (CHROs) should therefore be forming project teams now to look at custom AI models for video interviewing and other recruitment processes. Otherwise you’ll always be six months behind those pioneering companies that have already invested in this technology.

Ethics:

There is an ethical question around how much support you take from an AI system. For example, are you happy for an AI system to reject your candidates? Or would you prefer it to ‘flag up’ unsuitable candidates so you can review and check their details? How to use AI ethically will be a key consideration for many employers.

AI’s role should be restricted to providing additional information and enhancing efficiency. Recruiters should always set the objectives when hiring. AI can then deliver useful information, at various stages of the selection process, that will support a final decision.

Data handling:

AI excels at massive amounts of data. However, when so much data is involved, the results can be misinterpreted or even deliberately abused. Good data handling practices will be essential not just for confidentiality but also for maintaining your organisation’s reputation. AI should be used carefully and honourably to help you predict which candidates will be effective in the role – and engaged by your organisation.

With this in mind, there are four guidelines to help you get AI in assessment right.

Recruiters should set the initial goal and make the final hiring decision. AI is simply there to support and assist the process.

The need for interviews remains a human activity. What impression does a candidate get from being interviewed by an avatar?

Standard AI systems play a limited role; there is a need to develop custom AI systems to differentiate your employer brand – and offer the transparency of decisions.

And remain aware of ethical considerations, not least how much support we take from an AI system.

In summary, using AI wisely means it’s possible to closely predict the people who will be the best in the specific roles available. They’re also likely to be the most engaged – which supports productivity and retention. There is no doubt, AI will be used more and more in assessment, but getting it right is essential for all organisations.

Richard Justenhoven is the product development director within Aon's Assessment Solutions. A leading organisational psychologist, Richard is an acknowledged expert in the design, implementation and evaluation of online assessments and a sought after speaker about such topics.

Latest news

Personalising the Benefits Experience: Why Employees Need More Than Just Information

This article explores how organisations can move beyond passive, one-size-fits-all communication to deliver relevant, timely, and simplified benefits experiences that reflect employee needs and life stages.

Grant Wyatt: When the love dies – when staying is riskier than quitting

When people fall out of love with their employer, or feel their employer has fallen out of love with them, what follows is rarely a clean exit.

£30bn pension savings window opens for employers ahead of 2029 reforms

UK employers could unlock billions in National Insurance savings by expanding pension salary sacrifice schemes before new limits take effect in 2029.

Expat jobs ‘fail early as costs hit $79,000 per worker’

International assignments are ending early due to family strain, isolation and poor preparation, as rising costs increase pressure on employers.
- Advertisement -

The Great Employer Divide: What the evidence shows about employers that back parents and carers — and those that don’t

Understand the growing divide between organisations that effectively support working parents and carers — and those that don’t. This session shows how to turn employee experience data into a clear business case, linking care-related pressures to performance, retention and workforce stability.

Scott Mills exit puts spotlight on risk of ‘news vacuum’ in high-profile dismissals

Sudden departure of a long-serving BBC presenter raises questions about how employers manage high-profile dismissals and limit speculation.

Must read

David Price: Are your employees suffering from Brexit anxiety?

Find out about the simple ways through which you can support anxious employees during  Brexit.

Jenn Batey: Inspiring a high performance culture

High performance culture. What does it mean to you?...
- Advertisement -

You might also likeRELATED
Recommended to you