HRreview Header

Elouisa Crichton: AI hiring tools – what recruiters need to know about discrimination risks

-

For UK employers, this shift brings opportunities but also significant legal risks, as biases within AI hiring systems may leave employers exposed to discrimination claims under the Equality Act 2010.

Qualitative findings from a 2025 Melbourne University study revealed that 13 AI hiring systems available on the global market (albeit tested solely in Australia) discriminated against applicants who wear religious coverings, request workplace accommodations for disabilities, or have names that the AI system perceived to be black.

Additionally, candidates whose first language isn’t English or who have speech conditions often score poorly in AI-powered interviews due to inaccurate transcription.

Other studies concur with these findings. A 2023 paper by the Nanjing College of Economics and Management found that algorithmic bias resulted in discriminatory hiring practices based on gender, race, colour, and personality traits. These studies underpin the reality that it is difficult for data used to train AI to be neutral and consequently that biases can be baked into systems without the knowledge of developers.

So, while AI can help save time when recruiting, companies must be aware of the risks these systems present for hiring processes.

Preventing discrimination

To ensure AI-supported recruitment is conducted fairly, businesses must routinely inspect their AI training programmes.

This can be achieved through accountability and scrutiny of AI processes. Regular audits of AI systems can identify patterns of bias or unfair candidate treatment. The University of Melbourne’s 2025 study identified that employers “need a better understanding of the [AI hiring systems] rolled out in their organisations and their potential to cause harm at scale”.

Analysis of implementation and results will help deepen understanding of these tools. Noting and reporting any identified patterns of potential discrimination allows data scientists to adjust and implement diverse training data.

Establishing clear AI governance via a committee or board with usage policies, particularly for hiring new staff, with designated responsibility for human oversight of systems’ decision-making, is a step in the right direction towards countering the potential biases of AI tools.

Transparency is also crucial for fostering candidate trust. Being open about AI use to candidates and disclosing how information, including personal data will be processed is a necessary measure to comply with data protection obligations, but it can also help reassure candidates about the nature of the process they have signed up for.

Ensuring an organisation’s internal teams understand the new technology is also vital. HR teams and hiring managers using AI tools must recognise the capabilities and limits of their software, especially due to the large risk of potential discrimination in recruitment and be clear about their role as overseers of the tools’ implementation.

Training to use AI in recruitment

Specific training for AI-powered recruitment is critical in helping employees understand how to use the technology while also avoiding potential discriminatory actions. This can be via critiquing data sets more proficiently or interrogating the AI on its candidate selection.

User-friendly tools which require little-to-no coding knowledge are ideal for widespread use within an organisation for the purposes of recruitment.

Making staff comfortable in using AI to assist in recruitment can help reduce the risk of perverse decisions and maximise its successful use. As AI becomes increasingly embedded in the market, understanding risks and opportunities isn’t just advisable, it’s essential for business survival and growth.

Implementing AI will allow for far greater efficiency from staff – but it must be handled with care due to the potential discriminatory nature of these systems.

Partner at  | [email protected]

Elouisa is partner within Dentons' People Reward and Mobility practice in Glasgow. She is an expert in employment and equality law and works on Scottish, UK and international matters.

She is accredited by the Law Society of Scotland as a specialist in both employment and discrimination law. Chambers and Partners ranks her as the Employment Star Associate 2024. Legal 500 chose her as the Employment Rising Star of the Year 2023, and has ranked her as Rising Star in 2022, 2023 and 2024.

Latest news

Turning Workforce Data into Real Insight: A practical session for HR leaders

HR teams are being asked to deliver greater impact with fewer resources. This practical session is designed to help you move beyond instinct and start using workforce data to make faster, smarter decisions that drive real business results.

Bethany Cann of Specsavers

A working day balancing early talent strategy, university partnerships and family life at the international opticians retailer.

Workplace silence leaving staff afraid to raise mistakes

Almost half of UK workers feel unable to raise concerns or mistakes at work, with new research warning that workplace silence is damaging productivity.

Managers’ biggest fears? ‘Confrontation and redundancies’

Survey of UK managers reveals fear of confrontation and redundancies, with many lacking training to handle difficult workplace situations.
- Advertisement -

Mike Bond: Redefining talent – and prioritising the creative mindset

Not too long ago, the most prized CVs boasted MBAs, consulting pedigrees and an impressive record of traditional experience. Now, things are different.

UK loses ground in global remote work rankings

Connectivity gaps across the UK risk weakening the country’s appeal to remote workers and internationally mobile talent.

Must read

Kirsten Cluer: What the EU Settlement Scheme means for UK business

Kirsten Cluer demystifies the EU Settlement Scheme for all HR and employers in the UK. A must read!

Iain McMath: The new role of the family man

‘Home dads’ are on the rise with one in...
- Advertisement -

You might also likeRELATED
Recommended to you