Elouisa Crichton: AI hiring tools – what recruiters need to know about discrimination risks

-

For UK employers, this shift brings opportunities but also significant legal risks, as biases within AI hiring systems may leave employers exposed to discrimination claims under the Equality Act 2010.

Qualitative findings from a 2025 Melbourne University study revealed that 13 AI hiring systems available on the global market (albeit tested solely in Australia) discriminated against applicants who wear religious coverings, request workplace accommodations for disabilities, or have names that the AI system perceived to be black.

Additionally, candidates whose first language isn’t English or who have speech conditions often score poorly in AI-powered interviews due to inaccurate transcription.

HRreview Logo

Get our essential weekday HR news and updates.

This field is for validation purposes and should be left unchanged.
Keep up with the latest in HR...
This field is hidden when viewing the form
This field is hidden when viewing the form
Optin_date
This field is hidden when viewing the form

 

Other studies concur with these findings. A 2023 paper by the Nanjing College of Economics and Management found that algorithmic bias resulted in discriminatory hiring practices based on gender, race, colour, and personality traits. These studies underpin the reality that it is difficult for data used to train AI to be neutral and consequently that biases can be baked into systems without the knowledge of developers.

So, while AI can help save time when recruiting, companies must be aware of the risks these systems present for hiring processes.

Preventing discrimination

To ensure AI-supported recruitment is conducted fairly, businesses must routinely inspect their AI training programmes.

This can be achieved through accountability and scrutiny of AI processes. Regular audits of AI systems can identify patterns of bias or unfair candidate treatment. The University of Melbourne’s 2025 study identified that employers “need a better understanding of the [AI hiring systems] rolled out in their organisations and their potential to cause harm at scale”.

Analysis of implementation and results will help deepen understanding of these tools. Noting and reporting any identified patterns of potential discrimination allows data scientists to adjust and implement diverse training data.

Establishing clear AI governance via a committee or board with usage policies, particularly for hiring new staff, with designated responsibility for human oversight of systems’ decision-making, is a step in the right direction towards countering the potential biases of AI tools.

Transparency is also crucial for fostering candidate trust. Being open about AI use to candidates and disclosing how information, including personal data will be processed is a necessary measure to comply with data protection obligations, but it can also help reassure candidates about the nature of the process they have signed up for.

Ensuring an organisation’s internal teams understand the new technology is also vital. HR teams and hiring managers using AI tools must recognise the capabilities and limits of their software, especially due to the large risk of potential discrimination in recruitment and be clear about their role as overseers of the tools’ implementation.

Training to use AI in recruitment

Specific training for AI-powered recruitment is critical in helping employees understand how to use the technology while also avoiding potential discriminatory actions. This can be via critiquing data sets more proficiently or interrogating the AI on its candidate selection.

User-friendly tools which require little-to-no coding knowledge are ideal for widespread use within an organisation for the purposes of recruitment.

Making staff comfortable in using AI to assist in recruitment can help reduce the risk of perverse decisions and maximise its successful use. As AI becomes increasingly embedded in the market, understanding risks and opportunities isn’t just advisable, it’s essential for business survival and growth.

Implementing AI will allow for far greater efficiency from staff – but it must be handled with care due to the potential discriminatory nature of these systems.

Partner at 

Elouisa is partner within Dentons' People Reward and Mobility practice in Glasgow. She is an expert in employment and equality law and works on Scottish, UK and international matters.

She is accredited by the Law Society of Scotland as a specialist in both employment and discrimination law. Chambers and Partners ranks her as the Employment Star Associate 2024. Legal 500 chose her as the Employment Rising Star of the Year 2023, and has ranked her as Rising Star in 2022, 2023 and 2024.

Latest news

Helen Wada: Why engagement initiatives fail without human-centric leadership

Workforce engagement has become a hot topic across the boardroom and beyond, particularly as hybrid working practices have become the norm.

Recruiters warned to move beyond ‘post and pray’ as passive talent overlooked

Employers risk missing most candidates by relying on job boards as hiring methods struggle to deliver quality applicants.

Employment tribunal roundup: Appeal fairness, dismissal reasoning, discrimination tests and religious belief clarified

Decisions examine appeal failures, dismissal reasoning, discrimination claims and religious belief, offering practical guidance on fairness, causation and proportionality.

Fears of AI cheating in hiring ‘overblown’ as employers urged to rethink assessments

Employers may be overstating concerns about AI misuse in recruitment as evidence of candidate manipulation remains limited.
- Advertisement -

More employees use workplace health benefits, but barriers still limit access

Many workers struggle to access employer healthcare support due to confusion, costs and unclear processes.

Gender pay gap in tech widens to nine-year high as AI roles drive salaries

Women in IT earn less as salaries rise faster in male-dominated AI and cybersecurity roles, widening pay differences.

Must read

All things being Equal…

Lucinda Bromfield of Bevans Solicitors gives us a whistle-stop...

Jock Chalmers: The problem with Midsomer Murder

You probably will have seen the recent press coverage...
- Advertisement -

You might also likeRELATED
Recommended to you