Does AI Help or Hinder Inclusive Hiring?
As UK organisations increasingly adopt AI tools for recruitment, from CV screening to interview analytics, an important question arises: is AI making hiring fairer, or reinforcing bias? This article explores both sides, drawing on real-life examples and practical steps to support inclusive hiring.

Does AI Help or Hinder Inclusive Hiring?
Introduction
As UK organisations increasingly adopt AI tools for recruitment, from CV screening to interview analytics, an important question arises: is AI making hiring fairer, or reinforcing bias? This article explores both sides, drawing on real-life examples and practical steps to support inclusive hiring.
Where AI Supports Inclusive Hiring
1. Removing Biased Language in Job Ads
AI tools can flag gendered, exclusionary or culturally loaded terms in job descriptions and suggest more neutral wording. This helps attract a broader, more diverse range of candidates from the start.
2. Standardising Initial Screening
AI can reduce unconscious bias in the early stages by focusing on objective criteria rather than subjective impressions. This opens doors for candidates from non-traditional backgrounds who might otherwise be overlooked.
3. Widening Applicant Reach
Some platforms use predictive analytics to match skills across sectors, highlighting applicants with transferable experience. This can surface overlooked talent.
Example: MoD’ s Use of Textio for Inclusive Job Adverts
The UK Ministry of Defence has embedded Textio, an AI-powered writing assistant, into its recruitment advert process. Public sector transparency records confirm its use to:
- Promote gender-neutral language, reduce jargon, and produce clearer adverts
- Support the Disability Confident Scheme by ensuring adverts explain guaranteed interview opportunities for qualified disabled applicants
- Pair Textio usage with inclusive recruitment policies, including diverse interview panels and structured sift guidance
Although MoD does not publish exact uplift figures, its comprehensive integration of AI-enabled inclusive language tools into policy, training and panelling suggests meaningful impact on widening access. This example shows AI being used as part of a systemic approach to inclusion, not simply as a bolt-on tool.
Where AI Risks Undermining Inclusion
1. Discrimination via Inferred Characteristics
Some AI tools infer protected characteristics, such as gender or ethnicity, from candidate data. This can result in unlawful discrimination and breach UK data protection laws, especially where candidates are not informed.
2. Penalising Disabled or Neurodivergent Candidates
AI-driven video interview tools can score candidates based on facial expressions, tone or body language. These tools often lack the nuance needed to fairly assess neurodivergent individuals or candidates with disabilities.
3. Repeating Past Bias
AI trained on historical data can replicate existing patterns of discrimination. In one high-profile case, Amazon abandoned a recruitment tool that downgraded CVs containing the word "women’s" because it had been trained on ten years of male-dominated hiring data.
Recent studies also show that generative AI tools may favour men for higher-paid roles or devalue experience related to disability or social impact. However, bias can be significantly reduced when tools are trained with inclusive datasets and evaluated by diverse teams.
How to Use AI Responsibly in Hiring
To harness the benefits of AI while protecting against risk, consider these good practice steps:
- Assess the risks before implementation. This includes checking for potential indirect discrimination.
- Ask for transparency from providers. Request documentation about how the tool was trained and tested across different demographic groups.
- Keep humans involved. AI should inform hiring decisions, not make them independently.
Offer alternative formats. Ensure candidates can engage with the process in ways that suit their needs, particularly those with access requirements.
Monitor outcomes. Regularly review who is being shortlisted, interviewed and hired, and look for patterns that suggest bias.
Final Thoughts
AI in recruitment is here to stay, but inclusive hiring doesn’t happen by accident. It requires deliberate choices about the tools we use, the data we rely on, and the way we treat candidates throughout the process.
Used well, AI can help streamline hiring and reduce bias. Used carelessly, it can do the opposite. The difference lies in how thoughtfully it is applied, and how accountable we are for the outcomes it creates.
Next Steps
At Inclusive Hiring Works, we’re here to support you on that journey. We provide:
- Inclusive Hiring Health Checks – Comprehensive audits of hiring processes
- Workshops & Training for Talent Acquisition Teams – Practical skills to drive inclusivity
- Inclusive Hiring Training for Hiring Managers – Ensuring equitable hiring decisions
Want to make inclusive hiring the norm in your organisation?
Book your Free Discovery Call to learn how we can support your business!
Share this post:
Related Posts

We all want to believe we’re making fair and objective decisions. But unconscious bias has a way of...

Hiring the best talent isn’t just about skills and experience - it’s about ensuring every candidate...