Home / Insight / AI: How to build a successful team

AI: How to build a successful team


For those of us old enough to have seen the film Blade Runner when it was first released, AI was part of a futuristic fantasy world. AI is no longer the future – it has arrived. Whether we fully embrace the opportunities it gives us or are fearful of the uncertainty that it presents, it is becoming an important part of everyday life. Recruitment is no exception and AI is increasingly being used by both employers and candidates. So what are the good, bad and downright ugly aspects of it?

When recruiting staff, AI can automate aspects of the hiring process. Screening applications, particularly in large-scale recruitment exercises, can result in substantial time savings. Some of the more tedious tasks, such as setting up interviews and ensuring that the minimum qualifications for the role are met, can easily be automated. Systems can also be set up to respond automatically to regularly asked questions, which allows candidates to get responses quickly and ensures consistency.

Much has also been made of how AI can play an important role in preventing any subconscious bias from creeping into decision-making (and defending any complaint that this has occurred). A computer program doesn’t have opinions or experiences which affect its responses.

So why does the suggestion of using AI in recruitment strike fear into the heart of many an employment lawyer? It is not simply that many of us are so old that we still consider the internet to be a thing of magic. Algorithms can perpetuate existing biases, either because it is present in historical data or due to how the criteria are worded. Automation of systems and processes also risks missing the more nuanced aspects of us humans and selecting only candidates who fit a standard criteria. Missing out those with a less conventional employment history can lead to a homogenised workforce when diversity can be a real bonus. 

The rigid application of selection criteria can also lead to inadvertent discrimination. An application automatically rejected because of gaps in employment history could discriminate against a candidate who has taken time out to raise a family or undertaken caring responsibilities. High levels of absence may be considered undesirable, but can AI accurately distinguish between disability and non-disability related absences? Employment tribunals are not going to accept the ‘computer said no’ defence.

Some companies are even starting to use systems to analyse a candidate’s personality, for example, using video interviews to check body language and facial expressions against the desired criteria in a role. But how comfortable do we all feel about a computer judging our personality traits and how does this work for candidates who are, for example, neurodiverse and may struggle with face-to-face interactions?

Al cannot take into account that impossible to explain feeling that someone will be a good ‘fit’ for your company. Every organisation has a personality, built up over time, that reflects everyone who works in it. It is difficult to input the culture and identity of a workplace into a programme to find the best candidate for you.

It’s not just companies who are using AI in the recruitment process. Job candidates are also taking advantage of the benefits when preparing applications and drafting responses to expected interview questions. AI programmes can provide increasingly intuitive and individualised responses, which means it can be almost impossible to know if a candidate’s responses are from the heart. It is nothing new for candidates to ‘enhance’ their CVs or get someone else to complete their application for them, but AI takes this to another level. Unchecked, this can mean a person is appointed to a role that they are not qualified for or suited to, which in turn results in wasted recruitment costs and impact on business efficacy if staff cannot be retained.

These issues can generally be resolved by careful programming combined with good old-fashioned oversight and double-checking by an actual person. On the plus side, using AI to quickly analyse data can also provide a method for checking that diversity is not being impacted.

Face-to-face interviews will remain an essential part of the process, whether in person or onscreen. The candidate gets to see the human side of the company and the recruiter can fully assess their suitability. Recruiters would also be wise to ask questions similar to those posed in the job application to compare how candidates respond, along with some less predictable follow-up questions to ensure the individual has the abilities their application suggests.

AI is here. It provides the opportunity for those within the London Markets to gain the edge by streamlining recruitment processes, without losing our humanity. 

[This article was prepared without the use of AI.]

Vanessa Latham

Vanessa Latham
Employment and Discrimination


Related Insights

Where have the spinally injured been left by Covid-19?

AI-based technology - a game changer for spinal injury claims?

Japanese Knotweed2

Davies v Bridgend - Supreme Court Judgment


Duty to consult – a voice, but no say

Stay informed with Keoghs


Our Expertise


Claims Technology Solutions

Disrupting claims management with innovation & technology


The service you deliver is integral to the success of your business. With the right technology, we can help you to heighten your customer experience, improve underwriting performance, and streamline processes.