Despite anti-discrimination laws, recruiter bias is as prevalent now as it was 50 years ago, and prejudices about gender, ethnicity and age are limiting people’s job prospects. The knock-on effects on society and business are serious, so what can recruiters do to reduce the effect of implicit bias on who gets hired?
The bias crisis: what’s in a name?
Writing the perfect CV isn’t easy. Each word must be carefully chosen to maximise the chances of landing your dream job. But what if the most important word in the document isn’t about your education, career history or experience but is simply your name?
Researchers at the Centre for Social Inequality in Oxford sent thousands of similar fake CVs to a wide range of employers. The only difference between them was the applicant’s name and the inclusion of a second language, designed to signal the sender’s ethnicity. On average, people thought to be from ethnic minorities had to send 60% more CVs to get a similar chance of a call-back, despite having an identical cover letter and CV. The problem was particularly bad for fictitious candidates from majority Muslim countries. Despite Britain’s anti-discrimination laws, the report found a similar level of discrimination exists here, compared to other European countries, and almost no sign of progress compared to similar studies undertaken 50 years ago.
Other studies using the same methodology have found similar results for gender, with women being around 30% less likely to be contacted by recruiters. The discrimination is worse for male-associated jobs like engineering, or if the candidate has children. In science, this bias goes beyond merely getting hired: female students are penalised in university applications and men are awarded grants 1.4 times more often than women, despite applying for a similar number. And there’s evidence that recruiters discriminate against certain ages, overweight candidates (especially overweight women) and unattractive people.
The big impacts of a hidden problem
Biases in recruitment aren’t just harmful to candidates, but also to business and academia. A report from Royal Society Open Science argues that diverse teams are better problem solvers and decision makers. Humans are bad at detecting their own biases, but very good at spotting other peoples, so having a mixed group means these traps are more likely to be spotted. A diverse group are also more likely to come up with a wider range of solutions to any given issue, which increases the likelihood of finding the best one. According to a report from 2018, businesses with diverse senior management are 21% more likely to have above-average profits.
What can we do to level the playing field?
The UK’s anti-discrimination laws on their own are clearly not a solution to the problem, but there are measures and procedures companies can use to decrease bias.
Better job ads
Bias can start very early on in the recruitment process, meaning some demographics are less likely to even apply. Some research suggests it can help for companies to remove gender associated language from job descriptions. And words like ‘bright’, ‘bubbly’ or ‘dominant’ come with gender associated baggage that can make references for women read poorly compared to those for men.
A seemingly simple solution is to remove things like names, genders and nationalities from CVs and grant applications, meaning people are reviewed solely on their qualities and abilities. Whilst some institutions have started doing this, most companies don’t, so some disadvantaged applicants have taken to using male names or ‘whitening’ their CVs to try to avoid being victims of bias. How much impact the blind CV approach can have depends a lot on the interview process. Whilst it’s hard to interview someone in person without finding out their age, gender or appearance, it is possible to include blinded skills assessments and even preliminary online interviews by text chat.
Diverse hiring committees
Another type of bias called ‘affinity bias’, where people want to hire people that remind them of themselves, also causes problem. A diverse hiring panel doesn’t just tackle affinity bias, it also puts diverse interviewees at ease. Technology company Intel implemented a rule that hiring panels needed at least two women and/or underrepresented communities, and the percentage of hires that were either women or people of colour went from 32% to 45%.
Recruiter bias is usually implicit: recruiters aren’t consciously aware they’re choosing one gender or ethnicity over another, so simply making people aware of this might help reduce it. A study from 2015 found a two-and-a-half-hour workshop was enough to reduce the levels of implicit bias in participants, and a follow-up from 2017 found this had a significant impact on their departments’ hiring practices: they recruited more women. However, this is a single success story from a mountain of studies, and a 2017 meta-analysis found that, overall, there is little change in behaviour resulting from training. Implicit bias training isn’t a silver bullet, and a lot more research is required before we fully understand what works.
Using AI hiring tools
Some have suggested eliminating bias by eliminating the people: perhaps AI could be used to avoid stereotyping candidates. Amazon developed just such a machine learning programme using ten years’ worth of CVs, but it incorporated the biases inherent in its training data set and penalised any CVs with the word ‘women’s’ in it.
Where does this leave us?
The most important takeaway is that companies need to adopt an evidence-based approach to rooting out their biases, without blindly throwing money at the problem. While it’s unpalatable, admitting that every one of us has unconscious biases can be a good first step towards making personal changes. And at an institutional level, we need to draft new policies and procedures that mitigate our implicit biases and make the hiring process inherently fairer. Hopefully, the more we tackle the problem now, the easier it will be in future as diversity becomes the norm.
You can read more about hiring process and practice in our Advocacy and Policy section.
By Georgia Mills.
Georgia Mills is a freelance science writer and podcast producer. She likes good wine, bad films and ugly dogs. Follow her on Twitter at @georgiamills2.