This article was published on December 2, 2020

Study shows how AI exacerbates recruitment bias against women

An experimental hiring algorithm reflected the gender biases of human recruiters


Study shows how AI exacerbates recruitment bias against women Image by: Anna Shvets from Pexels

A new study from the University of Melbourne has demonstrated how hiring algorithms can amplify human gender biases against women.

Researchers from the University of Melbourne gave 40 recruiters real-life resumĆ©s for jobs at UniBank, which funded the study. The resumĆ©s were for roles as a data analyst, finance officer, and recruitment officer, which Australian Bureau of Statistics data shows are respectively male-dominated, gender-balanced, and female-dominated positions.

Half of the recruitment panel was given resumĆ©s with the candidateā€™s stated gender. The other half was given the exact same resumĆ©s, but with traditionally female names and male ones interchanged. For instance, they might switch ā€œMarkā€ to ā€œSarahā€ and ā€œRachelā€ to ā€œJohn.ā€

The panelists were then instructed to rank each candidate and collectively pick the top and bottom three resumĆ©s for each role. The researchers then reviewed their decisions.

[Read: How to build a search engine for criminal data]

They found that the recruiters consistently preferred resumĆ©s from the apparently male candidates ā€” even though they had the same qualifications and experience as the women. Both male and female panelists were more likely to give menā€™s resumĆ©s a higher rank.

Credit: The University of Melbourne
Data suggest 70% of data analysts in Australia are men. If an algorithm is trained to rank candidates based on these statistics, it could assume that a male name is a desirable quality for the position.

The researchers then used the data to create a hiring algorithm that would rank each candidate in-line with the panelā€™s preferences ā€” and found that it reflected their biases.

Read: Amazonā€™s sexist hiring algorithm could still be better than a human

ā€œEven when the names of the candidates were removed, AI assessed resumĆ©s based on historic hiring patterns where preferences leaned towards male candidates,ā€ said study co-author Dr Marc Cheong in a statement.

ā€œFor example, giving advantage to candidates with years of continuous service would automatically disadvantage women whoā€™ve taken time off work for caring responsibilities.ā€

The study relied on a small sample of data, but these types of gender biases have also been documented in large companies. Amazon, for example, had to shut down a hiring algorithm tool after discovering it was discriminating against female applicants, because the models were predominantly trained on resumes submitted by men.

ā€œAlso, in the case of more advanced AIs that operate within a ā€˜black boxā€™ without transparency or human oversight, there is a danger that any amount of initial bias will be amplified,ā€ added Dr Cheong.

The researchers believe the risks can be reduced by making hiring algorithms more transparent. But we also need to address our inherent human biases ā€” before theyā€™re baked into the machines.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with