This article was published on May 31, 2018

AI is better than you at hiring diversely


AI is better than you at hiring diversely

Countless studies show that diversity — whether it’s based on race, age, gender, or socioeconomic status — is good for business. It adds more perspectives, opinions, knowledge, and skills to the table. But we know that companies, big and small, are still facing issues with hiring diverse workforces.

The reality is, we are inherently biased. We can’t stop ourselves from automatically liking people who resemble ourselves. That’s why it might be time to admit that tech could do a better job than us at hiring.

In the past few years, we’ve seen the arrival of a number of startups aiming to fight unconscious bias in hiring  — using software. Paris-based company Goshaba, for example, lets job candidates play cognitive games to make the recruiting process more efficient and inclusive.

The company was co-founded by Camille Morvan, who taught cognitive science and organizational psychology at Harvard. In 2014, she switched to entrepreneurship following a simple observation: Recruiters tend to be solely fixated on CV’s and cover letters while ignoring the candidate’s soft skills. But things are changing, says Morvan:

We’re seeing a real shift with large corporates becoming increasingly convinced of the benefits of objective, data-based, fair recruiting. In particular, they have observed the danger of unconscious biases in recruiting. Diversity is not only a key ethical question for companies but it helps them attract and retain the best talents.

For example, we will shortly be working with EDF Energy to target candidates just starting their career from school or university. Diversity and inclusion is a huge strategic programme for the energy firm, in a sector that has been quite white and male-dominated. They want to change that.

Headstart, which is based in London, has a similar mission but uses machine learning to determine which candidates are the best technical and cultural fits.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The recruitment process was designed to help companies move away from qualification-based hiring and take things like personality, interests, and motivations into account as well. By letting algorithms match applicants with the best ‘fit’ roles, unconscious bias can be reduced significantly.

Designing around bias

Siri Uotila is a research fellow at the Harvard Kennedy School’s Women Public Policy Program. She has done a lot of research on how bias affects our decision making and how we can design environments that give less room for biased decisions.

According to Uotila, there are loads of ways to improve decision making in terms of hiring – and to make the hiring process more effective in general seen from an HR perspective.

What we always recommend is to never do unstructured interviews. If you want to do interviews as the first step of hiring, make sure they are structured and equal for all applicants. What we would encourage even more than interviewing is blind recruiting and to require a work-sample test.

Blind recruiting is the act of removing personal information from an application, that includes your name and everything else that implies something about your demographics. This allows the essential parts of your application — your skills and qualifications — to stand out.

This is a tried and tested method. One of the most notable examples is that of classical musicians at orchestras. For decades they were highly dominated by male players, and numbers of female players were sometimes as low as five percent. This issue was noticed in the 70s and 80s, and players were then required to play behind a curtain when they auditioned for spots. This increased the numbers of female musicians significantly.

The other point of making hiring processes more effective and fair is requiring work-samples. Instead of letting a potential employee explain what they are good at, how about putting them to a test instead? Let them show their skills instead of explaining them.

Let AI do the hiring

This is where Priyanka Jain, head of growth at the hiring platform company, Pymetrics, comes in. Pymetrics offers a hiring platform to large companies, that excludes personal information about the applicant. This makes the process as unbiased as possible.

The goal for pymetrics is to give all applicants equal chance at being considered for a job. No matter their gender, ethnicity or socioeconomic background.

Pymetrics makes you play neuroscience games that let you solve different tasks. For example, how prone are you are at taking risks, or how impulsive are you? These tasks assess your cognitive and emotional traits, and the data they gather, Jain explains, is much denser than a resume would ever be.

So how does it work? You log in to a platform. You don’t send in your resume. You solve these tasks and depending on how you measure on them, you either will or won’t be invited for a second-round interview.

Example of a task you have to solve through the Pymetrics hiring platform. For each pump of the balloon, you collect some money. The more you pump, the more you collect. But beware, the balloon will burst at some point. Is it better to pump less and collect smaller amounts of money for many balloons or lots of money for fewer balloons?

This method works very similar to the blind-screening method that Uotila recommends for avoiding bias. And according to Jain, the results are already speaking for themselves.

Pymetrics usually has very large companies (like Unilever, Tesla, and LinkedIn) as clients. This is because they need companies with many employees to develop accurate algorithms that understand the skills this company needs.

According to Pymetrics’ client impact statistics, 18 percent more females were hired and 16 percent more people with minority backgrounds. Furthermore, more people with a community-college background were hired than ever before.

Can AI eliminate bias completely?

It’s clear that technical solutions like Pymetrics can help people with more diverse backgrounds get their foot in the door, but what happens when they’ve actually gotten the job? Can we ensure that fairer hiring policies will also result in promotions for more diverse employees?

Uotila admits that this is another challenge companies will face. At some point, the algorithms’ reach stops and people – co-workers, managers, and employees – have to interact. And bias will always occur. But Uotila thinks that if more and more large companies start implementing blind screening and solutions like Pymetrics’ hiring platform, it is a good step in the right direction.

It’s very hard to say whether this will help more women and minority people to also get promoted to higher layers, after having been in a company for a while. However, if diversity numbers in general improve, that will at some point become the new standard.

It seems like there definitely are some tangible solutions to securing unbiased hiring and increased diversity. And as Pymetrics’ results already show, tech can play a vital role in that solution. The question now is whether it can also have a “trickle up” effect and secure more diversity in the higher layers as well.

Many larger brands now recognize the significant commercial benefits that come from building a diverse workforce. EDF Energy is passionate about recruiting from diverse backgrounds, which is why the upcoming edition of The EDF Energy Pulse Awards includes this theme. 

It’s been proven that you get better innovation, performance and business outcomes from diverse groups of people,” says Fiona Jackson, head of diversity, inclusion and employer branding at EDF Energy.

The EDF Energy Pulse Awards is helping entrepreneurs and start-ups to turn innovative ideas and products into a reality through support and investment. Find out more or join the conversation

Get the TNW newsletter

Get the most important tech news in your inbox each week.