This article was published on July 9, 2019

This AI-powered tool helps you write effective and inclusive job ads


This AI-powered tool helps you write effective and inclusive job ads

HR departments everywhere, here’s a little secret: Women don’t respond well to job ads requesting “superstars” or “ninjas.” This is what hiring platform Unitive discovered in 2015 (Unitive is currently part of HR-software company TalVista) when researching how to take the gender bias out of the employee recruitment process — including writing job descriptions, evaluating applications, and interviewing applicants.  

Although it’s illegal to advertise job vacancies to a specific race, religion, or gender, research has proven time and time again that gendered words and phrases like “boastful” and “sympathetic” reinforce gender stereotypes, especially in the workplace. 

A team of social scientists argued that biased language in job ads sustain gender inequality. They coded a list of adjectives and verbs, labeling them as either feminine or masculine, then searched a popular job site using the gendered words. 

The researchers found that job-ads in typically male-dominated fields, like software programming, used masculine-coded language including “competitive,” “dominate,” or “leader.” while discovering that the mere presence of masculine words discourage women from applying as they feel like “they don’t belong.”  

Examples of gendered words commonly used on job ads that may deter women from applying, found by Unitive.

Employing AI to reveal gender bias

Gender imbalance is apparent throughout the whole job searching process — men apply for a job when they meet only 60 percent of the qualifications, but women apply only if they meet 100 percent of them

The reason we all should care about this isn’t just political — gender equality is good for business, too. According to research, diverse teams build better products. A study by the Boston Consulting Group (BCG) found that diverse companies produce 19 percent more revenue. Job ads that deter women from applying are not only problematic but also bad for business.

Terms like “proven” and “under pressure” used in job ads tend to attract more male candidates. But when companies use gender-neutral descriptions, they receive a broader applicant pool and the position will be filled three weeks faster than jobs using biased language. That’s according to Textio, an AI-powered writing-enhancement service that analyzes job descriptions in real-time, highlighting jargon and words often perceived as masculine or feminine. 

Using data science to reveal hidden gender bias in job descriptions, Textio will spell-check gender bias and give replacement options for all highlighted phrases where a gender-neutral synonym is available. The tool provides a “Tone Meter” showing the overall gender tone of a document, compared to the industry as a whole. Some of Textio’s suggestions may seem minor at first glance, but swapping out “exceptional” for “extraordinary” is statistically proven to attract more female applicants.

Textio’s “tone meter” helps employers to stay gender-neutral.

Wicked, maniacal Amazon

Marissa Coughlin, Senior Director of Communications at Textio, based in Seattle, believes employer’s write job posts to reach people who bring the culture that they want. “The language used can be pretty indicative to their company culture,” Coughlin said.

In 2017, Textio analyzed 10 companies and the language patterns they use in their job posts. It found that Amazon used “wickedly” 33 times more than the rest of the tech industry and “maniacal“ 11 times more. “Hiring language like this can become very cyclical. Wickedly, for example, has been shown to statistically attract more men to a role,” Coughlin added. “Ultimately, Textio is enabling people to uncover these patterns and change them before they publish the job post so they can have a diverse result.”

What qualifies as gendered language?

A language pattern is considered gendered if it statistically changes the proportion of men and women who respond to a job post. The bias tool recently found that whenever a man is hired, the original job post he responded to averages almost twice as many masculine-tone phrases as feminine. And in jobs where a woman has been hired, the results were the exact opposite: twice as many feminine-tone phrases as masculine in the job post.

“The bias in your original job post or recruiting email predicts who you’re going to hire. This makes sense; the language you use changes who applies to your job, and you’re much more likely to hire a woman into a tech role if your pipeline has several women to consider,” Coughlin explained. 

Location matters too

Textio’s research found that different phrases work differently geographically, “the corporate cliché, “synergy” in a job listing works particularly well in Salt Lake City, Honolulu, and Phoenix, but not in Miami, Philadelphia, or Washington, DC. Whereas the phrase “intense” helps jobs fill quickly in Portland, Denver, and Dallas, but not in Cleveland, San Francisco, or Chicago,” said Coughlin.

Textio’s bias meter works to avoid not only gender bias, but minority bias too. “One compelling example with respect to different demographic profiles is that Textio sees that corporate clichés turn away candidates in general, but they tend to deter candidates from underrepresented groups at a higher rate.”

Measuring TNW’s gender bias

The first step to improving diversity at the workplace: take a long, hard look at your own. Even companies with a conscious mind for recognizing bias can fall short when it comes to diversity and inclusion. So we put TNW to Textio’s gender test using a current vacancy for a Senior Back-end Developer.

The results, putting it lightly, weren’t great. The tool found that the ad “missed equal opportunity statements” and “included fixed mindset language.” The Tone Meter showed similar results, deeming the same job ad “very masculine” compared with over 152,000 other engineering job posts in the Netherlands. Considering that the programming industry is already very male-oriented, this is nothing to be proud of. 

TNW’s results from its job ad for a software engineer.

To improve diversity and inclusion in industries where women are seriously underrepresented, extra attention should be given to the language used to prevent women, people of color, or members from minority groups from not applying. So TNW definitely needs to become better at this in the future. 

Finding the gender-neutral alternative  

Gendered language in job postings that emphasizes competitiveness and assertiveness over teamwork and relationships may discourage women from applying. A study of job advertisements in the UK identified the top male-gendered words as “analyze,” “competitive,” “active,” and “confident,” whereas the top female-gendered words were “support,” “responsible,” “understanding,” “dependable,” and “commitment.”

In a study published by Hire More Women in Tech, it found the gender-neutral alternative to popular words and phrases used in job ads. 

Credit: Hire More Women in Tech

It’s worth keeping in mind that while technology is black and white, the world of humans is not. We can never solely rely on technology to prevent gender bias in job ads, but tools like Textio have the potential to create broader shifts in hiring strategies. 

By combining a passion for people with the power of today’s intelligent machines, Randstad supports people and organizations in realizing their true potential. Learn more about their innovative HR solutions here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.