This article was published on September 9, 2017

Researchers’ algorithm highlights gender bias in sports journalism


Researchers’ algorithm highlights gender bias in sports journalism

Three Cornell researchers built a language model algorithm to study gender bias in sports journalism, and it’s apparently capable of distinguishing inappropriate questions where humans fail. Their research paper was originally published last year and discussed by the New York Times this week.

The algorithm was built specifically to ferret out questions that weren’t related to the topic of tennis. By comparing in-game commentary to post-game questions and noting the differences, the researchers were able to train it to recognize when questions were more on topic than others. They discovered that women were more likely than men to receive these atypical questions.

Several of the questions that were a little more out there — meaning, less likely to relate to tennis — skewed towards what one might call stereotypical “female” questions, such as “Do you know of players who get their nails done on-site?” or “Was there anything in particular you bought when you went shopping?”

Meanwhile, the men were asked questions slightly closer to the point, such as whether they had doubts, or “What does it mean to you if you are indeed an inspiration for people who are not tall?”

I can picture someone asking a woman the latter questions. I can’t, in a hundred years, ever picture someone asking a man about his nails and his shopping habits.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Published
Back to top