You won't want to miss out on the world-class speakers at TNW Conference this year 🎟 Book your 2 for 1 tickets now! This offer ends on April 22 →

This article was published on April 20, 2021

US companies need to hold themselves accountable for racist algorithms — or the FTC will do it for them

The regulator also cautioned against overpromising about AI


US companies need to hold themselves accountable for racist algorithms — or the FTC will do it for them

The US Federal Trade Commission has issued a stark warning to companies using sexist and racist algorithms: hold yourself accountable — or the regulator will do it for you.

An official blog post by staff attorney Elisa Jilson noted that the FTC Act prohibits the sale or use of racially biased algorithms.

Jilson added that it’s illegal to use biased algorithms that lead to credit discrimination on the basis of race, color, religion, national origin, sex, marital status, age, or because a person receives public assistance.

It’s important to hold yourself accountable for your algorithm’s performance. Our recommendations for transparency and independence can help you do just that. But keep in mind that if you don’t hold yourself accountable, the FTC may do it for you.

Jilson expressly warned companies not to exaggerate what their algorithms do and whether they deliver fair results. Those that can’t back up their claims with evidence are violating FTC rules.

The result may be deception, discrimination – and an FTC law enforcement action,” she wrote.

[Read: The biggest tech trends of 2021, according to 3 founders]

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The rules could affect a vast range of businesses, from car manufacturers who overstate the capabilities of self-driving vehicles to AI hiring companies whose algorithms are built with data that lacks gender diversity.

Ryan Calo, a University of Washington law professor, said the blog post signals a shift in how the FTC is thinking about enforcing the rules.

“The concreteness of the examples coupled with repeated references to statutory authority is uncommon,” he tweeted.

However, it remains unclear how the FTC will assess algorithmic biases.

Nonetheless, the blunt warnings about selling discriminatory AI systems and overpromising on their capabilities suggest stricter enforcement is impending.

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with