We hear an awful lot about how artificial intelligence can be used to solve hard statistical challenges – but we hear much less about how it could solve emotional problems.
But this field already has a name, affective computing, and one of its leading firms today is Affectiva.
All Killer, No Filler
We’re bringing Momentum to New York: our newest event, showcasing only the best speakers and startups.
The startup is a spinout of MIT’s Media Lab, where researchers were working on ways to create new technologies that would enhance emotional communication and, yes, it already started offering ‘Emotion As A Service’ late last year.
The company has what is thought to be the world’s largest database of emotions, gathered using facial recognition technology to analyze over 3.8 million faces from 75 countries and collating over 40 billion different data points.
The company uses the Facial Action Coding System (FACS), originally devised in the 1970s to help carve up, count and ultimately understand facial expressions.
To become a certified face coder, this typically takes hundreds of hours of training, whereas Affectiva has trained algorithms to understand and categorize different facial expressions in fractions of seconds.
Its cofounder and chief strategy and science officer is Rana el Kaliouby. She was on the BBC World Service radio show The Forum last night to talk about exactly what they’ve been up to.
“As more and more of our lives migrate online, emotions are missing from our online digital experiences so we want to bring emotional intelligence back into our digital world. We say, with our artificial intelligence technology, we’re bringing artificial intelligence to life.”
This research has enabled the company to come up with some interesting cross-cultural findings so far.
For example, women have been found to express much more positive emotions than men, which isn’t that surprising, but in the US, women smile 40 percent more often than men, while in the UK, they found no significant difference.
Like other areas of AI, even el Kaliouby admits this still has some way to go. It’s argued that FACS alone can’t truly understand the underlying emotions behind a smile pinned to someone’s face.
But the company has so far launched an SDK so app developers, designers and researchers can incorporate emotions into their apps today. Areas like market research and gaming are both obvious industries that could benefit from understanding feelings in real-time.