While speaking at today’s TNW conference, Dr. Mitu Khandaker put forth the idea that one potential avenue in the evolution of AI isn’t in engineering or machine learning, but in game design.
Dr. Khandaker, creative partnerships director with Spirit AI and an assistant professor at the NYU Game Center, was tackling the problem of how to create “compelling emotional involvement with simulated characters.” The key to giving AI more conversational skills, she says, is to design them with context. In other words:
We need to give our AI stories. It’s not enough to expect our AI to exist, unstuck from time and space, devoid of context. If we do that, we’re not really motivated to humanize them. To be compelling to interact with, they need to live within a story, within a certain narrative context.
Have you visited TNW's hype-free blockchain and cryptocurrency news site yet?
It's called Hard Fork.
According to Dr. Khandaker, we all have underlying mental models of how a conversation is supposed to work, and a user’s particular idea of a conversation ought to be supported — in other words, they ought to feel that the thing they’re talking to can understand them. To help our AI do that, we need designers and creators who are used to giving AI particular voices and lives.
Who does that on a daily basis? Game designers, who create detailed AI-like figures in games known as non-player characters (NPCs).
Dr. Khandaker points out that there’s a difference between crafting an AI with good conversational skills, and convincing the users they’re talking to a human. When we think of AI with which we converse, it’s not necessarily the same as “conversational AI.” For example, we treat Siri or Alexa more like a search engine than we do a person with whom we’re talking.
But one insight cultivated from gamers is that they’re more willing to accept the internal boundaries of the world they’re in when it’s a compelling one — in other words, they’re willing to buy into the fiction.
As an example of how a context and story might better humanize an AI, it might give users more reason to be polite to Siri or Alexa. Currently, some users use abusive or problematic language when addressing these AI, which is troubling since many of them are coded female — Amazon’s already had to rewrite Alexa’s script in response to numerous attempts to sexually harass her.
Having AI become characters which represent diverse perspectives could help integrate them more comfortably into our lives. Dr. Khandaker believes game design could be the next step in creating AI that feels — and doesn’t just sound — more realistic and immersive.