Chatbots and similar types of applied artificial intelligence are on the rise. Sure, it may feel a bit strange to interact with robots., but most communication attempts actually turn out to be successful.
Chatbots came into the scene in 2011 as technology created new ways to provide customer support that aligned with – and evolve alongside – ever-changing communication habits and demands.
Chatbots are seen as a fun and easy way to help with customer service. Siri and Alexa are just two of the various chatbots businesses use.
The future is Nao
Earlier this year, Hilton started a test with a concierge robot for its chain of hotels in the US.
The project is based on a partnership with IBM’s Watson program.
The robot – named Connie, after the hotel’s founder Conrad Hilton – is available to answer questions from customers. Connie knows everything about the hotel, neighborhood restaurants, tourist attractions, and so on.
The robot used in the project is Nao, a so-called humanoid bot from French company Aldebaran. Powered by Watson services for natural language processing, Connie also employs WayBlazer – an IBM partner that offers personalized recommendations for travelers.
Peppered with emotion
Just like Nao, Pepper is a popular robot used in several trial projects as well. The robot was co-developed by Aldebaran and Japan-based SoftBank Robotics.
Pepper is an emotional humanoid bot, meaning it recognizes human emotions and acts accordingly.
Once again, IBM’s Watson services are making that happen. Pepper uses Watson through an SDK that enables developers to tap into IBM’s cognitive computing, to tailor the robot’s abilities to the customer’s specific needs.
Thanks to the cognitive capabilities that are embedded in the robot, people benefit more this new technology, in ways they had never expected to be possible.
In a commercial environment, today’s robots mainly act as eyecatchers. When arriving at a bank, for example, it’s cool to have a robot tell the customer which meeting room has been booked for his appointment.
The presence of the robot, however, opens the door to a new level of efficiency. At the doctor’s waiting room, the robot could come up to ask some questions – such as the patient’s name or the reason for his visit – collecting information for the doctor and allowing to streamline the patient visits administration.
In this type of interaction, the use of natural language is key. But since that happens to be one of Watson’s strongest capabilities – and Watson is a self-learning solution – the adoption rate of robots and cognitive computing is expected to increase rapidly over the next couple of years.
This post was brought to you by IBM Bluemix. Yes, TNW sells ads. But we sell ads that don’t suck.