Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on January 20, 2014

Microsoft Research built a smart elevator that uses AI to figure out what floor you’re going to


Microsoft Research built a smart elevator that uses AI to figure out what floor you’re going to

Microsoft Research is pushing the boundaries of artificial intelligence to a whole new level. Imagine a smart elevator that can figure out what floor a person wants to go to, based on their history and other factors. Okay, now you don’t have to imagine it anymore.

In an interview with Bloomberg, Head of Microsoft Research Peter Lee explains that AI is the company’s biggest focus right now. Check it out for yourself (the elevator part starts at 2:40):

Lee elaborates how a Microsoft Research team set up a bunch of sensors in front of the elevators. These in turn watched what people did, without any additional programming and without facial recognition software, for about three months.

Over that time period, the AI system learned how people behaved and began to understand their intentions. After the training period, the learning portion was turned off, and the intelligent system could control the elevator and act on the user’s behalf.

Here’s Lee offering a specific use case:

If your environment knows, for example, that it’s lunch time, that you had spoken yesterday about having lunch with a colleague on the second floor, and that it notices that you seem to be now leaving your office to go to the elevator, the elevator can be smart enough to take you, without your need to operate anything, to your colleague.

Lee’s job is to think further ahead than just what will be the next killer device to hit the market. Most of us currently sit down in front of a computer or take out our phone on the go and operate it, but Microsoft Research is looking to the next era of computing: “We think in the future, you won’t be operating computers, but instead computers will be working on your behalf.”

See also – Microsoft Research uses Kinect to translate between spoken and sign languages in real time and Microsoft Research and the UN team up to build a computational model of ecosystems across the world

Top Image Credit: Bloomberg

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with