Napier Lopez is a writer based in New York City. He's interested in all things tech, science, and photography related, and likes to yo-yo in Napier Lopez is a writer based in New York City. He's interested in all things tech, science, and photography related, and likes to yo-yo in his free time. Follow him on Twitter.
3D is here to stay. Well, probably not with TV’s, but instead with a number of technologies that can better interact with three-dimensional space, such as 3D printing and depth-sensing cameras. In fact, Intel’s RealSense camera has easily been the company’s greatest bullet point this CES, instead of a fancy new processor or chipset.
Needless to say, we spent some time checking the technology out; here are some of the coolest uses we saw for RealSense.
When strolling around Intel’s booth, I spotted an animated character contorting its face into a variety realistic facial expressions – I assumed it was a tech demo for a new graphics chip. But then I saw it was replicating the facial expressions of the person standing in front of the monitor (and the RealSense camera mounted on it) with almost scary accuracy.
The company behind the performance capture, Faceshift, has previously worked with commercial applications, but the RealSense technology they showed off was their first consumer product. I gave it a try, and frankly, was blown away. RealSense was able to capture minute movements in my lips, cheeks and eyes with almost no lag at all.
Our videogame avatars are about to get way more realistic.
When we reported on HP’s crazy Sprout computer in late October, it seemed more like a gimmicky product more than something that would ever be legitimately useful. But seeing it in person, I can’t deny there are some fascinating possibilities to consider.
If you’re not familiar, it’s an all in one PC with two screens – one’s a monitor, the other is a projection on your desk. Both are touch enabled (you can use a stylus on the bottom one, making it like a giant Wacom tablet), and you can use the embedded RealSense camera to scan objects onto the computer to later interact with them.
We saw the computer scan a variety of items ranging from bananas to a face mask, and the machine handled the process with aplomb. Once the scan is complete, you have an object you can play around and interact with on the computer and easily integrate into your design projects, even if just as a starting point.
The Sprout is a first generation product, so it may not get much traction in its current form. But if Intel is able to get the adoption for RealSense it’s hoping for, it could be a glimpse at the future of how we interact with our devices.
Okay, so maybe it wasn’t a hologram in the typical sci-fi idea of a completely three-dimensional projection, but Intel did show off a setup where a piano keyboard appeared to float in mid air via an array of mirrors and lenses. You could play with keyboard by basically tapping at the air. And again, it worked scarily well.
In fact, it worked almost too well: I assumed the computer featured some sort of ultrasonic haptic feedback, because it felt like I was pressing something as I moved around the keys in mid-air. When I asked the on-site representative, he said there wasn’t any, and, well, I thought I was going crazy. But he assured me that I wasn’t; apparently the perception of touch feedback with holograms is a common experience. Weird. But cool.
The future is 3D, just not the kind you thought
These were just some of my favorite uses for RealSense that we were able to check out on the show floor, but the company also showed off some other really applications, such as spatially-aware drones, robot insects, and even creating a 3D model of a person by circling around them with a tablet.
It’s hard to say when exactly RealSense will really hit its stride with the mainstream, but with Intel behind it in full force, it’s only a matter of time. I sure look forward to when it does.
Get the TNW newsletter
Get the most important tech news in your inbox each week.