“I was at Mobile World Congress, in the booth of Dolby, and I had on these headphones. I was watching a trailer from some action movie on a phone, and it sounded great, but there was something missing. I couldn’t feel anything.” – Dennis Sheehan, of Immersion.
That’s precisely the problem, as our world has become one of interactive glass and lacking of buttons. Argue what you will, but the disappearance of tactile feedback to our device interactions has been something that’s taken some getting used to, instead of feeling natural. That’s precisely what Immersion is working to fix.
F**k it, we'll do it live!
Our biggest ever edition of TNW Conference is fast approaching! Join 10,000 tech leaders this May in Amsterdam.
You might have never heard of Immersion, and that’s probably for a good reason. They’re the company behind the company. They’re the ones who are making the software that drives the haptic feedback hardware in many of the devices that you’re using today, including the Galaxy S III and the Galaxy Note. When you touch a button on the screen of these devices, you’re touching the work of Immersion.
But we’re not just talking about button presses. When Sheehan asked me my familiarity with haptics, I had to admit that most of it had to do with keyboards. What he showed me, however, was the future of where touch is going on mobile, and a bit of what feels like the future that we can use today.
The first thing that I saw was the integration of haptic feedback to an Android tablet where they keyboard actually felt good. The sloppy, near-flapping feeling has always been my complaint about the touch integration of most Android phones, but this felt punchy and satisfying.
Next up was a guitar application on the Galaxy Note called Solo that used the Immersion SDK to give tactile feedback to each string. From the low E to the high E, each string feels different when you play a guitar, and the app did an amazing job of reproducing the density of each one.
It could be argued that interactions like these are expected, and thus not really a big deal. So what’s next? This is where I was blown away. Haptics are offering a depth to simple interactions that are beyond anything that we have today, and could very well be the UX of tomorrow.
Sheehan showed me a demo application on his Nexus S that allowed for in-app highlighting via haptics. For instance, if you have an email inbox that’s overloaded, a developer could tie in with the Immersion SDK to have an email app that produces a vibration when an important contact’s email scrolls past on your screen. The demo that I used had light vibration for the scrolling movement, but it ramped up in intensity as I got closer to the highlighted contact’s message.
It’s one of those ah-hah moments where things just made sense. Why weren’t we getting this sort of touch-intensive feedback already?
The next demo was of something that could easily be a Facebook photo gallery. While flipping through pictures, I would feel increasing levels of feedback based upon how many comments a picture had. No comments? No vibration. Numerous? A quick buzz told me that something was going on, and I should tap the picture to bring up the comments display.
What’s probably most interesting about properly-done haptic feedback is that it feels very natural. Blame it on the Playstation or Xbox if you so choose, but we’ve become accustomed to feeling vibrations when we’re interacting with some electronics. When I was scrolling thorough pictures, it seemed natural that a buzz would mean that I’d need to tap the image to find out more information. The same is true with the increasing vibration while scrolling in an email app. It had that pleasing feel that I’d normally associate with the flipping sound of a Rolodex. It just felt right.
So why don’t we have better haptic feedback in more apps? According to Sheehan, that comes down to developer outreach. More people have to know about what’s available to them, and they have to know how easy it is to integrate haptics to their apps. Immersion has developed tools to make it extremely simple, including the app in the image above, where a developer can find the feedback that they want, and then simply use the code displayed on the screen to have the interaction in their app.
The final demo that I saw today was one that I can’t wait to see in action more often. Remember that quote from Sheehan above, where he talked about the lack of feeling with mobile video? Immersion has solved the problem, with technology that should be available to the developer public soon. I watched a video clip where haptic feedback made the entire experience more engaging. The audio seemed louder and more clear, the video had more pop to it. For a few seconds, I was sitting in the future, and it felt very good.
Want more from CTIA 2012? Check out all of our coverage here.