You won't want to miss out on the world-class speakers at TNW Conference this year 🎟 Book your 2 for 1 tickets now! This offer ends on April 22 →

This article was published on August 1, 2019

AI researchers developed a fully imaginary keyboard for touchscreens and VR


AI researchers developed a fully imaginary keyboard for touchscreens and VR

A trio of researchers from the Korean Advanced Institute of Science and Technology (KAIST) recently developed an eyes-free, AI-powered, invisible keyboard interface that positions itself based on where you choose to set your hands when you’re ready to type.

Billions of dollars worth of R&D and marketing research go into building better peripherals every year. But, realistically, little’s changed since the advent of the mouse and keyboard. Touchscreens and virtual keyboards are a fair proxy for mobile users, but there’s no replacement for a full sized QWERTY keyboard you can use all ten fingers on.

Various solutions have come along, including voice-control and swipe-to-type methods, but they’re brittle and far more prone to missed or errant strokes than hardware models. That’s why Researchers Ue-Hwan Kim, Sahng-Min Yoo and Jong-Hwan Kim decided to rethink the entire concept. They developed a keyboard interface that’s fully-imaginary.

According to the team’s research paper:

First of all, the proposed I-Keyboard is invisible, which maximizes the utility of the screens on mobile devices. Users can view the content of an application in full screen and type freely at the same time. To further improve usability, I-Keyboard does not have pre-defined layout, shape or size of keys. Users can start typing on any position at any angle on touch screens without worrying about the keyboard position and shape.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The I-Keyboard proposed by the researchers requires no calibration or tuning to work. You simply start typing anywhere on the touchscreen, like you would on a physical keyboard, and it uses deep learning to discern what you’re trying to type. Per the paper:

I-Keyboard comes with DL-based decoding algorithm which does not require a calibration step. The proposed deep neural decoder (DND) effectively handles both hand-drift and tap variability and dynamically translates the touch points into words.

This might sound a bit like magic. How can it possibly know what you’re trying to type if you’re just poking your fingers all over the place? After all, one of the biggest problems with soft keyboards is that it’s virtually impossible to keep your fingers located on the correct keys over time without looking because, without the physical constraints to cue our sense of touch, we start deviating from our original position. But there’s some pretty cool science behind I-keyboard. Instead of locating finger position with precision, the algorithm figures out what you’re trying to do and constantly adjusts its invisible keyboard to fit the imaginary one in your head.

Credit: Kim et al.

There’s still a lot of work to be done on I-Keyboard. While it’s currently able to perform with a whopping 95.8-percent accuracy, it only does so at about 45WPM. Still, that’s a marginal improvement over current soft keyboard technology.

With further development and better touch interfaces, the developers believe I-Keyboard could be improved to become a fully imaginary replacement for physical keyboards. The implications for VR technology – where eyes-free typing is sorely needed – are myriad, and the ability to type anywhere on a touchscreen (even on top of displayed information) could be a game changer for UX design. Imagine reclaiming 100-percent of your phone’s screen real-estate, even while typing.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with