New Motion-Sensing Keyboard From Microsoft Enables a Wide Range of New Abilities

Microsoft’s research division presented a prototype keyboard that can interpret simple hand gestures at the ACM CHI Conference on Human Factors in Computing Systems 2014 in Toronto, where it won the best Paper Award.

What’s most interesting about this new prototype is that it combines gestures with the traditional typing–an approach that hasn’t been taken yet. With its 64 different sensors that work in pairs to detect the hands’ movements as they brush over the keys, the new prototype could potentially be a bridge between traditional input methods and increasingly prevalent touch devices. The technology is similar to their Kinect gaming system.

Some gestures can replace preexisting keyboard shortcuts, such as using Tab to switch back and forth between applications, similar to the way that some of the ergonomic keyboards on the marketplace allow users to remap their keys. According to Microsoft senior research engineer Stuart Taylor, the idea behind the project is to allow users to keep their hands either close to or on the keyboard while they type and use input gesture.

“What we’ve found is that for some of the more complicated keyboard shortcut combinations, performing gestures seems to be a lot less overhead for the user,” said Stuart.

He explained that it’s not meant to replace a mouse, saying, “It’s less about fine-grain navigation, which would still be performed with a mouse or touchpad.”

Although the keyboard is supposed to be able to interpret several different gestures, the prototype featured at the conference only had a few.

There hasn’t been any word on when this keyboard might make it to the market. It’s possible that it may never even get there, but a product like this could very well give Microsoft a new leg-up on competition.