Smartphone understands gestures

Professor Otmar Hilliges and his staff at ETH Zurich have developed a new app enabling users to operate their smartphone with gestures. This development expands the range of potential interactions with such devices.

Enlarged view: Gestensteuerung
Reminiscent of sign language: gesture control significantly expands the range of smartphone functionality. (Screenshot: ETH Zurich)

It does seem slightly odd at first: you hold the phone in one hand, and move the other in the air above its built-in camera making gestures that resemble sign language. Sometimes you move your index finger to the left, sometimes to the right. You can spread out your fingers, or imitate a pair of pliers or the firing of a pistol. These gestures are not, however, intended for communicating with deaf people; they are for controlling your smartphone.

By mimicking the firing of a pistol, for example, a user can switch to another browser tab, change the map’s view from satellite to standard, or shoot down enemy planes in a game. Spreading out your fingers magnifies a section of a map or scrolls the page of a book forwards.

All this gesturing wizardry is made possible by a new type of algorithm developed by Jie Song, a Master’s student in the working group headed by by Otmar Hilliges, Professor of Computer Science. The researchers presented the app to an audience of industry professionals at the UIST symposium in Honolulu, Hawaii.

By playing the video you accept the privacy policy of YouTube.Learn more OK
In-air gestures around unmodified mobile devices. (Video: ETH Zurich)

Intelligent programming uses computer memory

The program uses the smartphone’s built-in camera to register its environment. It does not evaluate depth or colour. The information it does register – the shape of the gesture, the parts of the hand – is reduced to a simple outline that is classified according to stored gestures. The program then executes the command associated with the gesture it observes. The program also recognises the hand’s distance from the camera and warns the user when the hand is either too close or too far away.

“Many movement-recognition programs need plenty of processor and memory power”, explains Hilliges, adding that their new algorithm uses a far smaller portion of computer memory and is thus ideal for smartphones. He believes the application is the first of its kind that can run on a smartphone. The app’s minimal processing footprint means it could also run on smart watches or in augmented-reality glasses.

More control

The program currently recognises six different gestures and executes their corresponding commands. Although the researchers have tested 16 outlines, this is not the app’s theoretical limit. What matters is that gestures generate unambiguous outlines. Gestures that resemble others are not suitable for this application. “To expand its functionality, we’re going to add further classification schemes to the program”, says the ETH researcher.

He is convinced that this new way of operating smartphones greatly increases the range of interactivity. The researcher’s objective is to keep the gestures as simple as possible, so that users can operate their smartphone effortlessly.

But will smartphone users want to adapt to this new style of interaction? Otmar Hilliges is confident they will. Gesture control will not replace touchscreen control, but supplement it. “People got used to operating computer games with their movements.” Touchscreens, Hilliges reminds us, also required a very long adjustment period before making a big impact in consumers’ lives. He is therefore certain that this application – or at least parts of it – will find its way onto the market.

Reference

Song J, Sörös G, Pece F, Fanello S, Izadi S, Keskin C, Hilliges O: In-air Gestures Around Unmodified Mobile Devices, ACM User Interface Software and Technology Symposium, Honolulu, Hawaii, 7 October 2014

JavaScript has been disabled in your browser