As if touchscreen technology wasn’t enough, scientists at ETH Zurich have come up with a way to control smartphones with hand gestures. With just a few moves in front of a camera, users can now zoom into a page, change browser tabs, shoot down enemies in a game, and much more.
This technology was made possible by a master’s student named Jie Song working under Professor Otmar Hilliges at ETH Zurich. Song developed an algorithm that can translate hand gestures similar to sign language into commands for a smartphone, like swiping a page to the right without actually touching the screen.
At this time, the program responds to six different gestures that correspond to six different commands. The camera analyzes its environment without detecting color or depth. It registers the shape of the hand and the gesture, and then communicates the movement with the phone. The program can also tell a person when his hand is too far away from the camera.
The scientists have tested 16 outlines, but there is no actual limit to the application’s capabilities. Rather, the app simply needs sufficient processing power and memory to conduct other commands on the phone. The new algorithm currently uses only a small portion of a device’s memory, making it perfect for smartphones.
Hilliges says he does not believe this technology will replace touchscreens, but he does see it as a fitting supplement to touchscreens.