University of Washington researchers said some smartphones are starting to incorporate 3-D gesture sensing based on cameras, but cameras consume significant battery power and require a clear view of the user's hands.
    
Researchers have now developed a new form of low-power wireless sensing technology that recognises gestures in the space around the phone.
    
The technology - developed in the labs of Matt Reynolds and Shwetak Patel, UW associate professors of electrical engineering and of computer science and engineering - uses the phone's wireless transmissions to sense nearby gestures, so it works when a device is out of sight in a pocket or bag and could easily be built into future smartphones and tablets.
    
"Today's smartphones have many different sensors built in, ranging from cameras to accelerometers and gyroscopes that can track the motion of the phone itself," Reynolds said.
    
"We have developed a new type of sensor that uses the reflection of the phone's own wireless transmissions to sense nearby gestures, enabling users to interact with their phones even when they are not holding the phone, looking at the display or touching the screen," he said.
    
When a person makes a call or an app exchanges data with the Internet, a phone transmits radio signals on a 2G, 3G or 4G cellular network to communicate with a cellular base station.
    
When a user's hand moves through space near the phone, the user's body reflects some of the transmitted signal back toward the phone.
    
The new system uses multiple small antennas to capture the changes in the reflected signal and classify the changes to detect the type of gesture performed.
    
In this way, tapping, hovering and sliding gestures could correspond to various commands for the phone, such as silencing a ring, changing which song is playing or muting the speakerphone.
    
Because the phone's wireless transmissions pass easily through the fabric of clothing or a handbag, the system works even when the phone is stowed away.
    
A group of 10 study participants tested the technology by performing 14 different hand gestures - including hovering, sliding and tapping - in various positions around a smartphone.
    
Each time, the phone was calibrated by learning a user's hand movements, then trained itself to respond. The team found the smartphone recognised gestures with about 87 per cent accuracy.
    
Researchers will present their project, called SideSwipe, at the Association for Computing Machinery's Symposium on User Interface Software and Technology in Honolulu next month.