WiSee offers through-wall gesture recognition

June 5, 2013 | 09:32

Tags: #gesture-control #gesture-recognition

Companies: #research #university-of-washington

A group of researchers from the University of Washington's Department of Computer Science and Engineering have developed a method of recognising a user's gestures regardless of where in the house they might be - using nothing more than existing Wi-Fi signals.

Dubbed WiSee, the system claims to be able to recognise gestures both in line-of-sight from the equipment and in non-line-of-sight. More impressively, the WiSee technology also works through walls: meaning you could, for example, pause a video playing in the living room while taking a quick bathroom break.

The system works by looking for the Doppler shift of a wave as it moves relative to the observer - the compression or expansion in frequency that causes an ambulance siren to appear higher-pitched as it is travelling towards the observer and lower-pitched as it travels away. Applying this well-known phenomenon to Wi-Fi signals, the team has worked out a way to track the Doppler shift that comes from a hand moving towards or away from the receiver unit itself.

It's a system that requires incredible accuracy: a Wi-Fi signal works on a channel that spans around 20MHz, and operates at the speed of light; the Doppler shift that comes from a teeny-tiny human hand moving somewhere within the transmission path is a handful of hertz - a vanishingly small percentage change to the frequency of the signal. The team claims to have cracked that problem, transforming the received signal into a narrowband pulse with a frequency of just a few hertz which can be tracked for Doppler shifts.

The team's work is an extension of previous attempts to track the occupants of a room using Doppler shifts in Wi-Fi and mobile signals. Unlike that research, which was targeted at creating a means for law enforcement, military and emergency service personnel to see 'through' walls, WiSee has definite civilian applications.

In prototype form, the system has proved successful: the team's proof-of-concept implementation, using high-price Ettus USRP N210 equipment, has been tested in a two-bedroom apartment as well as an office environment with a claimed accuracy of 94 per cent in detecting gestures and translating them into control of a connected computing device.

Unlike rival systems, like Microsoft's Kinect or the soon-to-launch Leap Motion, the system is completely independent of line-of-sight requirements: the user doesn't even need to be in the same room as the machine he or she is controlling. Better still, while the prototype uses expensive signal processing gear, the team claims that the system is suitable for implementation on existing wireless access point hardware - pointing to a future where gesture control arrives as standard.

Qifan Pu, lead researcher on the project, claims that the system can use Multiple Input Multiple Output (MIMO) Wi-Fi hardware to eliminate interference from people other than the user who might be waving their hands around, locking the tracking to the active user. The system also includes a 'startup sequence' of gestures that must be performed before the WiSee software will accept commands, acting as both a personal identifier for a given user and as a means to stop the system rebooting your computer every time you scratch your nose.

The team has published a working draft paper (PDF warning) of its research, as well as a demonstration video reproduced below.


Discuss this in the forums
YouTube logo
MSI MPG Velox 100R Chassis Review

October 14 2021 | 15:04

TOP STORIES

SUGGESTED FOR YOU