Microsoft Research has proclaimed a breakthrough in hand-tracking technology, promising a future of both virtual reality and more prosaic human-machine interaction with accurate and natural movement mapping.
Announced this weekend
by Microsoft's research arm, a team of scientists have been investigating means of improving the tracking of users' hand and finger movement as a means of controlling a virtual environment - including in virtual reality - using natural motion, and without having to hold special motion controllers or don gloves. Using a combination of novel breakthroughs and a hand-tracking algorithm originally written back in the 1940s, the team claims it has cracked the problem.
'How do we interact with things in the real world? Well, we pick them up, we touch them with our fingers, we manipulate them,
' said Jamie Shotton, computer vision researcher at Microsoft's Cambridge lab. 'We should be able to do exactly the same thing with virtual objects. We should be able to reach out and touch them. We’re getting to the point that the accuracy is such that the user can start to feel like the avatar hand is their real hand. This has been a research topic for many, many years, but I think now is the time where we’re going to see real, usable, deployable solutions for this.
Microsoft's Handpose technology, which the company points out is still a research project, is capable of tracking a user's hand motion and mapping it to a virtual pair of hands - either on a planar monitor or on a three-dimensional virtual reality display. Using these virtual hands, users can interact with knobs, dials, levers, and even a stuffed bunny - and the claimed 1:1 tracking prevents any loss of proprioception, the brain's innate ability to know whereabouts its limbs are located.
The Handpose technology demonstration is reproduced below, with Microsoft already investigating the potential to add the capabilities into commercial software - including its Skype messaging platform - under the codename Project Prague.