Sunday, May 9, 2010

Gesture based control in Multi-Robot Systems

Summary
The authors design a way to use hand gestures to control a multi-robot system. Hidden Markov Models are used to recognize gestures from the CyberGlove. There were 6 gestures, opening, opened, closing, pointing, waving left and waving right. They added states for each of these, plus a wait state. They also use a gesture spotter which selects the gesture that corresponds to the last state with the highest score, or the wait state.

They ran some tests on the HMM. Using codewords with gestures and non-gestures, the HMM with the wait state recognized gestures with 96% accuracy and 1.6 per 1000 false positives.

To control the robots, there are 2 modes of interaction. Local robot control allows the user to control the robot from the robot's POV. If the user points forward, it moves forward, regardless of orientation. Global robot control allows the user to point at where he wants the robot to go.

Discussion
This work is similar to what I am doing for my robotics project. I am focusing on single robot control and the global robot control described here. I only use 1 nearest neighbor to recognize gestures. Anyways, this paper is very interesting. I would like to have expanded my work into using HMM and a 6D tracking system so I can get more accurate readings for my project. However, HMMs were too difficult to learn and the Flock did not work.

------------------------------------------------------------
Soshi Iba, J. Michael Vande Weghe, Christiaan J. J. Paredis, and Pradeep K. Khosla. An Architecture for Gesture-Based Control of Mobile Robots

No comments:

Post a Comment