Friday, May 14, 2010

Real-Time RObust Body Part Tracking for Augmented Reality Interface

This paper seeks to provide an interface to track body parts without limiting the user's freedom. The system can recognize whether the user wears long sleeves or short sleeves. They use a calibrated camera to obtain images of hands, head, and feet. They transfer 2D detected body parts in an approximate 3D posture. Their algorithm is the following (pretty much copied here):
  • obtain a foreground image by deleting the background and shadow of the user from the original image
  • using the face texture, detect a face from the image.
  • extract contour features
  • tracks the users head using a particle filter
  • detect and tracks the two hands by segmenting the skin blob image
  • using the contour of the lower body, detect and track the feet and estimate the 3D body pose
  • extract meaningful gestures with the position of the right hand
  • visualizes all augmented objects and the user
They performed an experiment, evaluating 2D tracking performance with short and long sleeves, separately. They used an a BeNature system that recognizes simple gestures. They calculated the error when the user wears long sleeves is 5.48 pixels and 9.16 with short sleeves.

Analysis
I think this is quite unique method of tracking the human body, well, the hands, feet, and head. I like the robustness of the system in detecting whether the user is wearing long or short sleeves. However, I would like to see more user studies to see if this can be used for other purposes.

No comments:

Post a Comment