Commented on the following blogs
Franck Norman
Drew Logsdon
This paper presented an embedded eye tracker for context-awareness and eye-based human computer interaction. The author designed a goggle with dry electrodes integrated into the frame and a small microcontroller for signal processing. Eye gazing is a good way to express intention and attention covertly, which makes it a good input mode. However, dwelling is still used for confirmation. The authors used EOG as a substitute for other methods. It is easily implemented in a lightweight system.
The goggles were designed to be wearable and lightweight, require low power, provide adaptive real-time signal processing capabilities to allow for context-aware interaction, and to compensate for EOG signal artefacts. The system would detect and ignore blinking, detect the movement of the eyes, and be able to map the eye movement into basic directions. There would be a string representation of the eye movement, where certain combinations would be recognized as a gesture.
The trial consisted of a computer game with 8 levels where they had to perform a specific gesture. High scores are given for those who complete it in short times.They found the EOG signals can be efficiently processed to recognize eye gestures. However, 30% of the subjects had trouble focusing.
Commentary
I think that relative eye tracking is probably much better than exact eye tracking for future methods to use mouse free interaction. The user can glance in the general direction of where they want the mouse to go.
The results were relatively lacking. There were no statistics. 30% was given for the number of people who had trouble concentrating, but how many people were tested? What was the average time for each level? I would like to see where this research goes and see if there is a much more thorough testing of the system.
Subscribe to:
Post Comments (Atom)
I agree that many of the questions were left unanswered. It seems to me that this paper is an introduction of things to come, and hopefully our questions will be answered in the future.
ReplyDeleteThe study certainly was lacking, I would have been curious to know what the different levels of difficulty consisted of and how long it took for users to become accustomed to performing the gestures.
ReplyDelete"I think that relative eye tracking is probably much better than exact eye tracking for future methods to use mouse free interaction."
ReplyDeleteI have to disagree on that point. Using their discrete forms of measurement to do motion control, you would first need to most your eyes in one of eight directional positions. Once you've arrived to desired location from jumping back and forth between those discrete measurements, you would then need to move to the "home" position, which is positioning your eyes at the center, or adjusting your head to compensate. To use the mouse analogy, that's like moving your mouse to the center of the mouse pad, or moving the mouse pad to the center of the mouse. Exact eye tracking is still more powerful because it mimics what we already do naturally with our eyes.