EyeKeys: A Real-time Vision Interface Based on Gaze Detection from a Low-grade Video Camera
John J. Magee, Matthew R. Scott, Benjamin N. Waber and Margrit Betke
Boston University Computer Science Department

Status: Completed

There are people that are so severely paralyzed that they only have the ability to control the muscles in their eyes. Communication is limited to the interpretation of eye movements. Currently available human-computer interface systems are often intrusive, require special hardware, or use active infrared illumination. We present a system that runs on an average PC with video input from an inexpensive USB camera. The face is tracked using multi-scale template correlation. Symmetry between left and right eyes is exploited to detect if the computer user is looking at the camera, or off to the left or right side. The detected eye direction can then be used to control applications such as spelling programs or games. We developed the game "BlockEscape" to gather quantitative results to evaluate our interface system with test subjects. We also compared our system to a mouse substitution interface.

Appears in the proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04), Volume 10: Workshop on Real-Time Vision for Human-Computer Interaction (RTV4HCI), Washington, D.C., pp. 159-166, Washington, D.C., June 2004
PDF Version