18-03-2011, 11:59 AM
[attachment=10477]
Eyegaze Human-Computer Interface
ABSTRACT
The Eyegaze System provides an eye-controlled human-computer interface (HCI), allowing people to interact with computers by pointing with their eyes. A video camera mounted below the computer monitor unobtrusively observes the user's eye and specialized image processing software analyzes the video images of the eye and determines the eye's gaze point on the monitor screen in real time. Early applications of the Eyegaze System addressed an HCI for people with severe motor disabilities. Simply by looking at control keys displayed on a computer screen a disabled user can type, generate synthesized speech, control lights and appliances, operate a telephone, play games, and run DOS-compatible off-the-shelf software.
Creating an eyegaze HCI that accommodates a variety of physical disabilities presented our engineering team with several technical challenges. The Eyegaze System has to be accurate enough for a user to trigger the 5/8-inch keys of an on-screen computer keyboard. The calibration procedure needs to be simple, and the system needs to maintain calibration when the user leaves the computer and returns. Finally, for people with uncontrolled head motion, it needs to be tolerant to head motion. The accuracy and calibration objectives have been achieved. A solution for accommodating head motion is under development.
INTRODUCTION
There are many kinds of eye tracking devices, ranging from galvanometric sensors which measure voltages across the eye, to video image processors which examine optical images of the eye (Mason, 1969; Merchant and Morrisette, 1973; Cornsweet, 1973). Eye trackers employing image processing are by far the most accurate and reliable, and are therefore preferable (Young and Sheena, 1975).
Image processing eye trackers exist in two categories: head-mounted and remote. For disabled people operating computers, it is appropriate to sense the eye unobtrusively, with remotely mounted cameras. The user need not be mechanically "hooked up" to access the system, and has no need for cumbersome equipment on his body.
In 1988, LC Technologies completed development of the first Eyegaze Computer System designed for use by people with severe motor disabilities. Eyegaze is a PC-based system, requiring only the control of one eye. Selections are made by fixing the gaze in control "keys" on the screen. Nothing is attached to the user.
As illustrated in Figure 1, a video camera located below the computer screen continually observes the user's eye, and specialized image-processing software determines the eye orientation and projects the subject's gazepoint on the computer display. With a person sitting between 18 and 24 inches from the computer screen, the system predicts the gazepoint with an average accuracy of better than 1/4 inch. The system also generates information regarding pupil diameter, blinking, and eye fixations, useful for other eyetracking applications.
METHOD
The Eyegaze System uses the pupil-center/corneal-reflection method to determine the eye's gaze direction. A low-power infrared light emitting diode (LED) located in the center of the camera lens illuminates the eye (Hutchinson, 1989). As shown in Figure 2, the LED generates a small, very bright reflection off the surface of the eye's cornea and, because it is located at the center of the camera lens, the LED causes the bright-pupil effect by reflecting light off the retina.[1] The computer calculates the person's gazepoint, i.e. the coordinates of where on the display he is looking, based on the relative positions of the pupil center and corneal reflection within the video image of the eye.
Prior to operating the eyetracking applications, the Eyegaze System must learn several physiological properties of a person's eye in order to be able to project his gazepoint accurately. It must know the radius of curvature of the eye's cornea and the angular offset between the eye's optic and focal axes. The system learns these parameters by performing a calibration procedure. To calibrate, the user fixes his gaze on a sequence of small circles that the computer displays at specific locations on the screen. The calibration procedure usually takes about 15 seconds and can be performed independently.
The Eyegaze System can save calibration results for future use, and it will retain current calibration data even if the user moves away from the system. When he returns to his position in front of the camera, Eyegaze will resume its gazepoint determination, enabling the user to continue to operate the system
Eyegaze Human-Computer Interface
ABSTRACT
The Eyegaze System provides an eye-controlled human-computer interface (HCI), allowing people to interact with computers by pointing with their eyes. A video camera mounted below the computer monitor unobtrusively observes the user's eye and specialized image processing software analyzes the video images of the eye and determines the eye's gaze point on the monitor screen in real time. Early applications of the Eyegaze System addressed an HCI for people with severe motor disabilities. Simply by looking at control keys displayed on a computer screen a disabled user can type, generate synthesized speech, control lights and appliances, operate a telephone, play games, and run DOS-compatible off-the-shelf software.
Creating an eyegaze HCI that accommodates a variety of physical disabilities presented our engineering team with several technical challenges. The Eyegaze System has to be accurate enough for a user to trigger the 5/8-inch keys of an on-screen computer keyboard. The calibration procedure needs to be simple, and the system needs to maintain calibration when the user leaves the computer and returns. Finally, for people with uncontrolled head motion, it needs to be tolerant to head motion. The accuracy and calibration objectives have been achieved. A solution for accommodating head motion is under development.
INTRODUCTION
There are many kinds of eye tracking devices, ranging from galvanometric sensors which measure voltages across the eye, to video image processors which examine optical images of the eye (Mason, 1969; Merchant and Morrisette, 1973; Cornsweet, 1973). Eye trackers employing image processing are by far the most accurate and reliable, and are therefore preferable (Young and Sheena, 1975).
Image processing eye trackers exist in two categories: head-mounted and remote. For disabled people operating computers, it is appropriate to sense the eye unobtrusively, with remotely mounted cameras. The user need not be mechanically "hooked up" to access the system, and has no need for cumbersome equipment on his body.
In 1988, LC Technologies completed development of the first Eyegaze Computer System designed for use by people with severe motor disabilities. Eyegaze is a PC-based system, requiring only the control of one eye. Selections are made by fixing the gaze in control "keys" on the screen. Nothing is attached to the user.
As illustrated in Figure 1, a video camera located below the computer screen continually observes the user's eye, and specialized image-processing software determines the eye orientation and projects the subject's gazepoint on the computer display. With a person sitting between 18 and 24 inches from the computer screen, the system predicts the gazepoint with an average accuracy of better than 1/4 inch. The system also generates information regarding pupil diameter, blinking, and eye fixations, useful for other eyetracking applications.
METHOD
The Eyegaze System uses the pupil-center/corneal-reflection method to determine the eye's gaze direction. A low-power infrared light emitting diode (LED) located in the center of the camera lens illuminates the eye (Hutchinson, 1989). As shown in Figure 2, the LED generates a small, very bright reflection off the surface of the eye's cornea and, because it is located at the center of the camera lens, the LED causes the bright-pupil effect by reflecting light off the retina.[1] The computer calculates the person's gazepoint, i.e. the coordinates of where on the display he is looking, based on the relative positions of the pupil center and corneal reflection within the video image of the eye.
Prior to operating the eyetracking applications, the Eyegaze System must learn several physiological properties of a person's eye in order to be able to project his gazepoint accurately. It must know the radius of curvature of the eye's cornea and the angular offset between the eye's optic and focal axes. The system learns these parameters by performing a calibration procedure. To calibrate, the user fixes his gaze on a sequence of small circles that the computer displays at specific locations on the screen. The calibration procedure usually takes about 15 seconds and can be performed independently.
The Eyegaze System can save calibration results for future use, and it will retain current calibration data even if the user moves away from the system. When he returns to his position in front of the camera, Eyegaze will resume its gazepoint determination, enabling the user to continue to operate the system