**Author details**

quality of life (QoL) of ALS patients is improved. However, in order to provide additional improvements in QoL, a more versatile environment for eye-gaze input is required. For example, some users would like to explore the newer Web services, such as Facebook and Twitter. It is difficult to develop new software for these users individually. To resolve this problem, we need to improve our interface for mouse operation by eye gaze (presented in

As shown in Section 4-2, if a user gazes at the indicator for a desired input, that input is easily decided upon, because the application program can recognize the indicator viewed. The interface for mouse operation can move the cursor to the gaze point of the user; however, it is difficult for this type of interface to recognize the icon viewed. To resolve this problem fundamentally, we are developing an interface that utilizes information on eye gaze and eye blinks. Many such interfaces have been proposed, but no truly practical system has been developed. When using this type of interface, unconscious eye blinks occur. In other words, the input errors are often attributable to unconscious blinks. This phenomenon is known as

We think that if involuntary (unconscious) blinks can be recognized, the input errors can be significantly decreased. In fact, we are presently developing an eye-gaze input system that can recognize voluntary blinks. Most conventional methods for measuring eye blinks analyze images of the eye (and its surrounding skin) captured by a video camera. Commonly used NTSC video cameras are capable of detecting eye blinks. However, it is difficult for these cameras to measure the detailed temporal changes that occur during the process of eye blinking, because an eye blink occurs relatively fast (within a few hundred milliseconds). The eye-gaze input system also uses an NTSC camera and therefore it is necessary to take account

NTSC video cameras capture moving images at 60 fields/s, and these field images are mixed to produce field-interlaced images at a rate of 30 frames/s (fps). We have proposed a new method for using NTSC video cameras to measure eye blinks [13]. This method utilizes the non-interlaced eye images captured by an NTSC video camera. These images are odd- and even-field images in the NTSC format and are generated by splitting NTSC frames (interlaced images). The proposed method has a time resolution that is twice that of the NTSC format. Therefore, the detailed temporal changes that occur during the process of eye blinking can be measured. By using this new method for eye blink detection, we can develop a next-generation

We have developed a new eye-gaze input system for people with severe physical disabilities. This system detects the horizontal and vertical eye-gaze components of users under natural light such as that from an incandescent, fluorescent, or LED lamp. By using this system, users can input text or commands to a PC. We have also developed application programs for the eye-gaze input system, including a text input system, PC operation support system, fixed-

Section 4-3).

the "Midas touch problem."

254 Current Advances in Amyotrophic Lateral Sclerosis

eye-gaze input system that is more user-friendly.

of this problem.

**7. Conclusions**

Abe Kiyohiko1 , Ohi Shoichi2 and Ohyama Minoru2

\*Address all correspondence to: abe@kanto-gakuin.ac.jp

1 College of Engineering, Kanto Gakuin University, Kanazawa-ku, Yokohama-shi, Kanaga‐ wa, Japan

2 School of Information Environment, Tokyo Denki University, Inzai-shi, Chiba, Japan

### **References**


[6] Wang J.G., Sung E. Study on Eye Gaze Estimation. IEEE Transactions on Systems,

[7] Gips J., DiMattia P., Curran F.X., Olivieri P. Using EagleEyes - an Electrodes Based Device for Controlling the Computer with Your Eyes - to Help People with Special Needs. Proceedings of 5th International Conf. on Computers Helping People with

[8] Abe K., Ohi S., Ohyama M. An Eye-gaze Input System based on the Limbus Tracking Method by Image Analysis for Seriously Physically Handicapped People. Adjunct

[9] Abe K., Ohi S., Ohyama M. An Eye-Gaze Input System Using Information on Eye Movement History. Proceedings of 12th International Conference on Human-Com‐

[10] Abe K., Ohi S., Ohyama M. Eye-gaze Detection by Image Analysis Under Natural Light. Proceedings of 14th International Conference on Human-Computer Interac‐

[11] Simpson RC, Koester HH. Adaptive one-switch row-column scanning. IEEE Transac‐

[12] Stark L., Vossius G., Young L.R. Predictive Control of Eye Tracking Movements. IRE

[13] Abe K., Ohi S., Ohyama M. Automatic Method for Measuring Eye Blinks Using Split-Interlaced Images. Proceedings of 13th International Conference on Human-Comput‐

Proceedings of 7th ERCIM Workshop "User Interfaces for All," 2002, 185-186

Man and Cybernetics, 2002;32(3) 332-350.

Special Needs, 1996, 77-83

256 Current Advances in Amyotrophic Lateral Sclerosis

puter Interaction, 2007;6, 721-729.

tions on Rehabilitation Engineering 1999;7(4) 464-73.

Transactions on Human Factors in Electronics, 1962; 3, 52-57.

tion, 2011;2,19-26.

er Interaction, 2009;1, 3-11.
