Automatic Eye Winks Interpretation System for Human-Machine Interface

  • Che Wei-Gang1Email author,

    Affiliated with

    • Chung-Lin Huang1, 2 and

      Affiliated with

      • Wen-Liang Hwang3

        Affiliated with

        EURASIP Journal on Image and Video Processing20072007:065184

        DOI: 10.1155/2007/65184

        Received: 2 January 2007

        Accepted: 21 August 2007

        Published: 8 October 2007


        This paper proposes an automatic eye-wink interpretation system for human-machine interface to benefit the severely handicapped people. Our system consists of (1) applying the support vector machine (SVM) to detect the eyes, (2) using the template matching algorithm to track the eyes, (3) using SVM classifier to verify the open or closed eyes and convert the eye winks into a sequence of codes (0 or 1), and (4) applying the dynamic programming to translate the code sequence to a certain valid command. Different from the previous eye-gaze tracking methods, our system identifies the open or closed eye, and then interprets the eye winking as certain commands for human-machine interface. In the experiments, our system demonstrates better performance as well as higher accuracy.


        Authors’ Affiliations

        Department of Electrical Engineering, National Tsing-Hua University
        Department of Informatics, Fo-Guang University
        Institute of Information Science, Academic Sinica


        1. Hallinan PW: Recognizing human eyes. Geometric Methods in Computer Vision, July 1991, San Diego, Calif, USA, Proceedings of SPIE 1570: 214-226.View Article
        2. Amarnag S, Kumaran RS, Gowdy JN: Real time eye tracking for human computer interfaces. Proceedings of the International Conference on Multimedia and Expo (ICME '03), July 2003, Baltimore, Md, USA 3: 557-560.
        3. Ji Q, Yang X: Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-Time Imaging 2002,8(5):357-377. 10.1006/rtim.2002.0279MathSciNetView ArticleMATH
        4. Smith P, Shah M, da Vitoria Lobo N: Monitoring head/eye motion for driver alertness with one camera. Proceedings of the 15th International Conference on Pattern Recognition (ICPR '00), September 2000, Barcelona, Spain 4: 636-642.View Article
        5. D'Orazio T, Leo M, Spagnolo P, Guaragnella C: A neural system for eye detection in a driver vigilance application. Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (ITS '04), October 2004, Washington, DC, USA 320-325.
        6. Van Orden KF, Jung T-P, Makeig S: Combined eye activity measures accurately estimate changes in sustained visual task performance. Biological Psychology 2000,52(3):221-240. 10.1016/S0301-0511(99)00043-5View Article
        7. Shaw R, Crisman E, Loomis A, Laszewski Z: The eye wink control interface: using the computer to provide the severely disabled with increased flexibility and comfort. Proceedings of the 3rd Annual IEEE Symposium on Computer-Based Medical Systems (CBMS '90), June 1990, Chapel Hill, NC, USA 105-111.
        8. Gan L, Cui B, Wang W: Driver fatigue detection based on eye tracking. Proceedings of the 6th World Congress on Intelligent Control and Automation (WCICA '06), June 2006, Dalian, China 2: 5341-5344.
        9. Haro A, Flickner , Essa I: Detecting and tracking eyes by using their physiological properties, dynamics, and appearance. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR '00), June 2000, Hilton Head Island, SC, USA 1: 163-168.View Article
        10. Iijima A, Haida M, Ishikawa N, Minamitani H, Shinohara Y: Head mounted goggle system with liquid crystal display for evaluation of eye tracking functions on neurological disease patients. Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS '03), September 2003, Cancun, Mexico 4: 3225-3228.
        11. Fasel I, Fortenberry B, Movellan J: A generative framework for real time object detection and classification. Computer Vision and Image Understanding 2005,98(1):182-210. 10.1016/j.cviu.2004.07.014View Article
        12. Zhu Z, Ji Q: Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Computer Vision and Image Understanding 2005,98(1):124-154. 10.1016/j.cviu.2004.07.012View Article
        13. Morimoto CH, Flickner M: Real-time multiple face detection using active illumination. Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition (FG '00), March 2000, Grenoble, France 8-13.
        14. Liu X, Xu F, Fujimura K: Real-time eye detection and tracking for driver observation under various light conditions. Proceedings of the IEEE Intelligent Vehicle Symposium (IV '02), June 2002, Versailles, France 2: 344-351.
        15. Hansen DW, Pece AEC: Eye tracking in the wild. Computer Vision and Image Understanding 2005,98(1):155-181. 10.1016/j.cviu.2004.07.013View Article
        16. Chai D, Ngan KN: Face segmentation using skin-color map in videophone applications. IEEE Transactions on Circuits and Systems for Video Technology 1999,9(4):551-564. 10.1109/76.767122View Article
        17. Chai D, Phung SL, Bouzerdoum A: Skin color detection for face localization in human-machine communications. Proceedings of the 6th International, Symposium on Signal Processing and Its Applications (ISSPA '01), August 2001, Kuala Lumpur, Malaysia 1: 343-346.View Article
        18. Vapnik VN: The Nature of Statistical Learning Theory. Springer, New York, NY, USA; 1995.View ArticleMATH
        19. Otsu N: A threshold selection method from gray-level histograms. IEEE Transcations on Systems, Man, and Cybernetics 1979,9(1):62-66. 10.1109/TSMC.1979.4310076MathSciNetView Article


        © Wei-Gang et al. 2007

        This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.