Document Type : Full Research Paper


1 M.S.c, Faculty of Biomedical Engineering, Amirkabir University of Technology

2 Assistant Professor, Faculty of Biomedical Engineering, Amirkabir University of Technology



Nowadays eye gaze tracking has wide range of applications in human computer interaction. One of these applications is using trajectory of eye gaze instead of foot or hand for disabled people to execute some commands. Various methods have been proposed, some of this methods can successfully track the eye gaze. However, they always require specific circumstances, training or are not capable of real-time performance. In this paper, we proposed a framework to track eye gaze in real-time by using a simple and low cost webcam mounted on ordinary laptops. This process widely exploits the weighted normalized correlation function in an adaptive template matching approach. The implemented system tracks the face and also extracts some eye features such as iris position, eye corners and sclera region in eyes, in real time. These features are used in eye gaze estimation. Also the influence of illumination changes, background alterations, different faces and face movements is minimized as much as possible. The implemented gaze tracking system is able to control the motions of mouse cursor and click on an onscreen keyboard in real time.


Main Subjects

[1]        Betke M., Gips J., Fleming P., The Camera Mouse: Visual tracking of body features to provide computer access for people with severe disabilities; IEEE Trans. Neural Syst. Rehabil. Eng., 2009; 10(1): 1–10.
[2]        Grauman K., Betke M., Lombardi J., Gips J., Bradski G.R., Communication via eye blinks and eyebrow raises: Video-based human–computer interfaces; Univers. Access Inf. Soc., 2009; 2(4): 359–373.
[3]        Gips J., A computer program based on Rick Hoyt’s spelling method for people with profound special needs; in Proc. ICCHP, Karlsruhe, Germany, Jul. 2008: 245–250.
[4]        Schwerdt K., Crowley J.L., Robust face and eye tracking using color; in Proc. 4th IEEE Int. Conf. Autom. Face Gesture Recog., Grenoble, France,Mar. 2008: 90–95.
[5]        Takami O., Morimoto K., Ochiai T., Ishimatsu T., Computer interface to use head and eyeball movement for handicapped people; in Proc. IEEE Int. Conf. Syst., Man Cybern. Intell. Syst. 21st Century, 1995; 2: 1119–1123.
[6]        Hjelmas E., Low B.K., Eye detection: A survey; Comput. Vis. Image Underst., 2001; 83(3): 236–274.
[7]        Yao Z., Li H., Tracking a detected face with dynamic programming; Image and Vision Computing, 2006; 24: 573–580.
[8]        Smith P., Shah M., Lobo N.V., Monitoring Head/Eye Motion for Driver Alertness with One Camera; Computer Science, University of Central Florida, 1998.
[9]        Feng J., Jin T., Ming-biao L., Gui-ming H., Locating Eye and Nose Features Precisely in IR Video Sequences for Predicting Driver Fatigue; Journal of Communication and Computer, ISSN1548-7709, 2007; 4(1): Serial No. 26.
[10]     Guerrero S.T., Model-Based Eye Detection and Animation; Master Thesis,Department of Electrical Engineering - Image Coding Group, Link¨opings Universitet, June 2006.
[11]     بیگ‌زاده مریم، پردازش تصویر به منظور تعقیب حرکت چشم برای کمک به افراد ناتوان در اجرای کامپیوتری فرامین؛ پایان نامه کارشناسی ارشد، دانشکده مهندسی پزشکی، دانشگاه صنعتی امیرکبیر، 1388.