Document Type : Full Research Paper

Authors

1 M.Sc Student, Biomechanics Department, Faculty of Mechanical Engineering, Iran University of Science and Technology

2 Assistant Professor, Biomechanics Department, Faculty of Mechanical Engineering, Iran University of Science and Technology

10.22041/ijbme.2011.13195

Abstract

This paper investigates prototyping an online, low-cost, video based and applicable eye tracker, which is called "Dias Eye Tracker". Disabled people can use the proposed system to communicate with computer. What have made the system different from the other low-cost eye trackers, are the accuracy of gaze estimation, the different application parts of the software and the lightweight wireless hardware, which can be mounted on the user’s head. This paper introduces the software/hardware and the methods of the system. In addition, two methods of pupil tracking have been compared together, and an uncertainty analysis on the mapping function of the system has been done. The performance of the designed eye tracker has been evaluated by analyzing the answers to the three questionnaires, which were filled by disabled people after performing three specific tasks. The results show that the system performs well for interaction with computer.

Keywords

Main Subjects

[1]     پرنیــان پور محمد، مردانبگی دیاکو، سرشـار محمد؛  ردیـابی حرکــات چشــم (Eye Tracking)  و کاربردهای آن در ارگونومی؛ اولین کنفرانس بین المللی ارگونومی ایران؛ تهران؛ 1387.
[2]  Eye tracking; http://www.wikipedia .com/2008.
[3] Kumar M., Paepcke A., Winograd T., EyePoint: Practical Pointing and Selection Using Gaze and Keyboard; In Proceedings of CHI. San Jose, California, USA: ACM Press; 2007.
[4] Skovsgaard H., Noise Tolerant Selection by Gaze Controlled Pan and Zoom; 2008.
[5] Ward D.J., Blackwell A.F., MacKay D.J.C., Dasher: a gesture driven data entry interface for mobil computing; Hum Comput. Interact; 2002; 17: 199–228.
[6] Tall M., EyeTube, Gaze Group; IT University of Copenhagen; 2008.
[7] Babcock J.S., Pelz J.B., Building a lightweight eyetracking headgear; symposium on Eye tracking research & applications, San Antonio, Texas, ACM; 2004; 109–114.
[8]    Li D., Babcock J.S., Parkhurst D.J., openEyes: a low-cost head-mounted eye-tracking solution; symposium on Eye tracking research & applications, San Diego, California, ACM.; 2006; 95–100.
[9]  Zielinski P., Opengazer: open-source gaze tracker for ordinary webcams; http://www.inference.phy.cam.ac.uk/opengazer/.
[10] مردانبگی دیاکو، ملاک زاده محمدرضا؛ طراحی و ساخت یک ردیاب چشم به منظور توانبخشی معلولان در برقراری ارتباط با کامپیوتر؛ پانزدهمین کنفرانس مهندسی پزشکی ایران؛ مشهد؛ 1387.
[11] مردانبگی دیاکو، ملاک زاده محمدرضا؛ ارزیابی عملکرد یک ردیاب حرکات چشم در زمینۀ توانبخشی معلولان کوادریپلژیک به منظور برقراری ارتباط با رایانه؛ شانزدهمین کنفرانس مهندسی پزشکی ایران؛ تهران؛ 1388.
[12] Duchowski A.T., Eye Tracking Methodology: Theory and Practice; Springer, London; in press.
[13] Noureddin B., Lawrence P.D., Man C.F., A non-contact device for tracking gaze in a human computer interface.; In Computer Vision and Image Understanding; 2005; 98: 52–82.
[14] Beymer D., Flickner M., Eye gaze tracking using an active stereo head; IEEE Conf. Comput. Vis. Pattern Recogn; 2003; 2: 451–458.
[15] Li D., Win_eld D., Parkhurst D.J., Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches; In Proceedings of the IEEE Vision for Human-Computer Interaction Workshop at CVPR; 2005; pp. 1.8.
[16] Morimoto C.H., Marcio R.M., Mimica, Eye gaze tracking techniques for interactive applications; Computer Vision and Image Understanding; 2005; 98: 4–24.
[17] NASA, Task Load Index (TLX): Computerized version (Version 1.0). Moffett Field, CA: Human Research Performance Group; NASA Ames Research Center; 1986.
[18] Chin J. P., Diehl V. A, Norman K., Development of an instrument measuring user satisfaction of the human-computer interface; Proc. ACM CHI; 1987; 88: 213-218.
[19] Lewis J. R., IBM Computer Usability Satisfaction Questionnaires: Psychometric Evaluation and Instructions for Use; International Journal of Human-Computer Interaction; 1995; 7:1, 57-78.