全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Text Entry by Gazing and Smiling

DOI: 10.1155/2013/218084

Full-Text   Cite this paper   Add to My Lib

Abstract:

Face Interface is a wearable prototype that combines the use of voluntary gaze direction and facial activations, for pointing and selecting objects on a computer screen, respectively. The aim was to investigate the functionality of the prototype for entering text. First, three on-screen keyboard layout designs were developed and tested ( ) to find a layout that would be more suitable for text entry with the prototype than traditional QWERTY layout. The task was to enter one word ten times with each of the layouts by pointing letters with gaze and select them by smiling. Subjective ratings showed that a layout with large keys on the edge and small keys near the center of the keyboard was rated as the most enjoyable, clearest, and most functional. Second, using this layout, the aim of the second experiment ( ) was to compare entering text with Face Interface to entering text with mouse. The results showed that text entry rate for Face Interface was 20 characters per minute (cpm) and 27?cpm for the mouse. For Face Interface, keystrokes per character (KSPC) value was 1.1 and minimum string distance (MSD) error rate was 0.12. These values compare especially well with other similar techniques. 1. Introduction Recently, there have been several attempts to develop alternative human-computer interaction (HCI) methods that utilize eye tracking in combination with another human behavior related measurement. One line of investigation has been to measure signals that originate from human facial expression systems [1–4]. One reason for using facial muscle behavior in HCI has been the fact that the human facial system is versatile when used for communication purposes and whose functionality could serve as a potential solution in HCI systems as well [3]. Pointing and selecting as well as text entry are the most common tasks in HCI, and thus being able to carry them out with acceptable performance can be considered important in order to consider an HCI solution fit for use. The potential of the human facial system has been already utilized in the context of eye tracking research. For example, eye blinks have been used for selecting objects when gaze direction has been used for pointing [5, 6]. The choice of using use eye blinks results from the fact that video-based eye trackers that image the eyes are able to recognize whether the eyes are opened or closed [5, 7]. The relation to the facial muscle system comes from the fact that eye blinks result from the activation of orbicularis oculi facial muscle [8]. While video-based eye trackers track the eyes, computer vision

References

[1]  C. A. Chin, A. Barreto, J. G. Cremades, and M. Adjouadi, “Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities,” Journal of Rehabilitation Research and Development, vol. 45, no. 1, pp. 161–174, 2008.
[2]  J. San Agustin, J. C. Mateo, J. P. Hansen, and A. Villanueva, “Evaluation of the potential of gaze input for game interaction,” PsychNology Journal, vol. 7, no. 2, pp. 213–236, 2009.
[3]  V. Surakka, M. Illi, and P. Isokoski, “Gazing and frowning as a new human-computer interaction technique,” ACM Transactions on Applied Perceptions, vol. 1, no. 1, pp. 40–56, 2004.
[4]  V. Surakka, P. Isokoski, M. Illi, and K. Salminen, “Is it better to gaze and frown or gaze and smile when controlling user interfaces?” in Proceedings of the 11th International Conference on Human-Computer Interaction (HCI '05), CD-ROM, p. 7, July 2005.
[5]  B. Ashtiani and I. S. MacKenzie, “BlinkWrite2: an improved text entry method using eye blinks,” in Proceedings of the ACM Symposium on Eye-Tracking Research and Applications (ETRA '10), pp. 339–346, March 2010.
[6]  A. Sesin, M. Adjouadi, M. Cabrerizo, M. Ayala, and A. Barreto, “Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability,” Journal of Rehabilitation Research and Development, vol. 45, no. 6, pp. 801–818, 2008.
[7]  H. Heikkil? and K. J. R?ih?, “Simple gaze gestures and the closure of the eyes as an interaction technique,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), pp. 147–154, March 2012.
[8]  A. J. Fridlund, Human Facial Expression: An Evolutionary View, Academic Press, San Diego, Calif, USA, 1994.
[9]  A. Kr?lak and P. Strumi??o, “Eye-blink detection system for human-computer interaction,” Universal Access in the Information Society, vol. 11, no. 4, pp. 409–219, 2012.
[10]  P. M. Fitts, “The information capacity of the human motor system in controlling the amplitude of movement,” Journal of Experimental Psychology, vol. 47, no. 6, pp. 381–391, 1954.
[11]  V. Rantanen, P. H. Niemenlehto, J. Verho, and J. Lekkala, “Capacitive facial movement detection for human-computer interaction to click by frowning and lifting eyebrows,” Medical and Biological Engineering and Computing, vol. 48, no. 1, pp. 39–47, 2010.
[12]  V. Rantanen, T. Vanhala, O. Tuisku et al., “A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction,” IEEE Transactions on Information Technology in BioMedicine, vol. 15, no. 5, pp. 795–801, 2011.
[13]  O. Tuisku, V. Surakka, Y. Gizatdinova et al., “Gazing and frowning to computers can be enjoyable,” in Proceedings of the 3rd International Conference on Knowledge and Systems Engineering (KSE '11), pp. 211–218, October 2011.
[14]  O. Tuisku, V. Surakka, T. Vanhala, V. Rantanen, and J. Lekkala, “Wireless face interface: using voluntary gaze direction and facial muscle activations for human-computer interaction,” Interacting with Computers, vol. 24, no. 1, pp. 1–9, 2012.
[15]  I. S. MacKenzie, “Fitts' law as a research and design tool in human-computer interaction,” Human-Computer Interaction, vol. 7, no. 1, pp. 91–139, 1992.
[16]  S. A. Douglas and A. K. Mithal, “Effect of reducing homing time on the speed of a finger-controlled isometric pointing device,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’94), pp. 411–416, April 1994.
[17]  C. Ware and H. H. Mikaelian, “An evaluation of an eye tracker as a device for computer input,” in Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface (CHI '87), pp. 183–188, Ontario, Canada, April 1987.
[18]  P. Majaranta and K. J. R?ih?, “Twenty years of eye typing: systems and design issues,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '02), pp. 15–22, New Orleans, La, USA, March 2002.
[19]  F. E. Sandnes and A. Aubert, “Bimanual text entry using game controllers: relying on users' spatial familiarity with QWERTY,” Interacting with Computers, vol. 19, no. 2, pp. 140–150, 2007.
[20]  K. J. R?ih? and S. Ovaska, “An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’12), pp. 3001–3010, May 2012.
[21]  O. ?pakov and P. Majaranta, “Scrollable keyboards for casual typing,” PsychNology Journal, vol. 7, no. 2, pp. 159–173, 2009.
[22]  X. Bi, B. A. Smith, and S. Zhai, “Quasi-Qwerty soft keyboard optimization,” in Proceedings of the 28th Annual CHI Conference on Human Factors in Computing Systems (CHI '10), pp. 283–286, April 2010.
[23]  I. S. MacKenzie and S. X. Zhang, “Design and evaluation of a high-performance soft keyboard,” in Proceedings of the SIGCHI conference on Human Factors in Computing Systems (CHI ’99), pp. 25–31, May 1999.
[24]  D. Li, J. Babcock, and D. J. Parkhurst, “openEyes: a low-cost head-mounted eye-tracking solution,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '06), pp. 95–100, San Diego, Calif, USA, March 2006.
[25]  W. J. Ryan, A. T. Duchowski, E. A. Vincent, and D. Battisto, “Match-moving for area-based analysis of eye movements in natural tasks,” in Proceedings of the ACM Symposium on Eye-Tracking Research and Applications (ETRA '10), pp. 235–242, March 2010.
[26]  H. Aoki, J. P. Hansen, and K. Itoh, “Learning to interact with a computer by gaze,” Behaviour and Information Technology, vol. 27, no. 4, pp. 339–344, 2008.
[27]  J. P. Hansen, A. S. Johansen, D. W. Hansen, K. Itoh, and S. Mashino, “Command without a click: dwell time typing by mouse and gaze selections,” in Human-Computer Interaction (INTERACT '03), M. Rauterberg, M. Menozzi, and J. Wesson, Eds., pp. 121–128, IOS Press, Amsterdam, The Netherlands, 2003.
[28]  J. P. Hansen, K. T?rning, A. S. Johansen, K. Itoh, and H. Aoki, “Gaze typing compared with input by head and hand,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ’04), pp. 131–138, San Antonio, Tex, USA, March 2004.
[29]  D. J. Ward and D. J. C. MacKay, “Artificial intelligence: fast hands-free writing by gaze direction,” Nature, vol. 418, no. 6900, p. 838, 2002.
[30]  O. Tuisku, P. Majaranta, P. Isokoski, and K. J. R?ih?, “Now Dasher! Dash away!: longitudinal study of fast text entry by eye gaze,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ’08), pp. 19–26, March 2008.
[31]  J. R. Helmert, S. Pannasch, and B. M. Velichkovsky, “Influences of dwell time and cursor control on the performance in gaze driven typing,” Journal of Eye Movement Research, vol. 2, no. 4, pp. 1–8, 2008.
[32]  P. Majaranta, I. S. MacKenzie, A. Aula, and K. J. R?ih?, “Effects of feedback and dwell time on eye typing speed and accuracy,” Universal Access in the Information Society, vol. 5, no. 2, pp. 199–208, 2006.
[33]  D. A. Norman and D. Fisher, “Why alphabetic keyboards are not easy to use: keyboard layout doesn’t much matter,” Human Factors, vol. 24, no. 5, pp. 509–519, 1982.
[34]  M. M. Bradley, “Measuring emotion: the self-assessment manikin and the semantic differential,” Journal of Behavior Therapy and Experimental Psychiatry, vol. 25, no. 1, pp. 49–59, 1994.
[35]  C. E. Osgood, “The nature and measurement of meaning,” Psychological Bulletin, vol. 49, no. 3, pp. 197–237, 1952.
[36]  R. Menzies, A. Waller, and H. Pain, “Peer interviews: an adapted methodology for contextual understanding in user-centred design,” in Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '11), pp. 273–274, Dundee, UK, October 2011.
[37]  G. Bradski and A. Kaehler, Learning Opencv: Computer Vision with the Opencv Library, O'Reilly Media, Sebastopol, Calif, USA, 2008.
[38]  D. Li, D. Winfield, and D. J. Parkhurst, “Starburst: a hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '05), pp. 1–8, San Diego, Calif, USA, June 2005.
[39]  V. Rantanen, J. Verho, J. Lekkala, O. Tuisku, V. Surakka, and T. Vanhala, “The effect of clicking by smiling on the accuracy of head-mounted gaze tracking,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), pp. 345–348, March 2012.
[40]  P. O. Kristensson and S. Zhai, “Relaxing stylus typing precision by geometric pattern matching,” in Proceedings of the 10th international conference on Intelligent user interfaces (IUI ’05), pp. 151–158, San Diego, Calif, USA, January 2005.
[41]  R. W. Soukoreff and I. S. MacKenzie, “Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’03), pp. 113–120, April 2003.
[42]  R. Bates, “Have patience with your eye mouse! Eye-gaze interaction with computers can work,” in Proceedings of the 1st Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT '02), pp. 33–38, March 2002.
[43]  S. Zhai, “What's in the eyes for attentive input,” Communications of the ACM, vol. 46, no. 3, pp. 34–39, 2003.
[44]  R. J. K. Jacob, “The use of eye movements in human-computer interaction techniques: what you look is what you get,” ACM Transactions on Information Systems, vol. 9, no. 2, pp. 152–169, 1991.
[45]  M. Ashmore, A. T. Duchowski, and G. Shoemaker, “Efficient eye pointing with a fisheye lens,” in Proceedings of Graphics Interface 2005 (GI ’05), pp. 203–210, Ontario, Canada, May 2005.
[46]  A. B. Barreto, S. D. Scargle, and M. Adjouadi, “A practical EMG-based human-computer interface for users with motor disabilities,” Journal of Rehabilitation Research and Development, vol. 37, no. 1, pp. 53–64, 2000.
[47]  R. W. Levenson, P. Ekman, and W. V. Friesen, “Voluntary facial action generates emotion-specific autonomic nervous system activity,” Psychophysiology, vol. 27, no. 4, pp. 363–384, 1990.
[48]  P. Majaranta, U. K. Ahola, and O. ?pakov, “Fast gaze typing with an adjustable dwell time,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’09), pp. 357–360, Boston, Mass, USA, April 2009.
[49]  J. O. Wobbrock, J. Rubinstein, M. W. Sawyer, and A. T. Duchowski, “Longitudinal evaluation of discrete consecutive gaze gestures for text entry,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ’08), pp. 11–18, Savannah, Ga, USA, March 2008.
[50]  M. Porta and M. Turina, “Eye-S: a full-screen input modality for pure eye-based communication,” in Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ’08), pp. 27–34, March 2008.
[51]  X. Yong, M. Fatourechi, R. K. Ward, and G. E. Birch, “The design of point-and-click system by integrating a self-paced brain-computer interface with an eye-tracker,” IEEE Journal on Emerging and Selected Topics in Circuits and Systems, vol. 1, no. 4, pp. 590–602, 2011.
[52]  J. J. Darragh and I. H. Witten, The Reactive Keyboard, Cambridge University Press, New York, NY, USA, 1992.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413