全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Haptic Addition to a Visual Menu Selection Interface Controlled by an In-Vehicle Rotary Device

DOI: 10.1155/2012/787469

Full-Text   Cite this paper   Add to My Lib

Abstract:

Today, several vehicles are equipped with a visual display combined with a haptic rotary device for handling in-vehicle information system tasks while driving. This experimental study investigates whether a haptic addition to a visual interface interferes with or supports secondary task performance and whether haptic information could be used without taking eyes off road. Four interfaces were compared during simulated driving: visual only, partly corresponding visual-haptic, fully corresponding visual-haptic, and haptic only. Secondary task performance and subjective mental workload were measured. Additionally, the participants were interviewed. It was found that some haptic support improved performance. However, when more haptic information was used, the results diverged in terms of task completion time and interface comprehension. Some participants did not sense all haptics provided, some did not comprehend the correspondence between the haptic and visual interfaces, and some did. Interestingly, the participants managed to complete the tasks when using haptic-only information. 1. Introduction As complexity in vehicles increases, new techniques are being developed to reduce the demands on a driver’s attention [1–5]. Because driving is mainly a visual task [6], many new systems have been developed to reduce visual load by providing supporting auditory or haptic information. For example, a haptic rotary device can provide haptic information intended to support interaction with a visual user interface. In this paper, we focus on this type of haptic information. Today, several cars are outfitted with haptic rotary devices to help the driver handle secondary tasks [7]. This type of haptic information includes kinaesthetic and tactile sensations [8] provided through active touch [9]. An exploratory procedure of repeated hand movement [10], in this case turning the rotary device back and forth, is required to perceive the haptic information. Haptic information includes the placement of a ridge between menu items [7] and special haptic effects for scrolling through a list or searching for radio stations. These kinds of haptic effects could help a driver if designed to extend or correlate with the visual information. That is, the haptic interface provides similar information as the visual interface. This redundant information may help drivers perform actions without looking at the visual display. If the driver knows that a desired function is three steps to the right in the menu, the driver can select the correct function by simply counting the haptic ridges, a

References

[1]  P. Bengtsson, C. Grane, and J. Isaksson, “Haptic/graphic interface for in-vehicle comfort functions—a simulator study and an experimental study,” in Proceedings of the 2nd IEEE International Workshop on Haptic, Audio and Visual Environments and their Applications, pp. 25–29, Ottawa, Canada, September 2003.
[2]  G. E. Burnett and J. M. Porter, “Ubiquitous computing within cars: designing controls for non-visual use,” International Journal of Human Computer Studies, vol. 55, no. 4, pp. 521–531, 2001.
[3]  K. Prynne, “Tactile controls,” Automotive Interiors international, pp. 30–36, 1995.
[4]  C. Spence and C. Ho, “Multisensory interface design for drivers: past, present and future,” Ergonomics, vol. 51, no. 1, pp. 65–70, 2008.
[5]  W. W. Wierwille, “Demands on driver resources associated with introducing advanced technology into the vehicle,” Transportation Research Part C, vol. 1, no. 2, pp. 133–142, 1993.
[6]  M. Sivak, “The information that drivers use: is it indeed 90% visual?” Perception, vol. 25, no. 9, pp. 1081–1089, 1996.
[7]  D. Grant, “Two new commercial haptic rotary controllers,” in Proceedings of the EuroHaptics, pp. 451–455, Munich, Germany, June 2004.
[8]  V. Hayward, O. R. Astley, M. Cruz-Hernandez, D. Grant, and G. Robles-De-La-Torre, “Haptic interfaces and devices,” Sensor Review, vol. 24, no. 1, pp. 16–29, 2004.
[9]  J. J. Gibson, “Observations on active touch,” Psychological Review, vol. 69, no. 6, pp. 477–491, 1962.
[10]  R. L. Klatzky, S. J. Lederman, and D. E. Matula, “Haptic exploration in the presence of vision,” Journal of Experimental Psychology, vol. 19, no. 4, pp. 726–743, 1993.
[11]  H. Alm and L. Nilsson, “The effects of a mobile telephone task on driver behaviour in a car following situation,” Accident Analysis and Prevention, vol. 27, no. 5, pp. 707–715, 1995.
[12]  C. Collet, A. Guillot, and C. Petit, “Phoning while driving I: a review of epidemiological, psychological, behavioural and physiological studies,” Ergonomics, vol. 53, no. 5, pp. 589–601, 2010.
[13]  C. Collet, A. Guillot, and C. Petit, “Phoning while driving II: a review of driving conditions influence,” Ergonomics, vol. 53, no. 5, pp. 602–616, 2010.
[14]  J. Engstr?m, E. Johansson, and J. ?stlund, “Effects of visual and cognitive load in real and simulated motorway driving,” Transportation Research Part F, vol. 8, no. 2, pp. 97–120, 2005.
[15]  D. Lamble, T. Kauranen, M. Laakso, and H. Summala, “Cognitive load and detection thresholds in car following situations: safety implications for using mobile (cellular) telephones while driving,” Accident Analysis and Prevention, vol. 31, no. 6, pp. 617–623, 1999.
[16]  T. C. Lansdown, N. Brook-Carter, and T. Kersloot, “Distraction from multiple in-vehicle secondary tasks: vehicle performance and mental workload implications,” Ergonomics, vol. 47, no. 1, pp. 91–104, 2004.
[17]  D. L. Strayer and W. A. Johnston, “Driven to distraction: dual-task studies of simulated driving and conversing on a cellular telephone,” Psychological Science, vol. 12, no. 6, pp. 462–466, 2001.
[18]  C. D. Wickens, “Multiple resources and performance prediction,” Theoretical Issues in Ergonomic Science, vol. 3, no. 2, pp. 159–177, 2002.
[19]  M. S. Prewett, L. Yang, F. R. B. Stilson et al., “The benefits of multimodal information: a meta-analysis comparing visual and visual-tactile feedback,” in Proceedings of the 8th International Conference on Multimodal Interfaces, pp. 333–338, ACM Press, Alberta, Canada, November 2006.
[20]  H. S. Vitense, J. A. Jacko, and V. K. Emery, “Multimodal feedback: an assessment of performance and mental workload,” Ergonomics, vol. 46, no. 1–3, pp. 68–87, 2003.
[21]  M. Mulder, M. Mulder, M. M. van Paassen, and D. A. Abbink, “Haptic gas pedal feedback,” Ergonomics, vol. 51, no. 11, pp. 1710–1720, 2008.
[22]  C. Ho, H. Z. Tan, and C. Spence, “Using spatial vibrotactile cues to direct visual attention in driving scenes,” Transportation Research Part F, vol. 8, no. 6, pp. 397–412, 2005.
[23]  J. B. F. Van Erp and H. A. H. C. Van Veen, “Vibrotactile in-vehicle navigation system,” Transportation Research Part F, vol. 7, no. 4-5, pp. 247–256, 2004.
[24]  F. Asif, J. Vinayakamoorthy, J. Ren, and M. Green, “Haptic controls in cars for making driving more safe,” in Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO '09), pp. 2023–2028, Guilin, China, 2009.
[25]  G. Costagliola, S. Di Martino, F. Ferrucci, G. Oliviero, U. Montemurro, and A. Paliotti, “Handy: a new interaction device for vehicular information systems,” in Proceedings of the Mobile Human-Computer Interaction (Mobile HCI '04), S. Brewster and M. Dunlop, Eds., vol. 3160 of Lecture Notes in Computer Science, pp. 264–275, Springer, Glasgow, UK, 2004.
[26]  J. Mark Porter, S. Summerskill, G. Burnett, and K. Prynne, “BIONIC – ’eyes-free’ design of secondary driving controls,” in Proceedings of the Accessible Design in the Digital World Conference 2005, Dundee, Scotland, August 2005.
[27]  A. Tang, P. McLachlan, K. Lowe, C. R. Saka, and K. MacLean, “Perceiving ordinal data haptically under workload,” in Proceedings of the 7th International Conference on Multimodal Interfaces (ICMI '05), pp. 317–324, ACM, Trento, Italy, October 2005.
[28]  A. Rydstr?m, R. Brostr?m, and P. Bengtsson, “Can haptics facilitate interaction with an in-vehicle multifunctional interface?” IEEE Transactions on Haptics, vol. 2, no. 3, pp. 141–147, 2009.
[29]  S. J. Lederman and S. G. Abbott, “Texture perception: studies of intersensory organization using a discrepancy paradigm, and visual versus tactual psychophysics,” Journal of Experimental Psychology, vol. 7, no. 4, pp. 902–915, 1981.
[30]  C. Grane and P. Bengtsson, “Menu selection based on haptic and/or graphic information,” in Proceedings of the 11th International Conference on Human-Computer Interaction, G. Salvendy, Ed., Las Vegas, Nev, USA, July 2005.
[31]  W. M. Bergmann Tiest and A. M. L. Kappers, “Haptic and visual perception of roughness,” Acta Psychologica, vol. 124, no. 2, pp. 177–189, 2007.
[32]  E. Gentaz and Y. Hatwell, “Haptic processing of spatial and material object properties,” in Touching for Knowing: Cognitive Psychology of Haptic Manual Perception, Y. Hatwell, A. Strieri, and E. Gentaz, Eds., pp. 123–159, J. Benjamins, Amsterdam, The Netherlands, 2003.
[33]  M. A. Heller, “Visual and tactual texture perception: intersensory cooperation,” Perception and Psychophysics, vol. 31, no. 4, pp. 339–344, 1982.
[34]  A. G. DaimlerChrysler, Lane Change Test—User Guide 1.2. Stuttgart: DaimlerChrysler AG, Research and Technology, 2004, http://people.usd.edu/~schieber/pdf/LCT-UserGuide.pdf.
[35]  S. Mattes, “The lane-change-task as a tool for driver distraction evaluation,” in Quality of Work and Products in Enterprises of the Future, H. Strasser, K. Kluth, H. Rausch, and H. Bubb, Eds., pp. 57–60, Ergonomia, Stuttgart, Germany, 2003.
[36]  T. Iverg?rd, Handbook of Control Room Design and Ergonomics, Taylor and Francis, London, Uk, 1989.
[37]  A. Rydstr?m and P. Bengtsson, “Haptic, visual and cross-modal perception of interface information,” in Proceedings of the Human Factors Issues in Complex System Performance, D. de Waard, G. R. J. Hockey, P. Nickel, and K. A. Brookhuis, Eds., pp. 399–409, Shaker Publishing, Maastricht, The Netherlands, 2007.
[38]  S. G. Hart and L. E. Staveland, “Development of NASA-TLX (Task Load Index): results of empirical and theoretical research,” in Human Mental Workload, P. A. Hancock and N. Meshkati, Eds., pp. 139–183, North-Holland, Amsterdam, The Netherlands, 1988.
[39]  M. B. Miles and A. M. Huberman, Qualitative Data Analysis, SAGE, Thousand Oaks, Calif, USA, 1994.
[40]  S. Guest and C. Spence, “What role does multisensory integration play in the visuotactile perception of texture?” International Journal of Psychophysiology, vol. 50, no. 1-2, pp. 63–80, 2003.
[41]  M. O. Ernst and H. H. Bülthoff, “Merging the senses into a robust percept,” Trends in Cognitive Sciences, vol. 8, no. 4, pp. 162–169, 2004.
[42]  S. A. Wall and W. S. Harwin, “Interaction of visual and haptic information in simulated environments: texture perception,” in Proceedings of the Haptic Human-Computer Interaction 2000, S. Brewster and R. Murray-Smith, Eds., pp. 108–117, Springer, Glasgow, UK, August-September 2000.
[43]  I. Rock and J. Victor, “Vision and touch: an experimentally created conflict between the two senses,” Science, vol. 143, no. 3606, pp. 594–596, 1964.
[44]  S. J. Lederman, G. Thorne, and B. Jones, “Perception of texture by vision and touch. Multidimensionality and intersensory integration,” Journal of Experimental Psychology, vol. 12, no. 2, pp. 169–180, 1986.
[45]  P. M. McDonnell and J. Duffett, “Vision and touch: a reconsideration of conflict between the two senses,” Canadian Journal of Psychology, vol. 26, no. 2, pp. 171–180, 1972.
[46]  C. Grane and P. Bengtsson, “Serial or parallel search with a multi-modal rotary device for in-vehicle use,” in Proceedings of the 2nd International Conference on Applied Human Factors and Ergonomics (AHFE '08), W. Karwowski and G. Salvendy, Eds., USA Publishing, 2008.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413