全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Testing Two Tools for Multimodal Navigation

DOI: 10.1155/2012/251384

Full-Text   Cite this paper   Add to My Lib

Abstract:

The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment. 1. Introduction Visual maps have a number of advantages as a tool for navigation, for example, overview and high information density. Over the last years, new technologies have radically broadened how and in what contexts visual maps can be used and displayed. This development has spawned a plethora of new tools for navigation. Many of these are based on graphics, are meant for the eye, and use traditional map metaphors. The Google Maps application included in, for example, iPhone and Android smartphones is one example. However, visual maps are usually abstract representations of the physical world and must be interpreted in order to be of use. Interpreting a map and relating it to the current surroundings is a relatively demanding task [1]. Moreover, maps often require the users’ full visual attention, disrupt other activities, and may weaken the users’ perception of the surroundings. All in all, maps are in many ways demanding tools for navigation. One major challenge for developers of navigation services based on smartphones is handling the inaccuracy of sensor data, especially GPS location data. The location provided by GPS in urban environments is often very inaccurate from pedestrians’ perspective. Furthermore, the accuracy is heavily influenced by nearby buildings as well as other factors such as the positions of the satellites and weather. This paper addresses the problems described above. The problem with current navigation tools’ demands on users’ attentional and cognitive resources was addressed using multimodal user interfaces built on a mix of audio, pointing gestures, graphics, and

References

[1]  A. M. MacEachren, How Maps Work: Representation, Visualization, and Design, The Guilford Press, 1995.
[2]  K. Tsukada and M. Yasymua, “ActiveBelt: belt-type wearable tactile display for directional navigation,” in Proceedings of the 6th International Conference on Ubiquitous Computing (Ubicomp '04), pp. 384–399, Springer, 2004.
[3]  M. Frey, “CabBoots: Shoes with integrated guidance system,” in Proceedings of the 1st International Conference on Tangible and Embedded Interaction, pp. 245–246, ACM, February 2007.
[4]  T. Amemiya, H. Ando, and T. Maeda, “Lead-me interface for a pulling sensation from hand-held devices,” Transactions on Applied Perception, vol. 5, no. 3, article 15, pp. 1–17, 2008.
[5]  D. Spath, M. Peissner, L. Hagenmeyer, and B. Ringbauer, “New approaches to intuitive auditory user interfaces,” in Proceedings of the Conference on Human Interface: Part I (HCII '07), M. J. Smith and G. Salvendy, Eds., vol. 4557 of Lecture Notes in Computer Science, pp. 975–984, 2007.
[6]  J. M. Loomis, J. R. Marston, R. G. Golledge, and R. L. Klatzky, “Personal guidance system for people with visual impairment: a comparison of spatial displays for route guidance,” Journal of Visual Impairment and Blindness, vol. 99, no. 4, pp. 219–232, 2005.
[7]  R. Kramer, M. Modsching, and K. Ten Hagen, “Development and evaluation of a context-driven, mobile tourist guide,” International Journal of Pervasive Computing and Communications, vol. 3, no. 4, pp. 378–399, 2007.
[8]  L. Evett, S. Battersby, A. Ridley, and D. Brown, “An interface to virtual environments for people who are blind using Wii technology—mental models and navigation,” Journal of Assistive Technologies, vol. 3, no. 2, pp. 26–34, 2009.
[9]  D. McGookin, S. Brewster, and P. Priego, “Audio bubbles: employing non-speech audio to support tourist wayfinding,” in Proceedings of the 4th International Conference on Haptic and Audio Interaction Design (HAID '09), pp. 41–50, Springer, 2009.
[10]  S. Holland, D. R. Morse, and H. Gedenryd, “AudioGPS: spatial audio navigation with a minimal attention interface,” Personal Ubiquitous Computing, vol. 6, no. 4, pp. 253–259, 2002.
[11]  S. Strachan, P. Eslambolchilar, and R. Murray-Smith, “GpsTunes—controlling navigation via audio feedback,” in Proceedings of the 7th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI '05), pp. 275–278, ACM, September 2005.
[12]  M. Jones, S. Jones, G. Bradley, N. Warren, D. Bainbridge, and G. Holmes, “Ontrack: dynamically adapting music playback to support navigation,” in Personal and Ubiquitous Computing, vol. 12, pp. 513–525, Springer, 2008.
[13]  M. Liljedahl, N. Papworth, and S. Lindberg, “Beowulf: an audio mostly game,” in Proceedings of the 4th International Conference on Advances in Computer Entertainment Technology (ACE '07), pp. 200–203, June 2007.
[14]  D. McGookin, C. Magnusson, M. Anastassova, W. Heuten, A. Rentería, and S. Boll, Proceedings from Workshop on Multimodal Location Based Techniques for Extreme Navigation, Helsinki, Finland, 2010.
[15]  M. Anastassova, C. Magnusson, M. Pielot, G. Randall, and G. B. Claassen, “Using audio and haptics for delivering spatial information via mobile devices,” in Proceedings of the 12th International Conference on Human-Computer Interaction with Mobile Devices and Services (Mobile HCI'10), pp. 525–526, ACM, September 2010.
[16]  C. Magnusson, K. Rassmus-Gr?hn, and D. Szymczak, “Angle sizes for pointing gestures,” in Proceedings of the Workshop on Multimodal Location Based Techniques for Extreme Navigation, Helsinki, Finland, 2010.
[17]  C. Magnusson, B. Breidegard, and K. Rassmus Gr?hn, “Soundcrumbs—hansel and gretel in the 21st century,” in Proceedings of the 4th international workshop on Haptic and Audio Interaction Design (HAID ‘09), 2009.
[18]  C. Magnusson, M. Molina, K. Rassmus-Gr?hn, and D. Szymczak, “Pointing for non-visual orientation and navigation,” in Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (NordiCHI '10), pp. 735–738, ACM, October 2010.
[19]  M. Pielot and S. Bol, ““In fifty meters turn left”: why turn-by-turn Instructions fail pedestrians,” in Proceedings of the Workshop Using Audio and Haptics for Delivering Spatial Information via Mobile Devices (MobileCHI '10), Lisbon, Portugal, 2010.
[20]  T. Djajadiningrat, S. Wensveen, J. Frens, and K. Overbeeke, “Tangible products: redressing the balance between appearance and action,” in Personal and Ubiquitous Computing 8, pp. 294–309, Springer, London, UK, 2004.
[21]  P. Hekkert, “Design aesthetics: principles of pleasure in design,” Psychology Science, vol. 48, no. 2, pp. 157–172, 2006.
[22]  S. G. Hart and L. E. Staveland, “Development of nasa-tlx (task load index): results of empirical and theoretical research,” in Human Mental Workload, pp. 139–183, 1988.
[23]  Java MIDP 2.0, http://jcp.org/aboutJava/communityprocess/final/jsr118/index.html.
[24]  S. Robinson, M. Jones, P. Eslambolchilar, R. Murray-Smith, and M. Lindborg, ““I did it my way”: moving away from the tyranny of turn-by-turn pedestrian navigation,” in Proceedings of the 12th International Conference on Human-Computer Interaction with Mobile Devices and Services (Mobile HCI '10), pp. 341–344, ACM, September 2010.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413