全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

RoboTable: An Infrastructure for Intuitive Interaction with Mobile Robots in a Mixed-Reality Environment

DOI: 10.1155/2012/301608

Full-Text   Cite this paper   Add to My Lib

Abstract:

This paper presents the design, development, and testing of a tabletop interface called RoboTable, which is an infrastructure supporting intuitive interaction with both mobile robots and virtual components in a mixed-reality environment. With a flexible software toolkit and specifically developed robots, the platform enables various modes of interaction with mobile robots. Using this platform, prototype applications are developed for two different application domains: RoboPong investigates the efficiency of the RoboTable system in game applications, and ExploreRobot explores the possibility of using robots and intuitive interaction to enhance learning. 1. Introduction In the past few years, much development has taken place in the research field of human-computer interaction. Several research approaches in this field, including tabletop interaction, tangible user interfaces (TUIs), augmented reality, and mixed-reality, show great promise for bringing new interaction styles to other related research domains. The research presented in this paper is an attempt to integrate several research approaches to create a mixed reality environment for novel human-robot interaction (HRI). Because the horizontal surface of a table permits the placement of objects, and its large surface area enables spreading, piling, and organization of the items, digital tabletop user interfaces are becoming increasingly popular for supporting natural and intuitive interaction [1–3]. A TUI is a user interface in which a person interacts with digital information via the physical environment. It gives a physical form to digital information and computation, facilitating the direct manipulation of bits [4]. Such physical interactions are very natural and intuitive for human beings, because they enable two-handed input and can provide spatial and haptic feedback [5]. In this paper, we present RoboTable, an infrastructure that combines tabletop interaction with TUIs to support intuitive interaction with mobile robots. This framework can create a mixed-reality environment in which interaction with real robots and virtual objects can be combined seamlessly. This capability also extends the robot entity into the virtual world, enabling rich and complex HRIs to be supported in various applications. Based on the RoboTable framework, we have developed two prototype applications for proof-of-concept purposes. RoboPong is a tabletop game in which a robot player also participates. This game supports touch input for virtual objects and graspable interaction with robots simultaneously. ExploreRobot is

References

[1]  E. Hornecker, “‘I don't understand it either, but it is cool’—visitor interactions with a multi-touch table in a museum,” in Proceedings of the IEEE International Workshop on Horizontal Interactive Human Computer System (IEEE TABLETOP '08), pp. 113–120, Amsterdam, The Netherlands, October 2008.
[2]  A. Mahmud, O. Mubin, J. R. Octavia, et al., “Affective tabletop game: a new gaming experience for children,” in Proceedings of the 2nd IEEE International Workshop on Horizontal Interactive Human-Computer Systems (IEEE TABLETOP '07), pp. 44–51, Newport, RI, USA, October 2007.
[3]  M. R. Morris, A. M. Piper, A. Cassanego, A. Huang, A. Paepcke, and T. Winograd, “Mediating group dynamics through tabletop interface design,” IEEE Computer Graphics and Applications, vol. 26, no. 5, pp. 65–73, 2006.
[4]  H. Ishii, “Tangible bits: beyond pixels,” in Proceedings of the 2nd International Conference on Tangible and Embedded Interaction (ACM TEI '08), pp. 15–25, Bonn, Germany, February 2008.
[5]  D. Rosenfeld, M. Zawadzki, J. Sudol, and K. Perlin, “Physical objects as bidirectional user interface elements,” IEEE Computer Graphics and Applications, vol. 24, no. 1, pp. 44–49, 2004.
[6]  J. Kato, D. Sakamoto, M. Inami, and T. Igarashi, “Multi-touch interface for controlling multiple mobile robots,” in Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems (ACM CHI '09), pp. 3443–3448, Boston, Mass, USA, April 2009.
[7]  C. Guo, J. E. Young, and E. Sharlin, “Touch and toys: new techniques for interaction with a remote group of robots,” in Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems (ACM CHI '09), pp. 491–500, Boston, Mass, USA, April 2009.
[8]  C. Guo and E. Sharlin, “Utilizing physical objects and metaphors for human robot interaction,” in Proceedings of the Artificial Intelligence and the Simulation of Behaviour Convention (AISB '08), AISB Press, Aberdeen, UK, April 2008.
[9]  P. Frei, V. Su, B. Mikhak, and H. Ishii, “Curlybot: designing a new class of computational toys,” in Proceedings of the Conference on Human Factors in Computing Systems ‘The Future is Here’ (ACM CHI '20), pp. 129–136, The Hague, The Netherlands, April 2000.
[10]  M. Kojima, M. Sugimoto, A. Nakamura, M. Tomita, H. N II, and M. Inami, “Augmented coliseum: an augmented game environment with small vehicles,” in Proceedings of the 1st IEEE International Workshop on Horizontal Interactive Human-Computer Systems (IEEE TABLETOP '06), pp. 3–8, Adelaide, Australia, January 2006.
[11]  J. Leitner, M. Haller, K. Yun, W. Woo, M. Sugimoto, and M. Inami, “IncreTable, a mixed reality tabletop game experience,” in Proceedings of the International Conference on Advances in Computer Entertainment Technology (ACM ACE '08), pp. 9–16, Yokohama, Japan, December 2008.
[12]  D. Calife, J. L. Bernardes, and R. Tori, “Robot arena: an augmented reality platform for game development,” Computers in Entertainment, vol. 7, no. 1, Article ID 11, 2009.
[13]  J. Y. Han, “Low-cost multi-touch sensing through frustrated total internal reflection,” in Proceedings of the 18th Annual Symposium on User Interface Software and Technology (ACM UIST '05), pp. 115–118, Seattle, Wash, USA, 2005.
[14]  M. Kaltenbrunner and R. Bencina, “ReacTIVision: a computer-vision framework for table-based tangible interaction,” in Proceedings of the 1st ACM International Conference on Tangible and Embedded Interaction (ACM TEI '07), pp. 69–74, Baton Rouge, La, USA, 2007.
[15]  http://www.jbox2d.org.
[16]  http://www.mt4j.org/mediawiki/index.php/Main_Page.
[17]  http://bluecove.org.
[18]  P. Marshall, “Do tangible interfaces enhance learning,” in Proceedings of the First International Conference on Tangible and Embedded Interaction (ACM TEI '07), pp. 163–170, Baton Rouge, La, USA, 2007.
[19]  H. Mi, A. Krzywinski, and M. Sugimoto, “RoboStory: a tabletop mixed reality framework for children's role play storytelling,” in Proceedings of the 1st International Workshop on Interactive Storytelling for Children (ACM IDC '10), Association for Computing Machinery, Barcelona, Spain, June 2010.
[20]  T. Fong, I. Nourbakhsh, and K. Dautenhahn, “A survey of socially interactive robots,” Robotics and Autonomous Systems, vol. 42, no. 3-4, pp. 143–166, 2003.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413