%0 Journal Article %T RoboTable: An Infrastructure for Intuitive Interaction with Mobile Robots in a Mixed-Reality Environment %A Haipeng Mi %A Aleksander Krzywinski %A Tomoki Fujita %A Masanori Sugimoto %J Advances in Human-Computer Interaction %D 2012 %I Hindawi Publishing Corporation %R 10.1155/2012/301608 %X This paper presents the design, development, and testing of a tabletop interface called RoboTable, which is an infrastructure supporting intuitive interaction with both mobile robots and virtual components in a mixed-reality environment. With a flexible software toolkit and specifically developed robots, the platform enables various modes of interaction with mobile robots. Using this platform, prototype applications are developed for two different application domains: RoboPong investigates the efficiency of the RoboTable system in game applications, and ExploreRobot explores the possibility of using robots and intuitive interaction to enhance learning. 1. Introduction In the past few years, much development has taken place in the research field of human-computer interaction. Several research approaches in this field, including tabletop interaction, tangible user interfaces (TUIs), augmented reality, and mixed-reality, show great promise for bringing new interaction styles to other related research domains. The research presented in this paper is an attempt to integrate several research approaches to create a mixed reality environment for novel human-robot interaction (HRI). Because the horizontal surface of a table permits the placement of objects, and its large surface area enables spreading, piling, and organization of the items, digital tabletop user interfaces are becoming increasingly popular for supporting natural and intuitive interaction [1¨C3]. A TUI is a user interface in which a person interacts with digital information via the physical environment. It gives a physical form to digital information and computation, facilitating the direct manipulation of bits [4]. Such physical interactions are very natural and intuitive for human beings, because they enable two-handed input and can provide spatial and haptic feedback [5]. In this paper, we present RoboTable, an infrastructure that combines tabletop interaction with TUIs to support intuitive interaction with mobile robots. This framework can create a mixed-reality environment in which interaction with real robots and virtual objects can be combined seamlessly. This capability also extends the robot entity into the virtual world, enabling rich and complex HRIs to be supported in various applications. Based on the RoboTable framework, we have developed two prototype applications for proof-of-concept purposes. RoboPong is a tabletop game in which a robot player also participates. This game supports touch input for virtual objects and graspable interaction with robots simultaneously. ExploreRobot is %U http://www.hindawi.com/journals/ahci/2012/301608/