%0 Journal Article %T Blind Sailors¡¯ Spatial Representation Using an On-Board Force Feedback Arm: Two Case Studies %A Mathieu Simonnet %A Eamonn Ryall %J Advances in Human-Computer Interaction %D 2013 %I Hindawi Publishing Corporation %R 10.1155/2013/163718 %X Using a vocal, auditory, and haptic application designed for maritime navigation, blind sailors are able to set up and manage their voyages. However, investigation of the manner to present information remains a crucial issue to better understand spatial cognition and improve navigation without vision. In this study, we asked two participants to use SeaTouch on board and manage the ship headings during navigation in order to follow a predefined itinerary. Two conditions were tested. Firstly, blind sailors consulted the updated ship positions about the virtual map presented in an allocentric frame of reference (i.e., facing north). In the second case, they used the forced-feedback device in an egocentric frame of reference (i.e., facing the ship headings). Spatial performance tended to show that the egocentric condition was better for controlling the course during displacement, whereas the allocentric condition was more efficient for building mental representation and remembering it after the navigation task. 1. Introduction Nowadays, it is well known that blind people can take advantage of virtual navigation. Indeed, various experiments have shown interesting results in different types of environments. Jansson and Pedersen [1] studied a virtual map of North America¡¯s States and showed that it was difficult to navigate with a haptic mouse (VTPlayer) which only provided participants with cutaneous feedback (two matrices of pins) in spite of the participants¡¯ motivation. Gutierrez [2] used a forced-feedback device (GRAB Project) and assessed a geographical haptic representation of Madrid (Spain) by asking twelve blind people to create and depict a route between two points. Participants completed the tasks without major difficulty and stated that the application was attractive and easy to use because of the combination of forced-feedback and audio information. At the same time, Magnusson and Rassmus-Gr£¿hn [3] investigated the transfer between an egocentric haptic virtual environment and a real street in a district of Lund (Sweden). Here, they asked participants to prepare and realize an itinerary from a bus stop to a music hall. Results show that blind people who were good at navigating with a cane were also good at exploring with the haptic device. Then, in the street, participants were able to complete the itinerary learnt in the virtual environment. Therefore, a transfer happened between the virtual and real worlds. In this respect, Lahav and Mioduser [4] compared nonvisual spatial representations obtained after having explored real and virtual %U http://www.hindawi.com/journals/ahci/2013/163718/