全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Exploring Sensor Gloves for Teaching Children Sign Language

DOI: 10.1155/2012/210507

Full-Text   Cite this paper   Add to My Lib

Abstract:

This research investigates if a computer and an alternative input device in the form of sensor gloves can be used in the process of teaching children sign language. The presented work is important, because no current literature investigates how sensor gloves can be used to assist children in the process of learning sign language. The research presented in this paper has been conducted by assembling hardware into sensor gloves, and by designing software capable of (i) filtering out sensor noise, (ii) detecting intentionally posed signs, and (iii) correctly evaluating signals in signs posed by different children. Findings show that the devised technology can form the basis of a tool that teaches children sign language, and that there is a potential for further research in this area. 1. Introduction Communication involves the exchange of information, and this can only occur effectively if all participants use a common language [1]. Deaf people need an efficient nonauditory means of expressing and interpreting information in order to communicate, and sign language have proven effective in communicating across a broad spectrum of requirements from everyday needs to sophisticated concepts. Australian sign language (Auslan) is the native sign language used in Australia where the research has been conducted, but the work is equally applicable to other signed languages. It is important that intuitive and efficient tools for teaching sign language are available to ensure that hearing impaired people are able to develop extensive social networks with deaf and hearing people. In addition to ensure that deaf people are able to obtain the best possible education and services within the community. This research investigates if a computer, and an alternative input device in the form of sensor gloves, can be used in the process of teaching children Australian sign language (Auslan). Each sign consists of a number of parts: hand shape, place of articulation, orientation, path of movement, and nonsign components including facial expression [1]. For this research we are focusing on the hand shape component as one important aspect of a sign. We wish to use a computer, because computers can act as an ideal medium for conveying details of sign language such as hand shapes, location, and hand movements. In addition to this, the learner can work at their own pace, at a place and time that is convenient to them. The learner can target the vocabulary that is relevant to their circumstances and multimedia can provide supplementary information to enhance the learning experience.

References

[1]  G. R. Karlan, “Manual communication with those who can hear,” in Manual Communication: Implications for Education, H. Bornstein, Ed., pp. 151–185, Gallaudet University Press, Washington, DC, USA, 1990.
[2]  K. Ellis and K. Blashki, “Children, Australian sign language and the web; the possibilities,” pp. 281-287.
[3]  K. Ellis and K. Blashki, “The digital playground: kindergarten children learning sign language via multimedia,” AACE Journal, vol. 15, no. 3, pp. 225–253, 2007.
[4]  gizmag, “The acceleglove—capturing hand gestures in virtual reality,” September 2003, http://www.gizmag.com/go/2134/.
[5]  J. L. Hernandez-Rebollar, N. Kyriakopoulos, and R. W. Lindeman, “A new instrumented approach for translating American sign language into sound and text,” in Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 547–552, May 2004.
[6]  G. Grimes, Digital Data Entry Glove Interface Device, AT & T Bell Labs, 1983.
[7]  S. Sidney and E. Geoffrey, “Glove talk—a neural-network interface between a data-glove and a speech synthesizer,” IEEE Transactions on Neural Networks, vol. 4, no. 1, pp. 2–8, 1993.
[8]  J. Kramer and L. Leifer, “The talking glove: an expressive and receptive “verbal” communication aid for the deaf, deaf-blind, and Nonvocal,” in Proceedings of the Computer Technology, Special Education, and Rehabilitation Conference, pp. 335–340, 1987.
[9]  J. Hernandez, N. Kyriakopoulos, and W. Lindeman, The AcceleGlove: A Whole-Hand Input Device for Virtual Reality, ACM Press, 2002.
[10]  D. Sturman and D. Zelter, “A survey of glove-based input,” IEEE Computer Graphics and Applications, vol. 14, no. 1, pp. 30–39, 1994.
[11]  M. Mohandes and S. Buraiky, “Automation of the Arabic sign language recognition using the powerglove,” AIML Journal, vol. 7, no. 1, pp. 41–46, 2007.
[12]  vrlogic, “CyberGlove,” March, 2009, http://www.vrlogic.com/html/immersion/cyberglove.html.
[13]  R. M. McGuire, J. Hernandez-Rebollar, T. Starner, V. Henderson, H. Brashear, and D. S. Ross, “Towards a one-way American sign language translator,” in Proceedingsof the 6th IEEE International Conference on Automatic Face and Gesture Recognition (FGR '04), pp. 620–625, May 2004.
[14]  J. L. Hernandez-Rebollar and E. Mendez, “Interactive American sign language dictionary,” in Proceedings of the ACM SIGGRAPH International Conference on Computer Graphics and Interactive Techniques, p. 26, Los Angeles, Calif, USA, August 2004.
[15]  S. Simon and K. Johnson, “Improving the efficacy of motion analysis as a clinical tool through artificial intelligence techniques,” in Pediatric Gait: A New Millenium in Clinical Care and Motion Analysis Technology, pp. 23–29, 2000.
[16]  Infusion Systems, “Infusion systems,” June, 2012, http://www.Infusionsystems.com.
[17]  K. Ellis, M. Quigley, and M. Power, “Experiences in ethical usability testing with children,” Journal of Information Technology Research, vol. 1, no. 3, pp. 1–13, 2007.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413