全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Practical Application of a Tongue-Operated Joystick Device with Force Feedback Mechanism

DOI: 10.4236/ica.2019.103006, PP. 90-106

Keywords: Tongue, Joystick, Tactile Sensation, Mobility, Reaction Force Feedback

Full-Text   Cite this paper   Add to My Lib

Abstract:

The human tongue has superior movement and tactile sensations. For individuals with severe disabilities, a tongue operated interface device can be used to operate life-support equipment, such as powered wheelchairs and robotic manipulators. A joystick-type device can directly translate various tongue motions to external equipment behavior. In addition, the user can interactively communicate with the equipment by tactile feedback. This helps the user to control the equipment safely and skillfully. Considering these factors, in a previous study [1], we developed a novel tongue-operated joystick device with reaction force feedback mechanism. We described the design process including the analysis of human tongue movement and tactile sensations and showed fundamental performances of reaction force feedback with the prototype device. In this study, we discuss the shape of the operational part that is used by the tongue. Two types of operational tools are prepared and their operability and perception of reaction force feedback are compared. Furthermore, we confirm the effectiveness of reaction force feedback to operate the joystick device safely and skillful controlling a mobile robot in an unknown environment.

References

[1]  Ohba, T. and Kajikawa, S. (2017) Tongue-Operated Joystick Device with Reaction Force Feedback Mechanism. 2017 IEEE International Conference on Advanced Intelligent Mechatronics, Munich, 3-7 July 2017, 207-212. https://doi.org/10.1109/AIM.2017.8014019
[2]  Rudigkeit, N., Gebhard, M. and Graser, A. (2015) Evaluation of Control Modes for Head Motion-Based Control with Motion Sensors. 2015 IEEE International Symposium on Medical Measurements and Application, Turin, 7-9 May 2015, 135-140.
https://doi.org/10.1109/MeMeA.2015.7145187
[3]  Barea, R., Boquete, L., Mazo, M. and Lopez, E. (2002) System for Assisted Mobility Using Eye Movements Based on Electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10, 209-218. https://doi.org/10.1109/TNSRE.2002.806829
[4]  Krolak, A. and Strumillo, P. (2008) Vision-Based Blink Monitoring System for Human-Computer Interfacing. 2008 IEEE International Conference on Human System Interactions, Krakow, Poland, 25-27 May 2008, 994-998. https://doi.org/10.1109/HSI.2008.4581580
[5]  Iturrate, I., Antelis, J.M., Kubler, A. and Minguez, J. (2009) A Noninvasive Brainactuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation. IEEE Transactions on Robotics, 25, 614-627. https://doi.org/10.1109/TRO.2009.2020347
[6]  Sasaki, M., Suhaimi, M.S.A.B., Ito, S. and Rusydi, M.L. (2015) Robot Control System Based on Electrooculography and Electromyogram. Journal of Computer and Communications, 3, 113-120. https://doi.org/10.4236/jcc.2015.311018
[7]  Shima, K., Fukuda, O. and Tsuji, T. (2012) EMG-Based Control for a Feeding Support Robot Using a Probabilistic Neural Network. 2012 IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics, Rome, 24-27 June 2012, 1788-1793.
https://doi.org/10.1109/BioRob.2012.6290876
[8]  Wei, L. and Hu, H. (2011) A Hybrid Human-Machine Interface for Hands-Free Control of an Intelligent Wheelchair. International Journal Mechatronics and Automation, 1, 97-111.
https://doi.org/10.1504/IJMA.2011.040040
[9]  Kandel, E.R., Schwartz, J.H. and Jessell, T.M. (2000) Principles of Neural Science. 5th Edition, McGraw-Hill, New York.
[10]  Kimura, T. (2012) User Interface Detects Tongue Movement with Kinect.
https://www.youtube.com/watch?v=jWIl3CtH6SE
[11]  Saponas, T.S., Kelly, D., Parvix, B.A. and Tan, D.S. (2009) Optically Sensing Tongue Gestures for Computer Input. 22nd Annual ACM Symposium on User Interface Software and Technology, Victoria, 4-7 October 2009, 177-180. https://doi.org/10.1145/1622176.1622209
[12]  Draghici, O., Batkin, I., Bolic, M. and Chapman, I. (2013) The MouthPad: A Tongue-Computer Interface. 2013 IEEE International Symposium on Medical Measurements and Applications, Gatineau, 4-5 May 2013, 315-319. https://doi.org/10.1109/MeMeA.2013.6549759
[13]  Struijk, L.N.S.A. (2006) An Inductive Tongue Computer Interface for Control of Computers and Assistive Devices. IEEE Transactions on Biomedical Engineering, 53, 2594-2597.
https://doi.org/10.1109/TBME.2006.880871
[14]  Struijk, L.N.S.A., Egsgaard, L.L., Lontis, R., Gaihede, M. and Bentsen, B. (2017) Wireless Intraoral Tongue Control of an Assistive Robotic Arm for Individuals with Tetraplegia. Journal of NeuroEngineering and Rehabilitation, 14, 110. https://doi.org/10.1186/s12984-017-0330-2
[15]  Lund, M.E., Christiense, H.V., Caltenco, H.A., Lontis, E.R., Bentse, B. and Andreasen, L.N.S.A. (2010) Inductive Tongue Control of Powered Wheelchairs. IEEE 32nd Engineering in Medicine and Biology Society Conference, Buenos Aires, Argentina, 31 August-4 September 2010, 3361-3364. https://doi.org/10.1109/IEMBS.2010.5627923
[16]  Kim, J., Huo, X., Minocha, J., Holbrook, J., Laumann, A. and Gjovanloo, M. (2012) Evaluation of a Smartphone Platform as a Wireless Interface between Tongue Drive System and Electric-Powered Wheelchairs. IEEE Transactions on Biomecical Engineering, 59, 1787-1796.
https://doi.org/10.1109/TBME.2012.2194713
[17]  Tang, H. and Beebe, D.J. (1999) Tactile Sensitivity of the Tongue on Photolithographically Fabricated Patterns. Proceedings of the First Joint BMES/EMBS Conference Serving Humanity, Advancing Technology, Atlanta, GA, 13-16 October 1999, 633.
[18]  Kaczmarek, K.A. (2011) The Tongue Display Unit(TDU) for Electrotactile Spatiotemporal Pattern Presentation. Scientia Iranica, 18, 1476-1485. https://doi.org/10.1016/j.scient.2011.08.020
[19]  Sampaio, E., Maris, S. and Bach-y-Rita, P. (2001) Brain Plasticity: Visual Acuity of Blind Persons via the Tongue. Brain Research, 908, 204-207. https://doi.org/10.1016/S0006-8993(01)02667-1
[20]  Vuillerme, N., Pinsault, N., Chenu, O., Fleury, A., Payan, Y. and Demongeot, J. (2009) A Wireless Embedded Tongue Tactile Biofeedback System for Balance Control. Pervasive and Mobile Computing, 5, 268-275. https://doi.org/10.1016/j.pmcj.2008.04.001
[21]  Droessler, N.J., Hall, D.K., Tyler, M.E. and Ferrier, N.J. (2001) Tongue-Based Electrotactile Feedback to Perceive Objects Grasped by a Robotic Manipulator: Preliminay Results. 2012 23rd Annual International Conference IEEE Engineering in Medicine and Biology Society, Istanbul, 25-28 October 2001, 1404-1407.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413