全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Designing of a Personality Based Emotional Decision Model for Generating Various Emotional Behavior of Social Robots

DOI: 10.1155/2014/630808

Full-Text   Cite this paper   Add to My Lib

Abstract:

All humans feel emotions, but individuals express their emotions differently because each has a different personality. We design an emotional decision model that focuses on the personality of individuals. The personality-based emotional decision model is designed with four linear dynamics, viz. reactive dynamic system, internal dynamic system, emotional dynamic system, and behavior dynamic system. Each dynamic system calculates the output values that reflect the personality, by being used as system matrices, input matrices, and output matrices. These responses are reflected in the final emotional behavior through a behavior dynamic system as with humans. The final emotional behavior includes multiple emotional values, and a social robot shows various emotional expressions. We perform some experiments using the cyber robot system, to verify the efficiency of the personality-based emotional decision model that generates various emotions according to the personality. 1. Introduction Social robots communicate with various humans in our daily life environment. They play with humans as toys and friends [1–9], provide information as guides [10, 11] or teachers [12], and help humans [13, 14]. In this process, emotional communication is the key to making humans regard robots as friends. For this reason, researchers have designed artificial emotional systems and social robots. Breazeal designed a three-dimensional emotional space model that consists of arousal, valence, and stance, and developed Kismet and Leonardo [15, 16]. Miwa et al. proposed a three-dimensional emotional space model that consists of activation, pleasantness, and certainty [17]. The emotional vector calculated by a quadratic differential equation decides the final emotion of the humanoid robot WE-4RII. Lee et al. proposed a linear affect-expression space model that consists of surprise, angriness, and sadness, for Doldori [18]. Karg et al. designed an affect model based on Piecewise Linear system to model transitions of affect [19]. Kanoh et al. proposed an emotional model using a three-dimensional emotional space for ifbot [20]. Becker-Asano and Wachsmuth designed the “WASABI” affect simulation architecture, which uses a three-dimensional emotion space called PAD (Pleasure-Arousal-Dominance) space [21]. In these studies, social robots generate and express their emotions in Human-Robot Interaction. Emotions are known to be dependent on contextual and cultural background. The emotional systems of human beings result in different emotions and behaviors under the same external stimuli. One of the

References

[1]  T. Shibata, K. Ohkawa, and K. Tanie, “Spontaneous behavior of robots for cooperation—emotionally intelligent robot system,” in Proceedings of the 13th IEEE International Conference on Robotics and Automation, vol. 3, pp. 2426–2431, Minneapolis, Minn, USA, April 1996.
[2]  M. Fujita and H. Kitano, “Development of an autonomous quadruped robot for robot entertainment,” Autonomous Robots, vol. 5, no. 1, pp. 7–18, 1998.
[3]  H. S. Ahn, I.-K. Sa, D.-W. Lee, and D. Choi, “A playmate robot system for playing the rock-paper-scissors game with humans,” Artificial Life and Robotics, vol. 16, no. 2, pp. 142–146, 2011.
[4]  M. Fujita, “On activating human communications with pet-type robot AIBO,” Proceedings of the IEEE, vol. 92, no. 11, pp. 1804–1813, 2004.
[5]  M. Haring, N. Bee, and E. Andre, “Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots,” in Proceedings of the 20th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN ’11), pp. 204–209, Atlanta, Ga, USA, August 2011.
[6]  B. Gonsior, S. Sosnowski, C. Mayer et al., “Improving aspects of empathy and subjective performance for HRI through mirroring facial expressions,” in Proceedings of the 20th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN ’11), pp. 350–356, Atlanta, Ga, USA, August 2011.
[7]  H. S. Ahn, D. W. Lee, D. Choi et al., “Development of an android for singing with motion capturing,” in Proceedings of the 37th Annual Conference of the IEEE Industrial Electronics Society (IECON ’11), pp. 63–68, 2011.
[8]  H. S. Ahn, D. W. Lee, D. Choi, D. Y. Lee, H. Lee, and M. H. Baeg, “Development of an incarnate announcing robot system using emotional interaction with humans,” International Journal of Humanoid Robotics, vol. 10, no. 2, pp. 1–24, 2013.
[9]  M. Scheeff, J. Pinto, K. Rahardja, S. Snibbe, and R. Tow, “Experiences with sparky, a social robot,” in Socially Intelligent Agents, K. Dautenhahn, A. Bond, L. Ca?amero, and B. Edmonds, Eds., vol. 3 of Multiagent Systems, Artificial Societies, and Simulated Organizations, pp. 173–180, Springer, New York, NY, USA, 2002.
[10]  W. Burgard, A. B. Cremers, D. Fox et al., “The interactive museum tour-guide robot,” in Proceedings of the 15th National Conference on Artificial Intelligence, pp. 11–18, 1998.
[11]  M. Bennewitz, F. Faber, D. Joho, M. Schreiber, and S. Behnke, “Towards a humanoid museum guide robot that interacts with multiple persons,” in Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots (ICHR ’05), pp. 418–423, Tsukuba, Jaban, December 2005.
[12]  J. Solis, M. Bergamasco, K. Chida, S. Isoda, and A. Takanishi, “The anthropomorphic flutist robot WF-4 teaching flute playing to beginner students,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA ’04), vol. 1, pp. 146–151, May 2004.
[13]  H. S. Ahn, I. K. Sa, and J. Y. Choi, “PDA-based mobile robot system with remote monitoring for home environment,” IEEE Transactions on Consumer Electronics, vol. 55, no. 3, pp. 1487–1495, 2009.
[14]  H. S. Ahn, Y. M. Beak, I. K. Sa, W. S. Kang, J. H. Na, and J. Y. Choi, “Design of reconfigurable heterogeneous modular architecture for service robots,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ’08), pp. 1313–1318, Nice, France, September 2008.
[15]  C. Breazeal, Designing Sociable Robots, MIT Press, Cambridge, Mass, USA, 2002.
[16]  W. Dan Stiehl, L. Lalla, and C. Breazeal, “A ‘somatic alphabet’ approach to ‘sensitive skin’,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA ’04), vol. 3, pp. 2865–2870, May 2004.
[17]  H. Miwa, K. Itoh, M. Matsumoto et al., “Effective emotional expressions with emotion expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ’04), vol. 3, pp. 2203–2208, October 2004.
[18]  H. S. Lee, J. W. Park, and M. J. Chung, “A linear affect-expression space model and control points for mascot-type facial robots,” IEEE Transactions on Robotics, vol. 23, no. 5, pp. 863–873, 2007.
[19]  M. Karg, S. Haug, K. Kühnlenz, and M. Buss, “A dynamic model and system-theoretic analysis of affect based on a piecewise linear system,” in Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive (RO-MAN ’09), pp. 238–244, Toyama, Japan, October 2009.
[20]  M. Kanoh, S. Kato, and H. Itoh, “Facial expressions using emotional space in sensitivity communication robot ‘ifbot’,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ’04), vol. 2, pp. 1586–1591, October 2004.
[21]  C. Becker-Asano and I. Wachsmuth, “Affect simulation with primary and secondary emotions,” in Intelligent Virtual Agents, H. Prendinger, J. Lester, and M. Ishizuka, Eds., vol. 5208 of Lecture Notes in Computer Science, pp. 15–28, Springer, Berlin, Germany, 2008.
[22]  I. Wilson, “The artificial emotion engine, driving emotional behavior,” in Proceedings of the AAAI Spring Symposium on Artificial Intelligence and Interactive Environment, pp. 76–80, 2000.
[23]  S. Kshirsagar and N. Magneat-Thalmann, “A multilayer personality model,” in Proceedings of the 2nd International Symposium on Smart Graphics (SMARTGRAPH '02), pp. 107–115, ACM, Hawthorne, NY, USA, 2002.
[24]  H. Ushida, Y. Hirayama, and H. Nakajima, “Emotion model for life-like agent and its evaluation,” in Proceedings of the 15th National/10th Conference on Artificial Intelligence/Innovative Applications of Artificial Intelligence (AAAI '98/IAAI '98), pp. 62–69, Menlo Park, Calif, USA, July 1998.
[25]  J. Bates, “The role of emotion in believable agents,” Communications of the ACM, vol. 37, no. 7, pp. 122–125, 1994.
[26]  S. Reilly, Believable social and emotional agents [Ph.D. thesis], Department of Computer Science, Carnegie Mellon University, Pittsburgh, Pa, USA, 1996.
[27]  K. R. Scherer, “The role of culture in emotion-antecedent appraisal,” Journal of Personality and Social Psychology, vol. 73, no. 5, pp. 902–922, 1997.
[28]  J. L. Tsai, Y. Chentsova-Dutton, L. Freire-Bebeau, and D. E. Przymus, “Emotional expression and physiology in European Americans and Hmong Americans,” Emotion, vol. 2, no. 4, pp. 380–397, 2002.
[29]  H. S. Ahn and J. Y. Choi, “Emotional behavior decision model based on linear dynamic systems for intelligent service robots,” in Proceedings of the 16th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN ’07), pp. 786–791, Jeju, Republic of Korea, August 2007.
[30]  H. S. Ahn, P. J. Kim, J. H. Choi et al., “Emotional head robot with behavior decision model and face recognition,” in Proceedings of the International Conference on Control, Automation and Systems (ICCAS ’07), pp. 2719–2724, Seoul, Republic of Korea, October 2007.
[31]  H. S. Ahn, Y. M. Baek, J. H. Na, and J. Y. Choi, “Multi-dimensional emotional engine with personality using intelligent service robot for children,” in Proceedings of the International Conference on Control, Automation and Systems (ICCAS ’08), pp. 2020–2025, Seoul, Republic of Korea, October 2008.
[32]  H. S. Ahn, J. Y. Choi, and D. W. Lee, “Universal emotional behavior decision system for social robots,” in Discrete Event Robots, iConceptPress, Hong Kong, 2012.
[33]  D. Lee, H. S. Ahn, and J. Y. Choi, “A general behavior generation module for emotional robots using unit behavior combination method,” in Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive (RO-MAN ’09), pp. 375–380, Toyama, Japan, October 2009.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413