**6. References**


Affective Human-Humanoid Interaction Through Cognitive Architecture 161

Duch, W. ; Oentaryo, R. J. & Pasquier, M.. (2008). Cognitive Architectures: Where do we go

Ekman, P. (1994). Strong Evidence for Universals in Facial Expressions: A Reply to Russell's

Ekman, P. & Friesen, W.. (1977). Facial Action Coding System. Consulting Psychologists

Ekman, P.; Friesen, W.V., & Hager, J.C.. (2002). Facial Action Coding System, A Human

Epstein, S. L.. (1994) For the Right Reasons: The FORR Architecture for Learning in a Skill

Erlhagen, W. & Bicho, E.. (2006) The dynamic neural field approach to cognitive robotics.

Fogassi, L. (2011). The mirror neuron system: How cognitive functions emerge from motor

Franklin, S. & F. G. Patterson, Jr. (2006). The Lida Architecture: Adding New Modes of

Gazzola, V. ; Rizzolatti , G.; B. Wicker, B. & C. Keysers, C. (2007). The anthropomorphic

Goertzel, B. & Pennachin, C.. (2007). *Artificial General Intelligence*. Springer-Verlag, Berlin,

Goertzel, B. (2008). OpenCog Prime: Design for a Thinking Machine. Online wikibook, at

Goodrich, M. A. & Schultz, A (2007). *Human-Robot Interaction : A Survey,* Foundations and Trands in Human-Computer interaction, vol.1, No. 3, 2007, pp. 203-275. Gopnick, A. & Moore, A. (1994). "Changing your views: How understanding visual

Grandjean, D.; Sander, D., & Scherer, K. R.. (2008). Conscious emotional experience emerges

Grossberg, S. (1987). Competitive learning: From interactive activation to adaptive

Gunes, H. & Piccardi, M. (2009). Automatic Temporal Segment Detection and Affect

Cybernetics Part B, Special Issue on Human Computing 39(1), 64–84.

Mind, eds. C. Lewis and P. Mitchell, 157-181. Lawrence Erlbaum

organization, *Journal of Economic Behavior & Organization*, vol. 77, n. 1, Special Issue: Emotions, Natural Selection and Rationality, January 2011, pp. 66-75, doi: 10.1016/

Learning to an Intelligent, Autonomous, Software Agent. Integrated Design and Process Technology, IDPT-2006, San Diego, CA, Society for Design and Process

brain: The mirror neuron system responds to human and robotic actions, NeuroImage, vol. 35, n 4, 1 May 2007, Pages 1674-1684, ISSN 1053-8119, doi:

perception can lead to a new theory of mind," in Children's Early Understanding of

as a function of multilevel, appraisal-driven response synchronization. Consciousness & Cognition, vol. 17, no. 2, pp. 484–495, 2008, Social Cognition,

Recognition from Face and Body Display. IEEE Transactions on Systems, Man, and

Mistaken Critique," Psychological Bull., vol. 115, no. 2, pp. 268-287.

Duda, R.; Hart, P. & Stork, D. (2000). Pattern Classification, Wiley-Interscience

Domain. *Cognitive Science*, vol. 18, no. 3, pp. 479-511.

*Journal of Neural Engineering*, vol. 3, pp. 36–54.

10.1016/j.neuroimage.2007.02.003.

Emotion, & Self-Consciousness.

http://opencog.org/wiki/OpenCogPrime.

resonance. *Cognitive Science,* vol. 11, pp. 23-63.

Press, 1977.

Face, 2002.

j.jebo.2010.04.009.

Science.

Heidelberg.

from here?. In Proceeding of the 2008 conference on Artificial General Intelligence 2008: Proceedings of the First AGI Conference, Pei Wang, Ben Goertzel, and Stan Franklin (Eds.). IOS Press, Amsterdam, The Netherlands, The Netherlands, 122-136.


160 The Future of Humanoid Robots – Research and Applications

Bach, J.; Dörner, D., & Vuine, R. (2006). Psi and MicroPsi. A Novel Approach to Modeling

Barakova, E. I. & Lourens, T. (2009). Mirror neuron framework yields representations for

Bartlett, M.S. ; Littlewort, G. ; Frank, M.; Lainscsek, C.; Fasel, I., & Movellan, J.. (2005).

Bartlett, M.S. ; Littlewort, G. ; Frank, M.; Lainscsek, C.; Fasel, I., & Movellan, J.. (2006). Fully

Cerezo, E.; Baldassarri, S; Hupont, I. & Seron, F. J. (2008). Affective Embodied

Chaminade, T. & Cheng, G. (2009). Social cognitive neuroscience and humanoid robotics,

Chaminade, T.; Zecca, M.; Blakemore, S.-J.; Takanishi, A.; Frith, C.D. & al. (2010) Brain

Chong, H. ; Tan, A. & Ng, G.W.. (2007) Integrated cognitive architectures: a survey. Artificial

Cohn, J.F.. (2006). Foundations of Human Computing: Facial Expression and Emotion. In Proc. Eighth ACM Int'l Conf. Multimodal Interfaces (ICMI '06), pp. 233-238.

Conf. Automatic Face and Gesture Recognition (AFGR '06), pp. 223-230. Beer, R. D. (2000) Dynamical approaches to cognitive science, *Trends in Cognitive Sciences*, vol. 4, no. 3, 1 March 2000, pp. 91-99. doi: 10.1016/S1364-6613(99)01440-0. Birlo, M & Tapus, A.. (2011) The crucial role of robot self-awareness in HRI, in Proc. of the

available at http://www.macs.hw.ac.uk/EcircusWeb/webContent/. Bachorowski, J., & Fernandez-Dols, J.. (2003). Facial and Vocal Expressions of Emotion. Ann.

2007), January 2009, pp. 895-900. doi: 10.1016/j.neucom.2008.04.057. Barsade S.G., (2002). The Ripple Effect: Emotional Contagion and Its Influence on Group Behavior, Administrative Science Quarterly, vol. 47, n. 4. (Dec., 2002), pp. 644-675. Bartl, C., & Dörner, D. (1998). PSI: A theory of the integration of cognition, emotion and

Rev. Psychology, vol. 54, pp. 329-349.

Conference on Cognitive Modelling, pp. 66-73.

'05), pp. 568-573.

doi>10.1145/1957656.1957688

onal\_agents\_for\_natural\_interaction.

*Systems*, vol. 9, n. 2, pp. 301-318.

Intelligence Review, pp. 103-130

Emotion and Cognition in a Cognitive Architecture. Tutorial at ICCM 2006

robot interaction, *Neurocomputing,* vol. 72, n. 4-6, Brain Inspired Cognitive Systems (BICS 2006) / Interplay Between Natural and Artificial Computation (IWINAC

motivation. F. E. Ritter, & R. M. Young (Eds.), Proceedings of the 2nd European

Recognizing Facial Expression: Machine Learning and Application to Spontaneous Behavior, Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition (CVPR

Automatic Facial Action Recognition in Spontaneous Behavior. In Proc. IEEE Int'l

6th international conference on Human-robot interaction HRI'11, ACM.

Conversational Agents for Natural Interaction, *Affective Computing*, Jimmy Or (Ed.), ISBN: 978-3-902613-23-3, I-Tech Education and Publishing, Available from: http://www.intechopen.com/articles/show/title/affective\_embodied\_conversati

*Journal of Physiology–Paris*, vol.103, pp. 286-295. doi:10.1016/j.jphysparis.2009.08.011

Response to a Humanoid Robot in Areas Implicated in the Perception of Human Emotional Gestures. *PLoS ONE* 5(7): e11577. doi:10.1371/journal.pone.0011577. Chang, Y.; Hu, C.; Feris, R., & Turk, M.. (2006). Manifold Based Analysisof Facial

Expression. *Journal of Image and Vision Computing*, vol. 24, no. 6, pp. 605-614.Chella, A.; Dindo, H. & Infantino I. (2008). Cognitive approach to goal-level imitation, *Interaction Studies: Social Behaviour and Communication in Biological and Artificial* 


Affective Human-Humanoid Interaction Through Cognitive Architecture 163

Kotsia I., & Pitas, I.. (2007). Facial Expression Recognition in Image Sequences Using

Laird, J. E. (2008). Extending the Soar Cognitive Architecture. In proc. of the 2008 conference

Laird, J. E.; Newell, A. & Rosenbloom, P.S., (1987). "Soar: An architecture for general

Langley, P., & Choi, D. (2006). A unified cognitive architecture for physical agents. In proc.

Langley, P.; Laird, J. E.; & Rogers, S. (2009). Cognitive architectures: Research issues and

Malatesta, L.; Murray, J.; Raouzaiou, A.; Hiolle, A; Cañamero, L. & and Karpouzis, K. (2009).

Marinier R. P. III; Laird, J.E & Lewis, R. L.. (2009). A computational unification of cognitive

Pantic, M. & Bartlett, M.S.. (2007). Machine Analysis of Facial Expressions. Face Recognition,

Pantic, M., & Patras, I.. (2006). Dynamics of Facial Expression: Recognition of Facial Actions

Pantic, M.; Pentland, A.; Nijholt, A. & Huang, T.S. (2006). Human Computing and Machine

Russell, J. A.; Bachorowski, J., & Fernández-Dols, J. M. (2003). Facial and Vocal Expressions of

Salvucci, D. D., & Taatgen, N. A. (2011). Toward a unified view of cognitive control. Topics in

Sandini, G.; Metta, G. & Vernon, D. (2007). The iCub Cognitive Humanoid Robot: An Open-

Saygin, A.P.; Chaminade, T.; Ishiguro, H.; Driver, J. & Frith C. (2011). The thing that should

*Systems, Man, and Cybernetics Part B*, vol. 36, no. 2, pp. 433-449.

Multimodal Interfaces (ICMI '06), pp. 239-248.

doi:10.1146/annurev.psych.54.101601.145102

*Processing*, vol. 16, no. 1, pp. 172-187.

1474.

interaction .

J.A..

10.1016/j.cogsys.2008.03.004.

Cognitive Science, 3, 227–230.

2011. doi:10.1093/scan/nsr025

Netherlands, The Netherlands, pp. 224-235.

intelligence," Artificial Intelligence, vol. 33, pp. 1-64.

challenges. *Cognitive Systems Research*, 10, 141-160.

Geometric Deformation Features and Support Vector Machines. *IEEE Trans. Image* 

on Artificial General Intelligence 2008: Proceedings of the First AGI Conference, Pei Wang, Ben Goertzel, and Stan Franklin (Eds.). IOS Press, Amsterdam, The

of the 21st national conference on Artificial intelligence (AAAI'06), vol. 2, pp. 1469-

Emotion Modelling and Facial Affect Recognition in Human-Computer and Human-Robot Interaction, *State of the Art in Face Recognition*, Julio Ponce and Adem Karahoca (Ed.), ISBN: 978-3-902613-42-4, I-Tech Education and Publishing, Available from: http://www.intechopen.com/articles/show/title/emotion\_ modelling\_and\_facial\_affect\_recognition\_in\_human-computer\_and\_human-robot\_

behavior and emotion, *Cognitive Systems Research*, vol. 10, n. 1, Modeling the Cognitive Antecedents and Consequences of Emotion, March 2009, pp. 48-69, doi:

K. Delac and M. Grgic, eds., pp. 377-416, I-Tech Education and Publishing Russell,

and Their Temporal Segments Form Face Profile Image Sequences. *IEEE Trans.* 

Understanding of Human Behavior: A Survey, in proc. Of Eighth ACM Int'l Conf.

Emotion, Annual Review of Psychology, vol. 54, pp. 329-349,

System Research Platform for Enactive Cognition, in *50 Years of AI*, M. Lungarella et al. (Eds.), Festschrift, LNAI 4850, pp. 359–370, 2007, Springer-Verlag, Heidelberg

not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. *Soc Cogn Affect Neurosci*, first published online April 22,


162 The Future of Humanoid Robots – Research and Applications

Gunes, H. & Piccardi,M. (2006). A Bimodal Face and Body Gesture Database for Automatic

Hao, T., & Huang, T.S. (2008). 3D facial expression recognition based on automatically

Hommel, B.; Müsseler, J; Aschersleben, G. & Prinz W. (2001). The Theory of Event Coding

Infantino, I.; Lodato, C.; Lopes, S. & Vella, F. (2008). Human-humanoid interaction by an

Juslin, P.N., & Scherer, K.R.. (2005). Vocal Expression of Affect. The New Handbook of

Kelley, R.; Tavakkoli, A.; King, C.; Nicolescu, M.; Nicolescu, M. & Bebis, G.. (2008).

Kelley, R.; Tavakkoli, A.; King, C.; Nicolescu, M. & Nicolescu, M. (2010). Understanding

Kieras, D. & Meyer, D.E. (1997). An overview of the EPIC architecture for cognition and

Knox, J. (2010). Response to 'Emotions in action through the looking glass', *Journal of Analytical Psychology*, vol. 55, n. 1, pp. 30-34. doi:10.1111/j.1468-5922.2009.01822.x.

*Interaction Studies*, vol. 7, n. 3, pp. 297-337. doi: 10.1075/is.7.3.03mac. Kipp, M., & Martin, J.-C. (2009). Gesture and Emotion: Can basic gestural form features

*Sciences*, vol. 24, pp. 849-878. doi:10.1017/S0140525X01000103.

Gesture Recognition - 2011.

10.1111/j.1467-8721.1993.tb00114.x

10.1109/CSPA.2011.5759912

Scherer, eds., Oxford Univ. Press.

DOI=10.1145/1349822.1349870

intentions-for-human-robot-interaction.

Intelligent Interaction 2009, pp.1-8.

Analysis of Human Nonverbal Affective Behavior. In: Proc. of the 18th Intl. Conference on Pattern Recognition (ICPR 2006), vol. 1, pp. 1148–1153 (2006). Gunes, H.; Schuller, B.; Pantic, M., & Cowie, R.. (2011). Emotion representation, analysis and

synthesis in continuous space: A survey," in Proc. of IEEE Int. Conf. on Face and

selected features. Computer Vision and Pattern Recognition Workshops, 2008. CVPRW '08. IEEE Comp. Society Conf. pp.1-8. doi: 10.1109/CVPRW.2008.4563052 Hatfield, E; Cacioppo, J. T. & Rapson, R. L. (1994). Emotional contagion. Current Directions,

*Psychological Science,* vol. 2, pp. 96-99, Cambridge University Press. doi:

(TEC): A framework for perception and action planning. *Behavioral and Brain* 

intentional system, In proc. of 8th IEEE-RAS International Conference on Humanoids 2008, pp.573-578, 1-3 Dec. 2008. doi: 10.1109/ICHR.2008.4756007. Jerritta, S.; Murugappan, M.; Nagarajan, R., & Wan, K.. (2011). Physiological signals based

human emotion Recognition: a review. Signal Processing and its Applications (CSPA), IEEE 7th Intl. Colloquium on, pp. 410-415, 4-6 March 2011. doi:

Methods in Nonverbal Behavior Research, J. Harrigan, R. Rosenthal, and K.

Understanding human intentions via hidden markov models in autonomous mobile robots. In Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction (HRI '08). ACM, New York, NY, USA, 367-374.

Activities and Intentions for Human-Robot Interaction, *Human-Robot Interaction*, Daisuke Chugo (Ed.), ISBN: 978-953-307-051-3, InTech, Available from: http://www.intechopen.com/articles/show/title/understanding-activities-and-

performance with application to human-computer interaction. *Human-Computer Interaction*, vol. 12, pp. 391-438\*MacDorman, K.F. & Ishiguro, H. (2006). The uncanny advantage of using androids in cognitive and social science research.

discriminate emotions? In proc. of the Intl. Conf. on Affective Computing and


**1. Introduction**

life (Moon, 2001)(Wada et al., 2002).

robot can be used to affect the audience.

exercise to the audience.

the humanoid robot.

Robots are expected to help increase the quality of our life. Robots are already widely used in industry to liberate humans from repetitive labour. In recent years, entertainment is getting more momentum as an application in which robots can be used to increase peoples quality of

**Speech Communication with Humanoids:** 

*National Institute of Advanced Industrial Science and Technology (AIST)* 

**How People React and How** 

**We Can Build the System** 

Yosuke Matsusaka

*Japan* 

**9**

We have been developing the robot TAIZO as a demonstrator of human health exercises. TAIZO encourages the human audience to engage in the health exercise by demonstrating (Matsusaka et al, 2009). When demonstrating, TAIZO, the robot and the human demonstrator will stand in front of the human audience and demonstrate together. For this to work, the human demonstrator has to control the robot while they themselves are demonstrating the

A quick and easy method for controlling the robot is required. Furthermore, in human-robot collaborative demonstration, the method of communication used between the human and

In this chapter, we will introduce the robot TAIZO, and it's various functions. We also evaluated the effect of using voice commands compared to keypad input during the robot-human collaborative demonstration. In Section 2 we explain the social background behind the development of TAIZO. In Section 2.5 we will discuss about effects of using voice commands compared to key input in human-robot collaborative demonstration. In section 2.6 we present an overview of the system used in TAIZO. In Section 2.7 is the evaluation and discussion of the results from experiments in which weãA˘ Ameasured the effect of using voice ˘ commands through simulated demonstrations. Finally, in Section 2.10 and Section 2.11 we will discuss about the benefits and problems of using humanoid robot to this application. In latter part of the chapter, we will discuss how to develop the communication function for

Recently, "behavior-based" scripting method is applied in many practical robotic systems. The application presented by Brooks (1991) used hieratical structure model, the recent applications (Kaelbling, 1991) (Yartsev et al, 2005) uses state transition model (finite state automata) to model the situation of the system. The developer incrementally develop the script by adding each behaviors which fits to each small situations. Diverse situation understanding ability can

The behavior-based scripting method can also be applied to communication robots by incorporating speech input with the situation model. Application of the behavior-based

be realized as a result of long-term incremental development.

