Recognizing Emotional Body Language Displayed by a Human-like Social Robot
نویسندگان
چکیده
Natural social human–robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that body language plays an important role in conveying information about changes in human emotions during human–human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional body language for our human-like social robot, Brian 2.0. We develop emotional body language for the robot using a variety of body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic body language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for lifesized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot’s emotional body language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human body language features displayed by the robot with respect to the same body language features displayed by a human actor. D. McColl (B) · G. Nejat Autonomous Systems and Biomechatronics Laboratory, Department of Mechanical and Industrial Engineering, University of Toronto, 5 King’s College Road, Toronto, ON M5S 3G8, Canada e-mail: [email protected] G. Nejat e-mail: [email protected]
منابع مشابه
Personalized Emotional Expressions to Improve Natural Human-Humanoid Interaction
We need to prepare robots for the shift from laboratories and industrial environments to join human residential areas. This is one of the reasons why current trends in the field of human-robot interaction are expanding into the social experience of users often involving artificial emotions. Emotional technology in its two forms – as an expression of artificial emotions of the systems and as sys...
متن کاملClassifying a Person's Degree of Accessibility From Natural Body Language During Social Human-Robot Interactions
For social robots to be successfully integrated and accepted within society, they need to be able to interpret human social cues that are displayed through natural modes of communication. In particular, a key challenge in the design of social robots is developing the robot's ability to recognize a person's affective states (emotions, moods, and attitudes) in order to respond appropriately durin...
متن کاملChildren Interpretation of Emotional Body Language Displayed by a Robot
Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along...
متن کاملMirror my emotions! Combining facial expression analysis and synthesis on a robot
Everyday human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, body pose or gestures. Facial expressions are one of the main communication mechanisms and pass large amounts of information between human dialogue partners [22]. Therefore, the analysis and the synthesis of facial expressions are important steps towards an intui...
متن کاملDynamics, Stability Analysis and Control of a Mammal-Like Octopod Robot Driven by Different Central Pattern Generators
In this paper, we studied numerically both kinematic and dynamic models of a biologically inspired mammal-like octopod robot walking with a tetrapod gait. Three different nonlinear oscillators were used to drive the robot’s legs working as central pattern generators. In addition, also a new, relatively simple and efficient model was proposed and investigated. The introduced model of the gait ge...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- I. J. Social Robotics
دوره 6 شماره
صفحات -
تاریخ انتشار 2014