Recognizing robotic emotions: facial versus body posture expression and the effects of context and learning
MetadataShow full item record
The ALIZ-E project is focused on creating a social robot to interact with diabetic children. Robots need to be able to show emotional expression in order to interact socially with children. The iCat robot seems to have a good set of emotional expressions and now the Nao robot will also be programmed to express emotions. The iCat expresses emotions with facial expressions, and the Nao robot will use body posture to express emotions. First, movements for the Nao were created and these postures were validated as good emotional postures. Fourteen children between 8 and 9 years old participated with the second study in which the emotional expressions from the iCat and Nao were compared. Besides facial expression versus body posture, effects of context and a second experiences were also investigated. For both robots the movements used were significantly recognized as the correct emotions. Between the iCat and the Nao no difference in correct recognitions was found. When the emotions were expressed in a corresponding context, the recognitions rates were significantly higher than if there was no context. Number of interactions also plays a role in correct recognition of emotions. In the second session, the emotions were significantly better recognized than in the first time contact.