Child - EMYS™ interaction

emys2The perception of EMYS™ (FLASH's head) when interacting with humana was examined in an experiment involving 50 schoolchildren aged 8-11 years. Scenarios of EMYS™ behavior were prepared to encourage children to play with him. The robot was controlled via Gostai Studio where a simple scenario with two games was implemented. The experiment was designed to examine both how the robot’s emotional expressions affect the interaction and whether the children are able to decode the intended expressed emotions correctly. The robot was programmed to work autonomously and realize two game scenarios; each participant went through both scenarios. The first one, called “making faces”, relied on encouraging children to repeat some facial expressions made by EMYS™. In the second scenario ("toy" scenario), the robot expressed some emotions and asked the children to show a toy of a color corresponding to the expression. Four labelled boxes with toys of different colors were available near the subject. Box with green toys corresponded to joy, red to anger, blue to sadness, and yellow toys were to be shown when the expression didn't fit any of the three previous groups. With its vision system, EMYS™ was able to recognize the color of the toy and react accordingly, i.e. by praising or criticizing. After each session the children watched the videotaped interaction from the first game and were asked which emotions EMYS™ showed. Thus, the experimental procedure consisted of a mixture between affect description assessment (“making faces”) and affect matching assessment (“toy”). The duration of the interaction experiment with a single child was about 5-10 minutes. All sessions were recorded by two cameras from two different shots. After the interaction the participants were interviewed and answered questions on demographic data, on how they perceived EMYS™, and how they liked the interaction.

emys1From a psychological viewpoint the study on children interacting with EMYS™ serves several different purposes. Firstly, the study investigates the emotional expressiveness of EMYS™. The robotic head is able to show six different emotions (anger, sadness, surprise, joy, disgust, fear). It was investigated, if those emotions could be recognised by schoolchildren, and if recognition rates differ from schoolchildren's emotional recognition rates found for human facial expressions of the respective emotions. Furthermore, the association of certain variables like engagement or personality of the children with the recognition rates was investigated. Because of its design as a kind of a “turtle head”, realized with three rotating and flapping discs and thus differing significantly from a human face, its possibilities to display those emotions are different compared to humans (as described by Ekman & Friesen, 1975) and regarding certain areas limited (e.g. it is not capable of raising mouth corners or wrinkling its nose). To diminish biases due to the method, we used two different tasks in the study which represent the two main methods in the research for emotion recognition (affect description assessment and affect matching assessment, see Gross & Balif, 1991). Secondly, the study investigated the engagement of the children in the interaction with the autonomous EMYS™ and variables that impact engagement (this influences the building of longer term relationships towards EMYS™ and is thus relevant). Variables possibly having an impact on engagement are for example: age, sex and personality of the child subject, the perceived personality of EMYS™, the perceived emotionality of EMYS™, prior experience with robots and more. Furthermore, it was investigated whether the recognition of EMYS’ emotional expressions is relevant for the engagement of the child subjects. 

PUBLICATION:

  • J. Kędzierski, R. Muszyński, C. Zoll, A. Oleksy, and M. Frontkiewicz. 2013. "EMYS Emotive Head of a Social Robot". International Journal of Social Robotics, 10.1007/s12369-013-0183-1, pages 237-249, Springer Netherlands [LINK-OpenAccess].
  • Tweets