EMYS™ arouses positive emotions and people feel safe with him. They recognize that his appearance as well as his behaviors, are human-like.
When situated in home environment, EMYS™ can provide new possibilities related to increasing the humans' activity, help children learn or just assist people in their everyday activities.
EMYS™ can take advantage of a built-in dynamic PAD-based model of emotion. The robot's mood is the integral of all emotional stimuli. Thus, EMYS™ gets engaged in the interaction with humans on a rational as well as on an emotional level.
His vision system is comprised of a hi-res camera mounted in his nose and a Microsoft Kinect device playing the role of an advanced motion sensor. It allows him to detect humans as well as provide information on position of particular elements of their bodies. It also provides gesture recognition algorithms, 3D face tracking, and recognition of some facial features.
Both video streams can be processed by algorithms which provide: basic image processing (e.g. blurring, thresholding and morphological operations), object detection (e.g. human faces or certain body parts), color and movement detection, and much more.
His audio system is comprised of a microphone mounted within his middle disc and a premium speaker. His facial movements are synchronized with the uttered phrases with respect to the spoken visems. An optional Kinect sensor provides speech recognition and detection of voice direction utilizing a built-in microphone array.
Physical contact with robots and their autonomous behavior have a large impact on the perceived human-likeness and credibility. Being able to touch EMYS™ and see him react can greatly improve the general attitude and the approach of the users.
Information gathered by the robot could potentially be used to stimulate everyday human-robot interaction. A robotic companion could become an interface to the external world by enabling the human to use the above mentioned media in an accessible manner. This is especially important for people not familiar with modern means of communication.
Perhaps the most advanced available affective mind, FAtiMA (FearNot! Affective Mind Architecture), based on the Orthony, Clore and Collins appraisal theory of emotions, was also successfully integrated with EMYS' control system. The decision system can also be assisted by a human operator.
Thanks to the integrated computational model of emotion with appraisal capabilities, information gathered from the Internet like news, weather forecasts, messages, etc. can be affectively assessed to extract the emotional meaning.