In their final thesis project, they set out to explore expressiveness in robots in order to humanize them. This would lead people to "interact with them more naturally," they explained in their project.
Garat explained that, beyond the design, they implemented a Facial Action Coding System on the robot’s face inspired by the work of Dr. Paul Ekman (whose research forms the basis for the series “Lie to Me”), who dedicated much of his life to studying the universality of expressions across diverse human groups. “His research shows that the way humans smile is independent of ethnic origin,” added Garat, who serves as Assistant Coordinator of the Electronics Laboratories at the School of Engineering.
Thus, this product can be used to study facial micro-expressions in order to detect moods or lies. In addition to experimenting with this technology, Garat noted that the idea is to eventually develop a range of products that can be marketed within the region.
The project consists of an animatronic platform: everything that attempts to simulate a living being in appearance and behavior. “We decided to build an animatronic face, although our research can be extended to all types of movements for a robot. In other words, with a few modifications, we could build an arm or any other part of the body,” stated Gaspar.
Despite all attempts by robotics to respond to recurring human fantasies depicted in film and literature, nothing so far indicates that robots will ever acquire the ability to experience feelings.
According to Gaspar, “a facial expression simply means a set of movements to a robot, not an emotional state. In other words, a robot can display an expression of sadness but will not actually be sad.”