
Emo can predict a human smile 839 milliseconds before it happens and smile back
Right now, in most humanoid robots, there’s a noticeable delay before they can smile back at a person, often because the robots are imitating a person’s face in real time. “I think a lot of people actually interacting with a social robot for the first time are disappointed by how limited it is,” says Chaona Chen, a human-robot interaction researcher at the University of Glasgow in Scotland. “Improving robots’ expression in real time is important.”
Cameras in the robot’s eyes let it detect subtleties in human expressions that it then emulates using 26 actuators underneath its soft, blue face. To train Emo, the researchers first put it in front of a camera for a few hours. Like looking in a mirror would do for humans and their muscles, looking at itself in the camera while researchers ran random motor commands on the actuators helped Emo learn the relationships between activating actuators in its face and the expressions it created. “Then the robot knows, OK, if I want to make a smiley face, I should actuate these ‘muscles,’” Hu says.
Next, the researchers played videos of humans making facial expressions. By analyzing nearly 800 videos, Emo could learn what muscle movement indicated which expressions were about to occur. In thousands of further tests with hundreds of other videos, the robot could correctly predict what facial expression a human would make and re-create it in sync with the human more than 70 percent of the time. Beyond smiling, Emo can create expressions that involve raising the eyebrows and frowning, Hu says.
The robot’s timely smiles could relieve some of the awkward and eerie feelings that delayed reactions in robots can cause. Emo’s blue skin, too, was designed to help it avoid the ‘uncanny valley effect’. If people think a robot is supposed to look like a human, “then they will always find some difference or become skeptical,” Hu says. Instead, with Emo’s rubbery blue face, people can “think about it as a new species. It doesn’t have to be a real person.”
The robot has no voice right now, but integrating generative AI chatbot functionalities, like those of Chat GPT, into Emo could create even more apt reactions in the robot.
