A camera captures images of a person's face. Through the algorithm developed by the Creative Machines Lab, the robot can use the details of the face in the image to learn and mimic the facial expression in real time. (Creative Machines Lab/Columbia Engineering)
For the past five years, AI researchers at Columbia University's Creative Machines Lab have been working on EVA, an animatronic robot that can watch you and mimic your facial expressions.
“The idea for EVA took shape a few years ago, when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes,” said Columbia engineering professor Hod Lipson, who is the director of the Creative Machines Lab, in a news release.
Lipson also noticed humanizing features such as googly eyes and nametags affixed to a restocking robot that he saw at a grocery store.
"People seemed to be humanizing their robotic colleagues by giving them eyes, an identity, or a name," he said. "This made us wonder, if eyes and clothing work, why not make a robot that has a super-expressive and responsive human face?"
EVA doesn't have a set of predefined expressions. Rather, a camera captures images of a person's face. Through the algorithm developed by the Creative Machines Lab, EVA can use the details of the face in the image to learn and mimic the facial expression in real time.
In order to get EVA used to controlling its own face, researchers also filmed hours of footage of EVA while its facial motors made random faces. This was to get EVA to learn and understand how to control its own face, all without human supervision.
Prior to EVA, the vast majority of animatronics have relied on hard materials such as metal that move in a way that is too stiff to mimic human tissue. For EVA, researchers have applied a soft blue silicone material for the robot's skin, with the colour inspired by the Blue Man Group.
The researchers relied on 3D-printed parts to manufacture EVA's components. Behind the silicone skin is a series of nylon cables connected to 10 motors within EVA's face, mimicking the facial muscles that humans have. Motors on EVA's neck also enable it to nod its head.
"The greatest challenge in creating EVA was designing a system that was compact enough to fit inside the confines of a human skull while still being functional enough to produce a wide range of facial expressions," said Columbia undergraduate student Zanwar Faraj, who led the team that constructed the motor system, in a news release.
EVA was also designed to be relatively inexpensive. The total cost of materials is approximately US$900, a far cry from similar robots which have cost up to US$165,000.
With the growing presence of robots, AI and automation in our daily lives, the team hopes that EVA can be a step in the right direction when it comes to making these robots more emotionally engaging and human.
"There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers," said Lipson. "Our brains seem to respond well to robots that have some kind of recognizable physical presence."