“If we haven't really understood human emotions, can we actually ... put them into machines?”

Bilge Mutlu, University of Wisconsin at Madison
say cheese

Robots are learning to smile and it's making humans cringe

Researchers from Columbia University have designed a robot that can smile just like a human, but just how well humans will react to this is yet to be seen.

by Sarah Wells
Updated: 
Originally Published: 

When it comes to expressing emotions, human faces have a lot to say.

Without speaking a word, we can signal our disgust to those around us with pursed lips and a furrowed brow. Our joy is expressed just as fast: Eyes open wide and lips upturned. Like an overturned flag signaling otherwise unnoticeable distress on a ship at sea, our facial expressions act as a bridge between our internal life and the outside world.

While other humans are generally good at picking up on these small signals, we may soon have another group with which to communicate: intelligent robot companions. From service robots delivering our takeout to companion bots bonding with our grandparents, it’s becoming more important to design robots that use emotion-like signaling to efficiently relate to humans.

But achieving this feat is easier said than done, and getting it wrong could doom a robot to reside in the “uncanny valley” — forever ruining their hopes of a true human relationship.

Enter Eva, a blue-skinned, body-less robot designed by Boyuan Chen, a computer science Ph.D. student at Columbia University, and colleagues from Columbia’s Creative Machines Lab. With 12 tiny muscle-like actuators built into its face, Eva is prepared to express a myriad of human emotions — from fear to joy to disgust.

Chen tells Inverse that he and the rest of the team behind Eva aren’t even sure exactly how many emotions it can express.

“Can you tell me how many expressions you can make?” Chen asks me. “It’s a very hard question to answer and it’s the same for the robot. I'm happy to see this happen because if we know the exact number [of emotions], that means there are limits. We do not know the number, [so] there are no limits.”

Eva is made using a 3D printed and assembled skull with a blue, flesh-like face mask placed on top.

Faraj et al.

Why create a smiling robot

As for why you’d want to create a smiling robot at all, Chen says that developing robots that can hold their own in human-like interactions — such as reading distress in a human companion and responding accordingly with a comforting face — will be essential for improving human-machine interactions in the future.

Using emotions as a stepping stone toward building emotional and physical intelligence will help robots in the future intuitively know how to help humans, Chen explains, instead of needing to be explicitly programmed to do so.

“When robots see that other people may need help, you want a robot to actively help the people instead of me asking for help and programming it to help us,” he says.

Paula Niedenthal is an emotions researcher and professor of psychologist at the University of Wisconsin at Madison. She tells Inverse that working toward emotionally express robots is important as well because humans will read emotions into these robots no matter what. Take, for example, food delivery robots milling the streets around UW Madison.

“The robot's behavior rather than a facial expression can look really emotional because it accelerates when there’s danger,” says Niedenthal. “For example, if a robot is crossing the road and then comes across a car there’s kind of a panicked rearing or running away. That makes you actually feel a kind of relationship with it, both sympathy and wanting to use that agent in the future.”

Eva can show a wide range of expressions, starting from 6 base human emotions: anger, disgust, fear, joy, sadness, and surprise.

Robots that can expertly express these kinds of emotions will have a better chance at building human relationships and even persuading humans to do what they want, i.e. taking their medicine on time.

But having a robot nail this interaction every time is an incredibly difficult task, says Bilge Mutlu, an associate professor of computer science and psychology and UW Madison. Even humans don’t always get it right.

“When you look at the psychology literature, our understanding of emotions is incomplete,” Mutlu says. “And if we haven't really understood human emotions, can we actually simplify them and put them into machines? That's an open question.”

What is the uncanny valley?

For humans, these mismatched interactions can be uncomfortable or awkward, but with robots, they can be downright creepy, thanks to the uncanny valley.

The concept was proposed by Japanese roboticist Masahiro Mori in the 1970s to describe a humanoid robot that looks close — but not quite close enough — to a real human.

According to the uncanny valley concept, humans are comfortable interacting with a more abstract and cute robot (think, Pixar’s WALL-E) or an incredibly human-like robot (think, Battlestar Galactica Cylons.) However, there exists a so-called valley between these two robotic extremes where robots look neither truly human nor robotic. This feeling of the “uncanny” might be the shiver you get when walking through a wax museum or when looking at a robotic Einstein video.

Psychologically, scholars have theorized that this discomfort with the not-quite-human may stem from an instinctual fear or distrust of dead human bodies, explains Niedenthal.

To steer clear of the uncanny valley altogether researchers will typically try and keep their robots on the cuter, more abstract side of the curve, says Mutlu. However, when it comes to programming emotions into your bot, he says avoiding creepiness altogether is a little impossible.

Does Eva creep you out? You’re not alone.

Faraj et al.

How does it work — The uncanny valley was a challenge that Chen and colleagues were willing to take when designing Eva.

To start, they chose to design Eva as just a disembodied head. While this may instinctually sound creepy, the researchers explain in their April 2021 paper on the project that this design choice was made to help viewers more clearly separate this robot from humans in their minds. For similar reasons, the team also chose to leave Eva’s skull exposed and color its face a distinctly non-human color: blue.

Admittedly, Chen says this decision was influenced partially by the lab's affection for the 2009 movie “Avatar.”

In addition to their 2021 paper published in the journal HardwareX, the team also recently presented a second paper at the 2021 International Conference on Robotics and Automation which further describes Eva’s latest hardware and software developments.

Underneath its blue skin, Eva is equipped with:

  • 42 “muscles”
  • Expressive, hand-milled eyes
  • Base knowledge of six basic human emotions (anger, disgust, fear, joy, sadness, and surprise)

Running an off-the-shelf learning framework on a small Raspberry Pi built into its skull, Eva is able to “look” at human expressions in person or through video and realistically mimic them. This is done by mapping the human face using discrete points, similar to the kind of dot arrays used in motion tracking for CGI in movies. Eva then imagines how these patterns of dots would look on its face and then moves its facial actuators to bring the new face to life.

Altogether, the team reports that Eva can be manufactured for just $900.

What’s next — Eva is still in its infancy, but Chen says he’s excited to see how other researchers will use this open-source platform to design their own emotional robots — from changing the skin tone to programming real human interactions for Eva. In the future, Chen hopes that expressive robots like Eva will find a home as educators or in healthcare to help care for humans when others can’t.

And as for whether or not Chen finds Eva’s smile creepy, he says he could never be scared by a smile like that.

“This is hard for me because the robot is like our baby,” says Chen. “I absolutely love every part of it.”

This article was originally published on

Related Tags