The study’s lead author, Yuhang Hu, a PhD student at Columbia Engineering in Hod Lipson’s lab. Credit: John Abbott/Columbia Engineering

Imagine a resident walking up to a robot with a human-like head and receiving a quick smile in return. It’s a scenario that soon might become commonplace across senior living settings, thanks to groundbreaking research from Columbia University.

In a new study appearing in Science Robotics, the lab introduced Emo, a robot that can both anticipate and mimic facial expressions in real-time. Emo can predict a forthcoming smile about 840 milliseconds before a resident actually smiles, and then it mirrors the resident’s expression simultaneously. It’s like having a companion who truly understands their emotions.

The team, led by Hod Lipson, PhD, faced significant hurdles in creating Emo. They had to develop a mechanically versatile robotic face, equipped with 26 actuators to produce a wide range of nuanced expressions. But the real breakthrough was teaching the robot to recognize and respond to human facial expressions naturally and promptly.

“I think predicting human facial expressions accurately is a revolution in human-robot interaction,” said Yuhang Hu, the study’s lead author and a PhD student at Columbia Engineering.

“When a robot can co-express emotions with people in real-time, it not only enhances the quality of interaction but also fosters trust between humans and robots,” he added.

So, how does Emo make such a human connection? Its human-like head is covered with soft silicone skin and has high-resolution cameras built into each eye, allowing for lifelike interactions and crucial nonverbal communication through eye contact.

In the future, as robots like Emo become more prevalent in senior living communities, they could play a vital role in enhancing residents’ well-being and social interactions. Such a robotic companion would not only seem to understand their emotions; it also could respond to them with genuine empathy.

The team currently is exploring the integration of verbal communication into the robot, potentially leveraging a large language model such as ChatGPT. As robots increasingly resemble humans in behavior, Lipson emphasized the importance of addressing related ethical implications.

“While this capability opens doors to numerous beneficial applications, from senior living assistants to educational tools, it’s crucial for both developers and users to approach it with caution and ethical awareness,” he noted.

“Yet, it’s undeniably thrilling. Picture a world where conversing with a robot feels as natural and comforting as chatting with a friend.”