The Human-Robot Collaboration and Companionship Lab developed a soft robotic skin that can change its texture to express its internal state, such as emotions. The prototype skin can change its skin through a combination of goosebumps and spikes.

Robot prototype will let you feel how it’s ‘feeling’

The robot prototype expresses its "anger" with both its eyes and its skin, which turns spiky through fluidic actuators that are inflated under its skin, based on its "mood."

In 1872, Charles Darwin published his third major work on evolutionary theory, “The Expression of the Emotions in Man and Animals,” which explores the biological aspects of emotional life.

In it, Darwin writes: “Hardly any expressive movement is so general as the involuntary erection of the hairs, feathers and other dermal appendages … it is common throughout three of the great vertebrate classes.” Nearly 150 years later, the field of robotics is starting to draw inspiration from those words.

“The aspect of touch has not been explored much in human-robot interaction, but I often thought that people and animals do have this change in their skin that expresses their internal state,” said Guy Hoffman, assistant professor and Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering (MAE).

Inspired by this idea, Hoffman and students in his Human-Robot Collaboration and Companionship Lab have developed a prototype of a robot that can express “emotions” through changes in its outer surface. Its skin covers a grid of texture units (TUs) whose shapes are controlled by fluidic actuators, based on a design developed in the lab of Hoffman’s MAE colleague Rob Shepherd.

Their work is detailed in a paper, “Soft Skin Texture Modulation for Social Robots,” presented in April at the International Conference on Soft Robotics in Livorno, Italy. Doctoral student Yuhan Hu was lead author; the paper was featured May 16 in IEEE Spectrum, a publication of the Institute of Electrical and Electronics Engineers.

Hoffman, whose TEDx talk on “Robots with ‘soul’” has been viewed nearly 3 million times, said the inspiration for designing a robot that gives off nonverbal cues through its outer skin comes from the animal world, based on the idea that robots shouldn’t be thought of in human terms.

“I’ve always felt that robots shouldn’t just be modeled after humans or be copies of humans,” he said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

Part of our relationship with other species is our understanding of the nonverbal cues animals give off – like the raising of fur on a dog’s back or a cat’s neck, or the ruffling of a bird’s feathers. Those are unmistakable signals that the animal is somehow aroused or angered; the fact that they can be both seen and felt strengthens the message.

“Yuhan put it very nicely: She said that humans are part of the family of species, they are not disconnected,” Hoffman said. “Animals communicate this way, and we do have a sensitivity to this kind of behavior.”

At the same time, there’s a lot of technology being developed featuring active materials, which can change their shape and properties on demand. In fact, one of the innovators in that area – Shepherd, leader of the Organic Robotics Lab – happens to work about five Upson Hall doors away from Hoffman.

“This is one of the nice things about being here at Cornell,” Hoffman said. “Rob is right down the hall, and this is how I discovered this technology. This kind of close collaboration is in a large part what I was so excited to join Cornell for.”

Guy Hoffman, assistant professor of mechanical and aerospace engineering, heads the Human-Robot Companionship and Collaboration Lab.

Hoffman and Hu’s design features an array of two shapes, goosebumps and spikes, which map to different emotional states. The actuation units for both shapes are integrated into texture modules, with fluidic chambers connecting bumps of the same kind.

The team tried two different actuation control systems, with minimizing size and noise level a driving factor in both designs. “One of the challenges,” Hoffman said, “is that a lot of shape-changing technologies are quite loud, due to the pumps involved, and these make them also quite bulky.”

Hoffman does not have a specific application for his robot with texture-changing skin mapped to its emotional state. At this point, just proving that this can be done is a sizable first step. “It’s really just giving us another way to think about how robots could be designed,” he said.

Future challenges include scaling the technology to fit into a self-contained robot – whatever shape that robot takes – and making the technology more responsive to the robot’s immediate emotional changes.

“At the moment, most social robots express [their] internal state only by using facial expressions and gestures,” the paper concludes. “We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction.”

Media Contact

Jeff Tyson