Knepper examines emotions behind human-robot relations
By Bill Steele
When robots become common in our homes and workplaces, how will we relate to them? Will we treat them with dignity and respect or regard them as second-class citizens? Could we develop emotional relationships with them?
“We engineers have a duty to plan for this,” says Ross Knepper, assistant professor of computer science. Knepper addressed the question in a lecture, “Autonomy, Embodiment and Anthropomorphism: the Ethics of Robotics,” April 24, part of a weekly series on “The Emergence of Intelligent Machines: Challenges and Opportunities.”
The robots we meet will be “embodied” – there’s a machine standing there; it’s not just the disembodied voice of an artificial intelligence running in a box. And they will be “autonomous,” making their own decisions about what to do: You’re not controlling the machine with a joystick. (This is not free will, Knepper emphasized; since the robot is following its programming, what you see is “time-shifted human operation.”)
But as a result of this combination we anthropomorphize, responding to the robot as if it were human.
Knepper has built small robots that assemble Ikea furniture. Part of their programming is to ask a human for help when needed: “Please locate part number F-13.” He has found that when this happens, visitors to his lab will converse with the robots as if they were small children.
Some people paint faces on their Roomba robot vacuum cleaners. Robots made in the shape of dogs or stuffed animals sometimes elicit the same responses as real animals.
The elephant in the room turns out to be sex robots (recently introduced in Japan). In surveys, many people think this is a good thing, to cut down on prostitution and prevent the spread of STDs. But most agree that these robots should not express emotions.
Whatever kinds of robots are involved, we may find ourselves in the “uncanny valley” – often a concern of the makers of animated films – where something that seems very close to human but not perfect will arouse uneasiness or even revulsion.
Knepper opened with the currently viral video of a little girl talking to a water heater, thinking it was a robot. She ends up hugging it. But then there was the “Hitchbot,” a robot that was sent out to try to hitchhike across the United States but ended up vandalized. “An experiment in humanity that failed,” Knepper said.
The future may depend on how engineers teach robots to behave, but the bottom line, Knepper said, is: “How we treat robots mirrors how we treat people.”
Not only a problem for engineers.
The lecture series, although open to the public, is part of a course, CS 4732, “Ethical and Social Issues in AI.”
Media Contact
Get Cornell news delivered right to your inbox.
Subscribe