Teaching robots to solve their own problems

“I’m sorry, Dave. You told me to take out the trash, but there’s a big box blocking the door to the trash chute.” If the speaker were a human child, you might just say, “So go back and move the box.”

But what do you say to a robot? If they are to do useful work for us, “we want to get all these robots thinking for themselves about how to solve problems,” said Ross Knepper, assistant professor of computer science. Knepper is about to embark on a project to make robots and other artificial intelligences “persistently autonomous,” so they can deal with unexpected problems.

The work is supported by a $360,000, three-year grant from the U.S. Air Force Office of Scientific Research's Young Investigator Research Program, designed to enhance early career development of outstanding young investigators in science and engineering.

Previous approaches to dealing with unexpected obstacles have included brute force (just plunge ahead no matter what’s in the way), giving the robot many alternative routes to the goal and, of course, asking a human for help.

“It’s really about introspection,” Knepper explained. “The robot must look at a model of itself and be able to reason about itself.”

Knepper plans to base corrective action on what he calls “inverse semantics,” using what the robot might ask a human to do as a basis for instructions to itself. It could then use its model of itself to draw up a plan of action.

As an early test Knepper will collaborate with Carla Gomes, director of the Cornell Institute for Computational Sustainability, to deploy autonomous drones to monitor the condition of vegetation in Kenya. Self-navigating aircraft will collect images of agricultural land from which computer analysis can recognize drought, a predictor of poverty pockets that will require intervention by government or nongovernmental organizations. If the drones are driven off course by weather or are restricted by national borders or no-fly zones, they must be able to recognize that there’s a problem and revise their course to return to the original target area and complete the mission.

This work will also deal with what workers with robotic vehicles call the “coverage problem,” which comes up in projects like spraying for mosquitoes or automated harvesting. The drones will draw a map as they go, which will help if they need to return and fill in missed areas. They may also rely on satellite images and previous maps.

A “fully autonomous vehicle,” Knepper notes, is still far in the future.

Media Contact

Melissa Osgood