A glimpse into the technological future was on display in Duffield Hall atrium May 18, where 23 teams of students showcased their sensing, grasping and flying robots for the public.
The event was the capstone of the new course Robot Learning (Computer Science 4758), taught by assistant professor Ashutosh Saxena. Students were required to build and program robots that could autonomously accomplish specific tasks, from recognizing certain objects to detecting obstacles.
The course teaches students algorithms for robotic applications and involves machine learning and artificial intelligence methods and techniques.
Arjun Prakash '10 and teammates demonstrated a helicopter robot platform outfitted with sensors to navigate buildings. The future application, the electrical and computer engineering major said, might be to send the flying machine into a burning building to ascertain the location of people or objects. The robot was programmed to recognize corridors or stairs, Prakash explained.
"If it doesn't detect a corridor or stairs, it assumes it's in an open room and hovers," he said.
Several groups worked on different aspects of a robotic arm. One team demonstrated the arm's ability to recognize pictures taped to small drawer handles. The grasping end of the arm would then grab and open the drawer.
Some projects were devoted to what are considered high-level robotic tasks. Graduate student Changxi Zheng, who studies computer graphics, worked on an algorithm to let a robot figure out how to grab various kitchen items and carefully load them into a dishwasher. And computer science graduate student Yun Jiang worked on a planning algorithm for assembling an object from a kit of parts using a robotic arm.
Teaching a robot to classify different objects is no easy task, as a group comprising electrical and computer engineering graduate student Rick Wu, computer science major Nathan Lloyd '10 and mechanical and aerospace engineering major Sebastian Castro '10 can attest. They wrote code to make a robotic arm assess and classify kitchen items into three basic shapes: bottles or cups, things with handles, and plates or bowls. The robot would ideally grab the object the correct way after it determined what type of object it had encountered.
"Our code helps them figure out where to approach it and how to approach it," Lloyd said.