Share story

LOS ANGELES — Move over C-3PO, Cornell University computer-science geeks have created a robot that can tell if you want a beer and pour it for you. Baristas may want to wake up and smell the coffee, too. This robot can guess whether students are hankering for java and pour it for them.

Kodiak the robot was as handy with a lager as with a latte, could open refrigerator and microwave doors and even tidy up, say the human robotics researchers. In tests, the hard-wired humanoid correctly anticipated a student’s next move between 57 percent and 82 percent of the time, depending on how far into the future it was “anticipating.”

Programmers broke down human movements, uses of objects, possible trajectories of motion and the intentions of activities, such as eating dinner or having a beer, said computer scientist Ashutosh Saxena, who works on personal robotics at Cornell. Along with graduate student Hema Koppula, Saxena built an algorithm around 120 videos of 10 common activities, giving the robot the flexibility to “rate” the next move in a sequence and move accordingly.

“You can think about human activities as a document in which there is a basic alphabet of what people can do, and we sequence things together to do long-term activities,” Saxena said. “We all do very basic things, like move our arms, get up, hold something, eat something or drink something. Now we can put this together in a sequence to do a variety of things.”

Equipped with a Microsoft Kinect 3D Camera, the robot would visualize the student’s motions and the location of key objects, such as a coffee cup, Saxena said. When Kodiak’s guess of what came next proved wrong, it would reassess new possibilities, in a way similar to a GPS mapping program changing driver instructions after a wrong turn.

The tasks were long, and some were complex, such as eating an entire dinner. Kodiak had to learn to pour beverages without interfering with other activities, or assist someone trying to microwave an entree, Saxena said.

“If it looks like they are picking something up, they may be putting something in the fridge,” Saxena said. “And if they are doing it, then this must be the trajectory. … the robot can plan around it or plan for it. It considers all the possibilities and it scores them.”

Saxena and Koppula tested their programming on students who were not given explicit instructions beyond, say, “Cook something in the microwave,” or, “Get a beer.”

The robot performed better when predicting an action within one second, correctly anticipating the next move 82 percent of the time. But because the possibilities multiply, accuracy dropped to 71 percent in the three-second time frame and 57 percent in a 10-second time frame, according to the researchers, who will present their work at the International Conference on Machine Learning in Atlanta in mid-June.

The programming feat was no parlor trick. It has applications wherever humans and robots work side-by-side. They already have made big inroads in factory assembly lines and are being implemented in operating rooms and offices, where “telepresence” robots zip around like broomsticks on Segways, conducting tasks at the behest of distant employees.

“I think the real applications are whenever there is a robot and a human together,” Saxena said.

Of course, there’s the whole “dude, bring me a brew” thing, too.