Students demo autonomous robotic systems

August 22, 2012
By Anne Ju

Chuck Yang, M.Eng. '12, models a pair of glasses that mirror a computer screen, providing the user with a heads-up display as he communicates with a robot on a search-and-rescue mission. Image: Lindsay France

(Phys.org) -- Pop into Cornell's Autonomous Systems Lab in Rhodes Hall any given day, and a mechanical arthropod might be negotiating a steep ramp, or a Roomba-like rover could be cleaning up a cluttered room.

Students led by Mark Campbell, professor of mechanical and aerospace engineering, and Hadas Kress-Gazit, assistant professor in the same department, are helping to bring robotics out of the rigid hard-wired programming systems of yore into more sophisticated, integrated and automated functions for a variety of robot platforms. Several student researchers showed off their latest contributions at the end of last semester with a project demo.

"What we want to do is create machines that can do things autonomously," explained Kress-Gazit, whose research interests include a high-level software toolkit called Linear Temporal Mission Planning (LTLMop). "That means from the simplest things -- 'Don't collide with something,' to the more complex."

For instance, Robert Villalba '15 used LTLMoP to create high-level commands for a spiderlike robot that's smart enough to traverse different terrains without being programmed with every move; instead, it reacts to an environment based on broad specifications.

"The idea is to use English and tell the robot what you want it to do," Villalba said. The robot, for example, walks with a relaxed gait on a flat surface; when it encounters an incline, it adjusts its gait with more exaggerated movements to aid its climb.

Another group might someday put campus tour guides out of a job. Ahmed Elsamadisi '14 and his group demonstrated their design of a robotic system that can give a prerecorded campus tour -- and knows where it's going with minimal human supervision.

They used a rolling mechanical Segway platform and QR code-like tags along the walls of Rhodes hall for a "vision" system so the robot could orient itself. Preloaded with specifications, the robot can negotiate hallways and corners on its own, providing auditory recordings along the way. It can adjust its behavior depending on what it encounters -- such as a cluster of people walking by.

"The robot has to constantly inform itself and update its knowledge based on what it sees," Elsamadisi said. "The full goal would be [for it] to learn habits and respond with verbal communication."

Annie Dai '12 demonstrated how she used a robotic universal "gripper," which is a balloon filled with coffee grounds that hardens around and picks up objects, integrated with a platform to patrol "bedrooms" of a mock apartment and find trash, pick it up and drop it off in receptacles. Like the arthropod and the Segway, this robot operates with high-level understanding of its environment, reacting to situations as they arise, guided by simple English commands.

Yet another group, with the goal of improving search-and-rescue missions, designed a software interface with an interactive map of an area (in this case, the engineering quad) that a person could input information into -- such as, "go here, not there"; "there may be something interesting to look at here, but not there" -- using a touchpad. In turn, the robot could relay information back to the human.

And to really make things futuristic, the students loaded their interface into a pair of monitor goggles, which mirrors the computer screen. This provides the wearer with a heads-up display -- more convenient than looking down at a screen, and more ideal in a search-and-rescue type environment.

"The idea is to leverage things robots are good at, like doing long, monotonous or dangerous missions, with the things humans are good at, which is interpreting scenes and doing high-level decision making and pattern recognition," said Nisar Ahmed, a postdoctoral associate who works on the project, which is called Husion.

These projects and more all demonstrate the lab's unifying theme: increasing the autonomy of robots, Kress-Gazit said.

"Robotics has different flavors, but here we focus on machines that can do things autonomously with minimum human intervention, and still be safe while doing something interesting," she said.

Related Stories

(PhysOrg.com) -- Move over, Jetsons. A humanoid robot named Mae is traipsing around Cornell's Autonomous Systems Lab, guided by plain-English instructions and sometimes even appearing to get frustrated.

(PhysOrg.com) -- Learning a language can be difficult for some, but for babies it seems quite easy. With support from the National Science Foundation (NSF), linguist Jeffrey Heinz and mechanical engineer Bert Tanner have ...

(PhysOrg.com) -- Intense robot battles have, for the most part, been confined to the silver screen. Occasionally a robot comes by to trounce us at chess, but robot on robot competition has been fairly limited. In this case ...

(PhysOrg.com) -- They're mundane, yet daunting tasks: Tidying a messy room. Assembling a bookshelf from a kit of parts. Fetching a hairbrush for someone who can't do it herself. What if a robot could do it for you?

The next level of robot is currently in the research and development stage in Japan's National Institute of Information and Communication Technology. The next level of robot untethered by human omnipresence allows it to take ...

Researchers with the Hasegawa Group at the Tokyo Institute of Technology have created a robot that is capable of applying learned concepts to perform new tasks. Using a type of self-replicating neural technology they call ...

On the theory that a driver who knows when a red light will turn green is more relaxed and aware, vehicle manufacturer Audi is unveiling this week in Las Vegas a technology that enables vehicles to "read" traffic signals ...

There you are, cruising down the freeway, listening to some tunes and enjoying the view as your autonomous car zips and swerves through traffic. Then the fun ends and it becomes time take over the wheel. How smooth is that ...

Roboticists at UC Berkeley have designed a small robot that can leap into the air and then spring off a wall, or perform multiple vertical jumps in a row, resulting in the highest robotic vertical jumping agility ever recorded. ...

0 comments

Please sign in to add a comment.
Registration is free, and takes less than a minute.
Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.