Willow Garage Blog

ICRA 2010 is fast approaching, and we're excited to pack up and head north
for a week. Until then, we'd like to share some of the work you can
expect to see from us in Anchorage. Over the next month, we'll
periodically run blog posts spotlighting some of the papers we'll be
presenting, and workshops we'll be participating in. If you see something
that strikes your fancy, drop on by!

This full-day workshop will take place on Monday, May 3 (9:00 am-12:30 pm and 2:00 pm-5:30 pm) in the Dena'ina Center, Tubughnenq' 4.

Another workshop to look for is the Workshop on Mobile
Manipulation, organized by
Patrick Pfaff, Wolfram Burgard, Kris Hauser, Sachin Chitta, and Oliver
Brock. This workshop will discuss the state of the art of
mobile manipulation research. Willow Garage's Kaijen Hsiao, Matei Ciocarlie, Gil Jones, and Sachin Chitta will be among the participants,
presenting "Contact-Reactive Grasping of
Objects with Partial Shape Information." For a glimpse of their work,
see the video on reactive grasping, below.

The full-day workshop will take place on Friday, May 7 (9:00 am-12:30 pm and 2:00 pm-5:30 pm) in the Dena'ina Center, Tubughnenq' 3.

The towel-folding is the work of Pieter Abbeel's
group at Berkeley. Jeremy Maitin-Shepard spent many a long night with
one of our PR2 alpha prototypes, here at Willow Garage, adapting their
research to ROS and the PR2 platform. Marco Cusumano-Towner and Jinna
Lei also contributed to the work. The towel-folding is all the more
impressive due to the level of robustness they were able to achieve:
they successfully folded 50 out of 50 towels. As we've learned with our
plugging-in code, once you've made something robust, it's easy to make
it faster and more efficient in the future.

We're hoping to put together something more proper to highlight and
describe the work that was done. In the meantime, they will be
presenting their work at ICRA 2010, and you can read
their paper here.

Willow Garage will be at RoboGames 2010, which is being held from
April 23-25 at the San Mateo Fairgrounds. Stop by Saturday and Sunday to see what we've managed to
steal away from the lab.
RoboGames has always been a fun event for us, and we're excited to participate this year as a sponsor. Last year, Counter
Revolution managed to take some chunks out of the combat stadium,
and this year one of our interns will be competing in the firefighting
competition.

We'll have 8.5"x11" posters as giveaways in our booth, so come grab one and check out our robots! You can also see more of Josh's awesome artwork in person this weekend at Wondercon in San Francisco, and you can see some making-of images in Josh's Flickr photoset.

We invite the community to submit papers presenting their latest work on
novel methods for advanced semantic perception in cluttered
environments, sensor-based control including the robot-environment
dynamics, task-oriented planning algorithms, and human-robot cooperative
manipulation. Please see the website for a more complete list of
topics. We welcome preliminary results, particularly with compelling
videos or live demonstrations. Whether or not you submit a paper, you
are invited to attend the workshop.

The submission deadline is May 8, 2010; check the website for details.

On April 14, 2010, Willow Garage will be at the National Robotics Week event at Paul Brest Hall on the Stanford University campus. We'll be able to steal a PR2 and Texai away from the lab, so you can check out our robots! The event runs from 12-6 pm, and is free to the public.

When talking with people face-to-face, we may experience the "Cocktail Party
Effect": even in a crowded, noisy room, we can use our binaural
hearing to focus our listening on a single person speaking. With current telepresence technologies, however, we lose this important ability. Thankfully, there are already researchers giving us new tools for effectively bridging these remote distances.

The HARK system integrated well with ROS and our Texai. The Texai was outfitted with a green salad bowl helmet embedded
with eight microphones, and there is now a hark package for ROS. Using this setup, their team put together three demos showing off the potential for telepresence technologies.

In the first demo, four people, including one present
through a second Texai, talk over each other while the HARK-Texai
separates out each voice. The
second demo shows that sound is localized and that sound direction and power can be displayed in a radar
chart. The final presentation puts these two demos together into a powerful new interface for the remote operator: the Texai pilot can
determine where various sounds and voices are coming from, and select
which sound to focus on. The HARK system then provides the pilot with the desired audio, cutting out any background noise or additional
voices. Even in a crowded room, you can have a one-on-one conversation.

HARK came out of
close collaboration between HRI-JP and Kyoto University, and Professor
Okuno's passion to make computer/robot audition helpful for the hearing
impaired. HARK is provided free and open source for research purposes and can be licensed for commercial applications.

One of the ROS libraries currently under development is a web infrastructure that allows you to control the robot and various applications via a normal web browser. The web browser is a powerful interface for robotics because it is ubiquitous, especially with the availability of fully-featured web browsers on smart phones.

The web_interface
stack for ROS allows you to connect to a web-enabled ROS robot, see
through its cameras, and launch applications. Under the hood is a
Javascript library that is capable of sending and receiving ROS
messages, as well as calling ROS services.

With just a couple of clicks from any web browser, you can start up and
calibrate a robot. We've written applications for basic capabilities
like joystick control and tucking the arms of the PR2. We've also
written more advanced applications that let you select the locations of
outlets on a map, and instruct the PR2 where to plug in. We hope to see many more
applications available through this interface so that users can control
their robot easily with any web-connected device.

We're also developing 3D visualization capabilities based on the O3D extension that is
available with upcoming versions of Firefox and Chrome. This 3D
visualization environment is already being tested as a user interface
for grasping objects.

All of these capabilities are still under active development and not recommended for use yet, but we hope that they will become useful platform capabilities in future releases.