When we first wrote about NASA�s humanoid robot �Robonaut� in September 2003,
it had already developed hands dexterous enough to wield tools such as wrenches.
By January
2004, it was developing sensitive skin and the ability to understand human spoken commands. Now, it has a leg�
and wheels. And the wheels
are getting smarter. Carnegie Melon University, which is at the forefront of
autonomous vehicle research, is already testing a robotic �astrobiologist
without a space suit� for NASA in Chile�s barren Atacama desert. NASA�s robotics
roll also extend to robot �swarms� for
planetary rovers and (later) for unmanned spaceship swarms. But whereas most if
not all other swarm R&D uses only simple algorithms in the individual
robotic units, NASA is giving them pretty good brains. We have always thought
the Japanese had a strong lead in robotics, but putting together this string of
news from NASA suggests it is premature to write off the US.

South Korea has developed formidable prowess in technology and is nipping at
Japan�s heels on several fronts, the latest of which is healthcare robots. They have some
catching up to do, though, as evidenced by a robocop recently out on patrol in a
Japanese shopping arcade, that wowed shoppers.

A relatively inexpensive development kit puts sophisticated
robotic vision and tracking capabilities in the hands of well-heeled hobbyists.
The result could be a spurt in innovative robotics applications.

NASA�s �Robonaut� recently took its first steps, using a single �space leg�
plus its agile arms and hands to move around the outside of a simulated Space
Station, says a NASA press release. It has also taken its first ride, on a
Segway scooter. Whether on the leg or on wheels, Robonaut�s head, torso, arms,
and hands stay the same.

The mission of the one-legged robots will be to remain anchored on the
outside of spacecraft, ready for routine maintenance or emergencies. They would
not be autonomous (at least, not at first), but be teleoperated wirelessly by
astronauts inside the spacecraft.

The wheeled Robonauts are intended for planetary work, and will also be
teleoperated.

A robotics and life sciences research team made up of researchers from NASA,
Carnegie Mellon University, the University of Tennessee, and Chile�s Universidad
Catolica del Norte researchers is testing and refining an autonomous rover
called �Zoe.� The aim is a robot that can autonomously seek and identify life on
other planets��an astrobiologist without a space suit,� as a NASA scientist put
it. This is the second year of a three year project, and the project goal is for
an instrument-laden Zoe to operate autonomously over a 30 mile journey. A goal
of the current phase is to test Zoe�s life-detection capabilities versus those
of humans.

A fluorescence imager will be located beneath the rover to detect the
presence of molecules indicative of life. Powered by sunlight, Zoe is expected
to travel 2 kilometers each solar day, with a maximum speed of 100 centimeters
per second. By contrast, the current Mars rovers travel 0.007 kilometers in one
solar day, with a maximum speed of 5 centimeters per second.

NASA scientists are developing �very complex AI software that enables a
higher level of robotic intelligence,� reports John Bluck in Red Nova.
Today�s space rovers have very simple AI and can only make very simple
decisions, which means that large teams of humans must �micromanage� them, to
borrow a term from business, and the process is slow because it takes a long
time for sensor data to reach Earth, be analyzed, and new instructions signaled
back to Mars. A really intelligent robot could analyze the data from its own
sensors on the spot and take action without waiting for human input.

The state-of-the-art, complex, AI-based software being developed is based on
an architecture developed by NASA called Intelligent Deployable Execution Agents
(IDEA). NASA plans to deploy (first, on Earth, of course) a collection of
IDEA-enabled rovers to undertake such tasks as mapping. A large robot team will
not only give better coverage of a large area of land, but also has built-in
redundancy�if one or more of the robots fails, �you still can accomplish the
mission��and a team can do tasks more complex than a single rover could achieve.

IDEA-driven swarms of robotic spacecraft could also one day be sent to make
scientific observations of �planets, moons and other celestial objects.�

Robot aides for the sick and disabled were recently demonstrated in Australia
by their Korean makers. The robots can be programmed to move a patient and fetch
food and drinks by spoken or gestured command. It will be �several years� before
they are ready for use, however, and in the meantime Australian scientists are
focusing their efforts on a home healthcare system, in trials, that enables
patients to test their own blood pressure, heart rate, and lung function and
send the readings directly to their doctor.

The T63 Artemis patrolling guard robot was put through its paces at a
shopping arcade in Japan on a busy Friday night recently. Artemis shines lights
and sounds an alarm if it �spots anything untoward,� as the Mainichi
Shinbun put it. Its first security patrol, in February, was limited to a
distance of only 5 meters, but the recent test covered an 80-meter shopping
arcade, including the parking lot where it detected and memorized car license
plates. �Revelers,� said the Mainichi Shinbun, �watched in amazement as
it swept through the arcade.�

Israel�s Steadicopter Ltd. is now selling (to military customers only) a
5-foot long, US$125,000, autonomous robotic helicopter. It flies unaided (except
for being told where to go), has a range of 13 kilometers, and carries up to 18
kilograms of video equipment. Though designed primarily for military
surveillance, it could also be useful for high-voltage cable inspection and news
photography.

Seiko Epson is developing a flying robot that �looks like a miniature
helicopter and is about the size of a giant bug,� reports the AP�s Yuri
Kageyama. The prototype is 3.35 inches tall, weighs 0.4 ounces. And receives its
flight plan via a Bluetooth wireless connection. �On board is a 32-bit
microcontroller, a super-thin motor, a digital camera that sends blurry images
and a tiny gyro-sensor,� writes Kageyama.

The robot �barely managed to get off the ground in a demonstration this
week,� he said. �It crashed off the table at one point and required long waits
for battery changes. It can fly just three minutes at a time, for now, and its
lift was wobbly because the machine�s precision is not much better than a
wind-up toy.� It will be ready for commercial launch in �two or three years.�

Canesta�s EP DevKit enables researchers and hobbyists to create
applications for sight-enabled machines, using the company�s Equinox radar-like
image sensor chip that transmits unobtrusive pulses of light and times the round
trip of photons as they bounce off surrounding objects, reports Sebastian Rupley
in PC Magazine. The development kit builds a 3-D relief map from the more
than 50 3-D frames per second captured by the chip, and can track moving objects
such as hands waving.

The US$7,500kit includes a single-chip 3-D camera, a USB interface, and a
Windows-based software development environment. The technology is used in
virtual keyboards, where the sensor chips can track the fingers of a person
typing on projected, rather than real, keys; but it could also be used to enable
user interfaces, video games, and even dumb machines such as a TV set to respond
to gestures. �You�d be able to point at a TV to turn it on,� an executive told
Rupley, �or take an open hand and close it into a fist to turn it off. You could
give a thumbs-up to turn the volume up, or down to turn it down.� Notebook
computers could turn on when users flip them open, and doors could open for
recognized entrants. Home security robots could distinguish between a cat and a
person.

By providing a development kit, the company hopes to spur the development of
innovative applications, and it probably will.

A remote-controlled wireless crawling robot developed by Carnegie Mellon
University roboticists has successfully inspected hundreds of feet of
114-year-old 8-inch gas main still in use under New York City. �Explorer� is
�segmented like a link sausage,� says an article in Space Daily, �with front-
and rear-fisheye cameras and lights� giving the remote operator images of a
pipe�s interior, as well as other data. �This kind of remote inspection
technology is truly enabling and will change the face of infrastructure
maintenance,� its lead developer said, and is just �the tip of the iceberg when
it comes to using high-tech wireless inspection devices in areas traditionally
thought to be inaccessible to human beings.�

The linked-sausage shape enables Explorer to make 90-degree turns in elbows
and tees. An engineer at Con Edison said �This will significantly reduce our
costs per foot of pipe inspected, especially in the all-too-common situation
where multiple excavations are presently needed to locate the point of water
intrusion into our low-pressure, cast-iron system.�