Putting robots to work

Monthly Archives: May 2015

The first version of Roomba to hit the market was more popular than we developers had dared hope. An initial factory-order for 15,000 turned quickly into an order for 25,000 followed by orders for many, many more.

The company decided to fill in the product line so we developed several new models with feature designed to appeal to different market segments. Three of the new robots had gray bumpers, the fourth was all red including the bumper.

When early production units arrived from the factory we checked to see that they worked properly. The robots with gray bumpers passed every test but the red robot was a different story. It began strong—cleaning well, turning away from cliffs, and escaping from collisions. When cleaning was done the robot detected the charging station beacon from the proper distance, turned toward it, and advanced confidently. But then, only a couple of feet from its goal, the robot became confused, hesitated, and wandered off in the wrong direction. Every attempt ended the same way, no matter where or how we started it the robot could never reach its source of electrical nourishment.

We were all baffled. The red robot had exactly the same homing electronics as the other robots; it ran exactly the same code; the beacon sent exactly the same signal to all the robots. The only difference we knew of was the color of the bumper. How could that possibly affect the way the robot behaved?

The sensor Roomba uses to detect the beacon is mounted on the bumper. Above the sensor is a little plastic dome that directs IR energy toward the detector. As the robot turns left or right the dome moves into different parts of the beam and the signal changes. For the robot to operate correctly the IR energy reaching the detector must come from nowhere but the plastic dome.

After many theories and tests we finally discovered that the gray bumpers were opaque to IR while the red bumper was translucent. The gray bumpers prevented IR energy from reaching the detector except after being directed through the dome while the red bumper let IR energy reach the detector from any direction. The robot saw the beacon everywhere, no wonder it became confused. We confirmed our diagnosis by putting black tape on the back side of the red bumper—everything worked fine.

The red bumper mystery was a fun if perplexing debugging exercise. But I think it teaches a lesson in a completely different area: Be wary of drawing conclusions from simulations. A simulation includes only those features one believes in advance to be significant. But the real world always seems to find ways to make obscure things suddenly take on vital importance. At least it does in relatively unexplored areas like robotics.

In the early ’90s I worked briefly at a company called Denning Mobile Robotics. At the time I arrived, about eight years after the company’s founding, Denning had developed an ambitious robot called Sentry. Sentry was a security robot designed to patrol the corridors of a warehouse, office, or other facility after hours.

Sentry was a marvel of engineering—especially considering the technology available at the time. Sentry used a ring of sonar sensors to detect obstacles and follow walls; it used infrared and microwave motion sensors to detect intruders and a video camera to transmit a picture back to the security station. Sentry could follow a programmed path (relying on previously installed, active beacons) and would automatically return to its charging station to recharge its batteries. The robot gave impressive demonstrations.

Over the course of several years many talented, capable people designed, built, and programmed Sentry. These robot pioneers were justifiably proud of their achievement. But a commercial robot must please its customers not its builders and this is where the trouble started.

Sentry was placed at several customer sites. Denning management was confident that Sentry would be well received and gave the trial companies favorable terms. But after a few months all the robots were sent back to Denning. No one wanted to buy or lease Sentry.

Engineers sitting around the lab might imagine that a security robot would frequently encounter intruders. Maybe the voice of the guard relayed through the robot would instruct the would-be burglar to surrender or flee. Maybe the robot would even give chase. Unfortunately, Denning discovered that’s not what security staff spend most of their time doing. Instead guards do things like check the doors to make sure that they are locked, turn off the lights and the coffee pot, maybe turn down the thermostats to save energy. Sentry couldn’t do any of those things.

Sentry could roll along a corridor and report unexpected movement. For that customers had to outfit their office or warehouse with beacons and pay Denning $75,000. And someone was still needed to check that the doors were locked and the coffee pot turned off. Companies concluded that Sentry’s service wasn’t worth the price.

Denning’s example—involving people I knew—of fruitless effort and dashed hopes, dramatically illustrated to me that great technology isn’t enough. Building a really cool robot nobody wants is just an exercise in disappointment. If your work is to count for something you have to solve a problem people want solved at a price they’re willing to pay. Fulfilling a customer need at a competitive price makes a robot practical. But achieving practicality is deceptively hard.