The TUGs are designed to transport things like patient meals, medicines, medical waste, and linens within the hospital. Hospital staff use a touch screen to send the robot to a specific location. The TUG has built-in sensors and maps that enable it to navigate its way autonomously through corridors, into elevators, and around corners and obstacles.

What especially intrigued me was the fact that the TUG can talk. In fact, Matt Simon, the author of the Wired article, calls the robot “chatty.” I didn’t see any mention of the robot’s speech abilities on the Aethon website, though I could have overlooked it.

In the video, the TUG makes a few statements around the fact that it has called the elevator, that it’s waiting for the elevator to arrive, and that it’s about to get onto the elevator, so please step aside. It looks like the TUG provides this spoken information whether or not someone is actually there to hear it.

But giving the TUG a voice seems like the best way for it to communicate its intentions. The robot has to interact within a dynamic social environment. To be successful, it can’t just barge blindly through the hallways and it also doesn’t want to be seen as standoffish and unpredictable. Being able to say what it’s up to goes a long way to making it seem less alien. The TUG doesn’t come across as any less robotic for the fact that it can talk, but it does become more of an accepted part of the social fabric.

It would be interesting to see what new capabilities the TUG might gain if it was enabled with speech recognition and NLP technology. If you were a patient, you might be able to ask it what was for lunch. In the best case, if you didn’t like the hospital menu, you could send it across the street for a burger. But that’s probably wishful thinking.