A Georgia Tech study found out they do. In involved participants being directed to a meeting room by a guidance robot.

On the way to the room, the robot was shown to be unreliable. In entered the wrong room and did circles before exiting to the correct room, or just stopped and participants were told it was broken.

While in the meeting room, the hall was filled with fake smoke and an alarm sounded.

Here the robot is pointing right while the green emergency exit sign is ahead.

Almost all of the 42 participants evacuated by following the robot they had earlier seen malfunctioning.

Researchers summize that in high-stress situations, people look to authority figures for guidance and humans assume the robot has knowledge of the quickest way out.

The research, partially funded by the Air Force Office of Scientific Research, was looking at how humans could trust robots. A better question might be how to stop humans putting too much trust in robots.