This has a nice range (up to 4 meters) and the output signal variates with distance. So not only that I am able to detect the beacon, but I also can estimate the distance to it. However getting this to a reliable method is not an easy path. There are some drawbacks here also, most annoying one is that motors also make ultrasonic sounds... creating a high amount of noise.

Sound also reflects off walls and other objects. Also, ultrasonic sensors may be much less reliable (and have lower range) when you're outside, especially if it's windy.

If you have problems with wall reflections for IR, then the next step might be to use a camera and some image recognition code. For example, you could make a bright yellow ball of a known diameter and put it on top of the other device. The camera can scan for this object, and when it sees it, calculate the angle, as well as approximate the distance by measuring the size. This can be a very robust solution, but costs more both in dollars and compute power. (Not doable on an Arduino by itself.)

I managed to get a robot to follow my hand by using SHARP IR sensors. The algorithm went something along the lines of if the sensor readings are too low, go forward. I was only able to get the robot to follow forward, but with another robot that had six or seven sensors at different angles, I could get it to follow my hand along turns as well. Not sure how well it would work for following another robot.

As a convinced perfectionist with all that I build, I couldn't stay away and had to make this robot even better, also thanks to the nice feedback I've received. So I did two things:1) For the autonomous, human following software: I improved the ultrasonic detection algorithm, and the movement logic: Now the robot will follow its user more precisely, and the speed will vary with the detected signal: if the robots sees the user at a greater distance, will engage with a greater speed. If closer to the user, will proceed with smaller steps. The calculations are not linear, so I used some time to get the best formula. In the end I'm quite pleased, we can see some nice improvements when compared to previous two videos:

2) For the remote control software, where the user controls the robot using a phone, I had the idea to make the rover report its frontal sensor readings (that show the proximity in centimeters to any detected obstacle), to the smartphone. So the movement commands from phone to robot, and the sensor readings go the opposite way, from robot to phone. The Android software now allows the user to turn the lights on and off, and using the frontal distance sensor, I have drawn a red line showing the proximity to an obstacle. The robot can be controlled this way, without actually seeing what it is heading for, this radar will be enough to get a clean path. Here is another demo: