OK there are few games where frequency makes a small difference, but still the difference from 1600-2133 compared to 1300-1600 MHz is performance wise a smaller performance increase as these Anandtech test show Link1Link2. And faster memory is more expencive than slower. It can be a 50$ difference but still that's a lot if you are on a tight budget.

You are trying to use extremely dated reviews that are completely useless and irrelevant. DDR 1600 and DDR 2400 memory cost the same price in America. Hate to break it to you. Ivy Bridge and Haswell have better memory controllers than Sandy does, that's why their results seem so minimal. Since the review is dated from 2011, and Sandy Bridge has a memory frequency limit of 2133. However, if you do look at the results closely you will still come to find that as expected all the benchmarks show that the faster memory performs better. Which means, in translation, if benchmarks are saying it is better; hate to break it to you, that means it will apply in other scenarios besides benchmarking tests. That's why they exists, benchmarks, to expose performance in a controlled environment...

Two out of five game tests, F1 2012 and The Elder Scrolls V: Skyrim, showed us that that both bandwidth and latency can influence frame rates significantly. Both variables appear equally important, too. We might have guessed we'd see the results we did; after all, both titles are already known to be less graphics-bound than the others.

On the other hand, Metro 2033, Battlefield 3, and Aliens vs. Predator demonstrated no changes at all. The performance of the first two titles is most consistently associated with the speed of a given machine's graphics subsystem, so it makes sense that we don't see a big impact from memory bandwidth or timings. And when it comes to the high frame times that impose perceived choppiness, those appear tied exclusively to graphics performance, not memory throughput or latency.

Getting back to the games that were affected by memory performance, only one title exhibited differences significant enough to be noticeable during real-world play. Even then, the average frame rates were so high that your eyes (and displays) would need to be about twice as fast as ours to realize the real-world benefits of faster RAM.

The game in question, F1 2012, consistently averages more than 100 FPS, yet also scales well with memory improvements. Really, that's only important to sustain if you're using AMD's HD3D and Eyefinity technologies at the same time, encouraging frame rates two times the 60 Hz refresh rate of most monitors. If you don't have a trio of stereo-enabled screens, large performance bumps above and beyond already-high frame rates are really only good for bragging rights.