Due to historic reasons the Apple folks used a Gamma of 1.8 and the Windows Guys use 2.2. Which one is the best? According to my oppinion the best Gamma is the one which is closest to the luminance perception of the human eye. If we consider Lab to be a good representation of the human perception it would be best to have a uniform distribution of digital values in an 8 Bit image over the L axis of the Lab space. The image below shows that a gamma of 2.2 to 2.4 is closer than a gamma of 1.8. So the best gamma for viewing image on a monitor is around 2.2. If you have the possibility in your calibration software it will be even better to calibrate the monitor to a linear L.
But the question is: are the differences visible in real images? If you have a calibrated and profiled workflow according to the ICC standard I have not found a single image where I coould really see differences caused by different gamma settings. So in real live it seems as if it does not make a significant difference which gamma is used.