I have a circuit which generates an accurate pulse 500ms wide. I use the PicoScope 6 measurement function to measure the high pulse width between rulers. When the timebase is set to 100ms/div, I get sensible results:

Min=500.1ms; Max=500.1ms; Average=500.1ms; sigma=24.46ns

If I change the timebase to 200ms/div (and move the rulers accordingly), I get rubbish results:

Min=483.4ms; Max=498.6ms; Average=497ms; sigma=4.245ms

Am I doing something wrong, or is this a serious bug? Screen shots attached.

It does appear that when using the slow sampling mode the downsampled data that is used by the measurements engine is introducing inaccuracies, compared with the downsampled data retrieved directly from the scope when using faster timebases. I will pass the information on to the development team.

Can you send the text from Help->About, with the scope attached, so that we know the versions of S/W, F/W and H/W you have.