However, on a 32-bit Windows system the result is very inaccurate and every time I run this script I get either 0.015625 or 0, nothing in between. I suppose this is due to floats having little precision on 32-bit systems. On a 64-bit Linux production server I get values like 0.0025119781494141.

How do I measure duration accurately? I'm trying to find a simplest solution, so far the best I can think of is use microtime() without true, split and concatenate the string and use bc functions for calculation. Are there any better ways?

An external PEAR library is too much for my needs, occasionally I just want to inject 2-3 lines of simple code to benchmark a piece of code without any external dependencies. But I looked at the source of this package and it uses the same method of counting time as I was contemplating earlier - textual microtime() result + bcsub() to calculate the difference. So I tried this method in my code:

And guess what? The same problem persists on the 32-bit Windows system - I get either 0.015625 or 0. I'm inclined to think that floats are not the problem but more likely microtime() is reporting time with very small precision in the first place. What can I do then?

Rubble
—
2014-01-06T20:51:51Z —
#4

I have had a look back and I did some speed test a few years ago using XAMPP on probably a 32 bit XP setup - I do not think I had vista at that time but if it was it would also have been 32bit and the results were:1st run0.299100001653 average time2nd run0.298720002174 average time3rd run0.298156666756 average time4th run0.298293328285 average time5th run0.298613333702 average time

All Microsoft Windows versions since Windows 2000 and Windows XP include the Windows Time service ("W32Time"),[13] which has the ability to sync the computer clock to an NTP server. The version in Windows 2000 and Windows XP only implements Simple NTP, and violates several aspects of the NTP version 3 standard.[14] Beginning with Windows Server 2003 and Windows Vista, a compliant implementation of full NTP is included.[

EDITWhen I did some code with microtime() a couples back there was more processing involved between "start" and "end".Rubble's "exec" would also be more resource hungry than a simple for loop.

Maybe there's no way to test "fast" code accurately?

Lemon_Juice
—
2014-01-06T21:13:30Z —
#6

Rubble, your case is different because you are measuring image manipulation which takes almost 1/3 of a second so it's obvious you wouldn't get 0.015625 or 0 because your conversion lasts many times longer than either of these values. I'm measuring a for loop that gets executed much quicker and it looks like the duration is around the borderline of microtime accuracy.

Anyway, I've found this bug: Bug #64633 :: microtime regression - resolution reduced to 64 ticks per second. Due to some changes in PHP code the accuracy of microtime was reduced to 1/64 of a second, which is exactly 0.015625. So it looks like there's no way around it on Windows systems, at least on Windows 7. They don't mention anything about 32 vs 64-bit versions so I'm not sure if this makes any difference. They also mention there are different API calls on Windows 8 but it's also uncertain if the problem is resolved on Win 8.

Mittineague
—
2014-01-06T21:23:55Z —
#7

Good find. Seems 0.015 isn't very precise for something one would think was. For "humans" not noticeable, but for processors it sure is.What is misleading is why offer float precision when the data it's working with isn't precise. Gads!

Lemon_Juice
—
2014-01-06T21:31:01Z —
#8

Mittineague said:

Maybe there's no way to test "fast" code accurately?

It seems true on Windows systems. Being able to time such small amounts of time is not critical but sometimes it's useful - for example, you can take some piece of code outside of a large loop and benchmark it without any loss of precision.

But this limitation has its side effects - for example, uniqid() suffers from the same lack of accuracy if not used with the more entropy option - and on my system I was able to get 313 same uniqid() values in a loop!

Anyway, I doubt it's impossible to get accurate microsecond time on Windows - I'm sure any OS needs way more accuracy than 1/64s. for its proper functioning! How many internal commands can be executed within 1/64s.? I suppose quite a lot on modern machines...

cpradio
—
2014-01-07T12:34:01Z —
#9

Couldn't you use xDebug to profile your code? Create a very simple test.php that does what you want, profile it and look at the call hierarchy/stack?

Lemon_Juice
—
2014-01-07T18:52:02Z —
#10

cpradio said:

Couldn't you use xDebug to profile your code? Create a very simple test.php that does what you want, profile it and look at the call hierarchy/stack?

I suppose that is an option but here I wanted a most simple method without any dependencies on xDebug and other extensions and I wanted this to be portable across different remote installations where often I don't have xDebug.

Apart from that, doesn't xDebug skew the results, I mean if you turn profiling on then some additional execution must get in the way of normal script flow to take the measurements. I once had a problem of some piece of php code running slow because xDebug extension was enabled (unfortunately, I don't remember what this was about).

cpradio
—
2014-01-07T18:58:29Z —
#11

Lemon_Juice said:

I suppose that is an option but here I wanted a most simple method without any dependencies on xDebug and other extensions and I wanted this to be portable across different remote installations where often I don't have xDebug.

That's fair enough. And a very good reason to rule it out.

Lemon_Juice said:

Apart from that, doesn't xDebug skew the results, I mean if you turn profiling on then some additional execution must get in the way of normal script flow to take the measurements. I once had a problem of some piece of php code running slow because xDebug extension was enabled (unfortunately, I don't remember what this was about).

It shouldn't really skew the results, as the injection it does, should be before and after, not during the execution. You would likely see a decrease in your execution from a end-user standpoint because it is collecting the additional data (if you were to leave it enabled). That is why I use extensions that enable me to quickly turn it on or off, so I'm not impacted by it unless I know I want it. Nonetheless, it shouldn't affect the actual time it takes to execute the code block (but again, I'll usually run the same code block a few times -- 10 for example, and take the average; after removing the first and last execution).

As for the problem at hand. I now wish I had a 32 bit machine to try out a few tests... Unfortunately, I only have 64 bit systems... I'll have to see if I get the time to build a 32 bit virtual machine, but it may be a while. Sorry.

Lemon_Juice
—
2014-01-07T21:24:57Z —
#12

cpradio said:

It shouldn't really skew the results, as the injection it does, should be before and after, not during the execution. You would likely see a decrease in your execution from a end-user standpoint because it is collecting the additional data (if you were to leave it enabled). That is why I use extensions that enable me to quickly turn it on or off, so I'm not impacted by it unless I know I want it. Nonetheless, it shouldn't affect the actual time it takes to execute the code block (but again, I'll usually run the same code block a few times -- 10 for example, and take the average; after removing the first and last execution).

Thanks, I haven't really used xdebug profiling apart from testing it once or twice so I don't have much experience with it. Can you set it up so that only lines from A to B are profiled in a given script?

cpradio said:

As for the problem at hand. I now wish I had a 32 bit machine to try out a few tests... Unfortunately, I only have 64 bit systems... I'll have to see if I get the time to build a 32 bit virtual machine, but it may be a while. Sorry.

But if you have a windows 64 machine it would be also interesting to know how it behaves since from what I have found it's not certain whether the bitness has anything to do with the problem. If you could run the simple code from my first post here and see what accuracy you get?

Oh, so I'm afraid we have another variable - your PHP version is pretty old and according to the bug report the problem began after bug #64370 was fixed, which happened at version 5.3.24 in the 5.3.x series. My PHP version is 5.3.26 so you are most probably still immune to this problem. So we still don't know whether 64-bit OS has anything to do with it....

cpradio
—
2014-01-07T23:43:19Z —
#17

Lemon_Juice said:

Thanks, I haven't really used xdebug profiling apart from testing it once or twice so I don't have much experience with it. Can you set it up so that only lines from A to B are profiled in a given script?

Not that I know of. I've only used it to test the full execution of a process though, as I then go back and review it for out of place items.1) High method call times (a method taking over X milliseconds/seconds to run)2) High method executions (a method being executed 100+ times)3) Several methods being called repeatedly (is there a way to refactor or pass along a result instead of calling it each time it is needed)

Once I get through that list, I then get nit-picky and start to look at method execution sorted by time taken, and I'll look at the top X percent and see if they can be improved any.

Lemon_Juice said:

But if you have a windows 64 machine it would be also interesting to know how it behaves since from what I have found it's not certain whether the bitness has anything to do with the problem. If you could run the simple code from my first post here and see what accuracy you get?

Yeah, I can do that. I'll run it from work too tomorrow, so I get two different sets (as my work setup is far different than my home setup).

cpradio
—
2014-01-07T23:44:45Z —
#18

Lemon_Juice said:

Oh, so I'm afraid we have another variable - your PHP version is pretty old and according to the bug report the problem began after bug #64370 was fixed, which happened at version 5.3.24 in the 5.3.x series. My PHP version is 5.3.26 so you are most probably still immune to this problem. So we still don't know whether 64-bit OS has anything to do with it....

o.O interesting... I'll definitely post my PHP version and I'll take apache out of the mix by running it through PHP directly.

cpradio
—
2014-01-07T23:52:35Z —
#19

First Post Duration is your original code in your initial post, the second post duration is the code using bcsub