Duration and CPU Values

I'm getting strange results when I try to analyze data from SQL Server Profiler in SQL Server 2005. It seems like the value for duration is 1,000 times more than what it should be. What's behind this anomaly?

Late in the SQL Server 2005 development cycle, Microsoft changed the way SQL Server Profiler captures the duration value from milliseconds to microseconds. Surprisingly, the time value for the CPU is still captured in milliseconds. The Profiler GUI automatically converts and displays the duration value in milliseconds; however SQL Server 2005 stores the duration value in microseconds. If you save the trace to a file or load the trace data to SQL Server for analysis, the value that's stored will be in microseconds. In the December refresh of SQL Server 2005 Books Online (BOL), the topic "Viewing and Analyzing Traces with SQL Server Profiler" tells us:

In SQL Server 2005, the server reports the duration of an event in microseconds (one millionth, or 106, of a second) and the amount of CPU time used by the event in milliseconds (one thousandth, or 103, of a second). In SQL Server 2000, the server reported both duration and CPU time in milliseconds. In SQL Server 2005, the SQL Server Profiler graphical user interface displays the Duration column in milliseconds, by default, but when a trace is saved to either a file or a database table, the Duration column value is written in microseconds.

For the most part, I like the change. However, I'm not sure why Microsoft didn't make the same change for the CPU time value to ensure consistency. Also, I wish that Microsoft had provided a way to switch between capturing and displaying the time in milliseconds or microseconds. Many administrators have built custom scripts for parsing trace data, and this change in the time measurement will break many of those scripts.