Comments

I guess it depends on what you want to do but there's a subtle difference between accuracy and precision. Even using APIs you'll be millisecond precise but not accurate. That is, units of milliseconds but not that you're reading the timer at the right time.

Plus if you're trying to time say a visual display to the millisecond then you're likely to be out of luck. PC's, Macs and Linux will all have similar problems as they use the same hardware. Even RTOS systems won't help you all that much as soon as you interact with a standard TFT or keyboard.

For some background on the issues take a look at: http://www.blackboxtoolkit.com and read about our Black Box ToolKit which helps users achieve millisecond timing accuracy in experimental work.

zroeder... It's likely that millisecond timing is limited only by the clock speed of your computer. Back when computer clocks were around 5 MHz, all interpreted computer programs ran very slowly. Programs written in assembly language ran the fastest.

For the fastest TB execution speed, a plain-text TB program in the TB Editor runs slower on the first RUN than on subsequent RUN's. The code has to COMPILE before it actually executes.

Interaction between your computer and your printer is normally a very slow process. Avoid this when speed is an issue.