To answer your question, the times are different because timer start and stop runs the code once and return the difference of time between start and stop. Whereas timeit runs the code inside the function many times and returns an average of the result.The reason times are different is because it is natural that GPU's are not at their fastest as soon as execution starts. There is a "warm up" period. Hence we take an average with timeit.

A few points you can take note of (based on the code you provided):* Both timer and timeit return time in seconds. You do not need to multiply it by 1000 is you need time in seconds.* I assume that you are doing a for loop inside the gfor to get an average time. Do not do this purely for the sake of getting average times. Use timeit instead or timer start-stop inside a for loop.* Use array.eval() to ensure that the variable is evaluated within the section being timed. Since ArrayFire uses Just-In-Time compilation, it is not necessary that the code executes within the timer range. The eval() function ensures that the array variable is evaluated before the next statement after eval(). You can read more about eval at: http://www.accelereyes.com/arrayfire/c/ ... __eval.htm