The definition of f=O(g) is that there exists an n,m such that for every x>n, f(x) < M*g(x).
that's why the constant doesn't matter.
suppose ALGO A runs in 2*n binary operations, while ALGO B runs in n binary operations.
then both run in O(n) time complexity.

@Aryabhata: True, but that was roughly what I meant by "roughly" :) I tend to assume that an algorithm will consume all its input, so it will have $\Omega(n)$ time-complexity.
–
Johannes KloosApr 19 '12 at 17:36