Re: Determining Big-O

Re: Determining Big-O

Posted 11 September 2011 - 06:10 PM

Ok, here's something trick I would say:

If p1(N) = 5N and p2(N) = 6N, why are both p1(n) and p2(n) O(N), since for N>=1, 6N is larger than 5N?

My answer: I would say that both functions are linear functions, and all linear functions take linear time algorithm. In this case, 6n grows approxiamtely twice as large as 5n for large values of n. This is a typical characteristic of linear function.That is why is is O(n)....

Re: Determining Big-O

Posted 11 September 2011 - 06:26 PM

Quote

If p1(N) = 5N and p2(N) = 6N, why are both p1(n) and p2(n) O(N), since for N>=1, 6N is larger than 5N?

My answer: I would say that both functions are linear functions, and all linear functions take linear time algorithm. In this case, 6n grows approxiamtely twice as large as 5n for large values of n. This is a typical characteristic of linear function.That is why is is O(n)....

The short answer is that you can disregard all constants in the highest order term when doing CC analysis, and therefore they're both O(n).

E.g. Suppose I have f(n) = 7n3 + 98n2 + 2343n + 343433

7n3 is the highest order term (the term with the greatest exponent). You can completely disregard all other terms because it dominates them at some n0 (it's grows a lot faster; it's what gives the graph its shape). So, you end up with f(n) = 7n3. You can also get rid of the constant factor, because it too is dominated by n3. Multiplying by a constant isn't going to affect the function's growth, it may displace it's graph, but the shape of graph doesn't change. Finally, you get f(n) <= n3 or f(n) is O(n3). That's just the quick and dirty way to find the complexity.

The more detailed answer can be understand by looking at the formal definition of Big-O:

Re: Determining Big-O

while (i > 0)
System.out.println("Infinity");
Will this have a running time ? May be O(infinity) ? What do you think?

I'm no expert, so take my answer with a grain of salt.

I would say it is O(1).

Big O notation relies on variability. If I write an algorithm that takes 1 minute for 10 customers, I want to know if it will still be efficient when I run the algorithm over 100 customers a few years later. If the algorithm is O(n), then yes, it will take about 10 minutes. If it is O(2^n), then it will take about 2^90 minutes, and I just don't have that kind of patience.

The algorithm in question doesn't really rely on any variable. It's kind of like asking how long an algorithm that solves chess will take. Since there is only one possible input (the starting position), the answer is O(1) even though such an algorithm would take billions of years. Even though we have a variable i, it takes the same amount of time to complete if i is 1 or 1000000000. No variablility means constant run time.

Re: Determining Big-O

Posted 12 September 2011 - 10:52 AM

I disagree. According to the definition of the O notation the algorithm's runtime being O(1) would mean that there are constants c and N, such that for all i>N the algorithm's runtime is less than or equal to c. However since the algorithm's runtime is infinite, there is no such constant, so it's not in O(1).

Since the same logic can be applied to disproof the runtime being in O(f(n)) for any real function f, This means that the algorithm's runtime simply is not in any O.

I suppose we could say it's in O(infinity) to express this, but that's certainly not covered by the definition (though I'd imagine everyone would understand what it's supposed to mean anyway).

Re: Determining Big-O

Posted 13 September 2011 - 10:30 AM

sepp2k, on 12 September 2011 - 11:52 AM, said:

I disagree. According to the definition of the O notation the algorithm's runtime being O(1) would mean that there are constants c and N, such that for all i>N the algorithm's runtime is less than or equal to c. However since the algorithm's runtime is infinite, there is no such constant, so it's not in O(1).

In order for program 1 to formally qualify as a O(1) algorithm then there must be constants c and N such that for all n>N, the algorithms runtime is less than c. The problem is there simply is no meaning for n, so it doesn't meet the formal definition. Yet, we have no problem saying this algorithm is O(1). Program 2 has the same problem. O(1) has a formal definition but we also use it to say "The program runtime is independant of any data". I think the program in question falls into the latter category.

Re: Determining Big-O

Posted 13 September 2011 - 11:18 AM

I don't quite follow you. Let's says we define print to take 1 step per character in the string, where a step is our measure of an algorithm's runtime. Then print "Hello World" takes 11 steps. So the runtime of the algorithm can be described by the function f(n) = 11. Clearly that function is in O(1).

Now let's consider prog 2. Here the runtime can only be represented by the function f(n) = infinity (ignoring for the moment that that's not a real function). This function is definitely not in O(1) for the reasons I explained in my earlier post.

The important thing to note, I think, is that O(1) does not mean that the algorithms runtime does not depend on the input size, but that it's bounded by a constant, which does not depend on the input size (so the algorithm never takes more than X steps, where X is not dependent on the input). For an algorithm which runs infinitely, there clearly is no such constant, so it's not in O(1).

Re: Determining Big-O

Posted 14 September 2011 - 10:03 PM

If an "algorithm" runs infinitely and never terminates, it isn't an algorithm by definition (fails the "finiteness" qualification). So, we're arguing over Big-O of an algorithm for something that isn't an algorithm.

This isn't a function, either, because f(x)=infinity is just wrong (infinity is not a value). Loosely speaking, f(x) should give some definite y, yet for all numbers n in the set of reals, f(n) is "infinity" implies, it seems, that f(n) is undefined. [Someone with bigger math chops than I might be able to give a more formal proof]

In short, don't sweat it because there is no upper bound for something that isn't an algorithm and isn't a function. Might as well find Big-O of an apple (I'm guessing O(orange)?)