I am reading the following in a book on algorithms (Cormen's to be specific):

That is, we are concerned with how the running time of an algorithm
increases with the size of the input in the limit, as the size of the
input increases without bound.

I can not understand what is the meaning of the phrase: with the size of the input in the limit.
What is the limit refering about?

UPDATE:

Please note that at this point in the book (early fundamentals of chapter 2) there has been no formal definition or mention of Big O or other asymptotic notation except a vague reference on Theta notation

That isn't really a formal statement; a formal statement would be essentially big-O notation, with or without the usual notation.
–
David ThornleyOct 24 '11 at 16:30

Yes.I was meaning that the way it is phrased is very formal
–
user10326Oct 24 '11 at 19:23

The author got his PhD in CS around the same time you were born. Please forgive him for not being very good at imagining what it is like to not know any math or computer science. I bet he learned calculus before high school. Calculus is all you need here.
–
JobOct 25 '11 at 3:38

2 Answers
2

Look at the formal definition of Big O notation to see a limit defined within the notation there. While Big O is specifically about worst case complexity there are other notations for average case or best case.

Usually the idea is to get an upper bound of how bad does the algorithm run. Is the complexity logarithmic, linear, quadratic, or exponential? There are various examples one can give for some of those as a binary search can have logarithmic complexity as one is reducing half the options with each search assuming the initial set of data is sorted. Quadratic would be various poor sorting algorithms like Bubble Sort.

An alternative view here is to note that "the size of the input increases without bound" is generally a way of describing a limit as a variable tends to infinity. Limits to Infinity would be an example discussing f(x) = 1/x where as x grows without bound, the function value will approach zero. The key is to imagine the running time of the algorithm as a function depending on the size of the input and as one changes that input, how quickly is the function growing. Is it a straight line or a curve? How steep is the slope at any point? That is kind of the point behind the idea here.

JB King:OK I will read this, but in this paragraph, the writers have not yet introduced the Big O notation.So since it has not been yet introduced, it confuses me that they use this concept here which is in the chapter Fundamentals
–
user10326Oct 24 '11 at 16:07