A computer can only do a limited number of operations such as additions, multiplications, etc in a given time. Usually, this number is approximately a billion operations per second. Hence, if we want our algorithm to finish executing in a second or so, then we need to make sure that the total amount of work we are asking it to do is less than a billion operations. Example of operations are addition, multiplication, assigning a value to a variable, and so on.

Here is a tutorial on Computational Complexity by Michal Forišek, one of the organizers of IOI. I love the first couple of sections titled Why is it important? and What is efficiency? before he gets into the formal definitions. As he says, therationalebehindthedefinitionsismoreimportantthanthedefinitionsthemselves. You don't need to continue to section 2 of the tutorial, section 1 is sufficient.

Finally, once you understand the rationale behind asymptotic notation and computational complexity, here is a simple video explaining the concept in detail.

If you get stuck, shoot your questions. If you find other great resources, do share with everyone.