Besides the classical: you can't do comparison sort with faster than (n logn); what are other lower bounds we know of for algorithms? I can't seem to dig them up via google scholar, yet they must exist.

I'm especially interested in polynomial time algorithms (i.e. and not proofs of: you can't solve this in polynomial time unless P = NP).

Any one-tape Turing machine that decides the language of palyndromes requires $\Omega(n^2)$ time. It can be proved using communication complexity. The proof can be found in "Communication complexity" by Kushilevitz and Nisan.

$\mathbf{SAT}$ can not be decided in time $O(n^{1.801})$ and space $O(n^{o(1)})$. This result is by Ryan Williams.

There is a PhD thesis of Mihai Patrascu, where he proves several lower bounds for data structures, but more or less they are logarithmic.

Another nice lower bound is the following. SUppose you have a first order sentence over reals with operations '+', '*', '>' and integer constants. Celebrated result of Tarski and Seidenberg states that you can decide if sentence is true. But there is an unconditional result that it requires exponential time.

No EXPTIME-complete problem can be solved in polynomial time as a consequence of the time hierarchy theorem. Of course, the same holds for harder problems, such as problems complete for exponential space, doubly exponential time, etc. These results are unconditional (i.e., they do not depend on P ≠ NP or other conjectures).

There are some interesting problems which provably lie outside P; one of them concerns the equivalence of regular expressions (see these slides for an introduction), and some kinds of game are also very hard to solve.

Because of the paucity of lower bounds, there has been some work connecting one lower bound
to another, as-yet unknown one. I am thinking here of the 3SUM problem: Given $n$ integers,
do three sum to zero? The only upper bound (in the most general model) is $O(n^2)$.
Jeff Erickson proved a matching lower bound in a restricted linear decision tree model:
"Bounds for Linear Satisfiability Problem." Many problems in computational geometry can be
reduced to 3SUM, and so are 3SUM-hard. For example: Given a set of $n$ points in the plane
(with integer coordinates), find the minimum area triangle with corners in the set.

There are a number of lower bounds for problems related to sorting, for example element-distinctness: are there two identical elements in a set ? These lower bounds are actually interesting because they generalize the comparison-lower bound to more algebraic formulations: for example, you can show that solving element distinctness in a model that allows algebraic operations has a lower bound via analyzing the betti numbers of the space induced by different answers to the problem.

A very interesting example of an unconditional exponential deterministic lower bound (i.e not related to P vs NP) is for estimating the volume of a polytope. There is a construction due to Furedi and Barany that shows that the volume of a convex polytope cannot be approximated to within an even exponential factor in polynomial time unconditionally. This is striking because there are randomized poly-time algorithms that yield arbitrarily good approximations.

Many O(n) algorithms (sorting array with values in fixed range; searching in unsorted array; MST; finding largest binary-search subtree of a tree) are asymptotically optimal simply by virtue of the fact that every element must be looked at at least once.

Also, comparison-search (of a sorted array) is bounded by $\Omega(\log{n})$ for the same reason that comparison-sort is bounded by $\Omega(\log{n!})$

For completeness, O(1) algorithms are also trivially asymptotically-optimal.

Yiannis Moschovakis has been working on giving explicit lower bounds for algorithms deciding basic number theoretic questions. For example, together he and Lou van den Dries have shown that any algorithm deciding whether integers a,b are coprime must have infinitely many inputs on which the algorithm runs for more than log(log(a)) steps. See their paper "Is the Euclidean algorithm optimal among its peers?" at http://www.math.ucla.edu/~ynm/papers/acnote.ps The Euclidean algorithm requires logarithmic time, so their lower bound does not establish its optimality; this is discussed in the paper.

Here rather than using the number of steps a Turing machine runs as a gauge for the runtime of an algorithm they use the number of times certain primitives are called (like +,*,<,rem, and so forth). The proofs are done in a logical framework (using the language of model theory).

most of the lower bounds that aren't circuit-based also use #primitive operations invoked as a measure of complexity as well - specially the algebraic decision tree lower bounds
–
Suresh VenkatJul 12 '10 at 2:24

There are numerous lower-bound results that are related to distributed (and parallel) algorithms. For a partial list, see the surveys from 1989 and 2003.

Here are some further examples: Linial (1992) shows that 3-colouring a cycle requires $\Omega(\log^* n)$ communication rounds (and this is tight). Similar but more general results include, e.g., Czygrinow et al. (2008), which is a neat application of Ramsey's theorem. For something different, see, e.g., Kuhn et al. (2004).