I've been thinking about this problem for some time... I'm a software engineer and have been recently getting into SEO. So my theory comes from my engineering side of the brain.

So how I would tackle this problem if I was Google?

1. Temporarily increase a smaller sites rank in the SERPs for a competitive keyword, for example 1% of searches that day.

2. Use the data collected to analyze the return rate (users returning back to Google results) of click through's to that website.

3. Modify the sites ranking for that keyword based on the return rate. If users did not return back to Google to check other results, it is likely that the users found what they were looking for at that site. If % of returns from the site is greater than other comparable sites, then decrease ranking.

This would apply to all sites/results and would be only a part of the ranking criteria.

I'm not positive if I have seen an example of this in analytics of the company I work at, but I notice random traffic jumps in competitive keywords for short periods of time.

I'd love to hear if anyone has some input on this theory.

Edit: My reply was actually supposed to be to interactivevoices and Gaurav Kohli on how new sites would ever get exposure in the first place. Though it applies to this discussion also.