Comments

I always assumed this was a major factor in SE rankings at Goolge. For image search, I would almost guarantee it is a major factor. And I would assume that the appearance of site links for a specific term would depend on whether the site met or exceeded a historical CTR threshold for the term in question.

Theres no reason to believe Google measure CTR for organic results and make it a parameter of the consolidated relevance score. There are strong reasons against it: 1. it perpetuates the top 10 listings in the top rankings; 2. it is very difficult to prevent manipulation of their results by fake visits.However, there are some cases in which Google becomes active and may push down some sites to let others make it to the top: 1. when theres not enough variety in the top results (all ten rankings for the name of a disease are occupied by lawyers sites and theres no informative site); 2. when theres strong ambiguity in the search query itself (for instance, a certain Mr. Obama is reported to be dating Brooke Shields). In this case it may use a CTR sample and push down the presidential candidates site for a while, till interest in the other Obama fades away. 3. personalized search

Personalized search IS based on user performance/behavioral metrics..... and as stated, Google has already conceded to using query analysis in the RI (reg index)..... so what more might be there is the question... its not a matter of IF but HOW strong are the signals that are being used? As Matt noted in the past, it would be a very weak signal to be sure.. but a signal none the less I would say...... one thing we do know... is that we dont know jack - none of us can truly say definatively on usage and the related thresholds that may be in play....

Personally I think Google would be silly not to use the CTR info they have at their fingertips. OK, so it can be gamed, but so can the majority of other ranking factors. And bear in mind that for many sites they have more than just CTR info to judge relevancy by - think bounce rate for one. Combine CTR with bounce rate and youve got a pretty good yardstick for relevancy and "usefulness" of website.

I think CTR is a bad terminology as User Behavioral Metrics involve more than mere click-through rates... I prefer UPM = User Performance Metrics OR UBM = User Behavioral Metrics (from a MS patent) ..On a side note, both Yahoo and Microsoft also have published patents in this area .... not just Google ;0)

hmmmmm... emanuelh try reading this maybe then - http://www.huomah.com/search-engines/algorithm-matters/beware:-google-is-watching-you.html There are many, many reasons but the true question is more about the reliability of the potential ranking signals.....

As a site owner, I can only cite my experience here. The general principles you guys are discussing here make a lot of sense. When you ask yourself "Does Google want to serve up sites that people dont use" it seems that a logical answer would be NO. But that metric is very hard to determine. There are some sites that you visit that the information or content is so well presented or spot on that you get what you need and bounce within few seconds. Believe me, thirty seconds is a lot longer than we think it is. I read through these comments in just a little bit of time (didnt use a stopwatch, sorry) Watching my cousin, who Ive always used as a measuring stick for a typical web user, do a search and start clicking. Hell click on every result in the top ten, stay for about 10-20 seconds while he scans for information, then bounce to the next result. Typically this meant that he didnt find what he was looking for, or all the info he was looking for wasnt there. So did all those clicks mean that each one of those sites gets a vote? How do we know how to qualify the traffic?Then their is the time spent on a site. As a retailer, those users who spend a lot of time on a site (more than 5 minutes) can fall between two categories. One is the user who is so enthralled with the site that they click all over the place, saying "Oooh, look at this, and this, and look at this cool thing" or the user WHO CANT FIND what they where searching for. Its the latter that scares me for a plethora of reason, particularly that Im not servicing my website user effectively.To algorithmically try to determine intent is a scary slope to go down. Ive always said that Google Search Technology is by far the most advanced example of computer programming that has emerged since the beginning of the nineties, and they just keep working on it. I wouldnt be suprised if they arent using on site usage or clickthrough rates for some discovery, and if youve seen Google Analytics, they have much more information about your site than you thought they had, so again, no suprise. But intent is so difficult to gage. For instance, my fastest conversion on one of my sites was 22 seconds. New user, straight from the serps, came to my site, saw what they wanted, and bought it. On the detail page, she was there for 15, seconds, added to cart, updated the info cause she wanted more of what she bought, hit checkout, blammo, 22 seconds. Now, that was a succesful conversion, but a quick bounce out. If Google used that, would it look favorable? I dont know how to answer that, cause Im not a Google Engineer and I havent really strapped my mind around this concept of search. But an interesting discussion nonetheless.

Im glad to see that this topic is interesting to you guys too.UBM (User Behavior Metrics) might be a better way to define the scope of what we are talking about here. (mentioned above by theGypsy). For instance, Ive always wondered whether the Google Analytics info is factored into the Organics....And yes, I guess the big question for us SEOs is: "Can we game it"? And as with many of the SEs factors, the answer is a resounding "Yes!"Anybody that would be interested in doing some testing to see if a high CTR on a site will boost its rankings? Ive got some ideas on how it could be tested... and then we can end this debate for the time being.Best, Brad

Well Im pretty plurking busy these days, but have an intimate knowledge of the Google Patents (and others) relating to this area.... so feel free to get in touch and I can try and point you in the right directionIn the end this type of signal would be tough (tho not impossible) to truly isolate for testing IMO.....

I have read in several authenticated sources that CTR is a partial ranking factor. It is not the entire part of the algo, but it is a part of it. That doesnt mean a sudden increase (which appears artificial to G) is a good idea.

@emanuelh - which stats would you go by for relevancy? Number of incoming links and keyword structure or the actual users vote - the site they click on and time they spend there? Obviously these are not mutually exclusive - but you get my drift. @robert.garcia - yes, there is huge fallibility in bounce rates etc. As you say, if the info seeked is all neatly served up on the first page, a quick bounce would actually be a vote of confidence by the user. However, Googles wealth of data does not end there either. They also generally know what you do AFTER visiting one search result. Do you quickly bounce and leave the search altogether? - FOUND. Or do they bounce off the site and go straight to another search result? - NOT FOUND.Its not a perfect science and certainly extremely complex but you can bet that the best brains on the world are working on it 24/7 and us mere mortals can only speculate...