The Moz Blog

The 2 User Metrics That Matter for SEO

In the wake of Google’s Panda updates, there’s been a lot of fear regarding user metrics and how they impact SEO. Many people are afraid that “bad” signals in analytics data, especially high bounce rates and low time-on-site, could potentially harm their rankings.

I don’t think Google is tapping into analytics data directly (I’ll defend that later), and I don’t think they have to. There are two user metrics that both Google and Bing have direct access to: (1) SERP CTR, and (2) “Dwell time”, and I think those two metrics can tell them a lot about your site.

Google Analytics (GA) & SEO

The official word from Google is that analytics data is not used for ranking. Whether or not you believe that is entirely up to you, and I’m not here to argue about it. I’ll only say that it’s rare to hear Matt say something that emphatically. I think the arguments against using analytics directly as a ranking factor are much more practical in nature…

(1) Not Everyone Uses GA

Usage stats for GA are tough to pin down, but a large 2009 study placed the adoption rate at about 28%. I’ve seen numbers as high as 40% being quoted, but it’s likely that somewhere around 2/3 of all sites don’t have GA data. It’s tough for Google to penalize or devalue a site based on a factor that only exists on 1/3 of all sites. Worse yet, some of the largest sites don’t have GA data, because those are the sites that can afford traditional, enterprise analytics (WebTrends, Omniture, etc.).

(2) GA Can Be Mis-installed

Even for sites using GA, Google can’t control how it’s installed. I can tell you from consulting and from Q&A here on SEOmoz that GA is often installed badly. This can elevate bounce rates, reduce time-on-site, and generally add a lot of noise to the system.

(3) GA Can Be Manipulated

Of course, there’s a malicious version of (2) – you can mis-install GA on purpose. There are ways to manipulate most user metrics, if you want to, and there’s no scalable way for Google to double-check everyone’s installation and setup. Once the GA tags are in your hands, they’ve lost a lot of control.

To be fair, others disagree and think that Google will use any data they can get their hands on. Some have even produced indirect evidence that bounce rate is in play. I’m going to argue a simple point - that Google and Bing don’t need analytics data or bounce rate. They have all the data they need from their own logs.

The 1 Reason I Don’t Buy

One argument you hear all the time is that Google can’t possibly use something like bounce rate as a ranking signal, because bounce rate is very site-dependent and unreliable by itself. I hear it so often that I wanted to take a moment to say that I don’t buy this argument, for one simple reason. ANY ranking signal, by itself, is unreliable. I don’t know a single SEO who would argue that TITLE tags don’t matter, for example, and yet TITLE tags are incredibly easy to manipulate. On-page factors in general can be spammed – that’s why Google added links to the mix. Links can be spammed – that’s why they’re adding social metrics and user metrics. With over 200 rankings factors (Bing claims over 1,000), no single factor has to be perfect.

Metric #1: SERP CTR

The first metric I think Google makes broad use of is direct Click-Through Rate (CTR) from the SERPs themselves. Whether or not a result gets clicked on is one of Google’s and Bing’s first clues about whether any given result is a good match to a query. We know Google and Bing both have this data, because they directly report it to us.

In Google Webmaster Tools, you can find CTR data under “Your site on the web” > “Search queries”. It looks something like this:

Bing reports similar data – from the “Dashboard”, click on “Traffic Summary”:

Of course, we also know that Google factors CTR heavily into their paid search quality score, and Bing has followed suit over the past year. While the paid search algorithm is very different from organic search, it stands to reason that they value CTR. Relevant results drive more clicks.

Metric #2: Dwell Time

Last year, Bing’s Duane Forrester wrote a post called “How to Build Quality Content”, and in it he referenced something called “dwell time”:

Your goal should be that when a visitor lands on your page, the content answers all of their needs, encouraging their next action to remain with you. If your content does not encourage them to remain with you, they will leave. The search engines can get a sense of this by watching the dwell time. The time between when a user clicks on our search result and when they come back from your website tells a potential story. A minute or two is good as it can easily indicate the visitor consumed your content. Less than a couple of seconds can be viewed as a poor result.

Dwell time, in a sense, is an amalgam of bounce rate and time-on-site metrics – it measures how long it takes for someone to return to a SERP after clicking on a result (and it can be measured directly from the search engine’s own data).

Google hasn’t been quite so transparent, but there’s one piece of evidence that suggests strongly to me that they use dwell time as well (or something very similar). Last year, Google tested a feature where, if you clicked a listing and then quickly came back to the SERP (i.e. your dwell time was very low), you would get the option to block that site:

This feature isn’t currently available for all users – Google has temporarily scaled back site blocking with the launch of social personalization. The fact that low dwell time triggered the ability to block a site, though, clearly shows Google is factoring in dwell time as a quality signal.

1 + 2 = A Killer Combo

Where these 2 metrics really shine is as a duo. CTR by itself can easily be manipulated – you can drive up clicks with misleading titles and META descriptions that have little relevance to your landing page. That kind of manipulation will naturally lead to low dwell time, though. If you artificially drive up CTR and then your site doesn’t fulfill the promise of the snippet, people will go back to the SERPs. The combo of CTR and dwell time is much more powerful and, with just 2 metrics, removes a lot of quality issues. If you have both high CTR and high dwell time, you’re almost always going to have a quality, relevant result.

Do Other Metrics Matter?

I’m not suggesting that bounce rate and other user metrics don’t matter. As I said, dwell time is connected (and probably well correlated) to both bounce rate and time-on-site. Glenn Gabe had a nice post on “actual bounce rate” and why dwell time may represent an improvement over bounce rate. I’m also sticking to traditional user metrics from analytics and leaving out broader metrics, like site speed and social signals, which clearly tie into user behavior.

What I want you to do is to take a broader view of these user metrics, from the search engine’s perspective, and not get obsessed with the SEO impact of your analytics data. I’ve seen people removing and even manipulating GA tags lately, for fear of SEO issues, and what they usually end up doing is just destroying the reliability of their own data. I don’t think either Google or Bing are using direct analytics data, and even if they do down the road, they’ll probably combine that data with other factors.

So, What Should You Do?

You should create search snippets that drive clicks to relevant pages and build pages that make people stay on your site. At the end of the day, it sounds pretty obvious, and it’s good for both SEO and conversion. Specifically, think about the combo – driving clicks is useless (and probably even detrimental to SEO) if most of the people clicking immediately leave your site. Work to find the balance and to target relevant keywords that drive the right clicks.

66 Comments

An interesting post, but i believe that purely looking at these two metrics from a user point of view is not the key here.

I agree with you when you say that Matt Cutts has never been so clear in his statements but the reality is that if we start believing blindly that Google representatives say it would be like stabbing ourselves with a sharp edged knife.

Honestly if I had to predict or state one of the ways on how Google would rank websites based on CTR/user behavior it would involve at least three basic steps -

Step 1

Google was granted a patent that talks about classification of websites based on categories. To give everyone a gist, search engines will be able to classify websites and categorize them which will eventually help them in determining relevance which will in turn help in determining (to some extent) rankings. Of course other factors like back link profile; social presence etc will be taken into consideration. So basically the way I see it, websites will be classified into specific categories.

Step 2

Bounce rate data is fine; I mean everybody talks about that but what’s surprising is that no one talks about the exit rate of a particular page. From an analytics perspective I would definitely look at exit rate of a page as that is the page from which a user is leaving the website after navigating through my website.

Who knows maybe even in page analytics data can be used. So if I had to predict, more than bounce rate, exit rate and in page analytics data should ideally be used. One would argue, why not site load time, the answer is NO for the simple reason that it’s a sample set, so using data which has been collected based on sample data set would be very stupid.

Step 3

Now we have websites categorized and then we have the required GA data, now combine that with Google Patent of Demoting Search Results. Again to give you all a Gist, this patent states that search engine might demote websites with similar search results in case a user did not click on any of the search results that was provided in the previous search made using a similar query.

They said that they dont crawl flash or JavaScript, but according to Mike they definitely do.

Similarly, they say that they don’t use GA but when it’s a matter of search supremacy I hardly doubt that they won’t use it.

Again I am not stating that the two user metrics stated above are not important but I think it’s high time that we stop limiting ourselves to some basic metrics.

My point isn't that analytics don't matter or that you should narrowly focus on 2 metrics - my point is that I think search engines can learn a lot just from those 2 metrics and are probably using those 2 metrics to impact ranking. I definitely think exit rate and other metrics can be useful for analyzing the effectiveness fo your site and for CRO.

I'd also say that (1), while perfectly valid, has nothing to do with user metrics. Classificiation can happen purely from on-page content.

Then perhaps you need a better title for your post. It's not 2 User Metrics, it's 2 Search Engine Metrics or something else. I clicked on this post thinking that it would be about metrics I can present to clients and end users that translate the success of SEO. This post while interesting wasn't exactly about useful user metrics.

I think dwell time as how long a user stays on a result isn't quite enough, they would also probably look at what the user then does. For instance, query, click first result and hit back, hit second result would indicate the first result didn't satisfy; query, click first result, do a totally different query would indicate they were satisfied (regardless of time between actions); query, clicks some results, refine query would probably have no bearing.

Time on site can be a really misleading metric in Google Analytics, especially for bounces. I've come up with an advanced tracking technique that can give site owners a much better understanding of how people interact with their content.

Using some custom code, we can measure when a visitor starts scrolling, when they hit the bottom of the page content and when they get to the bottom of the HTML. We can also track the time all of these actions take.

This give a very interesting picture of "Dwell" and what content is really popular.

I believe him (Cutts, not Dr. Pete) simply because they don't need to - if your dwell time sucks for organic traffic compared to the other SERP results being shown, that's all they care about. Referreral data is likely similar, and less relevant to Google's needs.

"One argument you hear all the time is that Google can’t possibly use something like bounce rate as a ranking signal, because bounce rate is very site-dependent and unreliable by itself."

In response to people who believe this, all Google needs to do is compare the bounce rates of 10 SERP entries. The lowest one is probably giving users the best solution. That's pretty reliable in my book, and it's relevant for just about every SERP they show.

"...all Google needs to do is compare the bounce rates of 10 SERP entries. The lowest one is probably giving users the best solution"

A low bounce rate isn't always a good thing -- it's entirely dependent upon the query. If I do a search for very specific information (the Republican debate schedule, for example) and the page I land on doesn't offer the exact information I require but links to it as a secondary page, that site's bounce rate gets artificially deflated. In your eyes, that site would be deemed superior to one that's optimized to efficiently provide the user the exact information they're searching for.

A low bounce rate can actually be an indicator of poor site structure and navigation. University sites are great examples of this -- try searching for tuition information and you may end up on a page that doesn't include actual tuition costs. Given the vast amounts of information and often-confusing language employed on these pages, it's easy for an average searcher to click 3-5 pages deep before finding the content they were actually looking for.

Beyond that, many searches, by their nature, encourage result comparison. If I'm in the market to buy a camera and I search for reviews, I'll likely want to see more than one opinion. Why should I take the first result as gospel truth? I'm much more likely to bounce back to the SERP for that query regardless of the quality of the content on the page I land on.

I do believe that Google's using bounce rate (back to SERPs) and dwell time as quality indicators but I don't think they can be universally applied, as you suggest.

Agreed, but this is where dwell rate comes into play - if the site sucks, they'll be back to the original SERP even if it requires 2 or 3 clicks of the back button. My use of the word bounce rate specifically was a poor choice.

"I do believe that Google's using bounce rate (back to SERPs) and dwell time as quality indicators but I don't think they can be universally applied, as you suggest."

My suggestion isn't a universal application - it's very specific I think. I'm talking about using it in comparison to the other SERP results being shown. If all of them have reviews but you want to see more and click the back button, then that is normal for that SERP. If Amazon clicks don't come back because the reviews rock, but clicks on all of the other entries results in clicks back to find other reviews, then that is an example of one results sticking out, and likely deserving to rank higher.

The following sentence may appear as a cliche' too often used as artifical politeness, but I actually mean this: Thank you for sharing these informative and important counter-examples to keep in mind.

I believe something is else is skipping below the radar however. In each of the examples (and most clearly for the "republican debate schedule" site and the university site, very strong legitimate authority comes into play.

For the university site, they have a monopoly on being the absolute best and most reliable source of information for a search for the university's tuition rate. So to chunk it up to an extreme for clarity, if there was just one website in the world that shared an actual legitimate and proven 5 minute home procedure requiring only 2 teaspones of car battery acid and a vinyl copy of Miles Davis' "Kind of Blue" (hopefully the former never touches the precious latter!), then doesn't very poor but tolerable usability and design become something that shouldn't be weighted heavily, if at all? If you're searching for something as important to know as university fees, would you trust any other site more than that university's site? The university site offers such original and important content that the importance of accuracy and credibility far outweighs the importance having to click a couple more times than the other sites. This is also why a webmaster who is only going to link to one site on a topic, will likely link to the university site rather than my homespun wordpress blog that shares the same info in a very pleasing format that serves up what they're searching for without requiring them to think (thanks to Steve Kurg).

Now if usability was so poor beyond a given threshold, then the potential usefulness of other sites sourcing the tuition rate info from the university site would go up. I've seen some very poor university sites, but I wager a relatively small percentage make it so difficult to find such basic info that it's worth going to another site instead and possibly getting innaccurate or dated info that's needed to determine whether one can afford to be a freshman.

I haven't read this anywhere, but I can't come up with any strong reasoning for why it couldn't be the case, given all we know about the relatively high importance of authority as a ranking factor.

How is dwell time effected when using things like iPhones or iPads, a lot of the time with these devices the app they use is kept running in the background with the last page you had opened on the screen.

If you let the device just fall into sleep mode the app is still active and churning away the time spent on the site. There is also the fact they people will have many tabs open on these sites which also contributes to false statistics.

If you ask me, using dwell time as a contributing factor to define a successful online marketing campaign is just plain backwards.

I would be the same as using html errors, or grammatical issues or even how visitor numbers.

Google will have access to CTRs and Bounce rates even without the Analytics and that's already a lot.
At the same time, one of the first thoughts I had when they came up with Chrome was that this is probably the best tool ever for them to see absolutely EVERYTHING.
I believe that NOT using the analytics because you don't want Google to see your poor Bounce rate, time on site or pages/visit is just not going to work :) - so use GA!

"Dwell time" is clearly explained here and links to google statements about analytics data are very usefull for the purpose of this post.

i've tried different kind of analysis mixing google analytics and google webmasters tools on differents sets of keywords (branded, unbranded, long tail,...). I don't see any clear link between rankings and engagement metrics in my reports. I reach to the conclusion that google mixes a bunch of users data and this mix depends on the type (category) of the website.

Therefore my suggestion is to analyze bounce rate, exit rate, ctr and others engagement data through segmentation because i strongly believe in the engagement analysis, the category of the website matter. Google is always trying to improve search results based on searches intentions which correspond to what he/she needs and each need has a category

Dwell time is surely an SEO metric, but I would also like to the necessity - hence the possibility by Google - to calculate it also over the base of the same nature of the page analyzed. Let's take an extreme example: a blog post page... one person search on Google, find a post, click, enter, read the post fast or simply click to Read it Later bookmarklet and go back to Google to search some other source for the same topic... and the time used to do this is less than two minutes.

Would be this considered a bad metric? I don't think it should be considered so.

Dwell time, then, is surely important, but it should be just considered one of the metric and surely one of the many factors.

Would it make sense that Google would not use a standard definition of low dwell time, but that this is directly correlated with the amount of content on the page? Has anyone ever tested the dwell time required to bring about the option to 'block' results from that site - is it standardised?

Secondly, I think this post should serve as a reminder that meta descriptions remain important to SEO. I think too many people instantly disregarded meta descriptions after Google declared that they were not used for rankings and it still shocks me the number of sites that actually don't utilise them at all. I think it also highlights the importance of having content writers, or SEOs that are skilled in writing more traditional marketing copy handle the meta description copy.

Unfortunately, the block feature has been in and out of testing and has probably been adjusted over time, so it's been hard to get a handle on exactly how it works. It would certainly be interesting to know what the delay is - I get the sense it's pretty short.

My data (based on way too many searches and too much time spend with a watch to time it) from google.dk showed a dwell time of 1 min 40 sec (or a 100 seconds). The dwell time and possibility to block a domain remained unchanged, regardless of browser, logged in or not etc.

I wrote a blog post about it on my Danish blog here. I have thought about writing an English version of my findings, but haven't found the time yet.

But anyway, there you go. You have to keep your visitors from bouncing back to SERPs within the first 100 seconds of their visit.

Thanks for the follow up. Quick question: did you test this on multiple domains? If so, was the 'dwell time' always 100 seconds i.e. it wasn't related to what/how much content was on that specific web page?

I tested it on a series of different domains, ranging from low end Made-For-Adsense or thin affiliate sites, right through to websites of some of the largest Danish newspapers and national television channels. And the "dwell time" didn't change.

I also ran tests from different computers, IPs, browsers, logged in vs. logged out, whether the site used Google Analytics or not, and tried to visit more pages onsite if they used GA - and none of it made a difference. I also tested if changed anything to click on more results from the same SERP page. And the threshold remained the same for the second and third click.

This is a great post, I am saddened by my own CTR and I wonder if Google takes into account in their ranking for businesses in my industry that I feel ( I hope I'm right) the consumer really may not want a lot of information just a trustworthy service. These customers are mainly looking for a phone number and call from the site's displayed number on the first landing page I feel.

Great article Pete! I agree that we shouldn't worried about Google using analytic information. Great SEO uses so many different variables and all too often we become fixated on one thing. While the CTR (Click Through Rate) and Bounce Rate are important, we can't neglect great content, proper link structure and a catchy snipet.

Great Article, these are the two that I have always watched the most. Plus if you put some weight to each one or each page then you can really start to get more precise about the behaivor of people visiting your website.

Even 3 years old, this post is timely since I don't think many SEOs understand the difference between "bounce rate" as calculated by analytics software, and "dwell time." Bounce rate, as calculated from Google Analytics, is unlikely to be a ranking factor. Dwell time, meanwhile, makes much more sense as a more accurate metric of quality.

I don't think bounce rate recorded by GA has much if any influence on Google SERP - some of my pages getting top SERP have a high bounce rate.

From my viewpoint I am happy if a visitor lands on a page, finds what they are looking for and leaves. Having said that, time on page tells me if the information sought has been found - if the visitor lands on a page with a lot off content and leaves immediately it tells me the info is not relevant to their search, not understandable or presented well enough, or possibly too complex for easy reading.

IF Google does use analytics data to refine SERP, that's fine by me - The algorithm considers so many factors bounce rate and dwell time would only play a small part. I'll stick to concentrating on content factors, and advising my clients to do the same

(I use several statistics tracking tools, preferring server generated stats when available, Webmaster Tools, WordPress and Drupal tools as well as GA)

I've always been dubious about installing Google analytics but I've also found the sites that GA isn't installed on just don't seem to work quite as efficiently at getting new content live and ranking. I'm sure it's just a coincidence but.....

SEO involves so many variables that it's easy to become fixated on any single one you hear a lot about, which is why I think some people have become obsessed with bounce rate. I think "Dwell Time" makes a lot of sense. Besides, if a large percentage of its users click on your site and then come right back to Google, Google the Search Engine doesn't even need to look at Google Analytics to learn that your site has a high bounce rate.

I've been saying CTR and bounce-rate, balanced with time on site (aka dwell time) are key metrics - I even tried creating a bot to 'improve' CTR for certain queries but apparently Google was smarter than the bot :p

Many will argue that bounce rate and CTR are terrible metrics, because what if you are looking up date & time? You are only going to stay a second or 2!

The answer to that is that yes, for some queries visitors will only stay a second before they bounce, but Search Engines take this into context. These metrics are relative and compared with similar queries - so if someone was searching for "SEO guide" the 'acceptable' dwell time would be higher than a query about timezones.

User metrics like this are the best way to account for quality. Links are useful for indexing content and guessing at which results should appear on page 1 but these results need to be vaidated by user data to prove that they deserve to stay on page 1. For this reason I believe that link spam is an irrelevant topic as long as your content is relevant and high quality, thus providing an exceptional user experience.

A page of one of my websites has been ranking between #3 and #2 for a long time (for a not very competitive keyword), but it was getting about 50% CTR, probably due to the Title tag and the meta description. After a couple of months floating between position 2 and 3, I suddenly bounced to #1. I have to admit, links to this page gets scraped time to time, but I havent noticed any new scraped links for this one in particular, and I haven't been link building for this page in the last few months. Of course, I cannot be 100% sure that the reason of the bump in rankings was CTR, but I suspect so. Also "Dwell time" could have been supporting CTR, as the page gives a very specific information described by its title - hence the users are likely not to go back on the SERP, although bounce rate it's really high (confirming that if this metric it's used at all, must be used relatively to the average for that specific keyword's results) and time on website really low, as the information it's quick to read.

Following is the Google Analytics data of one sites for which there was a drastic change in the bounce rate after the site was redesigned. The data proves that a high bounce rate does not necessarily mean a low visit quality and an irrelevant landing page. Especially when it comes to organic search rankings data and data does not lie.

But if you observe the above case study of a website for organic data or data for organic (SEO) rankings this co-relation does not hold true. If you observe the bounce rate with the old design was low in fact very good but the time spent on the site got divided into 4 pages and time spent on the site is 155 seconds with the visitor spending approx 38 seconds on the site.

After the new design with a more optimized landing page focused on lead generation the average time spent on the site is 160 seconds and the time spent on each page is approx. 80 seconds. The time spent per page is more, in fact doubled so that indicates that the content and design are appealing to the interest of the majority of the visitors.After the redesign the company got more genuine, targeted and serious online enquiries as the landing page was more geared and focused on lead generation with the contact details available on every page along with a good pleasing design, color combo and content giving a clean, uncluttered professional look .

The visit from the search engines show an increasing trend (Majority being from Google – so the rankings are also not affected by the high bounce rate.)Hence, coming to a conclusion by just observing one metrics in isolation does not convey the right message. Data has to be analyzed in a very cohesive way focusing on what answers have to be sought holistically.

Interesting data. Your analysis is also interesting. When I first viewed the data, I thought the update was a bust. But what you say makes sense - they're not clicking around because they're finding what they need.

Though Google is not transparent, I agree that SERP CTR really effects rankings of a website, It really matters, because the ultimate object of Google is to land users on the best websites through search.

What are your thoughts on dwell time in terms of how long the user stays on the page? At what point would they be considered 'inactive', and how does this affect what the search engines think of the site?

Personally I'd suggest that staying on a site beyond a certain time is always considered to be positive. It would be very rare for someone to click through to a page from a search engine, consider that page to not be useful but then forget to close it or leave it open whilst they ventured off to do something else. Deciding that something is not useful is, ordinarily, a very quick process.

I suspect they only look at very short dwell times, and consider that a negative factor. It's probably not a gradual scale or a positive ranking factor (in other words, if someone spends 90 seconds on a page, that's not better than 30 seconds). That's just speculation, of course.

There's still debate in SEO circles about CTR as a signal (it's noisy, unreliable, easy to game...yet every signal can be gamed), but to me there's no debate. Any proprietary metric that Google divulges in Webmaster Tools is Google's way of saying "hey, Webmaster, this is really, really important to us and should be to you too!" So IMO anything in WMT is a likely ranking signal. We've seen it with site speed and +1 metrics recently. I imagine the same holds true for CTR.

It's tricky, unfortunately. Bounce rate and time-on-site can be indicators - a site with low bounce and high time-on-site will typically have high (good) dwell time. The big problem with analytics is single-page visits, which impacts blogs and really any site that relies on social media. Time-on-site, for example, can only be calculated between two timestamps, which means two page visits. Google can use the time between the SERP visit and any one page on your site (in either direction). GA and other analytics need two pages from your site to do that math.

It's possible that we'll see more of an AJAX-style analytics approach down the road, where GA and other packages send back data from single pages. We're not there yet, though.

I've made the arguement about dwell time being used in the Google's algo from time to time. Most people don't believe it. Maybe because unlike on-page content and backlinks, it is something that is not as easily manipulated. Others argue the dwell time might be low for certain contant like to check the weather or the status of an arriving plane flight. However I argue the dwell time would be computed in the algo relative to the dwell time for sites coming up for the same search. So if the average dwell time on a weather site is 30 seconds, a dwell time of 40 seconds is better than average. However this could mean that site is just not as efficient and user-friendle. In general though, I believe bounce rate is a KEY factor and one of the most reliable. If someone clicks on a site and never comes back and clicks on any more serps or enters a new search, most likely they found what they are looking for. Very powerful signal.

Great post and great way to highlight these important basic factors. It definitely makes sense. It also makes me happy because it seems that these factors will help de-junk the SERPs, which is nice for everyone. It also points to the importance of focusing on UX rather than the bots.

The Google Analytic tool is one of the best tools for analyzing the website, the Google Analytical tool help to manage the website and provides the all information on the website. The Google analytics tool tells about bounce rate, click through rate and etc. for removing the complications of website you have to use the Google analytics tool for the website.

A friend of mine says their company never wanted to use Google Analytics because they dont trust Google not to look into their site traffic data and eventually make Google versions of what they are offering.

Yeah there's a bunch that might even be better. I reference Piwik because it's self-hosted, so you control all data, and it's open-source, so you can hack it up however you like. That should calm all possible paranoia.

I definitely don't think the paranoia is warranted, but I should say that I DO think Google uses GA data - especially in aggregate. They may use it to improve how the algorithm works or to test ranking factors. I just don't think they use individual site's analytics data directly as a ranking factor for those sites.