Posted
by
kdawson
on Monday July 09, 2007 @09:01PM
from the bye-bye-page-views dept.

StonyandCher write(s) with news that one of the largest Net measurement companies, Nielsen/NetRatings, is about to abandon page views as its primary metric for comparing sites. Instead the company will use total time spent on a site. The article notes, "This is likely to affect Google's ranking because while users visit the site often, they don't usually spend much time there. 'It is not that page views are irrelevant now, but they are a less accurate gauge of total site traffic and engagement,' said Scott Ross, director of product marketing at Nielsen/NetRatings. 'Total minutes is the most accurate gauge to compare between two sites. If [Web] 1.0 is full page refreshes for content, Web 2.0 is, "How do I minimize page views and deliver content more seamlessly?"'"

More likely Google doesn't give a shit whether or not some Web 2.0 metric gives it a pat on the shoulder or not. Google:

1. Makes its money out of serving ads, not out of being the site where you spend an hour on the same page. If you came, searched and looked at their ads, that's it.

2. Google's secret sauce is the brand name and search algorithm, not its Nielsen rating. People go to Google because they have something to search, and gets new users by word of mouth and by the deals it has with the likes of Mozilla to make their site the default home page. It's not like users start with Nielsen's Top X page and find out about Google there.

In other words, it seems... surrealistic to read the title and summary that Nielsen's ratings will hurt Google. Google doesn't get any income or users out of what rating it has, so the amount of "hurt" will be anywhere between insignificant and none whatsoever.

3. It seems to me like a flawed rating anyway, _especially_ coming from a usability expert. Google's search is a tool. Being able to just do what you needed done, quickly and with a minimum of useless fluff, is what a lot of us would call a good tool.

And the need for such tools won't go away just because some other sites work in a different way. Just because Ebay existed (as an example of a site where users spend a lot of time in a row), didn't make Google obsolete before, so why would it now?

4. Why the heck does it even matter, other than techno-fetishism, in Google's case, whether it's page refreshes or some AJAX kind of thing that fetches the results in the same page? No, seriously. Each search produces a different list, so essentially it _is_ a different "document". The browser is already perfectly able to display a new document. Why would anyone sane want to try to, basically, reinvent the page refresh in Javascript instead of using the browser's existing mechanism? No, seriously.

AJAX and the like make sense when you can actually have most of the data and the processing client-side, and you can actually offer some purely client-side functionality. In Google's case that's not even possible. You can't transfer the whole search database to the browser as XML and let the user tinker with the search expression locally, in the same document. So it's going to involve a round trip to the server and displaying a new result list anyway. So why not just let the browser display the new page?

Nielsen is generally a smart guy, but maybe there is no One True Metric to bring them all and in the darkness bind them. For some things it is a usability advantage to do more client side and not refresh the page, while for other things it makes no sense whatsoever. The focus should be on how well and intuitively is the user served by the site, not on promoting one arbitrary metric like time spent, taken out of context, for everything.

Because on a typical website, half of the page content is exactly the same on every single page: logo, header, footer, navigation rail, etc. The content well is the only part that's different from page to page.

Why should the client and server request and return those page elements over and over again if they never change? AJAX allows only those eleme

Exactly. Maybe the fact that a perfectly usable and popular tool scores badly on their new metrics, _and_ there's no imaginable gain whatsoever if it changed itself needlessly to fit the new metric, should only tell them that their new metric needs some more work.

What matters is not page views or page durations but redirects to pay-sites. That's the value of any site from an advertisers point of view. When I read the NY times I spend a long time there but I'm not likely to be shopping and even if I was in a mood to shop the probability they happen to show me an ad for something I'm interested in is close to nil. Plus the adds they tend to show there are delux flash moving ads or big long columns.

Now when I go to google and type in "blow up dolls" or Airline miles or 629 investments or some purchase worthy topic, I read the ads. Not only that but the ads are short. so I don't spend much time. But I click through a dozen of them into tabs in a few seconds.

When I watch you tube, how long to I look at the ads? probably not at all--I scroll then off screen. But I do see the adds on the leader of the video. But that's only a few second on a 5 minute video. A good and focused 5 seconds yes. Even subject worthy 5 seconds. But not 5 minutes.

You bring up some good points for a majority of ads, but I'd like to add something. You are basically saying that ads are only useful if they result in a direct purchase.
The thing about ads... It's not whether or not it leads to a direct purchase or not... It's the fact that you know the product exists, and is mainstream. For example, you can't buy food from McDonald's online (yet), so using your logic, an internet ad for their product would be useless. It's more of a long term customer base they are buil

This is the most hilariously worded Slashdot headline I've seen in like a week. The guy was basically describing an algorithm, nothing more. For technical reasons page view statistics are becoming irrelevant- so now they calculate a new metric that supposedly gives weight to longer user session lifetimes. Maybe they just pay more attention to overall HTTP query traffic or something. The effect of this would be, say, to boost a site such as AOL chat (an extreme example of a site with a low page view count and long session lifetime), and de-emphasize a site such as Google (an extreme example of a site with a high page view count and short session lifetime). For purposes of illustration, he just picked two examples that would make sense to people.

The article submission takes the angle that this is a kick in the nuts for Google! As if Google depends on Nielsen's reporting high metrics to advertisers so that they can charge more for banner ads! So Nielsen would report a low metric for Google! Oooh, what intrigue! Nielsen has their balls in a sling now! How will Google retaliate?

But that wasn't the point the guy was making at all; for him Google was just a good example of an extreme example. I would guess that nobody in either company is really concerned about Nielsen's calculated metric for Google. Google acts as its own Nielsen and competes with Nielsen using a not-quite-equivalent business model. It's a sort of integrated content provider/ratings company all on its own. They don't need to have their metrics reported to advertisers. Advertisers are showing up with money already for that AdSense program, and the cost is associated with a metric calculated for a search term, not Google as a content provider itself. The advertiser has already chosen Google (as the content provider) so implicitly of course they also have to agree to the terms of Google's ratings service since it's part of the package. Nielsen's rating of the Google home page doesn't enter into it! Just ask anyone using AdSense if they gave a crap about Google's Nielsen rating.

Sessions is what they'll use- and it'll be what many analytics (google included) use for measuring time spent at a site.

Is that why I've been getting page views that take forever to close their connection? They're keeping a download incomplete so that they can measure when the client gives up as time visited per page?

Anyway, they shouldn't just abandon page hits for time spent. Lots of quick impressions should be just as valuable as a few long impressions, maybe even more so(1) depending on the type of ads being sold (static splash vs. animated flash).

Sure, sessions will work for sites like forums. However, is there going to be anything shown by session length that won't be shown by page views in that case? What about pages that you can really spend days to weeks at a time staring at, such as the glibc [gnu.org] manual or the Coyotos microkernel specification [coyotos.org]? If the user never refreshes the page before the end of the session, information-packed sites aren't going to be measured at all.

This was the rage about 10 years ago - pages had to become more 'sticky', or so marketing people told everyone. I think this led directly to the demise of the blink tag - no one could bear to look at blinking text for any period of time. You made a page more sticky by providing better and more in-depth content. What actually happened is that sites started splitting up content over 10 or 20 pages, alla ad-view-generating tech sites today. Prepare for unending mazes of content to make you stay much longer on one web site.

What actually happened is that sites started splitting up content over 10 or 20 pages, alla ad-view-generating tech sites today. Prepare for unending mazes of content to make you stay much longer on one web site.

This sounds like a new needed firefox plug in. Content-re-aggregator. Detects multiple page articles and re-assembles them into one page or at least pre-loads them all. It does not actually have to detect anything. in manual mode you tell it when to re-aggregate, in ultra-dumb mode maybe you even show it where the "next" button is. Then wham. Re-aggregates the content, strips out the ads and replaces them with google-ads. Cha Ching.

Wow, and I just designed a site that works just like that. Kind of an ADHD navigation system. If you're interested try http://www.worldwakesurf.com [worldwakesurf.com]. Wow, I feel like a visionary. (Oh, and I have the top Google spot after four days)

Use some programmability/flash/whatever to keep pinging back to the host.

Right, so the users behind my NAT are going to be measured as one person spending all day on somepopularsite.com, in 8 different places simultaneously? What about the four other open tabs currently open in my browser? Am I still visiting those sites? The answer could be 'yes', but I don't see how that adds value for advertisers.

HTTP is a stateless protocol, which means that it's inherently difficult (i.e. impossible) to consistently get accurate data about the duration of a given visit. It can be argued that you can derive data that's statistically significant. You can argue further that if everyone uses the same metric then they'll be valid for comparison purposes, which is enough for the MBAs in Marketing, I suppose.

I personally think time spent on a website is a silly metric, and will continue to hold that opinion until someone can make the case that staring at an advertisement for longer period of time actually encourages a person to finally click on it, rather than tune it out completely. (This works well for branding, but for little else.)

There's a lot of nuance that can be brought into this discussion, and this is where the good advertisers and marketers earn their keep. Assuming that either page views or time spent on a site are sufficient to make a solid judgement of the value of a given website is, uh, a little short on nuance.

If [Web] 1.0 is full page refreshes for content, Web 2.0 is, "How do I minimize page views and deliver content more seamlessly?"

Unless of course your site uses advertising agencies that value page views; in which case you'll spread as little content as possible over as many pages as you can. A slightly puzzling trend I've seen more recently is to refresh the whole page every so often regardless of whether this provides any sort of benefit.

Just so I don't get karma-slapped upside the OT head... I've always thought of Nielson as a mechanism for pricing ads; like all representations of average behaviour, it doesn't say shinola about a particular individual's viewing habits. So, as long as the advertisers think they're getting value out of the metric, that's fine. But I've never talked to anyone who used a Nielson rating as a TV viewing guide.

Similarly, I've never talked to anyone who uses Nielson/NetRatings as a measure of the usefulness of quality/level of interest/etc. of a web site. And NetRatings doesn't even have the mindshare of Nielson the TV dudes. Anyway - in the context of a mechanism for ad pricing, google is the web equivalent of a TV ad about TV ads, which doesn't make any sense for a NetRatings rating. For that matter, what's the NetRatings measure of http://www.nielsen-netratings.com/ [nielsen-netratings.com] ?

Methinks that this announcement of a change in metric is just an attempt to get some profile on NetRatings' existence, and the notion of affecting google.com's measure for ads is plain absurd, because google *is* the advertiser. Drawing an equivalency between an indexing and search discovery mechanism like google and a less meta-focused content site is just boneheaded.

Seriously though, it might effect how many people choose to advertise through Google. Advertisers go for websites that they think are popular.

Personally I'd like to know if Nielsen/Netratings plans to measure the time people spend actually looking at a site, rather than having it open in a background window, or leaving it open while they do something else.

In my experience, most people don't bother to close their browser when they are done browsing. It's even worse for people used to tabbed browsing. How many times do you shut down the computer at night with tabs containing something you looked at with your morning coffee? I know I do as often as not.

In my experience, most people don't bother to close their browser when they are done browsing. It's even worse for people used to tabbed browsing. How many times do you shut down the computer at night with tabs containing something you looked at with your morning coffee? I know I do as often as not.

That doesn't matter. Assuming you don't have some kind of page refresh every n seconds, most analytics software have timeout values between page loads. If you don't close your browser and then come back the next morning and continue where you left off, the analytics software should see that it's been more than 30 minutes between page loads and consider it a new visit.

That doesn't matter. Assuming you don't have some kind of page refresh every n seconds, most analytics software have timeout values between page loads. If you don't close your browser and then come back the next morning and continue where you left off, the analytics software should see that it's been more than 30 minutes between page loads and consider it a new visit.

That might be true, but what about when I open a link in a new tab from something I am reading but don't get to it for another 20 minutes. After I get to it I notice that the link is crap and close it right away. Total time spent = 4 seconds. Total time they think spent is 20 minutes 4 seconds.

That might be true, but what about when I open a link in a new tab from something I am reading but don't get to it for another 20 minutes. After I get to it I notice that the link is crap and close it right away. Total time spent = 4 seconds. Total time they think spent is 20 minutes 4 seconds.

There will always be examples like that. However, unless such behavior becomes even remotely normal then statistically speaking I don't think it would make a dent in usage patterns. Certainly not if you consider median usage rather than average.

Measuring internet stats is just as crappy as measuring TV stats. When you have kids, you will often find that the TV is on and blathering about something or other, but no one has been watching it for hours. I'm sure that happens in Neilsen homes as well.

1) iGoogle, Gmail, and other AJAX websites do a sort of self-update every so often. I wonder how those would factor into the ratings for people who always keep those open in a tab (I, for example, pretty much always have iGoogle open in a tab).

2) I'm a regular user of Opera, which, in its latest iteration, includes a feature called "Speed Dial [opera.com]." This feature consists of a tab that has previews of nine user-selected web pages. The user can define how often the page preview updates--I ha

2) I'm a regular user of Opera...The user can define how often the page preview updates--I have mine set to every 30 minutes.

I love Opera too. If you right-click on any page there is a "Reload Every..." submenu on the context menu. I have 8 tabs in the background, updating at different rates depending on their content (active eBay bids every 20 minutes until close to the end, then every 5 seconds. News every 10 minutes. TV listings every 30 minutes (what a surprise), etc.)

I suppose this is somewhat offtopic from the story, but as you mention it, I do sometimes use that page reload feature in Opera--I have a question about it, though, that reading the story led me to do a little research on, but I couldn't find the answer:As far as I can tell, there is no way to set a page, for example Slashdot, to always reload every n minutes. If I have sufficiently searched every corner, it seems that I would have to check the "enable" setting every time I open Slashdot for it to automati

As an update to my last post--I just saved a new session with Slashtod in one of the tabs set to reload every 5 minutes. I closed Opera and opened that session--and it did save the setting to reload every five minutes.

So this is half of what I want. Now if only I can figure out a way to have it always automatically reload Slashdot even if I close the saved session's Slashdot tab and open a new Slashdot tab.

For situations like that, I can see several possibilities which would emerge if such features gain traction. The first would be to implement a different useragent string to identify this specific behavior. Instead of "Opera/X.Y..." it would be great to have "Opera-SpeedDial/X.Y...". This would allow you to filter out those hits and it gives the added benefit that now you can specifically track how many SpeedDial visits there were and, should this increase sufficiently, perhaps design your pages in such

>I guess I can just say I'm glad I'm not in the business of calculating ratings for>web pages. It seems like a difficult thing to measure, particularly in this day>of tabs and self-refreshing web pages, etc.Actually, it seems quite easy... Just pull some arbitrary metric out of your ass and slap the name "Nielson Ratings" on it. Hey, its "Nielson Ratings" so it must be accurate.

I think the best metric might be 'multiple metrics'. Provide categories such as page views, unique page views, time on p

At my workplace, we're forbidden from shutting our computers down at the end of the day. (Each desktop machine is used for distributed builds.) I keep Firefox open for weeks until it starts eating 100% of CPU for opening a link, then I kill-9 it and restart, restoring my tabs. I have 34 tabs open right now, though this one will be closed soon after I hit Submit.

And it doesn't appear to be a memory issue. I free up more memory by killing gnome-panel.

I mean really, it's not like anyone needs Nielsen to tell them that putting your ad on Google is going to generate fairly well targeted views. Personally, IF I were to advertise something, I'd rather it be unobtrusive (not annoying) and well-targeted: something Google does quite well.

Sometimes i visit a site that links a lot of places (an common one is a google search) and open every site in a different browser tab, and then i read. Now, the last tabs are likely to be there for long time, either till i close it, read it, or even click on links there. How that kind of behaviour gives more weight to the sites i opened at the end?

How does nielsen account for google usage that is embedded in other application (firefox), or in your own webpage? In those cases, i'm accessing google via an API rather than surfing over to google.com and typing in my query there.

Yes, and the only reason we're talking about it is because the summary is a dumb, Google-whoring troll.For other types of Web sites, on the other hand, this sounds like a good thing. Judging a content site by page hits is just stupid. And yet that's the metric that everybody's using. What it means is that you have all the so-called news sites scrambling to stuff their pages with crap. They push the story about the world's ugliest dog more than the latest story about corruption in the Bush administration, be

Why would Google care if their Nielson rating drops? A very low time-on-page, in my view, as both a user and AdWords advertiser, is good. I want a search engine that gives me what I want and lets me get to the content. I want advertisements that are concise and to the point -- and only catch the right person. The more time a person spends on a search results page, the more likely they are to click my ads for no real purpose other than to "see the result" -- driving up my advertising costs needlessly.The onl

This is going to bite the rating company big-time. First thing, a fair percentage of the userbase does things that severely interfere with time-on-site measurements. Blocking cookies is an obvious one. Another is blocking of various Javascript functions like onunload that prevent the page from seeing the user leaving the site. Unless the site eliminates direct off-site links and always redirects through it's own page, which users tend not to like either. And even after resolving all those issues, what const

Alexa gets its data from a toolbar, apparently...(I actually didn't know that, and now no longer trust their information). Where is Nielsen/NetRatings going to get their data from? It actually poses a good question about all these traffic reports for TV and internet. Are they self reported? Who checks the data? Just curious.

they don't get anything from me. stats scripts, bugs and other junk gets filtered and/or blocked, in and out.. when i run across one i haven't seen before, it gets added right away and they'll only log one or two hits and that's it.stats which one can opt-out of or avoid altogether are only slightly more accurate than the RNC surveying only republicans to see if the "general public" likes the prez or not; or checking only microsoft's server logs to come up with browser market shares...

PS, Alexa is a good example of how Neilsen would be bad at niche sites. If you get somebody in one of their panels to use your site, hurray, they say.5% of the world uses your site, even if only that person is using your site at all.Of course Alexa is even worse, since they track a self-selecting audience. I don't have Alexa installed on this machine, but everybody at work does, so our site thus looks on Alexa like 50% of Alexa users that visit our site spend 8 hours a day on our site. And we can then gue

This sort of metric will actually help the 'web 2.0'/ajax-ey parts of Google. If you measure the amount of time I spend on GMail- it is all day, every day. It's always open in a tab. Same with Google Reader and I refresh my iGoogle homepage once in a while. I bet this will show that GMail has a much larger marketshare than was suspected because it is no longer tested on the basis of page views.

Guys, guys- they aren't going to measure how long your WINDOW is open, they are going to measure how long your session is active for. Your session will timeout eventually. They'll be able to account for that, and voila- problem solved.

They already do it, and will be doing it. Google Analytics delivers it. It's quite informative.

I just cant see how this hurts google. Sure, entering a search and retrieving the result is generally VERY quick (maybe this is why its my search engine of choice)...

But for the very reason that I dont need to spend much time there and more often than not its 2 clicks to my result, one click on "search" and the next click on one of the first page search returns; I go there regularly as a starting point, resulting in a massive number of short visits.

If the measure is TOTAL time, google would still be number 1 followed closely by slashdot for me... Because 47 bazillion* one second page views per day is still 47 bazillion seconds of eyeball per day!

*the author realises that, as a complete idiot, he is prone to stupid exaggerationerr!jak.

This summary takes the article's original title, which compares how this will hurt Google, but help YouTube, as an example of how the new ranking method will affect different sites.If the ratings didn't change with a new metric, it wouldn't really be a new metric, would it? Why does Slashdot need to spin this just for the negative side?

Personally, I think this is a good change. Page views are a terrible metric, and encourage sites to make bad design choices, like breaking articles into twenty parts to mak

I cannot believe anyone would take Neilsen seriously in this day and age, especially in regards to any sort of internet ranking system.
The "total time spent on a site" is a very innacurate way to rank web pages. Take for instance Fark.com. It is a site that primarily links to other sites. I spend not very much time on fark, as I am clicking on the links to other sites. I check with fark.com about thirty times each day for new news, but according to the big "N" that does not mean shite.

I predict this change will lead to more sites where all interaction and pacing is under the control of a designer, not the user. I can see it now:

PHB: "How can we get people to stay longer?"Eager-Beaver Designer: "Let's put everything in Flash, put fewer words per screen and longer pauses between new screens."PHB: "Great!"

My point is that I am a browser and I use a web browser. That means I want to browse. That means I want to be able to glance at something, make a quick decision, and control the movement to the next chunk of content.

This emphasis on viewing time will cause designers (and their bosses) to try anything they can think of to slow down the user.

~~begin quote~~PHB: "How can we get people to stay longer?"Eager-Beaver Designer: "Let's put everything in Flash, put fewer words per screen and longer pauses between new screens."PHB: "Great!"~~end quote~~

Hmm, I think they've already done this... it's called Web 2.0

In other news, Amazon has decided to allow worldwide royalty-free use of one click, whilst simultaneously patenting their new 'one hundred click slo-purchase' system.

But surely advertisers don't care how long you stay on a site except insofar as it increases your exposure to their ad. E.g., on Slashdot, you might spend ten minutes reading comments but quickly scroll past the ads in the first 30 seconds and the rest is all content. However, if you choose to post a comment, an ad is visible on the comment pages and stays visible during the duration of your composition. I'd say the second ad, continuously viewed during the three mintues it takes to write a comment, is more valuable than the first ad, which goes off screen almost immediately.

Can someone please explain the rationale for declaring that a metric change will "hurt Google"? When is the last time someone decided to use a particular site based on a commercial web-rating? I certainly don't use Alexa to decide which news sites interest me, at which banks to do online banking, etc.

Certainly there are a few closet Google employees around here... So tell me, are the higher-ups even remotely concerned with a traffic ranking? I mean, if suddenly MSN Search spikes up over Google in the ratings because its so goddamned user-hating that it takes 3 minutes to search a single topic...is anyone going to blow a gasket, provided traffic and revenue remain at present expected levels?

Shhhh... don't let on that you have seen through the charade! The sheep who run your competition will create crappy sites that force visitors to stay on them for a long time in order to get worthwhile content. Users will leave these sites, flocking to sites created by people who, like you, realize that ultimately it's about delivering a site people want to actually use.

The Nielsen metrics debate is really about advertising, which contrary to popular belief, does not apply to all sites. Even those sites th

This is likely to affect Google's ranking because while users visit the site often, they don't usually spend much time there. 'It is not that page views are irrelevant now, but they are a less accurate gauge of total site traffic and engagement,' said Scott Ross, director of product marketing at Nielsen/NetRatings

Don't you guys see what's going on here. A creative way to throw "Google" in the mix, to get your press release a better publicity.

'If [Web] 1.0 is full page refreshes for content, Web 2.0 is, "How do I minimize page views and deliver content more seamlessly?"'Has anyone explained this to the marketoids? As far as I can tell "Web 2.0" is a marketing term that means "We're new and improved and you should look at us so you can see the ads we present and make us money." I've found no consistent explanation from any of the supposed Web 2.0 purveyors as to exactly what they mean by it. If the ratings folks have a valid and generalizable def

If even half of users work like I do, then Google isn't going to suffer...in fact, they might even gain a score higher. Here's why:

I would estimate that for 80% of my day, I have Google open.

Sure, I might not be looking at the page, in fact I'm probably not. I'm probably on one of the 15 tabs that I've opened from the search results. It might take me 5 minutes, or it might take me an hour to work through the results, but eventually I get back to the Google tab, and either search again, or close it.

If I close it, I'm willing to bet not 20 more minutes go by until I'm back there. I also have Google's personal homepage as my homepage, so it already has a head start.

Okay, I'm not the target of most advertising money... but I am of some. And that money can find me because of Google's keyword-based ad system. It sure doesn't find me on/. or the blogs or news sites I visit regularly. On those sites, any advertising that's obtrusive enough to get in the way of what I'm after gets Adblocked - pretty much in direct proportion to how much time I spend on those sites, and thus how annoying the particular advertising becomes to me. On a Google search though, if it's about som

What was the phrase? Oh yes... "it's about the data, stupid."This "2.0" crap generally has nothing to do with data; it's generally related to bullsh*t, and that's why most of us don't "get it" as having a point. And in that context - page hits are an excellent metric for data; time-sink is an excellent metric for "feel-good" crud. I think a lot of us see TFA as pointless because of that difference. The non-data crap has no point, so a metric that measures something pointless is... pointless.

How they are doing the measurement. Where do they get their raw numbers? How many samples do they collect, from whom, and how often?

What the error bands are.

Nielsen just isn't that clueful about the web, either 1.0 or 2.0 (blecch). Google will fall down the ratings? Does it matter? How much cash do they generate with ads that people click through to, versus, say, Yahoo or MSN? Nielsen is once agai

FTFW: ""How do I minimize page views and deliver content more seamlessly?""
Last time I checked, these 'seamless pages' use AJAX or technologies like it, which make calls to other pages through HTTP or XML requests in order to update what's on the current page. You still have page loads happening behind the scenes.

Most as are unfortunately irrelevant to the user's needs, ie they try to sell you something you don't want in the first place. As a result, advertisers design their ads to attract more attention in any way possible. The ads therefore become annoying, and the users generally try to avoid them.It is logical to conclude that if a user spends more time on site A than site B, then they will have trained their memory to remember the position of ads on site A and their eyes to quickly recognise any new ads appea

How popular your site is is now measured by how long people spend on it? Isn't that kind of like rating auto manufacturers based on how many culmulative gallons of gasoline their cars burn, rewarding inefficiency?

Adobe must be in heaven, planning all of the extra sales of Flash...

Please wait for the rest of this response:Loading [--3%----------------]

I dont use Google becuase it has good Nielsen 'ratings'. In fact I wasnt even aware there was such a thing. I use Google because it works, its simple, and it doesnt shove animated advertisements down my throat.

I'd be more interested in Googles 'rating' of this Neilsen site was than the other way around, if I care about 'ratings' of sites at all.

I wouldn't want to be the client whose agency uses Nielsens as a pricing guide for my electronic media (they're utterly behind the times, and they set the standard rates for "worth" for broadcast placements). Hell, Nielsens are just about obsolete for broadcast TV as well, seeing as how time shifted viewers don't count [wikipedia.org]. I might know, what, 3 people (?) that don't have some sort of DVR and time-shift absolutely everything they watch.