It has become clear that we are all Google’s Mr. Jones. The Google Mothership seems to be leaving and we can choose to get onboard for the data ride (Google Analytics would likely still be able to capture queries and the like) or we can live in a netherworld of insufficient data. With
some sites getting 50% or more of their traffic from Google natural search not
using GA in this scenario relegates them to a sort of third world of data and renders their subscription-based analytics platforms limp.

There is no question in my mind that Google owns this referrer
data though I have heard it argued otherwise from the analytics vendors. The
click action takes place in the Google domain and though the link data is
generated from publisher content, publishers are under no obligation to have
their sites indexed. There is also no question to its value. Huge. It is worth mentioning that much of the URL
shorteners now driving an ever growing amount of web traffic pass even less useful referrer
parameters to the linked sites.

This is Google's nuclear option to the world of web data. The fallout will be an analytic winter for many. The face of the analytics, SEO, online publishing, testing, targeting and even the public markets will all change. Does the very fact that Google has so much
data leverage mean they are likely to use it to their advantage at some point?
Would it be so bad to live in a world of (free) GA? They have made great
strides with segmentation and continue to add data visualization. Of course there are plenty of reasons ethical, historical and rational that
Google must leave their URL passable and parseable. It appears now less likely
than ever to happen.

I would have imagined you were above hearsay and conspiracy Mr. Mendez!

Google, like Microsoft and Yahoo! and Ask and others, realizes that the best customer is a data driven customer. It would be imprudent to do anything that would stand in the way.

I'll leave the ownership issues to Pundits (and Bloggers : )), but the Search team at Google announced the change well in advance To Ensure that all vendors (Google Analytics included) have time to make relevant changes to the data processing if required.

WebTrends, Omniture and other vendors have posted on their blogs that their Analytics solutions will be fine with this change.

In fact they have said that they are making minor changes to incorporate *additional* data / goodies that may or may not be in the new referrer string. : )

Derek- I think this amplifies the use/need of semantic and contextual rules on the publisher side. Lots of targeting can still be done but it is up to pubs to figure out how. We've got a ways to go there but the tools are getting better.

Avinash- Thanks for stopping by. Far from being heresy and conspiracy the AJAX SERP has been stripping URL parameters on and off for three months now to certain percentages of traffic. I've been following this for the past couple of months and it was seeing that once again this "testing" is going on on Google own support forum.

To date we've only heard Matt's PR like statements and Brett's comment that he was "seeking clarification from the search team." So quite a bit of data has already been lost for people. If you know of anyone that has been able to capture parameter data stripped from the URLs due to AJAX please let me know. I've been following this closely and have not heard of that.

I'm sure the vendors will make statement that things will be fine. I'm not as confident, but still hopeful.

This is really interesting, I've often wondered how people track changes in Googles "tracking system" - and thats when i realised... if you dont use Analytics, and use a third party stats package it becomes much more transparent when changes occur...

As an after thought, i'm now going to install two tracking packages on every website so I can compare data.