Multiplicity: Succeed Awesomely At Web Analytics 2.0!

Someone highly salaried : "Thanks for agreeing with me, now please help us fix it!"

Me: "No I meant, oh hmmmmmmm."

Someone highly salaried : "What do you mean? You are not going to solve my problem of getting a single source of truth for all my web data?"

Me: "Multiplicity. That's the solution. Not "singlecity", not on the web, not in Web Analytics 2.0."

Someone highly salaried (and by now you) : "You are not making sense."

Ok let me explain. I am sure you are also confronted with the classic quest for the "single source of truth".

But before that let me back up.

My first eMetrics summit was June 2003 and as a young inexperienced person new in the field it was a great learning experience (eMetrics in Santa Barbara were the best!). But there were two concepts, one big and one small, that were key in developing my own thinking about web analytics.

The big concept was Multiplicity and it was presented by Guy Creese (then Research Director for Aberdeen Group). The concept was brutal in its simplicity: The web is different and the reason was the existences of multiple constituencies, tools and types of data sources.

Here is the slide from June 2003……

The first one is perhaps slightly obvious, the second one less obvious but told me about the challenge ahead as I thought of dealing with my own Web Analytics journey. The last one was the most interesting and delightful.

The fact that to make optimal decisions on the web I was going to have to be comfortable with multiple sources of data, all valuable and all necessary to win.

The strategy, for me, was two fold: Go figure out what sources of data, web and non-web, were needed to make decisions. Go identify the best tools to collect and analyze those data sources.

It took the above slide to crush my own dream of having a single source of the truth that was God's answer to everything, because on the Web there is no such thing. For a person such as myself who came from the traditional Data Warehouse and Business Intelligence worlds that was a non-trivial mental model transformation.

It was totally worth it.

Let's fast forward to the present (an amazing four years later!).

I have already presented my mental model for Web Analytics 2.0. It requires that you think radically differently about your web analytics approach.

It also calls for you not to be distracted by, well, distractions about how your online presence is accessed (that will be important but won't distract from the mental model you need to have about how to understand performance of your web presence).

It also calls for you to not be distracted by the type of experience your online presence is. Static or flash or video or social or whatever.

Both of those are distractions that will impede your progress towards nirvana.

Regardless of how your site is accessed (mobile or via the Eniac) or what the experience consists of (static or dynamic or targeted), you need to solve for Web Analytics 2.0:

Five important sources of data that give you a holistic picture about your website performance and empower you to make decisions that help your websites solve for you and your customers.

Now let me reel it back into the now, this post.

Notice that the slide above is Multiplicity personified. I don't know if this is how Guy intended it to be, but it is certainly how my vision has evolved from Guy's presentation at the wonderful Biltmore hotel in Santa Barbara.

Five different sources of data, that require you to have multiple tools to measure success.

Now do you see why a single source of truth is not possible?

Too many pieces of data needed from too many different tools, for a medium that is evolving and dynamic and amazing (the Web).

Sure I'll happily give you the problem, but I also have a solution.

Solving the Web Analytics Multiplicity Problem:

(Click on the image below to see a higher resolution version, it is easier on the eyes.)

Every solid web decision making program (call it Web Analytics or Web Metrics or Web Insights or Customer Intelligence or whatever) in a company will need to solve for the Five Pillars: ClickStream, Multiple Outcomes, Experimentation & Testing, Voice of Customer and Competitive Intelligence.

For optimal success you'll only need one tool from each of the above to cover the base for each of the five pillars. Multiplicity.

Data from each tool is not meant to be duplicative of the other areas. It is not meant to tie to the other areas. Each individually provides insights that taken together give you the data you need to succeed.

And don't feel overwhelmed:

1) Notice that in each row you have the option to have a free tool.

2) You can execute based on a priority (here is mine: Voice of Customer, Outcomes, ClickStream, Testing, Competitive Intelligence).

Two New, Interesting & Non-Obvious Tools

If you looked carefully you might have noticed two new tools at the bottom of the image above describing Multiplicity, here they are again…..

When web analytics is mentioned Maxamine and Coradiant are not normally first in mind. It is my belief that for large companies, Fortune 1,000 especially, both these tool are almost mandatory. Neither measures what a traditional web analytics tool does, so no overlap, but each brings its unique strengths to the business of web data.

Essentially everything you need to know, measure and report around the existence of your website itself. See how it is web data and "web analytics"?

There is one thing about Maxamine reports that is quite delightful:

Not only do you get your scores, 21 and 84 above, for each metric reported (Page Weight and Page Proximity), but you also get a comparative score for the Fortune 100. This is delightful because you now can get your execs to take action. No one wants to look bad by comparison! :)

And I am just scratching the surface with what Maxamine can do.

Coradiant:

You should use Coradiant because it gives you critical data, down to a individual user level, about the "matrix" that powers your website. The bits and bytes and pages and packets and more. Every single thing you can imagine going out from your web servers (anywhere) to your Customers. You can find problems on your website quickly and hold yourself and your IT teams to account.

More than that you can understand why, for example, your conversion rates are down. Is it because suddenly your cart and checkout pages were slow and not making it to your customers? Or because of 404 errors on your important pages?

You can not only do all this but you can also, literally, create a funnel of your important site experience and see if the "leakages" in that funnel are related to your website "suckiness". Imagine that funnel next to your web analytics funnel!

There are two things about Coradiant that are quite delightful:

1) You get a box, you plug in power and a network cable, you are in business. Idiot proof fast installation.

2) I am a fan of the "AIM" reporting application (and indeed the mindset that it works on) from Coradiant because it means I have to think less and it tells me where to look (and it will tell me for the important set of things I care about, "what changed before and during the issue/problem"!!) . Now here is the important part: It does that without me telling it.

That last part is critical. Most web data applications that sit in that space tend to give you 100% unstructured data, and all of the data. You, tiny ant, are then immersed in a massive amount of data and asked to find your way around. Most of the time you don't even know what you are looking for, even if the answers to every question is right there. That is what makes Coradiant valuable, they tell you where to start looking because then you'll find your unique problem and solutions.

Coradiant often gets compared to Tealeaf but I believe that that is a imprecise comparison for a number of reasons. Tealeaf solves different problems and in a space by itself from a solution, applicability and benefit perspective.

[Taking a deep breath now!]

There you go. Multiplicity. What is it? Why it is important? How you can benefit from it? And as a special bonus two web data sources you might not have thought about, but sources every mature website probably desperately needs.

Comments

As someone new to large scale analytics there's lots of great information in your post. Reading through leaves me with two questions – 1. adding further layers to the analytics stack is likely to cause ever more information overload (for me anyway), and 2. this needs a really tight framework to manage, relate and integrate the data from those 5 sources.

Now if you could offer some insight to number 2. there this could be a really perfect post.

I love that this is where the discussion is going. For some of us, I'm sure, having to defend our 'multiplicity' is getting tiresome. The success behind changes made on elaborate, multifaceted data is evident and its simply no longer an argument. We've taken the position at CableOrganizer.com that analytics MUST be part of a larger data-ecosystem and not, in itself, a single packaged set. Certainly there is some overlapping, but even that has value for migration and other aspects of continuity. Its true that singular sources of data are useful. But it is my belief that this is an incomplete picture.

Again, well done in bringing authoritative perspective to this growing chatter.

Excellent summary on a practical approach to web analytics 2.0 challenges. It's going to be interesting to see what industry consolidation does to an integration of these 5+ pillars. (Never mind the integration of two or more different clickstream tool results on the same site lol).

Also, while most of our clients are not in the size category where Coradiant and Maxamine are 'must haves', the insight of these tools would be of benefit to many other organizations beyond Fortune 1000.

A super-informative post — especially the tools as it relates to the various pieces.

On the multiplicity vs 'singlecity' front, you bet there are a lots of 'Someone Highly Paid'out there with the dream of a unified source/tool and then exerting these [wrong] notions on their subordinates through their HIPPO edicts. I think there are many who are missing the value-add Webanalytics and online measurement boat because their boss first wants to have a single source of data for all non-web channels and then one source for the non-web and web channel.

On the tools, I really like the way you have laid it out. Additionally, I think it would also benefit many to show how the same tool can overlap both Marketing and IT needs on certain fronts and where the tools are pure Marketing and/or IT.

And yes, I agree that the first step is to have all these sources in place….but you will be stranded at the 'airport' if you don't figure out how best to integrate, synthesize, and leverage the collective information to provide a superior customer experience.

Thanks for highlighting some of these tools. While we are using many of these already; I was wondering if you know of a dashboard software that can work on importing the data into a single interface for those high salaried people.

Richard : The goal is not to have lots of data and report it all. The goal should be to have access to the right sources of data and then to measure your critical few success metrics from each.

So not going for twenty bizzilion web analytics metrics to sixty bizzilion. Rather to go from three "critical few" metrics from web analytics tools to five "critical few" metrics across the Multiplicity ecosystem.

Just get smarter first and please don't try to reconcile Compete with Omniture / Google Analytics.

When you are smarter (you'll know when it is time :)) think of Strategic Integration. Here is a example:

Pass the anonymous persistent web analytics cookie id to your survey tool. This will allow you to look at referring url's (or content consumed) for people who have rated you as -9 in customer satisfaction.

It is not important that day one you integrate. That is a classic way to selling you more expensive software and even more expensive consulting services. Get smart and then do strategic integration, which can be done for cheap.

You are right, there is a post here. :) But I hope this helps a little bit.

Jessica : Here is the best "software" for presenting single interface dashboards: A intern with Excel.

You can buy flash based software, you can buy other pieces of software, but at the end of the day none of these sources will cough up API's that allow you to "extract and display" automatically. Yes they all promise but I assure you it is a non-trivial challenge.

So get a inexpensive intern. Teach her/him how to download into CSV and upload into a spreadsheet that powers a Single Page Dashboard to the executives.

It certainly sounds ideal – of course we have to get ourselves into a position where we have access to those key indicators, and indeed choosing what those indicators should be. The trials and tribulations of large scale analytics ecosystems I suppose…

Pardon a digression, but this really taps a similar frustration I've been having with folks in the usability community (of which I'm one). One camp thinks you have to do eye-tracking studies, one camp says it's only usability if you do one-on-one testing, etc. Meanwhile, in their quest to make their own tool of choice the best, each group fights to prove why every other approach is worthless.

I thought of this because I've seen two articles recently where usability people dismissed web analytics as useless, throwing out the baby with the bathwater. Sure, some tools are more useful than others, and metrics are easy to abuse, but I don't understand people who would willfully throw out an entire realm of data just because it's inconvenient for them or they don't fully understand it.

It is a article by Alison Head who is a Principal at a research firm, and I contributed to it. It was a fun collaboration between the Qualitative world, that Alison is a part of, and Quantitative world, that I am a part of.

In the end I think you'll agree that the article is richer and better for having both perspectives, and in the end Practitioners are the real winners.

Avinash, I'm really curious about your comment on the comparison between Coradiant vs. Tealeaf being apples and oranges. Could you possibly elaborate a little more on that? I've seen demo of TeaLeaf and I've just been to Coradiant's site for an overview. They seem to be quite similar – especially in the define and track "events" department. What do you see as the key difference?

Avinash, I think you have just described the roadmap that companies can use, one that will enable them to think strategically and make good, sound decisions about their web properties.

We are seeing this now in our client base – people are thinking broader and selecting solutions that are complementary, as shown in the Web Analytics 2.0 diagram.

I think if people adopt your roadmap (and believe that over time they will), the next logical step is a discipline-focused organization that brings together people that know how to use these solutions – individuals, each with different expertise, all coming together as a team. In this post, you have provided a framework for such a team. Regardless of what it ends up being called, this new Web Analytics 2.0 organization will have a much more important seat at the table.

Great post Avinash. I would like to make one organizational comment: the 'owners' of these analytics tools must either fall under the same department or must have a common head (VP). I have been in situations where different departments have owned different parts of the customer experience/analysis and getting them to contribute to a common picture becomes a territorial thing.

I enjoyed your post and thought it was an interesting take of using different sources of information to make valuable decisions. However, (small minded I guess I am) is it really wrong to try and capture all of your vendors/tools API and create one mega tool that gives you a 360 view and integrates for example your top ten page ranks along with how many visitors came from Google while testing page C? I still consider myself young in the field and maybe this want of integration (though not high salaried :) stems of teams/departments not working/communicating together or people not using the right KPI's for their departments. I think the concept of Multiplicity works if A).I was the driver of every tool (though I wouldn't want to be) or B). Multiplicity is also an organizational/people structure.

It is not wrong to try and capture all the vendors/tools data into one "mega" :) tool. It is a natural human desire.

Unfortunately it is too hard and too expensive and takes too long. In the end the rewards can't justify the effort. Not even remotely.

I have seen so many massive projects kicked off to build God's gift to Web Integration and nothing comes of it for a couple years and then the world has changed too much.

Hence my recommendation.

The web changes every 30 mins. Ensure that the pace at which you are able to get data and make decisions reflects that pace. Even if that means collecting fragments of relevant data and moving on with your life.

I am sure things will settle down in a couple years (more standards, cleaner data, smarter users, off the shelf tools). Then this will be a more productive exercise.

Until then see my reply to Jessica, above, what what I think is the most optimal integration option! :)

-Avinash.

PS: Diana I promise to reply to your question. It is a longer reply, I might do a post rather than a comment.

I can't help but think that there's still something missing in this group: so little analytics is being done on the marketing of web sites! Any suggestions on tools that offer marketing analytics? So much money is spent on search, email, and display advertising, and it seems like the only thing out there is reporting on spend/return (and that's not really great).

Is anyone currently using Maxamine? I think some of their features are very useful, but are they worth what is a fairly high price.

A lot of what the tool does can be done through a little hard work. There are open source link checkers. You can easily keep on top of Truste compliance (although I don't know how much that is worth). Check accessibility of your site by viewing it through Lynx.

The two features that stand out are the ad tag and analytics tag checker. Using these could have a significant impact on your website's performance.

Still, the cost of the software seems expensive. For those who use it, is it worth the money?

@Evan: There are ton's of things you can do, once one critical step is done: Connecting the media data from banners to the user activity data on your landing page / mini site.
Omniture uses a unique id for each ad position (SAINT code) and creates a session cookie once the user arrives from the banner with such a code. Then all the user activity is tracked back to the actual banner.
With this data you can go and segment the hell out of it. Which channel provides most depth of visits? Which media brings in the most engaged (however you define that)users? etc.
Hope that helps

Avinash, I would like to thank you for all your post about Web Analytics. I think we have so much to do about it but, as a really big company where (sadly and prehistorically) no one believes in the web commerce, we have one big constraint: budget. All we have to work with is Omniture (which is quite good and still has a lot to "say"). But reaching the others tools is still impossible (no $ available T_T). What can we do? Will we be consumed by Web Analytics 2.0 or do we have a chance to live happy together?

It's kind of frustrating being here (in Chile) where everything related to "the web" is 5+ years delayed.

Now that Google Analytics has hundreds of thousands of accounts and all other client side applications have maybe 2-3K accounts (when I look at Q2 reports), Maxamine seems like an expensive choice for most sites. After all, everyone can go over to my competitor, Epikone, and use their SiteScan tool for free to scan for missing tags, if you are running GA.

It's true that SiteScan doesn't work behind firewalls, and sometimes, I need it desperately for the member part of the site. And there is probably a lot of other fancy stuff that Maximine does, and I may even use them on a member site. But it is a little like comparing Omniture to Google Analytics – sure, you get fancier stuff with Omniture, but wow, you really need to show your boss a benefits to price ratio, usually known as VALUE. (OK, you can convince me that I am wrong. You don't have to be gentle.)

Very true, but what you don't know is whether those metrics you're trying to a banner, search, email, affiliate, etc. is truly related to that tracking code! Omniture has 4 attribution techniques: first, last, equally, evenly. In each of these scenarios, users who enter your site through multiple marketing "touches", e.g. multi-searches, search+email, etc., all credit for on-site activity is attributed according to the rule of your prior choosing. But you never know that multi-touches occurred. Nowhere can that be seen (other than a data warehouse pull, which few people know how to execute well).

The truth is that there is a huge loss in what we don't understand about the organism that is our site marketing. These attribution rules may have worked when search, email, and display were less prominent and there was less inter-activity between them, but if you're anything like me, you see and click on banners, emails, search, flash games, facebook links, myspace pages, iframe'd content, parter channel sales, etc., all for the same brand. That's the world we live in now.

One litmus test for how "off" your campaign tracking is would be to compare a pie chart of referral "instances" (in Omniture) with a pie chart of your campaign-driven visits/click-throughs. Although there are a billion flaws with this test, you'll see that some of your marketing (probably search), is stealing visit/visitor/pv/transaction credit from your other media, if search is closer to the sale for your brand, based on your attribution rules.

Thanks for clarifying. Attribution is certainly a key challenge with this methodology (as with any other method I am aware of).

That being said, I agree with Avinash's point in the article. When analyzing a fast moving ecosystem like the internet, software standard releases will never be able to solve our problems "out of the box". It needs smart analysts to dig deeper into the data and do the Data Warehouse download.

There key challenge I guess is that there is no "right" attribution. Multiple touch point may have contributed to the decision.

It would be interesting to do and analysis about the correlation between the number and quality of touch points and a decision to interact with your landing page or transact business with you. But then again there are privacy issues involved. I am sure you have spent much more time on this than me, I would be happy to learn more about your thinking.

Robbin : You are wrong. There I was not gentle per your instruction. :)

Using Maxamine only to do scans to check for missing tags is like using a Jumbo Jet to drive around in downtown San Francisco (or Pittsburgh in your case). It can do that but it is built to do so much more.

Please note the features in my post that I value a lot more from Maxamine (compliance, security, seo, privacy, internal search blind spots, competitive reporting and on and on). I encourage you, as well as other readers, to call Maxamine and get them to do a free analysis for you.

Tim : The above comment should help you. Maxamine is not for bloggers or small business websites. It is a robust tool for enterprises. Well worth its cost.

IMPORTANT: I am not affiliated with Maxamine in any way and do not have any financial relationship with them. Though three years ago at a eMetrics summit they did give me a chocolate orange, it was yummy.

Great post Avinash!!!, it's not kind of funny that we are all in almost the same page but "the market" is still not showing signs of data integration. Is not ridiculous that, for instance, Google Analytics is not providing an API, or cannot process third party cookies?

I agree with your comment about Maxamine. Finding broken links is just scratching the surface.

One thing to add, organizations do implement multiple tools for same purpose across various segments (say web analytics tools). The challenge is having a consolidated picture and keeping up with multiple technologies. Ofcourse, datawarehouse solutions help in that case – but for multi business segments and global companies, core integration of data is key. Not to mention the mess with 3rd party cookies and overstating the visitor counts.

Thanks again for the wonderful post and I look fwd. for the Web Monitoring tool post :–)

Found you thru the graphic posted on Connie's blog. I think it is brilliant how it breaks down the rings of analysis required to get to the true gold – consumer and competitive insight and understanding. I wonder if I might include your graphic on my own blog pose. (Our company – MotiveQuest – specialized in the two inner circles of social media analysis – but sadly not for free!)

Sorry for being off track but will you please explain or provide perfect link for configuring Google Analytics for various events and goals. Can we also set the Goals based on different keywords…may be for 500 keywords…or more and find better results.

Great post. I am starting to think on how web analyst like us can extract the data using regular analytics solutions that maxamine does without investing in the maxamine solution. I am sure there the process is tedious but is there a reference guide?

Pardon a digression, but this really taps a similar frustration I’ve been having with folks in the usability community (of which I’m one). One camp thinks you have to do eye-tracking studies, one camp says it’s only usability if you do one-on-one testing, etc. Meanwhile, in their quest to make their own tool of choice the best, each group fights to prove why every other approach is worthless.

Anyway, I've a question about this "Multiplicity" strategy, is there any dependencies for the "Trinity" strategy on your previous book and this "Multiplicity" ? or the trinity strategy is just a part of the current multiplicity strategy?

Anthony: The Web Analytics 2.0 framework is an evolution of the Trinity concept. There was one small pieces and one very big one that became more important and viable after I wrote Web Analytics: An Hour A Day. That's covered in WA 2.0.

Multiplicity (regardless of if you use Trinity or WA 2.0 as your framework to think) is a "how". It (this post : ) helps you identify the tools that are available and tells you why you need more than one.

In Web Analytics 2.0 there is also a table that tells you which tools from Multiplicity are minimum requirement based on the type of your company, small biz or medium or huge.

Franz: I would adapt your analogy a little bit. It is not like going to different general practitioner doctors to get advice. That is sub optimal (like using GA and Omniture and WebTrends on the same site, what's the point?). It is more like going to a heart specialist when you need an analysis for your heart pain and choosing to go to an orthopedic when you need the broken reset and leveraging a dentist when your tooth cracks etc.

Finding the right place to get the answer to each unique question.

That does not eliminate the need to have a general practitioner, but it does mean your GP can't reset your bone.

The Adobe Analysis Suite is wonderful. But if you really think about it all it goes is give you a supercharged GP, not a heart and bone and teeth specialist. Most components of the Adobe Analysis Suite simply are deeper clickstream analysis options (Discover2, Warehouse, OnPremise etc). There are a couple of Multiplicity components in there, Surveys and Test & Target come to mind. But data from the latter two are superficially integrated with the clickstream, still requiring you to do a lot of heavy lifting.

And of course the Adobe Analysis Suite (like all other analysis suites) does not include competitive intelligence or reduce the need to leverage UCD methodologies to understand more of the Why.

If you can afford it the Adobe Analysis Suite (or competitors from CoreMetrics and WebTrends) will help you with better clickstream analysis (and a little or a lot of integration with other company data sources). But they, for now, won't be the be all and end all. You'll still need other tools and you'll still need a multiplicity strategy.

That is both the tough part of web analytics 2.0, and that is also the fun part of web analytics 2.0!

Its always a pleasure reading your blog and interviews. I had a small suggestion on your blog which is if i click a certain images on your blog it opens in the same tab making me to use the back button to continue reading. I think if you code at the back end to open it in a new tab it will give a better user experience and increase the stickiness of the article :)

I understand the desire for multiple analysis tools. but as a front-end web developer, and someone who's accountable when the sites I work on are running slowly, there's definitely a big downside to piling on too many 3rd party javascript libraries into the page load, particularly for users with slower connections and older browsers.

While you may be correct in regards to traffic analysis, your recommendations above suggesting using more tools can have significant impact on site performance, which can ultimately mean more frustrated users abandoning a visit sooner than they otherwise would. Those two concerns need to both be weighed against each other.

Tony: I think there was a small misunderstanding here. I am not recommending you use multiple javascript driven web analytics tools. As a matter of fact I've consistently recommended that we all practice monogamy in that area rather than bigamy.

What I am recommending is that if you want to be successful on the web you are going to have to use a web analytics tool, a survey tool, a competitive intelligence tool, usability studies, experimentation platforms etc etc etc. Only some of these will add javascript tags.

That said there are two more important things to consider:

1. Tools like Google Analytics now use async javascript code. This means that the code adds "no weight" to the page as it works asynchronously from the page and hence has no impact on the user experience.

2. In my analytics javascript implementation best practices guide I've recommended that any non-async javascript tracking code be placed very close to the < / body > tag. That ensures that the customer experience is not interrupted and data is still collected. If the customer clicks too fast and leaves, no biggie. Most of the time we'll still collect data, and 100% of the time we won't interrupt the customer experience.

For these reasons I personally don't buy "downside to piling on too many 3rd party javascript libraries into the page load." Even for slower connections and old browsers. :)

Well that's good to hear Avinash. From my experience, when running a bunch of tools like crazy egg, foresee, omniture, test and target, and a bunch of other retargeted marketing and conversion tracking scripts running, some end users do really start to feel the load. And while asynchronous requests fired at the end of the page load do help (two things we're doing at my current position), those scripts still need to be loaded and executed at some point.

For users with broadband connections and modern browsers (like those being used in most tech offices) these may be negligible, but for a lot of people with crappy systems the incremental effect of all these 3rd party scripts may add up to slower loading images, choppy transitions and animation, temporarily frozen screens, etc. Ultimately that can mean lower conversion rates as users move over to smoother running services.

I guess all I'm saying is that client side performance considerations need to be weighed too.

I like the team "Multiplicity." I have used the phrase, "The American Idol approach to online marketing" to mean something similar — using data from multiple "judges" to zero in on value. I'm going to start using "multiplicity" and quoting Avinash in meetings!

It's interesting how often different metrics, from different sources, point to the same winners and losers – just like judges (on American Idol?) often are in agreement about a performance.

I am particularly interested in phone calls from online advertising – I know you've written about that too!

Trackbacks

[…] You might want to read Multiplicity: Succeed Awesomely At Web Analytics 2.0! over at Occam's Razor. It seems the problem isn’t that we’ve got too many tools…but that most of us don’t actually have enough.. […]

[…] Posted by dpascoe on November 8th, 2007 Avinash Kaushik has just posted a new article titled Multiplicity: Succeed Awesomely At Web Analytics 2.0! that is a must read by anybody who is involved with web analytics, driving web strategies and participating in web initiatives. […]

The good news is that this is a manageable environment. Data quality, site quality and compliance are three pieces of the same puzzle, and must be treated that way, with an automated solution that frees the web analysts to have confidence in the data and the assumptions, conclusions and recommendations they make as a result.

If you haven’t seen it, take the time to read Avinash Kaushik’s blog post titled “Multiplicity – Succeed Awesomely at Web Analytics 2.0?.If you haven't seen it, take the time to read Avinash Kaushik's blog post titled "Multiplicity – Succeed Awesomely at Web Analytics 2.0".
[…]

[…]
Avinash convinced the audience that you can already analyse a lot and optimize your marketing efforts in an efficient way by using FREE solutions!
In his presentation ‘Multiplicity: Be Massively Successful at Web Analytics 2.0′ he showed different very usefull tips & tricks to perfom the optimization, but the hard part is to execute a Multiplicity strategy that requires being smart about what tools you need for what job, and then be flexible.
I advice you all to go to Avinash’s blog Occam’s Razor and read the related post on this topic (that Avinash already started in November 2007).
Or you can also take a look at the video of Avinash’s presentation at the congress made by webanalisten.nl.
[…]

[…]
The web is quite complex, you are going to access multiple sources of data, you are going to have to do a lot of leg work. Blood, sweat and tears. You don’t just need tools for that (remember 85% of the data you get from any tool, free or paid is essentially the same). You need people!

Hire the best people you can find, tools will never be a limitation for them.

[…]
In BlogAdda’s interview with Avinash Kaushik, Google’s Analytics evangelist, I had asked about the effect of the ‘emotional responses’ in social media on the field of analytics. As he explains, there cannot be a single tool that can capture all data, and those who monitor this, will have to get used to the idea of multiplicity. From just deciding where communication will be distributed (and to a certain extent, consumed) to having to track where conversations are happening in an ‘everything reviewed‘ (Transparency, Trendwatching’s September trend) world, and then deciding the what-why – that is quite a drastic change. These are obviously not mutually exclusive, but it still is a challenge.
[…]

[…]
A few years back, I remember reading a blog post by Avinash Kaushik on the subject of Multiplicity. In theory, I completely agreed with the idea of multiplicity however in practice, I found myself being a laser beam for all things Omniture. I was the perfect case study for the “single source of truth” model. I found myself sitting in strategy meetings where analytics was being discussed and I found myself sitting on the outside, only to be brought in if there was an “Omniture issue” or a question we needed “Omniture to answer”.
[…]

[…]
Some things are worth repeating. That’s why I think it’s worth revisiting this two-and-a half-year-old blog post by Avinash Kaushik. He’s a respected Web analytics guru and Analytics Evangelist for Google who has good ideas about his field to offer, even if you aren’t a Web analytics professional yourself.

If your job involves sharing information over the Web, whether you’re in marketing, sales, PR or otherwise, you should know about Web Analytics. (It’s that arcane discipline that enables analysts to measure the success of, and establish ways to improve, Web sites.) First and foremost, everyone, including web analysts themselves, needs to accept that it’s a complex discipline.
[…]

[…]
As a starting point, I think a good authority on the topic is Avinash Kaushik: http://www.twitter.com/avinash This graphic from his website is a good way to start thinking about website analytics:source: http://www.kaushik.net/avinash/2…Sites that fit into the bottom 'foundation' categories (although very expensive)
[…]

[…]
The concept of multiplicity is best explained by Avinash Kaushik, and at its core lies three tennets: multiple inputs of data (qualitative and quantitative), micro and macro analysis, and business outcome/goal oriented. From that mindset we can better understand how the model above works toward driving actionable insights and results:
[…]

[…]
The second problem, perhaps even harder, is how to deal with this multiplicity from a data analysis perspective. Much of this data is missing primary keys, it is often incomplete, and sometimes even incorrect. And it is rich with information we can turn into actionable insight. Yet from a human capital perspective we don't have enough people with the right skills.
[…]

[…]
Look outside the (analytics tool) box – That’s right, the role of analytics in the world of web 2.0 is not limited to clickstream analysis. There’s voice of customer analysis, social media analysis and a whole lot more that you should be looking at. Make sure you are including at least one of these this year! Have a great rest of 2012!
[…]

[…]
I was researching skills for web analytics, and found this website with an article on Multiplicity. When I was in school many moons ago, I had a physical chemistry course that had an associated lab. However, they were not synchronized in any way. So we would have a lab on Chemical Equilibrium and three weeks later would get a lecture on it (or a chapter reading) and go “Ohh!” that’s what we were doing that for!”.
[…]

[…]
Web Analytics 2.0 is the framework established by the Analytics Guru and Google Evangelist Avinash Kaushik. Visit his blog and learn about Multiplicity. I also advise you to get his book Web Analytics 2.0 and make it your bible. Take your time with this great book and don’t forget apply, apply and then re-apply! Good-luck and may the Analytics Force be with you!
[…]