Curious web user trying to make the web better. Passionate about details while not losing the big picture. Analytical feedback-ista. Occasional pun-artist.

Blog Bio:

Michael manages organic and paid search and web/funnel analytics for ShoreTel's Cloud Division. When he's not trying to help introduce people to the wonders of cloud phone systems, you'll usually find him running after his two boys, playing a bit of sport, surfing or perhaps looking at how to write a better script to make the web better.

Ecommerce sites have it very easy when trying to tie sales information back to online marketing activities whether it be organic, social, email, or paid campaigns. Since the purchase happens online, web analytics packages capture the transaction details and can generate detailed analyses on sales, visitor profitability, visits and days to transactions, etc.

I love TAGFEE. Thanks for sharing. It would be great if you could layout a summary of your short term ranking changes. For most sites, I've seen it normally take about 3 months to regain all/most rankings that you had prior to the domain change.

This is a problem with the social media sites and how they calculate the social shares. They should honor 301 redirects as part of their social share counts. Right now they are literal and "dumb". Not limited to domain changes. This is for any URL change.

I am a fan of using a Marketing Automation Platform for many reasons but the main reasons for it to me are:

Great list management

Solid emailing tools

Easy to build and integrated landing pages for one-off campaigns and a/b testing

Workflows/programs for lead nurturing and demand gen programs

There are other things as well such as lead scoring, social media management, supplemental SEO tools, blogging, although those are things that I tend not to use the MAP for but are helpful for many other companies etc...

In many cases, stand alone tools can be used to cover everything that a MAP does depending on the CMS and CRM you have as well as any other specialized tools such as MailChimp or Contant Contact for email list management and execution. However, it is very helpful to bring this all together into one platform such as HubSpot to simplify the management and make it easier for marketers (especially those without the time or expertise to do a lot of SQL and/or front end dev work) to actually get their work done and focus on results.

For example, I find Salesforce to not be a very friendly platform for managing or sending emails to prospects. So, you could use a stand alone tool like MailChimp that is integrated into Salesforce or use a MAP. Same goes for building A/B tests on landing pages or using dynamic calls to action on lead-gen forms. I could use Salesforce.com forms and another 3rd party tool to add the testing component or just use a good MAP. Etc. etc... so in the end it just makes sense for us (and a lot of companies) to go the route of using a MAP.Cheers,MF

It will be interesting to see how the market evolves. Take a look at Clicky as well. It's not free, but it is very affordable.In terms of the cookie issue, I'm lucky in not having to deal with the EU cookie law (which by the way is a mess IMHO because it's a great way to strangle innovation and is an overly broad "fix" for the problem they were trying to tackle).

Universal Analytics uses more than 1 cookie during the session, but only 1 cookie that persists. That persistent cookie has the client ID and nothing more. That's the primary key for them. However, during the session, the traditional _utmz, _utma etc... cookies are generated and accessible.

This whole issue would be a lot easier to handle if the GA API team would open up an API to get the campaign information for the visit from the tracker. Then, no cookie scraping would be needed.

Chisty,These are great questions, and I'm happy that I can be of some help.

- Google already launched beta of universal analytics which do not use
cookie system any more. In such cased how to track the values?First, few sites have made the move to Universal Analytics and it seems like it is still evolving to me. It will be very difficult for Google to force all users to abandon their historical data and force them to create a new profile for UA (today the data cannot be shared). With UA a lot more of the visit/session campaign information is tracked and filtered on the server side.

That being said, you can still access the campaign info form UA using the session cookie that it creates temporarily. However, the campaign information will only be valid on the landing page so you should go ahead and store that in a local cookie yourself (just as I store the landing page...see below).

For example, if you go to a page that uses UA, instead of creating the utm.gif call typical of GA, UA now makes a call to /collect? and includes most of the same information that you'd see with the utm.gif call. If you check document.cookie at that point, you'll see that _utmz is still there with the campaign information. The big surprise comes when you click to the next page, check te cookie again and see that all of the referral data is listed as "referral" since you came from another page on the same site. They take care of mapping that whole session's campaign data to the original campaign information looking at the CID, UID, and SID. OK, that is not an elegantly worded answer, but I hope it helps clarify your question.

- The
method mentioned above will only give the details when a visitor fills
the form, how can I get the path of visitor from first landing page to
last conversion page.I'd suggest that you store the landing page
in a cookie. That is what I do and when your visitor gets to a page
with a form, your jQuery function checks if that landing page cookie is
defined and if so populate the hidden field with the landing page value.

To
capture the conversion page in the CRM, not only GA (that should
already be defined in your Goal Funnel), just add a new form field to
your forms and capture the current page's path when the visitor is
completing the form.

- In case of adwords campaign without URL tagging can we track the source & medium correctly? No.

You need to have your URLs tagged even if you use AutoTagging because
Google obscures this information if AutoTagging is turned on by passing
it on the backend to GA. So, to get this correctly you'll need to
manually have your AdWords tagged with campaign information. It would
be nice if Google would just pass this information on to the site like
every one else is made to do so.

Glad you liked it. Although this is such a common problem, I was really surprised to see that there is so little written about the topic. Cutroni has written good posts about it and even building GA data into data warehouses, but I hadn't seen much of an argument or explanation out there that would speak to non-web analysts (although, in my opinion every inbound marketer should be a web-analyst)

What you suggest is not necessarily the case because of negative SEO. Competitors or anyone with a grudge against you could point spammy links to your site in an effort to bring you down via a Penguin style attack.Google knows this and I am sure that they are not blindly assuming that if you have spammy links your site is questionable. That is too simplistic.The fact that you use the disavow tool (especially after receiving a notice) indicates that you take this seriously and is a signal that you want to comply with Google guidelines. Therefore, I doubt that you carry some permanent "shadow" of suspicion due to a first time offense.

RE: the labs tool, I would appreciate your comments on the effect of hidden text and whether this was considered in the original LDA research. I have run a few tests using the LDA tool in SEOmoz Labs and it looks like it pulls all of the text from a page regardless of its CSS treatment (for example it is including the kw-rich text from hidden divs).

Do you plan on controlling for this in future enhancements of the tool?

If Google is dampening the effect of hidden text, taking that into account would likely improve your correlations, no?

Thank you Dr. Pete. After reading the various sets of posts and comments about LDA on seoMoz, this is the first comment from an SEOmoz Staffer/Associate that I have come across in these posts as objective and not defensive while putting it all in perspective.

I am quite surprised at all of the fire (both positive & negative) about this topic. Danny, in the original LDA post, made a great comment that most people seem to overlook (both in blind focus on links as well as the geewhiz response to the LDA findings): "Yeah, it always comes back to content, doesn't it."

Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

In terms of content strategy, this will probably bring a few thumbs down, their guidelines seem pretty straightforward and would naturally imply some sort of topic modeling algorithm being used (whether it is LDA or not). That should not come as some sort of a surprise, but the reactions (both positive and negative) and language used indicate otherwise.

FYI... your Delve video player is missing time markers. You should update your delve player template to include time markers (at least video length & current position... bonus points for seek position)

I have found that YQL & xpath together have problems with many pages that are not well-formed (properly written RSS/HTML/XHTML/etc...) so this tool may likely miss a decent amount links reported as false positives for "no-link" so be sure to double check the pages before locking and loading on the link request.

I tend to write my apps in Java for AppEngine and I like using a library called htmlcleaner
http://htmlcleaner.sourceforge.net/ to clean up the html before parsing it to extract the links or other content.

Rand,
First let me say that I am a big fan of the overall WB Friday series.

It seems like what you are proposing is akin to cloaking because you are presenting users with a different view/expierence of the site than what you are giving the spiders.

It seems that the show company wants to position itself with sliced and diced views for each possible keyword set.

For example they may want to position for each of these phrases... all of which are at "category" levels and not product levels"
Leather Urban Boots (there are 10 of this type)
Leather Spike Heel Boots (there are 5 of this type, 3 of which are Urban and 2 are Country)
Red Leather Boots (They have 20 types of these)
where they have perhaps 10 varietes (3 with tall heels, 4 with stub heels, 3 with no heel).

From what you recommend it appears that they will just have to choose the most popular category to try to position categories and then hope the rest positions at the product level.

Woudn't a tag based approach solve a lot of these problems? March 19, 2010

Inherently product level pages have more specific keywords and tend to have less competition than category level pages so they naturally position themselves fairly well.

When people shop, then tend to do their initial searches using category level keyword terms. They are not brand specific and definitely not model specific.

When starting to shop they want "slim digital camera" not "Casio Elixim s270Z".

So if I position my casio s270Z page for the keywords "slim digital camera" I am also taking away from my 12 other slim cameras. I will include a module for similar products, etc..., but that may be too little too late for the user who wants to see a category page.

Seems that the harder part is positioning well the broad category pages and a great strategy for link building is to get your category pages as much link juice as possible so it can trickle to my products equally.

Many sites do have 80/20 rules working while many others do not. however, I would argue that in a lot of sectors 80/20 is often at a category level not a product level.

For example, at wine.com probably saw a big shift in category volume after the release of the movie Sideways (Pinot Noir way up and Merlot way down) July 31, 2009