On the Web

Profile Information

Going back to January of 1995, Alan Bleiweiss was a pioneer in the Internet as a marketing medium. Some of his most notable early clients include Publishers ClearingHouse, Weight Watchers International, Starkist Tuna, Writers Guild of America, Hill and Knowlton, Porter Novelli, Mechanic's Bank, and Princess Cruises. Alan pioneered online display advertising with the nation's first Business To Business Yellowbook site where he was the architect of that site's content management system and display ad business model.

The economy is in the tank. Hundreds of thousands of people are losing jobs every month. Unprecedented numbers of people are losing their homes. Yet we, as Internet Marketing Professionals, are blessed to be in the right market at the right time.

Thanks for putting this one together. I think you've done an excellent job communicating how we, as an industry, would best be served in processing what comes from Google reps. It's a keeper for me to send clients to when they get lost in the "Google says" vortex. Greatly appreciate it!

Yeah this is a big concern for me. Even though I have not had the energy or capacity to participate in the community in a while, it's such an integral part of the Moz eco-system, that in many ways, the community IS the heart-beat that drives everything else for many people. :-(

Moz is such an integral part of our industry, it's even painful for us, on the outside of the organization, to know that there's been this level of struggle you've all gone through, and our collective hearts do go out to everyone at the company, especially those who need to transition out. The greater industry is amazing at picking up talent at such times though, so here's hoping it's as smooth as possible for every one of them!

On a final note, while this is a big contraction process, you've got such a strong core group of minds that I'm confident Moz is going to come through this and be stronger than ever.

Yeah I have a call with a major client tomorrow where I will be training people on their team how to do keyword research, and I'm going to advocate they sign up for this tool. Having the grouping is going to take the sting out of it because they have hundreds of phrase sub-groups to evaluate within about twenty primary groups. So this is really appreciated.

More better! I already loved the tool before - now I love it more! Can that even be possible? Greatly appreciate the grouping feature - it was one thing I'd thought about just this morning when I was looking over an export I did last week.

Yet again, I've got a prospective client I connected with just yesterday where they have a 34 million page site, went responsive several months ago, and my very first speed test showed a complete fail for speeds. While they have many other problems, the cumulative impact is they're now down 20% year over year in organic traffic.

People need to wake up to how serious the need is for quality signals, and where speed is one of those signals. Now, with "basic" and AMP being driven so intently by Google, it's no more a case of the few of us in the industry who have been pushing these topics - the house is now on fire.

There is no requirement for a site to pass with 100% validation regarding CSS or HTML. However the more errors you have in either of those on any given page / across the entire site, the more likely search engines may become confused in their understanding of the site's presentation, message and topical focus. This can harm user experience, and search engine quality and trust signals.

Interesting concept. Can a site see a boost if it gets "Panda" signals right?

First we need to understand Panda is one of many sets of related scoring sets in the greater Google system. If that scoring set determines a site is very weak on quality and trust signals, that site will likely at some point, see loss of rankings. That then, would be considered a Panda "penalty".

Conversely, that same site, if cleaned up properly, should, at some point, see an increase in rankings the next time those Panda specific algorithm factors are evaluated.

So what about a site that hadn't been hit by Panda? Wouldn't it then make sense that such a site should see a boost?

Not really, at least in most situations. For the vast majority of sites, if they're not "penalized" by Panda, it means they are "good enough". (otherwise they would have been hit)

And if a site is good enough to not be penalized, it's not likely to see gains.

There are, however, situations where gains can be had.

First, if the site is in a field where other competitor sites were hit by Panda, that site will naturally rise in rankings, even though technically, it's total ranking score remained the same.

There is at least some anecdotal talk around the web about sites that got an actual boost. So I suppose it is possible that a site could see a boost in its ranking score from a Panda update as an isolated change. However without knowing the full spectrum of a given competitive landscape and whether that boost might have come from other sites falling, it's very difficult to know what caused that boost.

Sure, some data has come out, especially with the bigger Panda rollouts where researchers have shown "winners" and "losers". So it is possible.

I just think however, that for the overwhelming majority of sites out there, that they fit in the "just good enough to have not been penalized" column.

I say that because every site I've ever audited, even ones that have not been hit by Panda or any other major algorithmic penalty, have had at least some flaws in their overall SEO and that translates into being "at least some weakness / vulnerability".

Just my opinion though. Others may actually have done extensive studies on this concept where they were able to isolate sites in ways that could show their gains weren't triggered by competitor losses.

I find the Panguin tool is helpful when looking at the historic data. However, like anything else in SEO, it is best to not rely entirely on one tool, resource or signal. Having a broad evaluation allows for a more complete understanding.

That's essentially the boiled down list of broad bucket considerations of an audit as far as the type I perform goes. Each site is unique, so there are often other considerations.

Take for example, a site that has 200 sub-domains associated with it. Or a site that operates in a field where social engagement is vital to the SEO. So sometimes additional off-site evaluations (off of the main site in question) are also included. And that's just two of many other examples.

Responsive design. Ugh. Can't tell you how many responsive design sites I've audited where everyone claimed "it's just the presentation layer, so it doesn't impact SEO" and yet I found critical flaws in its execution that directly harmed SEO. :-)

The initial audit was a one-time paid project. I do, however, offer all my audit clients additional consulting during the implementation phase. If they need me extensively in that consulting beyond a couple hours, its at an hourly rate. Otherwise I just offer the occasional support at no additional charge since they pay me very good money for the audit and action plan effort.

Thanks Doc! When Edward popped into Twitter last time (one of his rare brief appearances, he was surprised to learn a number of us still got value from it, and had been about to shut it down. Prevented a disaster!

Content organization in URLs needs to model the ideal way its organized in site navigation (if site navigation is properly set up). In proper site navigation, it IS a funnel. You don't link to 5,000 pages in your main navigation bar do you? Or wow. You really need to NOT be doing that :-)

if you think "this really does relate to both these categories, yes you can assign to both. Except what if 90% of your posts fit that? That's 90% duplication. And Users to a certain degree WILL go WTF? Why do I keep seeing the same posts over and over again?

THAT is why it's bad for SEO. Search engines do NOT want to annoy / frustrate searchers. So its not JUST for search engines. It's for search engines BECAUSE its for users.

You're welcome - every post I write is presented with the hope it will help others.

1. GPSI:

50/100 in GPSI. That's tragically low. And its why I double check and triple check those findings with URIValet.com 1.5mbps speed data on a sampling of page template types as it most often (not always) represents a middle-ground speed that Google sees across the volume of actual site users. And then go to WebPageTest.org - they have a grading system of their own, and you can test speeds there.

It's been shown through many case studies that improving speed and crawl efficiency, even if it doesn't improve SEO dramatically, can still improve conversion rates dramatically.

2. Changing Hosts

Changing hosts should not cause problems if that's all thats done. If the new server system is worse then the old one, then yes, that can cause problems. Or if the URL structure changes, that too can.

How much time it takes if there's a loss depends on the extent of the new problems. I've seen sites drop with a move and once the new problems were fixed, begin to return to normalcy within a month or two. And I've seen others take several months to recover because the scale of the new problems were big.

In three cases, I saw rebound happen literally overnight once the problem was resolved.

In each case above, the problems were unique.

3. 2015

QUART - my five super-signals. Apply that to every single aspect of work that's done for a site. It's a very easy concept to remember, and with practice, becomes easier to apply across the board. Google is only going to get more restrictive, so the sooner site owners and managers get on board with quality, uniqueness, authority relevance and trust, the sooner they'll be properly positioned for sustainability.

Like all tools, the Panguin tool has its value. And can be a time saver if things do line up.

You bring up a very important point - even after someone who can do the evaluation / audit work is involved, for many sites, even with client buy-in, there may not be the resources to apply enough effort consistently. Sometimes that just means more time is required (like the site you are talking about).

Sadly however, sometimes it's not enough to just keep plugging away on a small scale depending on the site, competitive landscape and limitation of resources.

In those situations, I sometimes end up needing to help the client understand it might be a losing battle. Painful to say, yet needed in those situations.

Yeah - educating clients to get them to realize the implications / consequences is so important. We can expend countless hours of our time, experience and energy in producing findings, yet if they don't grasp the seriousness from a bigger picture perspective, it's a waste of everyone's time...

UX is "User Experience". Search engines attempt to emulate user experience when they evaluate whether a page, section or site deserves to rank for a phrase the algorithms assess as most relevant for intent.

Regarding your site: I can't assume from a quick look what might be right or wrong for any single site. There are hundreds of factors.

A quick check shows you are ranking low on the 1st page of Google for "clipping path service". So you have a good foundation of SEO.

Where weakness might be could be any number of issues. Since you only have one page for each service, if your site is competing against sites that have multiple pages for each of your primary service phrases, that's a consideration.

I am curious - while I understand the value that can be gained from first paring down to one link per domain, what I have found in my work is that sometimes it helps to check two or three links - because when I do that I can sometimes spot artificial patterns that the first link checked APPEARS to have, yet where the confirmation I get from the 2nd / 3rd link checked is enough to move me from "looks suspicious, yet borderline" to "OMG blatant pattern".

What's your process in cases where a link may appear borderline on first pass? (If that's a trade secret, by all means no need to reply with a detailed answer!)

as someone who routinely refers my site audit clients to Marie, I can state for a fact that she's got the capacity to handle the big projects. Just one example - site has over 20 million pages indexed in Google, and when I sent her that client for her to help them clean up bad links, the site had millions of inbound links.

Here's the reality: One of the keys to my success in business and life is to know what I know, and know who to go to when I don't know something. Another key is "work smart, not hard". I've come a long way over the years, and much of my success is due to those people I go to when something is beyond my then current capacity.

Although I have continually evolved and learned more as each year has passed, there are some things I find hurt my brain. Regex and advanced Excel formulas are in that list. So rather than forcing myself to endure the learning curve (which yes, I admit I COULD do), I find it much more efficient in moments like this to listen to my "brain is about to be crushed by complex topics - bail out now" inner voice.

So while I most definitely do link evaluations as part of my audit services, like every other aspect of my audits, it's strategic level pattern identification only. That way I can leave the trench-work to great people like Marie who have their own path, their own passions and tolerances for such things.

I've gotten my audit process down to a highly efficient process and because of Marie and others like her, there's no need for me to reverse that.

Okay so my head imploded about a minute into the "how-to" here. So you know this is REALLY good info and WAY over my simpleton analytics mind. Thank the heavens I rely on YOU for link audit work Marie. This post proves why I trust you so much for that work. :-)

As a site auditor who routinely has to task page speed improvement to clients, this post makes me giddy with delight. Thank you for taking the time to communicate your experiences, provide clear and specific recommendations, and include enough caveats to clarify real world factors that need to be considered.

I am so glad you took the time to write this. I've been teaching clients to stop think "link building" and start focusing on "brand building" for a while - however I hadn't written an article on the subject. So now I get to point them here - I will be including a link to this in all my audit action plans in the "brand building" section to help them understand that this is how the publisher / guest contributor relationship needs to unfold and what it needs to look like.

great post Pete. I use many of these variations during my audit cycle. Just using the site: operator with the -www can reveal sub-domains the client failed to mention. I also find playing with the site: method invaluable to get a quick look at how many pages are split across various sub-domains.

if you are in position 1 for the phrase(s) you care about, there is no valid reason to change URL structure. In fact, doing so could trigger other issues, just one of which could be over-optimization. Always take a cautious approach to implementing keyword additions with a site or page you care significantly about.

if you have a 500 page site with 50 pages related to horses, and if you want success in the "horses" category, then there's no confusion of signals, only confirmation. If you want all fifty pages to be considered equal, but not rank for their common higher level topical focus, sure, skip the category layering. Just understand that it's going to take you that much more time to get all fifty to be recognized as being as valuable/important as all the others in that same level, right?

And as you point out in the article, it should be evaluated on a site by site, goal by goal basis.

I think it's also important to note that category infused URLs need to be considered when a primary goal is to help boost the SEO for the category. Having

domain.com/horses/

domain.com/horses/ponies

domain.com/horses/ponies/shetland

is definitely a valid structure if you want to show "I've got 50 pages all highly related to horses, and 30 just related to ponies within that".

Like you point out though, most people take that notion way too far, and end up hurting themselves more than helping. Especially when they don't recognize that it's "just one more signal" and requires its own dedicated off-site effort with inbound link work pointing to each "level".

Totally agree. With Penguin recovery being too rare at this early stage, we need to absorb what's shared, and continue to see what others experience rather than jumping to conclusions. It's a great day if for no other reason than the fact that we have such clear information on what was done to mitigate the drop, regardless of whether the return was manual or automatic we can then use as an initial baseline.

You lured me in and I'm glad I was curious enough to find out the REAL intent of the article. :-) Just shows that when you do your work right, you can truly find opportunities out there to "make good content others want to link to". Well done!

Whether people realize it or not, this industry is blessed to have not only the Moz tools, OSE and the Moz community, but more important, the people who make them and run them and give them ever increasingly more value. Every single Mozzer deserves congratulations for this.

Rand, it's been a joy to read your updates as you've taken the time to write them - I've only been following along a few years, yet it's just amazing to have been able to watch to this point, and I can't wait to see where you take the company moving forward.

On a final note, a BIG thanks to Gillian for all you've given to our community - may your future be blessed with even more happiness and joy than you've ever known.

yes - being banned from mech turk is one of the pitfalls of trying to use short-sighted tactics. The other is that an artificial attempt can potentially raise red-flags at Google if an unnatural pattern emerges. The beauty of the approach in the Romanian example is how actual marketing and social media were used to create a natural crowd-sourcing solution :-)

Clearly Romanians are smart! Too many people rely on spammy mechanical means of using cheap labor to artificially influence the Search Suggest box. When people come up with a creative, community gathering solution, it's pure genius!

Once again Rand, I applaud you for sharing actual numbers with us - TAGFEE is a blessed thing - and I'm just as glad to see this article because I've been saying for years how flawed such outside views are. Too many people, investors, advertisers, are lured by the data and quote these site's as if they were golden. It also validates my perspective that the methods used by these companies is unbeleivably ridiculous. Their premise is they get a big enough "sampling" that they can then extrapolate. What baloney! The web it too massive, there are too many mind models using the web.

People really need to wake up to reality and get out of 20th century concepts.

wow how completely utterly cynical without having any facts. I've been performing audits for years and wanted a tool that does almost exactly what this one does. There have been a couple that almost came close over the years but not really got there. Already, in it's first version, I'm finding it invaluable in our audit work. Being able to provide a visual reference to our link foot print recommendations is crucial in helping enterprise client VPs, EVPs and CEOs rapidly grasp the health of that foot print.

The fact that he links to the tool, to me, is a pure bonus above and beyond the actual article itself. Kudos to Eppie for having the innovative recognition of its value. And personally, I look forward to the day there's a paid version that provides even richer data visualization.

At Click2Rank we just did a complete makeover of our site. Entire new design, and all new content. When it came to the services and the umbrella focus for the entire site, I went with my intuition. SEO Consulting services. Inbound marketing may feel like a better more cutting edge catch-all, however I work and live in the world of serving corporate clients. Corporate clients have always been, and will always be several years behind any "trend" - both because that's the speed at which corporate adapts, and just as often it's because they adapt slowly partly to prevent falling into the red herring trend hunt.

SEO, regardless of all the internally applied stigma (read that - stigma that we put on it ourselves, in various ways), and all the "SEO is evil" (from within and outside the industry), is, as Danny so eloquently put it, now has 15 years of foundational strength. And any corporate executive tasked with online revenue responsibilities will most definitely look to SEO many more times, and with much more clarity of value, than inbound marketing or any other buzz words thrown about.

OKay I can gladly retract the most inflamatory words related to market share. I will not, however, step back from my perspective. 20% share. Except even you admit it's a narrow band user type that likely surfs with Chrome. By that admission alone, it validates the perspective that I hold - 20% of the market should not have the ability to directly influence the experience the other 80% have. It's just not a valid basis for making decisions that impacts such a large share of the market.

Until Google gets out of the "We're going to give disproportionate weight to G+ because that way we can force people to use it" business, I don't think we'll see such a weighting data set for a long time.

The fact that Google and Bing BOTH use social data is really a bad thing right now. Even though social as a factor is in its infancy, the reality is that at this point, as many people as there are using social, the pool of data is corrupt.

Here's an example - Site Z ultimately has the best content, the best products and the best offerings in the market. Except Size Z doesn't have a properly evolved social plan. Maybe they can't afford it, or maybe they don't know better yet. Either way, for all the people in the social sphere, nobody's talking about Site Z there because of it.

That factor alone means that relying on social signals is a flawed concept.

It's no different than the fact that Google actually relies on Chrome data to influence things. Seriously. Chrome. The browser has such a small tinly little fraction of just one sub-set of a geek based market share that it's bogus junk data to then extrapolate that "If Chrome users lean this way, then surely the rest of the world does".

Which goes back to the longer standing problem related to companies that rely on flawed data will always only do a lousy job of coming up with quality resuilts. It's true with all the ranking sites as well. Compete, Alexa... Seriously. Who uses Alexa? And how is that isolated and myopically narrow user base really going to reflect all of society?

No - even when social becomes a bigger factor, it's still going to be a flawed factor. Which means that as soon as X, Y and J become "the top three social signals to optimize for", the search engines will once again realize that's producing flawed results.

So they'll have to come up with yet another hack trick to figure it all out. Yet to get back to my point, we're probably still a couple years away from even seeing what X, Y and J are. And by the time we learn it, things will have changed even more, in ways we can't even comprehend.

It really is rocket science. And as far as Wil's post goes, his biggest point is one I totally agree with - Search engines have to learn how to stop the bullshit.

Personally, I think the latest "we stopped using a signal recently" buzz from Google has to do with anchor text. They're going for proximity more than ever - relevance page to page specific to topical focus. The better they can get at that, the sooner crap links will become worthless.

I am blown away at how comprehensive this post is. Of all the "how to" and "tips" articles I've seen, this is by far among the very few at the top from a quality, depth, and value perspective. While I've followed many of these tips for a long time, and thus have found moderate success in my own efforts, they've almost exclusively been focused and contained within our industry. Though I've been interviewed for business radio, that was years ago. Reading this just turbo-charged my motivation to spread my wings out to the world of journalists and publications outside our industry.

I can't thank you enough for having taken the time to put this together and offer it up to the world.

Obviously Rand can respond to this directly, so I'm not going to put words in his mouth or claim I know his views on your comment.

What I want to speak to is my own opinion on what you said. And since it's just my opinion, please understand it's meant with respect of your own opinion.

Given that SEOmoz is not the size of, nor do they have the funding of a company such as Google, and while some people, such as yourself, are fixated on the "quality of the results", as far as I am concerned, the quality could be 100% for all I care, and yet if the data set were just a few hundred pages, SEOmoz would be worthless to me.

While the exact number of pages indexed isn't important to me, knowing that the moz team continue to strive to increase the volume of pages indexed means the world to me. And having an actual number assigned to it allows me to both gauge the scale and scope of the progress they make in regard to increasing the volume, it also humbles me to know that even a small company within the search community can even manage to gather that much data, let alone do so with the intent, desire and willingness to also ensure as accurate a correlation data-set as possible.

As I communicated with Rand tonight on Twitter, having a much larger data set makes a great deal of difference to me in the work I do at the forensic level, because the larger the data set, the more likely patterns will reveal themselves. Patterns that are at the heart of my forensic analysis during SEO audits.

And sure, while there may come a point when not detailing counts may come across as "mature" to you, I for one hope Rand continues to do his best to maintain the TAGFEE policy.

the "find a ton of great stuff pre-made" concept (like review sites for example) is a great one. Current client had a review site showing 1st page of results for their brand, but they only had a 2 star rating when I took over. They'd never taken the time to actually let real customers know about review sites, so all that had been there was misinformation and generic rants about their industry but targeted at my client. 18 days after implementing my recommendation on informing customers about review sites, that two year old entry magically transformed into a four star rating - dozens of real customers posted reviews. Sure, some were only 3 stars, but most have been 4, and several 5 stars - so now even though it's prominently on 1st page of Google, it's a total win.

Actually when I was researching whether multiple domains would be either practical or offer realistic user value, I found countless big brands that do it. American Express for example, has many web sites. Some not even domain name brand matches. If you have enough of a topical focus to build an entire section of your site around, it's open to consider as a stand-alone site, as long as you have the willingness, resources and time to drive quality content and inbound links that each would need to stand on its own as an authority site.

Glad to see this whiteboard presentation focus on this particular tactic. It's exactly the strategy I've got my team working on for a couple clients. Where the biggest difference is between this and most of what I see out there from people who take shortcuts is in how each of the related domains is done. Taking the time to actually model big brands is critical. Just slapping up a bunch of crappy sites with garbage content might get you owning the 1st page, at least in the short term, but it totally leaves out how negatively that can impact your reputation when people click on and go to those other sites.

So I spend the majority of my planning on that specific issue - what supporting sites can we create that, on their own, offer real information? This is not something that should be taken lightly or given little thought. The better you can do at identifying opportunities, the higher the quality of each of those sites, and that, in the long-run, will ensure sustainable presence, and trust.

And for anyone who says press releases offer little or no value, We currently take up multiple 1st page results for each of the clients we're doing this for just from press releases alone. It's a total win.

Personally I don't recommend migrating content off of a main domain over to a sub-domain for Panda reasons. If the content moved does have value, what I've found is the only time to move it to a sub-domain for Panda is if it's information that's radically different in topical focus than the entire rest of the site.

What I have been recommending to clients instead, with great success in Panda recovery, is to take the time to properly organize the content for more refined topical focus, consolidate thin content as well as perceived duplicate content, (beefing up unique content within individual pages), and removing the confusing over-used "related" content boxes that so many sites have where it's "sort of" related but not enough to be of laser-focus topical relevance. For many of those clients I also help them thin out the over-use of on-page advertising that further wraps around the main content areas, and finally, in some cases, have them improve page speed.

All of these techniques have helped clients rebound, though some have also then needed new social media and inbound link work done to reinforce the "we've cleaned up our act" perspective.

I take this route because of long-term reality.

By moving to sub-domains, you end up having to essentially build their authority up from the ground; also, unless the content is truly unique, you're not doing site visitors any favors by isolating that content. And finally, because if the content really is worthy, and can be consolidated as I've described, you retain that much more depth for the main site, which is the most beneficial aspect of choosing this path from an SEO perspective.

Just my experience, involving several web sites comprising millions of pages ranked and millions of visits.

You've definitely hit the critical "thin content" points. While there are other Panda factors, site owners should definitely pay attention to and address any / all of these you've covered that they might have on their sites. Really good work on this article.

By doing something behind the scenes, it misses the opportunity to bring further awareness to a crisis that many around the world would otherwise not give a second thought to. By simply inviting people to click a thumb in participation, not only did it bring the crisis to many people's attention, it prompted several of us to kick in as matching 10% contributions.

Personally I'd already made a private donation, yet when I read this, it gave me an opportunity to look closer at my own finances as a business owner, and allowed me to see I could do even more. And that is a beautiful thing.

This is a great day for the SEO community. Honest. No BS. OSE alone is worth pro membership fees - and for newcomers to the industry and people needing to stay on the cutting edge, the plethora of resources in a Pro membership are beyond invaluable. To now offer a free 30 day trial is beyond great to see, because it'll give an untold number of people who might have been missing out on why this is such a great resource just enough of a motive to check it out. And once they do, they'll see what they'd been missing.

I'm going to chime in on this one. I see comments such as yours posted frequently across all the top search marketing blogs and forums in response to what you, I and many others already consider "common sense". The issue however, is that this industry is filled with people who, regardless of time in the industry, still don't know, consider, or implement true across-the-board best practices.

Some don't because they have mostly learned on their own, and might just now be exploring more information. Others might have explored some information previously and ran with what they learned, though it meant lacking thorough research. And there are countless other reasons.

Just by reading the comment threads here, and seeing how many questions, clarifications, and other dialogue this article generated in regard to specifics on various points shows that it's healthy to have this kind of list for our industry to refer to.

Even if someone thinks they know everything, at the very least it causes us to pause momentarily and at the worst, confirm that we are on the right track.

And on a final note, I'd suggest that maybe articles offering information beyond this would benefit from being labeled "extremely advanced" for someone such as yourself if you feel you know as much as you do. To help you skip everything else. And in the mean time, the rest of us can continue to discuss these topics, regardless of our own previously understanding.

While link wheels may have some positive impact, it is not due to their being recognized as acceptable link building. it's due to the fact that Google might have some difficulty in weeding out some link wheel schemes.

To be clear, link wheels are not an SEO best practice. I've personally seen several previously well positioned competitor sites drop to the Google vortex after their link wheels were caught by the algorithm this past year.

Just because something may in fact provide ranking value does not make it a best practice.

OMG. I refrained from chiming in on this sub-thread til now for a couple reasons. First, the women of SEO are more than sufficiently capable of standing up to this kind of stuff, don't need me or any other man to come to their aid. Second, I'm human, and far from perfect. Mostly a wise-guy in fact within our peer community. And happen to banter back and forth with some of my women friends in ways that would not otherwise be acceptable. We do so knowing that it's a mutually agreed upon thing, all in fun.

Yet because of that, I didn't want to chime in here only to have you or anyone else then just go and take something I have said in those situations, pull it out of context and get all indignant.

Having said all that, I need to say that you are one of the most myopic people I've ever had the displeasure to observe as far as blind cluelessness when it comes to one of the most important issues on this planet.

One thing I need to point out is this - WTH is it that causes you to even refer to the women in our industry as "girls"? You do it over and over. Not once have I read anything in this thread where you referred to women as "women". Do you even realize that by using that word, you immediately and instantly reveal yourself as a sexist bigot?

Seriously. They're NOT children. They're adults.

And have you also noticed a pattern here? Where you feel "ganged up on"? Because I don't think you have any willingness so far to look within. That pattern alone is a massive red flag. It's also a gift. For you to stop typing in defense of your mis-guided belief. And to ask yourself some serious fundamental questions about how you see women, not just in our industry.

@Digital10Media - this past spring/summer, this issue came up regarding whether Google still factors the Meta Description. It came up after Maile Ohye, senior developer programs engineer at Google, had been talking about their relevance.

Someone had misquoted her, with the end interpreted claim that they are now, once again counting it in relevance evaluation. Except she didn't say that. It was a complete mis-quote. Lots of people chimed in, and several of us, just to be sure, went and conducted tests, to varying degrees of scientific, or lack of scientific process. Not one single test by anyone participating showed in any clear, verifiable, repeatable way that the Meta Description field is being used for relevance.

If you would like to read more about this, you can visit Search Marketing Wisdom and read my article on it with links to others.

TopSEOs has been written about many times over the past few years. This past summer several of us wrote articles, and I did a multi-article expose' on their tactics, methods and deceptive practices. Their response was to issue a public apology to the SEO community, make several major changes to their deceptive process, then proceed to act like they resolved everything, when, in fact, they continue, to this day, to push crappy companies to the top of their lists based on those companies paying for the position.

They do now happen to seed within those crappy listings, add legitimate companies, however at not time should anyone ever become enamored with the phony "ratings" they offer.

I would sincerely suggest that you understand that no truly reputable company would ever offer the types of services you seem to think "reputable" companies offer. And take heed at this article's list of tactics - because they are truly things to avoid, and actual reputable companies understand this.

I guess what it comes down to is our perspectives on how some people, both within the industry and site owners, read something like this with the result being that such concepts then get added in to the list of "important" SEO methods. My frustration comes from countless time spent with new clients having to help reeducate them after they read or are told by "experts" that such things are considered good SEO.

Yes, these exercises could help people understand how search engines work. No, it's not good for people to read this stuff without also understanding that sometimes Google has a valid, and truly important reason for not wanting to count additional links. Both from a spam prevention perspective as well as a user perspective.

Why have multiple links on one page pointing to one other page? I can sometimes see at most, two links on a page (one in navigation and one within content. But only if the 2nd link is justified for usability, not just SEO. but three, four, or more? No, I just don't believe there's a valid reason from a user perspective. So yet one more problem that arises with this type of article is the further pollution of content with bogus links purely for SEO.

That then brings up a very common problem I encounter - over-saturation of links on pages. Site after site I audit typically have 300, 400 or more links on a typical page. It's just crazy. Not only does that severely deteriorate SEO, it also confuses the user most of the time.

So while I'm not advocating the suppression of information as a general rule, I am advocating a much higher responsibliity from you and all of us within the industry who know and understand the ramifications, the not mentioned concerns, to speak up. Both to educate people and to ensure the surface magic isn't blinding.

As something that could be researched as a curiosity concept, I think it might possibly be considered an interesting article.

When I read it, the article implies you can get more ranking value through multiple links from a single page. That's a huge red flag.

I routinely deal with enterprise and and multi-million page sites that have top navigation polluted with visitor tracking code.

I have no desire to have to combat a client's belief that "it's good for SEO because it means we get extra points due to multiple links", because honestly, that tracking code should not be there. It's bad SEO. We already know that Canonical implementation is purely a "hint", and having to use it is NOT best practices except out of necessity to combat hack factors.

Articles of this nature DO potentially cause confusion and misguided beliefs about what should be clean best practices policies based on the more valid long term optimization path.

Even beyond the minute potential here, this kind of tactic is a fishing expedition at best, and destined for the Google scrap heap.

It's really also a matter of responsibility to newcomers in our industry who don't have that expereince, and see "oh wow - I have to start using this stuff because it was promoted from YOUmoz...

Such as? Such as the hundreds of factors discussed, reviewed, and confirmed as factors that are not likely to be slapped by Google. Unlike this "method", which is clearly a counter to Google's attempts to weed out overly linked garbage. There are only two reasons to provide a link within a page pointing to another page. User experience or SEO. If the SEO method is NOT tied directly to high quality usability, it's bogus. And doomed from the start.

So even if it has ANY value, and you have NOT shown that it does have any real world value on real web sites in highly competitive markets, such value would have to be so insignificant as to be a waste of professional web site owners marketing budgets. Better they focus on improving the quality of those factors known to be high value.

It's hack concepts like this that pollute the web with information totally irrelevant for sites in the real business world. Nobody should ever want to expend worthless time implementing such techniques when there are countless other, more long-lasting, valuable methods that too many people ignore or overlook.

see - this is why people like Dana and I love you so much. Personally, I live and die by YOY data. It's even that much more valuable for clients in seasonally based business - trying to compare the summer of 2010 to the fall or winter of 2010 is so misleading - but compare Christmas season 2009 to Christmas season 2010? WIN!

Actually if Twitter and FB choose to adopt it, it just means they can. They don't care about real SEO. neither site ever has. And in fact, because they do, people who don't know better then think exactly what you just stated. So it proves it's wreckless

Advocating anything to do with content in AJAX from an SEO perspective is wreckless if you don't also point out the fact that this is still a bad practice. Just because Google can now supposedly index content inside AJAX does not mitigate, in the least, the several flaws with it. Let's separate out the pure usability issues and focus exclusively on the SEO aspects.

Google Only

First, just because Google can, does not mean any other search engines can. So right away, you're excluding all those people who come from other engines.

Topic Optimization Limits

Most AJAX content is presented on a page that has multiple AJAX links, typically tabs. You can NOT properly optimize a page for all the varying content displayed inside AJAX tabs. The topical focus of that page becomes trash.

Designer /Developer Free-For-All

As soon as you tell designers or developers Google can index AJAX by this new method, they completely go off the deep end and overload what should be mission-critical content, into AJAX displays, thus further killing SEO.

The Bottom Line

The bottom line, if you care about proper SEO best practices, is to NOT use the Google protocol for indexing AJAX content.

I can only speak anecdotally, however I do believe that exact match domains carry too much weight. I see it a lot, and it explains why there's been intermitent explosions of exact match domains that also include State or City names followed by or following the keywords.

I do believe that the engines are going to have no choice but to discount them at some point. I just don't anticipate it coming any time in the near future. Which is tragic.

The other pet peeve this reminds me of is sites getting top ranking when they only have one slender page on a keyword phrase but they get the ranking due to the shear size of the site, even though the other 150,000 or 3,000,000 pages have nothing to do with the topic. Like Wikipedia. Or About.com. Shear weight over overall domain depth should not be such a significant factor.

I have to agree with Dr. Pete. Entire sections of a site should be sliced out of ot the equation. Better to expend more energy on those higher level pages in this scenario. The extremely minute PR that such deep pages passes isn't necessarily worth the effort.

This really IS a must-read article... t's this kind of knowledge that separates people who "think" they know SEO and advanced professionals. Now - if I could just do a better job of motivating clients to actually implement such tasks when pointing them out... < sigh >

I think that's a valid concern - people new to our industry, however it also relates to many people who have been around a long time and don't stay focused on the bigger picture.

My view on the data as I originally stated in the first article's comments is essentially that I've perosnally seen topic model importance in my own work, and that LDA seems to be a valid representation of my own work. Yet I also personally don't ever get caught up in hype, hyperbole or marketing spin even if it's subtle or unintended. I'm not most people though. Most people get totally lost in that, in their search for that golden ring.

I do think Rand does a reasonable job of making disclaimers with these "studies", yet in all honesty, with so much information overload to digest, most people gloss ofer even those efforts.

Is the answer to then not share this info? Maybe. Maybe it's to share this info with the first paragraph being an entire H1 red colored disclaimer.

People also need to recognize, yet most never do, that the data being provided is coming from a company that is in the business of selling its analysis tools. So whether it's intentional or not, the spin will, as a natural aspect of the business process, drive users / visitors to want to make use of the findings. That's not a terrible, evil thing. It's a business reality.

If there are in fact 200 factors, then each individual factor is going to be a tiny fraction of that. I think people get too lost in any single factor as being the golden ring of SEO. People really need to grasp the visual here. Topics are a factor. That's something we should all be able to accept as reality. Whether it's 0.1 or 0.3 or whatever, is not something ANY of us has the capacity to determine. Trying to do so leads to arguments and rebuttal rants beyond insanity. Really.

And since the LDA tool is not based on multi-word phrases, which Rand already acknowledged, people really need to get over trying to think LDA or the LDA tool are that golden ring. And instead, should just focus on across the board best practices, not getting hung up over and over on any single factor.

I'd have to agree that a much more complex model than has been described is being used by the search engines. I mean, when you've got literally thousands of sites, and in turn tens or hundreds of thousands (or even millions) of pages that "could" show up for any single phrase, LDA type models are going to be required to sort it all out for relevance.

Personally, I think it's got just as much to do with on-site cross-page sectional relevance as it does with individual page and cross-site relevance. And it means that people need to do a much better job at refining focus in these things than most in the SEO industry do. Or they can keep doing what they have, which would just make my work easier. :-)

In any case, I hope more people pay attention to the concepts at the core of this, and that they realize the new Moz tool is only really something to get them thinking as much as anything else. because how many single word keyword searches take place for sites that most of us optimize for (or should be optimizing for)?

oh right - duh. If a mobile page is what they choose to put in the results instead of a non mobile version, that would be a great opportunity to compare things. because it could mean that the non-mobile page is too "heavy" and itself might benefit from trimming down.

I'm a bit concerned regarding the duplicate content aspects. While they don't count this as duplicate content for in the broader sense, I'm reading this as meaning that they may show a mobile page as opposed to a non-mobile in the regular results - so if I want page A in the SERP and instead, they show the mobile version of page A, that's an issue.

Yes, that's addressed if you use detection and automatically send the user to the version you want. Yet I think it also means you can't just create a stripped down mobile version by removing SEO specific content in that version. Whether it's long-tail search content, or links used for boosting value /authority of other pages, these are things that have to be in both versions.

I know you address this by simply using style sheets and not stripping out content, however I just think it's important to emphasize that there are bigger picture reasons to do so.

I bring it up because I've got a non-profit site I maintain where the mobile version is a lot thinner on content than the non-mobile version.

I really appreciate the details you provide Oli. You did a great job going the extra mile rather than just spewing out one liners.

And uh, for the record, if you were actually IN a 12 step program, you'd know that there are no doctors involved, and instead, it's our sponsors who give us the good orderly direction we need :-) But as a member of such a program, I won't take away any points from you for that little discrepancy :-)

The more I think about it, the more I have to agree with the issue of parsing words / interpreting. I tend to become a stickler for interpretation of word usage being based on my own internal filters rather than through a detached observer view and often forget I do so...

Either way, the data overall is fascinating and something that most of us don't look at on our own.

Thanks for coming out with this additional information from the original research. I think this does a better job at comparing data in a cleaner way than the previous article.

At the same time however, the statement "The ranking correlations suggested that the H1 tag isn't much of a differentiator, yet lots of people still swear by them:" is off the mark.

If the majority of the results you studied are sites that were optimized by people who don't happen to use the H1, that in and of itself is not a legitimate basis to claim that the H1 isn't a differentiator. Sure, your data might IMPLY this, but why even include such a statement here?

You're not comparing sites that use it to sites that don't given all other factors being equal. Since you're not making that comparison, such a claim only pollutes the information provided, and only confuses the matter given all your initial disclaimers.

Article marketing is dead. Anyone still doing it is expending too much time to get any real quality value, or they're pushing garbage out and getting garbage back.

It's already been said here, but guest blogging and fresh content for your own site are so much more valuable. As is working on a dozen or more other current algorithm factors. Like getting mentions in high quality places. Improving existing SEO in on-site ways that are more important now than they were 5 years ago when article marketing was all the rage.

No, spammers and people paying 20 cents an article killed article marketing. It really is time to move on people.