Profile Information

Independent SEO consultant in Portland, Oregon. Recently founded Visual Itineraries, a service for travel agents to help show a destination, hotels, sights, etc. to a client. In my free time, I enjoy motorcycles (racing, touring, and watching MotoGP), traveling, and photography.

Great WBF Rand! I'm happy to see that your thoughts on on vs. off topic sites align with mine (mostly cause it makes me feel smarter :-).

When talking about this subject with clients, I usually throw out an example like NY Times or Huffington Post--examples of sites that Google probably wants to trust THE MOST in terms of an editorial link, and also LEAST likely to be seen as being on the same topic as, well, pretty much anything they write about.

I just can't see Google giving any extra credit for links from a site that's on a similar topic--however, I can definitely buy that Google would care that the PAGE was talking about similar topics.

Let's presume that as part of Google's relevance (or Panda, maybe) algorithm, they use presence of commonly co-occurring terms as a measure of how relevant that page is for a given term (and/or how likely the page is crappy, over-optimized, keyword-stuffed-till-your-eyes-bleed spam). It would make a ton of sense to use that same bit of logic to decide if there was a decent intersection between the topics and commonly co-occurring terms between the two pages. Think of how good a job that would do at separating out real, useful recommendations in things like forums from plain old forum comment spam, for example.

#1 a case where you'd disavow only a link, and not the domain: where you're looking at spammy links in a big UGC site, like a forum or Yahoo groups, for example--you'd want any good links from outsiders to be kept.

#2 I recommend you check out LinkRisk for backlink toxicity analysis--I've been using it for a month or so now, and am quite impressed--especially with its risk assessment. Seems way more accurate than Link Detox. With LD, I typically had to hand-inspect 90% of the domains, whether they were marked as very risky or not. And no, I have no commercial relationship with LinkRisk (except I'm going to subscribe as soon as my free trial runs out!).

I think often a company that's really good at designing and producing a product is not necessarily also good at selling it themselves. And there are plenty of companies out there who are good at the selling, but maybe don't have the ability and/or interest in making products themselves. So I think it's not necessarily a bad thing for the retailers to outrank the manufacturers in some cases. If the manufacturer wants all pieces of the pie, then they will have to step up their game and be good at all pieces of the sales funnel :-)

You nailed it, Paddy. Oh, and a super-awesome link-building tactic you should have mentioned is blog comment spam like the two folks above just tried :-). Two people who (a) don't understand that your URL in text and a link aren't the same, and (b) their spammy comments will get nuked in minutes by you or an editor (they'll probably be gone by the time I click Submit) :-).

Yes, keyword research remains very important....so you can figure out exactly what anchor text you need to build links with :-p. I'm pleased at Google's progress in terms of understanding topics instead of keywords, but a bit baffled that their reliance on anchor text remains as strong as it does.

Marc, I'd agree with your point. I tend to prefer more bite-sized, better-organized pages. All I'm doing here is reporting on someone else's study that found that Google has different tastes than ours :-), presumably because THEIR studies told them users want more content per page!

I do too, Chenzo. Google ranking factors number in the hundreds--and virtually no page does well on all of them. It's definitely possible to rank well by doing well on some and pretty much failing on a few others...if the competition similarly only does well on some of them.

In terms of restructuring: it's worth doing some A-B testing. In some studies I've seen pages condensed into 1 long page do better; at MozCon, one of the speakers (anybody remember who? He was really good...) did an A-B test with their home page where they reduced it to almost nothing, and did something like 30% better.

While I agree that great UX ought to make for great ranking, the challenge here for Google is that they have to determine great UX algorithmically--the web is too big to have a set of human analysts rate every page. Hence Panda...an algo that tries to replicate the rankings given by a set of human analysts looking at a very small subset of the web.

#3 OK, but it's a trade-off: a fabulous design that nobody ever sees won't get the links, likes, shares. I suspect this is one of the reasons Flash is pretty much dead (oh, what have I started now....!)

#4 I agree with the hire a photographer...but, let's say you're a travel site, covering just the South Pacific. The costs are going to be astronomical. I do have a number of clients seeing nice success with the collage effect.

#5 I'd agree....partially. The reason why Google appears to be in love with video (video thumbnails notwithstanding...SIGH) is that users are liking video in many instances where we USED to just throw them some text and pics. Video doesn't just have to be for "how to" or "tour" applications: it's an awesome way to do a company intro, for instance, and give your company and team a "face" and "voice" that customers can relate to. It's also a great way to do product reviews. I have a client who does smartphone app reviews (thousands of them), and ALL they do is video reviews. It works in more instances than you might think. Having said that, yeah, forcing a video where it just makes no sense to the user is dumb :-)

Hi Tim, thanks. Absolutely Google can recognize it. For a test, do an search for "Bora Bora Pearl Resort", and take the URL of the awesome aerial photo of the overwater bungalows. Then, do the reverse image search on that:

Now do the math on some of the dimensions....a bunch of those are NOT just shrunk, they're cropped.

That's an easy example...I've seen a number of cases where clients have used a stock photo, superimposed a marketing message or heading on the image, and made a new image...and Google still finds it in reverse image search. In my experience, you have to do a fair bit of modification to NOT have it spotted.

In terms of Panda, you need to forget about whether it's visually appealing or not. The code that's looking at your page has no taste, and no fashion sense :-)

But to be more specific about the header, if it's part of your template, it might be that your page is very beautiful, but the header on every page of your site is highly unlikely to have the content the user was looking for in their query. So Panda doesn't really want to see much of that header.

What's really interesting and revealing is to do a reverse image search on an image that's NOT all over the net...and see what Google thinks are similar images. I'm curious and fascinated by what they might be doing to recognize images algorithmically.

Thanks Tom. In an ideal world, you can make your UX behave exactly the way you want it to, while structuring/marking up the content in such a way that Panda can recognize the "goodness" you've got on the page.

Thanks David. I strongly suspect that Google Panda will detect embedded video in an iframe--if for no other reason than it's the default embed method for YouTube, so if they DIDN'T handle it specially, they'd miss most embedded videos. Also, you've got to expect that the Panda on-page analysis code is running pretty much separately from the link-harvesting code.

So, while I believe the best odds are to use embed code, I think the risk of not having your content spotted by having it in an iframe is pretty low (presuming you use one of the big video hosting companies like YouTube, Wistia, Vimeo, etc.)

Right, Scott....Google can see this WITHOUT Analytics getting involved. All of the URLs in the SERP are tracking URLs (look at the source code for a search results page). Google is watching via tracking parameters to see the same person click on one result from a search, then a few seconds later, click on a second result from the same search.

I completely agree with your points about new Maps vs. old. It's interesting to see Google's pride in speed in search (showing the fractions of a second the search took...that's been there forever) yet in products like G+ and Maps that clearly isn't a priority at all--at least with new Maps (I'll say the old Maps was pretty fast to react to scroll/zoom compared to Bing, Mapquest, etc.)

And thanks for the tip on swiping to get the other listings in the app, I had hit a wall there a number of times.

It's brutal that you have to resort to knowing the URL parameters for some of these features. Especially the by:expert...that'd be an amazingly useful button to have.