Search Engine Optimization

Everyone knows that Web 2.0 technologies have permanently shaken up the practice of Search Engine Optimization. But when people discuss the confluence of Web 2.0 and SEO, they’re usually talking about blogging. After all, we all know that search engines love blogs because they’re dynamic, link to each other frequently and have well-structured code. Blogs usually beat metatagging and link exchanges on a static website.

But what about Facebook applications? Until recently, search engines weren’t indexing them. But according to Justin Smith of Inside Facebook:

Facebook recently enabled developers to serve XML sitemaps off the apps.facebook.com. Sitemaps are used by webmasters to notify search engines of updates to pages and page structure, and generally are a worthwhile exercise in any SEO strategy. Since apps are served from apps.facebook.com, developers get to ride on the back of Facebook’s PageRank – potentially a big leg up on regular web apps.

As of this writing, the domain www.facebook.com has a Google PageRank of 8. It’s entirely possible that a well-optimized application page could be indexed by Google as being more relevant than a company’s own website. An inbound link from an application page could also make your site more relevant.

If you’re attempting to make the case for developing a Facebook applicatio to reach your audience, don’t forget to mention the SEO benefit to your boss.

It’s a really cool and useful breakdown of what you can do with your copy to really boost your results in Google. I’d recommend reading the whole series for good ideas on how you can tailor your blog posts for a better showing.

But it’s a five part series, and let’s face it, most of us are lazy. So here’s the big not-so-secret secret: almost 90% of what you can do to get good search results is get linked to.

As Brian Clark puts it:

That’s why any true SEO copywriter is simply a writer who has a knack for tuning in to the needs and desires of the target audience. And due to the pursuit of links, those needs and desires have to be nailed well before you’ll ever show up in the search engines.

…

“Ask yourself what creates value for your users,” sayeth Google. As those brainy engineers continue to diligently create better algorithms, combined with people-powered social media tagging and blog-driven links, copywriters with a flair for prompting link response and conversions will become vital members of any search engine marketing effort.

In other words, good SEO copywriting is linkbait.

I think that it goes a little bit farther than that, though: I’m betting on Google. Google’s entire business is based around providing the best search results to whoever is searching.

According to Gray, one of the biggest issues with WP from an SEO standpoint is that it puts content in a lot of different places:

Main Index

Categories

Date Archives

Author Archives

Duplicate content is a major problem in SEO because it confuses Google. When Google is confused, it gives lower priority to your content. You want to keep nice little silos for all of your information.

As many of our conference attendees know, I put a high priority on finding news and relevant content hidden away in traditional HTML so we can introduce it into the RSS ecosystem. Done right, it avoids contributing to the echo chamber and creates what economists call a “Pareto Efficient Allocation” — where everyone involved is made better off and no one is made worse off.

A classic example of this surrounds a post I made today on our bigbusinessjet site. Here’s the chronology:

1) My favorite Firefox Plugin Update Scanner noticed one of the better (yet archaic) HTML subject expert sites we monitor has a new article posted. Aviation gurus Conklin & de Decker have written a piece about aircraft leasing. See below, as Update Scan even highlights the new item on the page.

2) Click through to the article and read.

Notice: At this stage Google has not noticed that the article exists.

Nor has Google indexed anything (yet) with the same string I used for my post headline.

3) Write an overview post, link back, and encourage readers to click through.
We get a nice post, relevant to our readers, Conklin & de Decker gets an inbound link and the resulting traffic. We win, Conklin & de Decker wins, and (see below) readers that previously had no idea this content existed can now find it.

The good news: 5 minutes later Google has indexed my post.

The not-so-good news (which should self-correct in a few hours/days:) Google sees us, but not Conklin & de Decker yet for the article title search string.

We’ll follow this over the next few days and see what Google picks up on and when.

But while the article mentions blogs here and there, it never states explicitly that blogs cover most of what the experts they interviewed recommend without a lot of fuss. After the jump, I’ve broken the article down into its basic components and explained why blogs can help you do just about everything the Journal article suggests.[click to continue...]

We all know that Alexa ranking is a fuzzy measure of a site’s actual traffic. It is extremely vulnerable to selection bias. But I contend that it’s still a useful tool.

When we were working to determine which bloggers should get press passes to CES, we looked at Alexa ranking as one of many factors to determine whether or not the blogger had a significant enough audience to qualify as “press.” When combined with a number of other qualitative and quantitative factors, Alexa rank can be a good indicator.

Norvig’s results should serve as a useful reminder that no one statistic or qualitative assessment — especially one that is susceptible to so much bias — should be used as the definitive indicator of a site’s merit.