A lot of people read blogs, books, articles and other materials from so-called 'gurus' that made it big. Readers are hoping to do the same but I know that no one will match success by simply following a guru’s advice. I am not saying this because I think those gurus are necessarily dishonest or are holding back (though most are), but because I've learned that you don't become successful by following a specific formula—especially not a formula that somebody else gave you.

Why? Because success is in many ways all about competition. If you learn things from other people's playbook, they already have the first-comer advantage. You too need to be the first somewhere. You need to find unexploited opportunities of your own.

In this post, I want to tell you about my personal experiences with pay-per-click (PPC) and what I’ve learned over the years. I started my first profitable site back in 2002 on borrowed credit with a CAD$3,000 limit. I turned that $3k in $4.5k with PPC and affiliate commissions—a 50% ROI. It was my first time as an online marketer, an online baby step. And now, several years later, I own a 7-figure per year business, I employ several talented individuals and have time to post on this blog! Read more →

As it is well known by now, Google decided to remove the supplemental label from pages it adds to its supplemental index. That is unfortunate because pages that are labeled this way need some “link lovin’.” How are we going to give those pages the love they need if we are not able to identify them in the first place?

In this post, I want to take a look at some of the alternatives we have to identify supplemental pages. WebmasterWorld has exposed a new query that displays such results, but nobody knows how long it is going to last. Type this into your Google search box: site:hamletbatista.com/& and you’ll see my supplemental pages. I tested it before and after Google removed the label and I'm getting the same pages. Read more →

Consider two chess players, Mike and Tom. Mike has never been able to win against Tom. Mike knows all the rules of the game: how to move every piece, when to capture, when to castle; he even knows all the tactical ideas like forks, pins, skewers and discovered attacks.

Mike's problem is that while he knows all the rules and has read a lot of chess books, he still looks only one or two moves in advance. It is very hard to prepare a winning plan looking so short-term. Tom on the other hand thinks at least five moves ahead and moves all his pieces so that they complete his master plan. Tom anticipates all of Mike's moves and prepares for them with a strong counterattack.

The world of search engine optimization is no different than this. As SEOs we need to think ahead of our competitors and, more importantly, we need to be on top of search engine advances. The world of search is moving so fast that any slacker will be left behind, with no time or opportunity to catch up. Check and mate. Read more →

We have discussed before how to control Googlebot via robots.txt and meta robot tags. Both methods have limitations. With robots.txt you can block the crawling of any page or directory, but you cannot control the indexing, caching or snippets. With the robots meta tag you can control crawling, caching and snippets but you can only do that for HTML files, as the tag is embedded in the files themselves. You have no granular control for binary and non-HTML files.

Until now. Google recently introduced another clever solution to this problem. You can now specify robot meta tags via an HTTP header. The new header is the X-Robots-Tag, and it behaves and supports the same directives as the regular robots meta tag: index/noindex, archive/noarchive, snippet/nosnippet and the new unavailable_after directive. This new technique makes it possible to have granular control over crawling, caching, and other functions for any page on your website, no matter the type of content it has—PDF, Word doc, Excel file, zip files, etc. Read more →

We all know that building solid, natural and authoritative links to your site or blog is the best way to obtain unshakable rankings. There has been a lot of chatter lately about using social networking sites to help build traffic and eventually links. Those links are hard to get, unless of course you have a power user account at a popular site like Digg. Unfortunately, getting a power user account involves a lot of work that most bloggers are not willing to put in.

Power users carry more weight than regular ones, the main benefit being that you need far less votes to make it to the Digg homepage. One hundred power users account for around 50% of the stories that make it there. The traffic you would get by being linked to the Digg homepage is not itself profitable, but many of those eyes glaring at the screen are the linkerati – influencers that will link and blog about your story, giving you a lot of very valuable natural and authoritative links.

The basic ingredients for success at Digg are: diggable posts/articles, a power user submitting your article, and a digg-friendly landing page. Most stories make it to the home page if they get over 50 votes in less than 24 hours. To help you achieve such numbers, I've seen many blogs directly or indirectly promoting a service called Subvert and Profit (S&P), designed to get all the votes needed to land on the Digg homepage. You basically pay $1 per vote and they pay $.50 cents to the Diggers. This means that if you create a diggable post and a digg-friendly landing page, you only need to invest around fifty bucks to make it to the Digg homepage. And if you can make it there, you can make it anywhere. Right. Read more →

My old pal Skitzzo from SEOrefugee revisits what he calls an SEO “myth”: that a competitor can potentially harm a site owner just by pointing links to his or her site.

According to the number of Sphinns, it looks like a lot SEOs agree it’s a myth. That’s understandable, as it would be very unfair for the search engines to allow this type of thing to happen.

Unfortunately the situation is not as simple as it first seems. As has been my practice on this blog, let's dig a little bit deeper to understand why—although difficult and possibly expensive—it is very well possible to pull of this exploit. For those concerned, I explained how to counter this type of attack in a previous post about negative SEOs. Check it out. Read more →

Getting solid rankings is a lot of work, and properly organizing keywords and landing pages is no trivial task either. Why not make the most out of it once you have started getting the traffic? After beginning a successful PPC or SEO campaign, it’s time to maximize the returns from it.

There are a lot of metrics that search marketers can track, but these are the three that deliver 80% of my results: Bounce Rate, Conversion Rate, and Return on Investment (ROI). Read more →

Link building is without a doubt the most time consuming—but most rewarding—aspect of search engine optimization. It usually takes more effort to promote your content (build links) than to actually create it. As I have stressed repeatedly before, compelling, useful content should make your link building efforts much easier.

Before I go any further, let me note that I have a slightly different perspective when evaluating link-building tactics than most SEO consultants. I do SEO primarily for my own sites and my income depends on the ability of those sites to make money. That means that I try to build links that primarily offer long-term value. I still try to get the short-term and medium-term value links, but I like to build authority for my sites. If you’re working for a client or a boss that wants to see immediate results, your priorities will probably be different.

In situations where I have to pay or put some serious effort to get a link, the most important criteria is always: Will the link send useful, converting traffic?

Why is this my most important criteria? Let's explore three different scenarios to illustrate this: Read more →

One of the most important measures of success for a blog is the number of RSS subscribers. There are many blog posts out there about how to increase your number of subscribers. They range from the use of bigger, more prominent and attention-grabbing RSS buttons, to offering bonuses for signing up. While you can use all sorts of tricks, at the end of the day it is really about the value you give to your visitors on an ongoing, consistent basis. Personally, I subscribe to any blog that sparks my interest, but as soon as I see the quality drop I unsubscribe just as quickly. So many blogs, so little time!

Let me introduce another way you can increase your RSS subscribers that I have not seen covered anywhere. It works by identifying your best RSS referral sources and focusing your marketing and networking efforts on those.

I frequently get asked why a particular page is no longer ranking. I wish there were a simple answer to that question. Instead of giving personal responses, I’ve decided to write a detailed post with the possible problems that might cause your ranking to drop, as well as all the solutions I could think of. I also want to present a case study every week of a ranking that dropped and what we did to get it back. If you have a site that is affected I invite you to participate. Send me an email or leave a comment.

There are many reasons why your page or website might not be ranking. Let's go through each of the three steps in the search engine ranking process and examine the potential roadblocks your page might face. We’ll see how to avoid them, how to identify if your page was affected, and most importantly, how to recover. Read more →