Monthly Archives: April 2007

Matt Cutts [post] requested that people use their account in the webmaster central console to report sites that are selling links. This kicked off a storm, among the webmasters and SEO community, which was bound to happen. Here are a few of my thoughts about this whole paid link issue.

It’s Just The Next Step
Over the past few years we’ve seen all kinds of methods used to try and artificially inflate a site’s PageRank, but every step of the way Google has been making updates to discount these methods, which have included:

1) Link farms
2) Reciprocal links
3) 3 way links
4) Directories

There may be others, but these have been the most obvious methods tackled.

This Isn’t NewMatt warned about paying for links specifically for PageRank purposes back in September 2005. In that post he says:Google does consider buying text links for PageRank purposes to be outside our quality guidelines.

I talked to my business partner about it and the only thing that really came out of our conversation was how they would detect paid links and how aggressive the algo would become. Our concern is that if they turned up the wick too high it would catch valid, non paid links.

Although being caught in the cross fire and being wrongly identified as buying or selling text links wouldn’t be fun for myself or any of my clients, I do think that Google will be stepping very cautiously on this issue.

People Have Been Asking For ThisOne thing I heard at SES London was that people wanted a way to report paid links specifically

Redirects Pass PageRank
The suggested guidelines for anyone selling links is to use the rel=nofollow attribute, which prevents link popularity from being passed on, or to use a redirect script, which is Disallowed in the robots.txt file. Something which some people may not have picked up on, is that the latter method actually confirms that redirect scripts actually do pass on link popularity.

Affiliate Links
Some people are stretching the definition to include affiliate links or other links which are given through some means of compensation. Remember Google’s goal is to identify sites that sell links for PR. Affiliate links are not put in place to artificially inflate PR, infact by their very nature, they don’t do that because most often they aren’t text links with a key phrase as the link text. The link either goes through a 3rd party tracking script or includes some affiliate id, which is not ideal for PR inflating.

But Google Sells Links – It’s Hypocritical!
Yes they do, but those links are not intended to artificially inflate a site’s PageRank, they’re designed to send traffic.

Doesn’t Google Have the Resources To Do This?
Yes of course they do, but what better way to gather many examples, than from people in the field who are monitoring their competitors. Let’s face it, many people do a lot worse than reporting a bit of paid link spam.

You Democrats Can’t Have It Both Ways!
1) At pubcon 2006 Matt Cutts did a red/blue poll and the overwhelming majority were blue.
2) People complain that Digg.com isn’t being democratic when it pulls certain stories
3) Since Al Gore invented the internet (yeah I know… ) it should be a democratic state right? ;-)

But you complain when Google wants to put a stop to sites buying their way up instead of obtaining links based on their merit?

So You Like What Google Is Doing?
Yes, I love it. I have even sent in my own paid link spam report. It levels the playing field and hopefully gets rid of a lot of junk. I want to see quality content in the SERPs and on sites I visit – all those obvious paid links don’t add any value to me reading the site and I never click on them.

Google’s plans to dominate the online advertising world just took another leap forwards today when they announced the acquisition of Doubleclick.

At Google, we are constantly looking for new, innovative ways to make the information you want more accessible and more relevant—and to deliver it as fast as possible.

Once Google gets firmly established in the other major media outlets, TV, Radio, Newspaper, you have to wonder what’s next.

In 2034 will I wake up each morning and listen to my Google clock radio? Will I have ads beamed onto the inside of my shower curtain? When I take a number two, will I have TP with Sponsored Sheets? Will advertisers be billed on a PPR (Pay Per Roll) model?

I really wanted to attend so I could listen first hand to some of the presentations and get the chance to put a few faces to names (and voices). And perhaps along the way I could have reminisced about my apartment on 79th and 1st.

Unfortunately work duties have me tied up a lot these days, so attending conferences has been pushed down the list of priorities.

Fortunately the SEO community does a pretty good job of blogging about the latest trends and techniques in the online marketing world. So thanks to all you attendees and eager bloggers.

Part 1 – Detecting http and https Mode Using Javascript
A while back I came across a scenario where a website (typically an ecommerce site) can serve part of their website in both http and https mode. These sites typically use the same template or footer include file for both browser modes. This causes a security alert popup in the browser because the remote javascript file is called using a http request. While this isn’t a security threat, it could cause some less technically savvy users to be concerned about the site security and perhaps not want to complete the transaction.

Google does offer the webmaster the ability to request the urchin.js file using a https call, which works well, except what we really need, is a way to detect which mode we’re in, then make the appropriate request on the javascript file.

With help from some members on SEORefugee we figured out how it can be done.

Part 2 – Only Obtaining External Referrers
Sunday night I was looking through my Top Content report and realized that after my hack to obtain the full referrer, it’s fairly indiscriminate and will obtain all referrers, both internal and external. While I already knew about this, I guess that night I was tired and grumpy and it just bugged me enough to want to fix it.

The whole point of my hack was to obtain the external referrer, so I came up with some more javascript to detect whether the referrer is internal or external and write out the urchinTracker function accordingly, so it will only record the external referrers.

The Grand Finale
So putting all this together we get this:

Just replace the XXX’s with your Analytics account number and “www.mywebsite.com” with your website.

The lovely DazzlinDonna tagged me for another one of these meme things. Since I’m on lunch right now I guess I’ll throw out 5 reasons quickly.

Personal Brand Building – Like many other professional search marketers, I had been lurking in the wings and posting in a rather anonymous fashion. But after attending pubcon in November 2006, I realized that I really needed to start building my personal brand.

A Professional Requirement – As a professional SEO with clients who are starting to blog, I also needed more experience with WordPress and blogging in general. I’ve provided help in everything from blogging platform recommendations, style, theme, category selection, blogging optimization and marketing.

Expanding Reach Past Forums – Every now and again, I’ll want to share a little tip or trick I’ve found and blogging enables me to reach more than the regulars on the forums I frequent. For example, my post about the Google Analytics hack has received ~14,000 visitors since its publication mid January.

Fun Factor – With all the various widgets, plugins, chicklets, gizmos, thingies and whatsits, you can link to and share all kinds of information which makes blogging a lot more fun than publishing on a plain old website. I really like the fact that I know exactly who has recently been reading my blog with the mybloglog widget.

SEO Experiments – The previous version of my website had a page about my modified Volkswagen Beetle along with various Beetle related in and outbound links. Because of that, my related:www.reubenyau.com query shows Beetle related websites. Since changing my overall theme of inbound links and topic, I’m monitoring how long it will take Google to figure out my new link neighborhood. I’m also looking at how link popularity filters through my blog, as well as possible duplicate content issues with the category and archive pages.

Google is known for testing out changes within its results across a limited set of datacenters every now and then. It looks like they’re testing the color scheme again in the datacenter (209.85.165.104) which is serving my network.