Google's recent algorithm changes, in response to the outcry over inbound links contributing excessively to search-engine ranking, has created a lot of repositioning within the ranking for many high-value keywords. The SEO blogs are abuzz with how it affected both white-hat and black-hat rankings, and many websites that relied on inbound links for their primary position saw steep declines in their positioning.

Does Google's constant tweaking of link weighting matter? Yes and no, depending on your SEO strategy. Google is determined to increase its use of inbound links as a measure of page popularity, as it further integrates social media links in its algorithm. If you depend on inbound links for your SERPs, then you are vulnerable to the ongoing juggling of SERPs we've seen since the Caffeine update, which only accelerated with the latest changes.

If on the other hand, your primary positioning is dependent on relevance and on-page content, you will be less affected by the "Google Dance", and every SEO worth his salt knows this. "Content is king" has never been truer now that inbound links are under constant scrutiny and tweaking by Google, Yahoo and Bing, and content should continue to be the primary focus of any company trying to build relevance online. Linking will continue to drive ranking, but depending on it is a bad strategy for stable SERP rank. Like the "three L's of real estate", you should focus on "content, content, content", and the links will come as a result.

The New York Times reported on Sunday that J.C. Penney had been exposed for implementing link-bait on an unprecedented scale, skewing search results and leading Google to levy severe penalties on the company's page rank against important keywords.

The article, titled The Dirty Little Secrets of Search, detailed how Penney had enjoyed 1st position results for highly competitive keywords like "dresses", "bedding", and "area rugs". and more valuable terms like "skinny jeans", "home decor" and "comforter sets". What was revealed was a widespread campaign of seeding thousands of links to J. C. Penney on largely irrelevant and unrelated, even obscure, websites. This process, commonly known as "link-farming", is a well-known black-hat technique for gaming Google's search results, even though Google publicly announced several years ago that it was preventing this techniqe from skewing its search results.

Apparently, not so...while Penney feigned ignorance about the use of link-farming, it summarily fired its search company, and Google proceeded to take punitive action by de-ranking the company for various keywords.

While Google insists that external links have less importance to a web page's SERPs than content, it is an inextricable component that Google can't ignore, especially as a gauge of momentary popularity. While Google claims to monitor SERPs for evidence of link-farming, it is a larger problem to identify social-media link abuses, as these results are critical to Google's "real time search" rank that takes into account social media links from Facebook, Twitter et al.

What's chilling for most white-hat SEOs is that black-hat techniques are alive and well, and put white hats at a competitive disadvantage. A black hat interviewed for the article implied that "S.E.O. is a game, and if you’re not paying black hats, you are losing to rivals with fewer compunctions." Even Matt Cutts, Google's top search-spam cop, noted that it's impossible for Google to police every link scam, although they do red-flag suspicious things like rapid growth of inbound links. It shows, however, that any proactive action on Google's part requires manual intervention by an employee; there is no automated process in place yet to deal with this type of exploit.

Microsoft’s latest iteration of its server-side programming language, .NET Framework 4, has new features designed specifically to address SEO shortcomings in previous versions of the Framework. In a recent whitepaper on Microsoft’s ASP.NET website, a number of these features were detailed:

Permanent (“301”) Redirects

Permanent redirects send a visitor from one page to another, but their function in SEO is critical because they tell search engine crawlers requesting the old page to transfer all it’s accumulated rank to the new page, and simultaneously drop it from the search engine’s results.

Returning a “301” response in the header tells the requestor that this redirection is permanent (a “302” response tells the requestor that the page is temporarily unavailable, which does not help SEO).

Prior to Framework 4, permanent “301” redirects were accomplished in one of two ways, either injecting the response object headers into the page before it’s sent back to the requestor (using “response.addheader”), or using IIS 7’s optional URLRewrite module (prior to IIS 7, rewriting was usually performed by a 3rd-party ISAPI module). Now permanent redirects can be handled in a single line of code:

RedirectPermanent("/newpath/foroldcontent.aspx");

If your website uses dynamic pages built from a database, this new syntax makes it incredibly simple to manage site upgrades and URL changes. If you’re trying to permanently redirect static content to new pages, URLRewrite maps are still your best friend (now directly supported by .NET 4’s routing engine).

Setting Keywords and Description META tags

While the keywords META tag has pretty much fallen out of use on major search engines, the META Description tag is still one of the most important elements of SEO, since it’s content is both used by search engines and displayed in search results. It plays a critical role in click-through rates and needs to correspond closely with the page content. It also needs to be unique to each page it appears on, or it will negatively impact search rank.

.Net Framework 4 provides two new methods for adding META tags at runtime (before the page has been sent to the requestor). You can now add them directly to the “@Page” directive at the top of every ASPX page, along with the page TITLE:

Prior to Framework 4, programmers commonly manually inserted this content using placeholders, or overloaded the “header” declaration in the page object. Now, the page object itself has been extended with specfic methods, making for a one-line solution:

For dynamically generated content, this is a tighter and less error-prone method, and allows for a complete separation of programming and page design.

Improved Browser Capability Providers

Another frustration for .NET developers has been browser compatibility. If you’re designing pages to display properly on mobile devices as well as desktop computers, the “.browser” file tells .NET how to compile the pages for a specific browser. This process has been streamlined in .NET using the new, cachable and extensible HttpCapabilitiesProvider.

Replacing URLRewrite with Routing

Most web developers have a love-hate relationship with Microsoft’s URLRewrite Module, which allows rewriting and redirection rules to be applied before pages are compiled. While IIS 7 provided a simple rewrite-rule generator, it was accessed through the IIS Admin interface, limiting access to developers using shared hosting environments.

Routing support existed in .NET 3.5 sp1, but it has been simplified and improved in .NET 4 with the following features:

The PageRouteHandler class, which is a simple HTTP handler that you use when you define routes. The class passes data to the page that the request is routed to.

The new properties HttpRequest.RequestContext and Page.RouteData (which is a proxy for theHttpRequest.RequestContext.RouteData object). These properties make it easier to access information that is passed from the route.

The following new expression builders, which are defined inSystem.Web.Compilation.RouteUrlExpressionBuilder andSystem.Web.Compilation.RouteValueExpressionBuilder:

RouteUrl, which provides a simple way to create a URL that corresponds to a route URL within an ASP.NET server control.

RouteValue, which provides a simple way to extract information from the RouteContext object.

The RouteParameter class, which makes it easier to pass data contained in a RouteContext object to a query for a data source control (similar to FormParameter).

After creating a public class for the routing object, .NET 4’s “MapPageRoute” method simplifies

the syntax back down to a one-line statement:

RouteTable.Routes.Add("SearchRoute", new Route("search/{searchterm}", new PageRouteHandler("/search.aspx")));

This example sets up a route mapping an SEO-friendly search URL (e.g.: “mySite.com/search/widget”) to a physical page, and sets the search term as the first parameter. On your search page, you can capture the requested product in a single line as well, either in your code-behind:

Google has mentioned in the past that it has 200 separate ranking factors for evaluating the relevance of a web page to a keyword (the "Google Algorithm"). But SEOs trying to reverse engineer ranking factors is like trying to peek at the man behind the curtain with tweezers...through multi-variate testing and other empiric methods, we can make changes to websites and monitor the search results, but only broadly guess at which changes carry the most weight or have any effect at all.

Darren Revell of Recruitwise Technology threw down a challenge on the Search Engine Land discussion group on Linked In, challenging it's members to identify all 200 ranking factors. A heated discussion followed, which identified most, if not all, commonly accepted factors. After a week of discussion, Darren summarized the results:

156How often the listing is clicked within the SERPs (relevant to other listings)

157Use of Google Check out on your site

158Domain name is one of the important factors that give good page ranking in that particular sector.

159Compression for size by eliminating white space, using shorthand notation, and combining multiple CSS files where appropriate. GZIP can be used.

160You can use CSS sprites to help to consolidate decorative images.

161No redirection to other URLS in the same server through flash banner images

[Factors marked with a "?" are subject to dispute as to their importance]

The sheer number of identified factors is overwhelming at first, but also shows how interrelated many ranking factors are. Improving your title tags, for example, will address many individual ranking factors at once, as would targeting inbound links from websites with good reputation and relevant content to your own website. Publishing content for syndication, when done properly, will create many positive factors as well. And of course there are negative factors to avoid, which would reduce relevance.

All in all we came up with approximately 80% of Google's ranking factors...but like the recipe for Coca Cola, the rest are some of the search industry's most tightly guarded secrets. And one can assume many unidentified factors are weightings between groups of individual factors. Will we ever know Google's secrets? All we can continue to do is test and measure, and slowly build up the empirical evidence that will point to the rest.

Thanks everyone for the inspiration and collaboration...knowledge is search-rank power!

With Yahoo close to ceding second place to Microsoft Bing, and now sharing ad revenue with its former competitor, many search analysts predict a swift decline for the former web content powerhouse. In the waning days of September, both Nielsen Media and Comscore report Bing has surpassed Yahoo in search share, and it will likely continue to take over Yahoo's share as more Windows 7 computers come online (Bing is the default SE in IE8, hastening the process).

Bing introduced many innovations in search results last year, placing Google in the improbable position of playing catch-up in its search results pages. Bing's consolidating results onto a single page was quickly copied by Google for its image search, and Google's latest innovation, Google Instant predictive search results, is receiving mixed reviews after producing incoherent, and sometimes offensive, results. The AJAX-enabled search results are also reducing the number of clickthroughs on Google itself, although this has little impact on PPC revenue for Google.

With ad revenue spiraling downward at Yahoo, however, the company has positioned its portal as more community-oriented than Google or Bing, fighting it out with AOL and other classic content portals providing "unique" content. But web communities are coalescing around social networks, not search portals. My prediction: Yahoo will be fully absorbed by Microsoft in less than two years.

Google’s Matt Cutts posted an hour-long video from Google I/O 2010, where he and three SEO experts performed live reviews of websites submitted by webmasters. What was striking was how poorly many websites have been optimized, when the basic rules are public and easy to meet.

The first website “Phoenician Stone”, a manufacturer of stone mantles, tiles, etc., had no text on the homepage at all, with a poorly descriptive two-word title tag (“Phonecian Stone”). The only significant amount of text was in the meta keywords tag, about whch Matt made sure to mention “Google doesn’t index that text”. He went on to emphasize “We [Google] don’t trust the meta keywords tag”.

SEO Tips To Take To Heart

Tip #1—Put text on your page

Tip #2—Think about what users will type when searching for your services, and put those words on the page.

Cutts recommended using any free keyword research tool to find actual search phrases people use on search engines.

The second example was “rodsbot.com”…as Matt noted, the domain name is not particularly descriptive or intuitive (it displays weird Google Earth images). Like the first website, there was virtually no text on the homepage, but since this was a “community” site where individuals posted images, an easy way to generate lots of search-relevant text would be to include users reviews and comments. “Why do the work when you can get someone else to do the work for you, right?” mused Cutts, rhetorically. Another point Cutts made was that the owner of the website had 6 other websites, and clearly wasn’t devoting enough attention to each site for any of them to rank well.

What’s in a (domain) name?

The next site profiled was a news site about Google’s Android operating system called “androidandme.com”. The homepage was top-loaded with ads and a large logo area, to the point that most of the actual content was rolled off the bottom of the screen. While search engines may return the website because the content’s there, the drop-out rate on the page will probably be higher than it should, because the content is too hard to find. On the positive side, the website was running on the latest version of Wordpress, and was configured to use descriptive names in the URLs.

But how do you differentiate your website from others covering the same industry or products? Cutts pointed out that branding “outside the box” would help differentiate your website from the rest of the pack, using as an example the mobile phone website “Boy Genius Report”…the name has nothing to do with mobile phones per se, but it does have a lot of resonance with gadget-hungry geeks and nerds, and it certainly “stands out from the crowd”.

Mal-de-Ware

One of the sites submitted for review actually had been hacked with malware scripts, and the owner evidently was unaware of it. Vanessa Fox pointed out that if you register your website with Google’s Webmaster Central (and who doesn’t?), you will be notified of any malware detected on your website. Panelist Greg Grothaus added that Google has a new service called “Skipfish” that will allow you to test your development website for SEO and malware before you’re released your code to your live site.

The Mayday Update

Cutts admitted that the radical shift in search rankings around the beginning of May was a deliberate algorithmic change that is here to stay. The ranking algorithm shift caught many SEOs off guard and caused much misery as sites with long-time ranking saw huge shifts in their SERPs. Google claims this update will return “high quality sites” above sites they evaluate as having lower quality.

TLDs Don’t Matter (Maybe)

During the final open Q&A that closed the session, it was asked if the TLD (what your website ends with—.COM, .INFO, etc.) affected search rankings. Cutts was emphatic that there was no ranking preference based on TLD, although he added parenthetically that other websites tend to aggregate links to .GOV, .EDU, etc. Certain TLDs may have a bad reputation that might impact on click-through rates, so I would still recommend staying away from .BIZ, .INFO and other spam-centric TLDs.

The H1 Tag…Still No Consensus

Many SEOs insist that you have to use the <h1> tag on your pages for the headline content…Cutts mentioned that Google will index the page regardless, and that what’s more important is that the page validate as HTML. He did not say that leaving out the H1 tag will penalize your search rank, so use it if you want, but don’t obsess over it!

Anchor Intelligence, in it’s quarterly traffic report, reports that click-fraud has reached an all-time high in the first quarter of 2010 for affiliate marketers using its services.

Unsurprisingly, the highest click-fraud rates are coming from countries with historically lax controls on internet traffic and PC security. Vietnam has the highest rate of detected click-fraud (mainly through botnets installed on trojaned computers), at 35.4% of all measured clickthroughs.

What is surprising, however, is that click-fraud in the U.S. is running at 35.0%, which represents the lion’s share of all click traffic by volume, followed closely by Australia, Canada and the UK. Click-fraud in the U.S. is predominantly committed by sophisticated organizations usually hired by competitors to increase a company’s PPC advertising costs.

What is most disturbing is that major PPC providers such as Google and Yahoo clearly have the means to identify concerted click-fraud attacks, which have obvious signatures such as automated high-volume traffic from distinct IP ranges, yet they have little incentive to address it, as cracking down on click-fraud just takes money from their pockets. While Google some time ago published an independent report of its fraud-detection techniques, the conclusion by that researcher was that Google’s effort to filter invalid clicks was “reasonable”, however he adds that the CPC model “is inherently vulnerable to click fraud.”

What to do if you suspect Click Fraud

It’s up to the advertiser to track the clicks and identify fradulent behavior, and then petition the network to adjust the CTR billing. The only way to do this is to monitor your server logs and identify PPC traffic, and then look for patterns in the originating IPs. For websites with high traffic and PPC volume, it can be difficult to separate valid traffic from fraudulent clicks, and even more difficult to “prove” to the network.

I’ve started using Windows Live Writer, Microsoft’s free offline blog editor which it distributes under its “Live Essentials” toolkit. What’s nice about this software is that you can compose blog posts in a WYSIWYG environment with a full set of editing and format tools, save and organize your posts before publishing, and much more.

The software mimics what most blogs already have in terms of editing functionality, but provides a more complete feature set and, most importantly, a common editing environment that allows you to manage an unlimited number of blogs simultaneously…a godsend for SEOs working on multiple client accounts. Unlike the latest versions of Microsoft Word, which also allow publishing directly to blogs, Live Writer is free for anyone to use, and works with virtually any blog. The GUI includes useful image editing tools for which you’d normally need additional software, as well as table and charting functions, photo albums, video, and it also supports an extensible plug-in architecture for developers to add their own features.

I will post some how-tos and screen shots to help anyone interested in setting up this software. Whether you’re editing a single blog or manage dozens of different websites, this offline blog-editor really makes for better blogging.

In an effort to compete with Google's dominance in search marketing, Yahoo! and Microsoft announced today that they have received regulatory clearance by the Federal Trade Commission to form a search alliance. The announcement means that the entire webiverse of PPC marketing will now be controlled by two large players, but for advertisers it will be somewhat easier to manage PPC budgeting across two major services instead of three.

Yahoo's revenue from and share of the PPC has been steadily eroding over the past year, mostly due to inroads by Microsoft with the introduction of its new search engine, Bing. Yahoo cited increasing losses from its affiliate-marketing sites, from which it generates a great deal of its search-marketing revenue, in seeking the merger.

Google's share of the PPC market has been flat at about 80% of all PPC revenue over the past year. For Microsoft, the merger will provide a big boost to its relevance in the paid-search market, although the combined PPC revenue for both companies is still only about a fifth of the market.

Yahoo says there will be no immediate changes for its Yahoo! Search Marketing advertisers, and will probably wait until after the all-important 2010 holiday season is over before migrating to a shared platform with Microsoft.

The fatal luge accident that killed Georgian Olympic hopeful Nodar Kumartashvili during a training run in Vancouver Feb. 12 was widely reported within hours of its occurrence. Initially videos were posted on liveleak.com that showed the entire crash, which then exploded across youtube and other media-sharing websites. In the wake of complaints against the visceral footage, news organizations hastily edited out the final milliseconds showing Kumartashvili's impact against the steel stantion that killed him, and then, citing copyright, had all incidences of the footage pulled from the web within 24 hours.

But was this a copyright issue, or are their other legal issues afoot? Reports are circulating in the blogosphere that the IOC asked that the video be withdrawn pending an internal investigation, amid a flood of negative reporting that warnings of teams, coaches and competitors of the dangers of the track design went unheeded before and during trials leading up to Kumartashvili's death. At a hastily convened press conference Friday night, "Olympic Officials" (not officially representing the IOC) announced that they would make "minor changes" to the "track configuration" and "ice profile" prior to the first official runs this weekend.

But what is more likely is that the video has been pulled pending legal action brought against the IOC by Kumartshvili's family, in an attempt to limit the negative publicity it has engendered for the IOC. The only thing one can be sure of is that the video will be reproduced, ad infinitum, during the trial.