Σχόλια 0

Το κείμενο του εγγράφου

WHITE PAPERApril 12, 2010SEARCHENGINEOPTIMIZATIONis critical if you want to get yourwebsite noticed,and your products in front of consumerssearching for your brand online.But SEO is more than justfiguring out how your site canperform well in organic searchrankings—it can also improveyour site’s overall performance.How do you know you are taking the best possible approach toensuring your brand and your site show up in search listingswhere and when you want them to? True SEO expertise isfiguring out how to build websites that appeal to search enginespiders as much as they do people.This Ad Age Insights white paper,written by SEO expert C.J.Newton,lays out important aspects to consider before you startrebuilding your website,from making sure HTML code worksfor you instead of against you to selecting keywords that willattract the right kind of users to your site.REMEMBER THE FOUR C’S: CODE, CONTENT, CONNECTIVITY AND COMMITMENTHOWTO IMPROVEYOUR SEARCHENGINE OPTIMIZATION2 |April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERIntroductionBY C.J. NEWTON cnewton@seologic.comsearch engine optimizationis more thanjust figuring out how your website can per-form well in organic search rankings—it alsocan improve your site’s usability,accessibilityand overall performance.Hundreds of well-intentioned,and not sowell-intentioned,companies and individualsoffer search engine optimization services.Depending on the state of your website,thewords or phrases you select as goals and the com-petitive nature of the battles for visibility youchoose to fight,some of those companies may beable to make changes to your site that achievesome degree of success in organic search.Most will suggest edits to the text on yourpages,and to some degree,to parts of the under-lying code,usually title elements and meta tags.Some will build new pages designed to win forspecific pages,also known as doorway pages.Some will even perform a technical analysis ofyour site using software such as Covario OrganicSearch Insight,WebPosition Gold or Web CEO.Others will submit your site to thousands ofother sites to increase your link popularity.A fewhave even developed their own network of sitesthat they can use to instantly add links to yoursand boost its link popularity.But,of course,ifyour engagement with them ends,your linkpopularity drops as well.While all of these services can be beneficial,true search engine optimization comprises abroader set of skills than simply copywriting orlink building.It’s not just a matter of editingcontent,changing keyword density,buildinglink popularity,or adding title elements ormeta tags.True SEO expertise is figuring outhow to build websites that appeal to searchengine spiders as much as they do to people.A typical SEO project involves an extensiveanalysis of your business,a review of your onlineand offline marketing efforts,and research intothe actual searching behavior of your potentialclients or customers.The goal is to learn howpeople are thinking about your industry by look-ing at how they search the internet.Once anSEO firm knows precisely how people aresearching for information on the resources,prod-ucts or services you have to offer,a detailedanalysis of the competition can be started.Thisanalysis is used to determine how much effort itwill take to win for each of the search enginephrases identified.Then,a decision can be madewith full knowledge of the costs (time,invest-ment in your site,investment in content andonline resource development,investment inincreasing your link popularity) and benefits(quality and quantity of visitors) associated withchoosing any particular keyword phrase goal.Atthat point,you and the SEO firm can chooseyour targets wisely,focusing first on phrases thatare highly likely to indicate a consumer or poten-tial client is in buying mode.After your keyword phrase targets are cho-sen,the SEO firm will do a comprehensiveanalysis of your existing site by reviewing thecode used to generate it.You are most likely notinterested in what happens behind the scenes onyour site,but for search engines,what happensthere is critical.Search engines don’t have “eyes”to “see”your site;they simply scan the code.Forthis reason,it is critical that the SEO experts youhire are also experts in standards-based web andapplication development.The next step is to create a plan for rebuildingyour site so that it includes the content your vis-itors are seeking (determined by the keywordresearch and analysis) and one that uses opti-mized,standards-based code.The SEO firm willeither create the recommended content andresources,or work with you to make sure that itis created in a search engine-optimized fashion.After your website has been rehabilitated,theSEO firm will work continuously to get it therecognition it deserves on the internet.By get-ting other websites to cite or reference yours,youbuild your site’s link popularity (a measure of thequality and quantity of websites that link to yoursite),and you provide more pathways for searchengine spiders to follow that lead to your site.Also,an SEO firm will typically consult with youon an ongoing basis to ensure that your site isgrowing in search rankings and constantlyupdated with new content.It is critical for any website looking tostrengthen its SEO muscle that it creates a planthat leads to the site naturally attracting visitorsby winning top ranking on the major searchengines for terms and phrases most likely to besearched by customers and potential customers.The plan should take a fully integrated approachto web development by focusing on the Four C’s:Code,Content,Connectivity and Commitment.MORE ON ADAGE.COMThis is one in a series of whitepapers published by AdvertisingAge. To see other Ad Age whitepapers and to obtain additionalcopies of this one, go toAdAge.com/whitepapersTABLE OF CONTENTSINTRODUCTION2- THE FOUR C’SCODE2- LEAN, MEANINGFUL AND W3C COMPLIANT- WYSIWYG DEVELOPMENTCONTENT6- HOW ARE CONSUMERS SEARCHING FORYOUR PRODUCT?- PREMIUM, HIGH VALUE, MEDIUM VALUEAND MAGNET KEYWORDSCONNECTIVITY7- INTERSITE CONNECTIVITY- INTRASITE CONNECTIVITY- NAVIGATION AND ARCHITECTURECOMMITMENT16- PULLING IT ALL TOGETHERCONCLUSION16CHARTSCHART 1: 10 DISCONNECTED WEBSITES 7INTERSITE LINK POPULARITYCHART 2: DICONNECTED WEBSITES 7CHART 3: SITE A LINKS TO SITE C 8CHART 4:SITE A AND SITE B LINK TO SITE C 9CHART 5: SITE A LINKS TO SITES B AND C 10INTRASITE LINK POPULARITYCHART 6: HOME PAGE AND 15 SUBPAGES 12CHART 7: HOME PAGE AND 5 SUBPAGES 13CHART 8: HOME PAGE, 5 SUBPAGES AND 6 SUB-SUBPAGES 13CHART 9: HOME PAGE, 5 SUBPAGES AND 10 SUB-SUBPAGES 14CHART 10: NON-OPTIMAL SITE ARCHITECTURE 15there are twokey “visitors” to your site:people and searchengines.And search engines “visit” and evaluate websites in waysthat are very different from people. HUMAN VISITORSWhen a person “visits”a web page,he is really using a user agent (ahuman-operated user agent is called a browser;some popular onesare Safari,Internet Explorer,Firefox and various mobile browsers).That user agent sends a request to a server to retrieve a copy of aweb page and to render the code in a special way,enabling a personto see the page and interact with the page by typing or clicking.Designers and developers focus on human-operated user agents (orbrowsers).The goal is to create a rich,effective interaction betweenthe user and your website through the browser.There are thousands of ways to code any given web page so thatit looks and acts the way it does when you visit it.The choicesdesigners and developers make are critical not only to the success ofthe site in human terms,but also in search engine terms. SEARCH ENGINE VISITORSWhen a search engine “visits” a web page,it uses a very differentkind of user agent,called robots,spiders,bots or crawlers,amongother names.When a search engine “spiders” or “visits” a page,itsends requests to your server to retrieve a copy of that web page,butnot for display.The spider simply scans the copy and stores some orall of its parts in a database.Spiders have very limited interactiveability.For example,spiders do not fill out web forms,so for the mostpart,they cannot see the data buried in many databases.Because ofthe limited interactive abilities of their spiders,the major searchengines rely on developers to create web pages in special ways thathelp their spiders access the information.Unfortunately,mostdesigners and developers focus exclusively on human-operated useragents.So,many sites you would consider to be incredibly usefuland valuable are practically impenetrable by search engine spiders.Of the four C’s,Code is most often overlooked and not fullyrealized in search engine optimization efforts.It also is themost misunderstood.Put simply,optimized code is code that is lean,meaningful andW3C compliant.The World Wide Web Consortium (W3C) is thestandard-setting organization for web developers.As a group,it pub-lishes guidelines used by the likes of Apple,Microsoft,Google andMozilla in the creation of web browsers (human user agents).Thoseguidelines enable browser creators and web developers to worktogether.Search engines also rely on the guidelines.For more infor-mation on HTML from the W3C,refer to the W3C HTML homepage (http://www.w3.org/MarkUp/) and Web Content AccessibilityGuidelines 1.0 (http://www.w3.org/TR/WCAG10/).Even with the W3C guidelines in place,there are still thousandsof ways developers can create code that will produce any given designor experience you see in your web browser.Developers have morechoices than ever before,and companies are creating more web-development software every day.When developers write code,they consider many factors—among them,ease of maintenance,ease of build,time to deploy anda company’s existing platform.Developers frequently make deci-sions about code that are optimal for some factors,but not optimal forsearch engines.Optimization is about undoing that.How can you help your developers create code that is lean,mean-ingful and W3C compliant? Let’s look at what is meant by each.LEAN, MEANINGFUL AND W3C COMPLIANTLEANLean code is exactly what is impiled:code written with as few charac-ters as is possible in order to achieve the desired visual and interactiveeffect.Keeping your code as lean as possible has several benefits,including improved speed,reduced overhead costs,reduced site-maintenance costs and improved search engine optimization.Most obviously,lean code results in smaller file sizes,and thatimproves the download speed of your pages,increasing your visi-tors’ satisfaction with your site.Reduced file sizes that result fromlean code also save you money on bandwidth and server storage.Lean code is also easier for a developer to maintain over time.Another benefit is that lean code leads to well-organized code,andthe effort to create lean code forces the developer to create bettercode.Both make maintaining your website (adding pages and edit-ing pages) much easier for your developers.And what else does lean code do for you? It leads to improvedsearch engine optimization,thanks to a higher content-to-coderatio,pages that are easier for search engines to “understand” andimproved page speed.A discussion of the benefits of a higher content-to-code ratio anda further explanation of what is meant by pages that are easier forsearch engines to “understand” will follow in the discussion ofmeaningful code below.For now,let’s explore the impact of pagespeed on search engine optimization.Google recently introduced tools to help webmasters measurepage load speed for their sites.Its Page Speed tool(http://code.google.com/speed/pagespeed/docs/using.html) ana-lyzes web pages and measures the page’s score against a series ofbest practices for web performance,ranked by relevance and priori-ty for that page.The scores are developed using a method thatweighs a number of different factors,including difficulty of imple-mentation,the “potential impact” of the fix (according to Google’sexperience,that is) and how badly the page violates the best practice.Google’s search maven Matt Cutts confirmed in an interview ata search engine marketing event in November 2009 that drivingimprovements in page speed is high on Google’s agenda.WhileGoogle has not historically used page speed as a ranking signal inits algorithm,Cutts said,“a lot of people at Google feel that the webshould be fast”—and the feeling at Google is that if your site pro-vides a good user experience and loads quickly,“maybe you shouldget a bonus.”Larry Page is on record as saying that the web should be as fastThe Four C’s of Search Engine Optimization|April 12, 2010| 3WHITE PAPERCode4|April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERas flipping through a magazine,and that a faster web is good forGoogle’s business.Early in December,Google launched SitePerformance,described as “an experimental feature in WebmasterTools that shows you information about the speed of your site andsuggestions for making it faster.” This feature gives webmastersanother reason to believe that download speeds will be a significantfactor in search engine optimization results in the future.See http://googlewebmastercentral.blogspot.com/2009/12/how-fast-is-your-site.html.How fast is your site?MEANINGFUL (SEMANTIC) AND W3C COMPLIANTLean code starts with meaningful code,or code that follows W3Csemantic markup guidelines.For more information on semanticmarkup,see Semantic Web-W3C (http://www.w3.org/standards/semanticweb/) or visit the W3C Semantic Web Interest Group(http://www.w3.org/2001/sw/interest/).The basic idea is to sepa-rate the meaning from the markup,or the content from the design.This directly relates to an improved content-to-code ratio andmakes pages more “understandable” to search engines.FROM THE W3CMark up documents with the proper structural elements.Controlpresentation with style sheets rather than with presentation ele-ments and attributes.Using markup improperly—not according to specification—hin-ders accessibility.Misusing markup for a presentation effect (e.g.,using a table for layout or a header to change the font size) makes itdifficult for users with specialized software to understand the organ-ization of the page or to navigate through it.Furthermore,usingpresentation markup rather than structural markup to conveystructure (e.g.,constructing what looks like a table of data with anHTML PRE element) makes it difficult to render a page intelligiblyto other devices (refer to the description of differences between con-tent,structure and presentation).Content developers may be tempted to use (or misuse) con-structs that achieve a desired formatting effect on older browsers.They must be aware that these practices cause accessibility problemsand must consider whether the formatting effect is so critical as towarrant making the document inaccessible to some users.At the other extreme,content developers must not sacrificeappropriate markup because a certain browser or assistive technol-ogy does not process it correctly.For example,it is appropriate touse the TABLE element in HTML to mark up tabular informationeven though some older screen readers may not handle side-by-side text correctly (refer to checkpoint 10.3).Using TABLE correct-ly and creating tables that transform gracefully (refer to guideline5) makes it possible for software to render tables other than astwo-dimensional grids.Standard document structure matters.Search engines rely on theguidelines created by the W3C to help them “understand” the dif-ferent parts of your web page.Search engines rely on webmastersto ensure that content that is marked up corresponds to the seman-tic meaning of the markup;e.g.,an H1 should contain a meaningfulheader,a P should contain a paragraph (not function as a line break),and UL should actually contain a hierarchical list.HTML should beused to structure a document,and CSS should be used to style it.There are even document elements that define attributes butwhich have no impact on the visual display of the text:they existonly to enable user agents like spiders to make semantic meaningfrom the text.Common examples include the emerging microfor-mats standards,by which Google and others identify informationlike addresses and phone numbers by examining the markup.Otherexamples are the phrase elements,including CITE (for a citation),DFN (indicating a defining instance of an enclosed term),ABBR(indicating an abbreviation) and ACRONYM.COMMON MICROFORMATS

hCard - contact data

hCalendar - calendars or individual events

hReview - opinions,ratings and reviews

XFN - personal relations (to other bloggers…)

rel-license - link to copyright info

hAtom - news feed

geo - latitude/longitudeThese and all other standard coding elements matter especially tosearch engine spiders,and will matter more to future user agents.Coding only for visual layout ignores all of these standards andinhibits your website.COMMON CODE ISSUESWhether a site is static and HTML-only or a full-blown web applica-tion driven by dynamic,databased content,there are pitfalls to lookout for during development to prevent problems with search engineoptimization.While web development at most companies is consid-ered the responsibility of the IT group,it is essential that the develop-ment team be fully versed on the overall SEO strategy,understandsmarketing goals for the site,and is aware of the impact of each tech-nical decision on the performance of the site.A guiding principle is understanding that just as marketing con-tent has different audiences,the code driving the site has multipleaudiences as well,including: The development team today,which must be able to workquickly and efficiently; The development team in the future,which must be able tounderstand each piece of code and the part that it plays in theoverall site; Web browser softwarethat is responsible for rendering the codeand making the content visible;and Search engine spidersthat will read the code to try to under-stand what a given page is about.Given demands today for a quick turnaround and low overhead,development teams typically focus on their immediate needs,and usewhatever tools and tactics required to make a site render appropriate-ly in the most popular browsers.Unfortunately,this can often lead tobrittle code that is difficult to read,difficult to change and difficult forsearch engines to understand.This is often the case no matter whethera site is hand-coded,built on an existing web application framework,or built using blog or content-management software.HAND-CODING, OR MANUAL DEVELOPMENTManual coding is equal parts art and science,requiring a balanceThe Four C’s of Search Engine Optimization|April 12, 2010| 5WHITE PAPERbetween the elegant touch of a craftsman and the analytical,problem-solving approach of a scientist.When done correctly,hand-coding can create the leanest,most standards-compliant and mostsemantic code possible.But manual development is not without potential problems.Developers who are not familiar with the principles of semanticcode may create markup that is inscrutable to search engines.They may also choose an application environment,such asMicrosoft’s .Net framework,that makes development easy butthat brings with it a number of side effects impacting optimiza-tion,including poor URL design or session management code thatbloats an otherwise lean page.some sites arecreated with little or no involvement from webdevelopers,instead using web-design software to transform anillustration of a site into a series of HTML templates that can bepopulated with content.The basic paradigm of WYSIWYG(What You See Is What You Get) development is a drag-and-drop,visual-layout,document-centric approach to the creation ofan interactive site.Such sites tend to have significant issues with code bloat.Ratherthan using optimized style sheets to handle the display of elementson a page,styles are applied multiple times in inline code,increas-ing the size and,therefore,impacting the load time of the page.What’s more,these tools introduce general-purpose widgets thatcan be used to provide functionality for things like forms,drop-down navigation or search features.Such widgets typically containmuch more code than is necessary for the task at hand,as they havebeen created to serve a variety of functions under different circum-stances.Essentially,they are a sledgehammer brought to bear onanything that looks like a nail.Most WYSIWYG editors also produce code that is semanticallyinvalid,abusing tags to achieve cosmetic effects rather than creat-ing a semantically sound document structure.BLOGS AND CONTENT MANAGEMENT SYSTEMSBlog software and off-the-shelf content management systems(CMS’s) suffer from many of the same side effects as WYSIWYGeditors.Since the software needs to serve the needs of a variety ofindividuals and organizations,the emphasis is often on providingmaximum flexibility and ease of use in creating and editingpages—at the expense of the performance of the overall site,andthe size and quality of the code that is generated by the system.Forexample,blog software typically separates the components of agiven web page into multiple parts:user-created content that isunique to the page;the overall “template” for the page,often bro-ken down into several files;and reusable widgets,such as naviga-tion controls,that are shared between pages.While breaking thepage down into smaller parts makes it easier to manage,assemblingthese pieces happens on the fly whenever a visitor requests a page,which takes both time and server processing.Such systems also rely exclusively on a database to store alluser-created content.While this isn’t necessarily a problem in andof itself,how the databased content is accessed certainly can be.Forexample,if page content is indexed by a number,then the systemmay create a URL such as http://www.example.com/pages?23.ThisURL is meaningless to a visitor,and to a search engine;it may beconsidered identical to pages that end in 24,25,26 and so on.Blog and CMS software also puts content management in thehands of people who are good at writing content but who do notunderstand the SEO implications of their actions.Such a systemmay allow an editor to easily create a page with paragraph tags forline breaks,broken links,links without title attributes,images with-out alt text and so on—but all of these negatively impact the read-ability of the page to a search engine spider.WYSIWYG DevelopmentWHITE PAPERthe copy ispart of SEO—but a much smaller part than youprobably think.Because of the importance of code,if you focus only on opti-mizing content,you are doing just part of the job.This isn’t to saythat content is not important,but without focusing on code,con-nectivity and commitment,edits to your content,title elementsand meta tags won’t have their full impact.In many cases,editingcontent will have no impact at all.The key to the second C,Content,is to create useful,engagingcontent and to deploy content using Code that is optimal.But thereal challenge to building an optimized website is starting with anoptimized,hierarchical content plan.FIRST, THE RESEARCHThere is great misunderstanding about how content impacts opti-mization.SEO,like a good marketing plan,starts with thinking firstabout what potential customers want from you and how they willfind you.What types of products or services are they seeking?What terms are they using to find businesses that provide them?An optimized content development plan is one that first takes intoaccount how people are searching for your products or services andthen evolves into a marketing document (a website) based on aframework created by keyword research.The process begins with research.WordTracker and Google bothoffer limited access to data about how people as a group are search-ing:what specific keyword phrases people type into search boxes.Forexample,using the Google Adwords:Keyword Tool,you can type ina single word (“cars,” for example) and Google will return a list ofphrases that include that word and will provide the search volume foreach phrase.For “cars,”Google reports,the most frequently searchedphrases are “cars,”“used cars,”“cars for sale,”“modified cars,”“usedcar,”“new cars,”“car sales” and so on.The tool is rather imprecise,but useful for finding out how frequently a specific phrase issearched.The tool is also helpful in generating related terms or syn-onyms you might not have considered.WordTracker has sold thatkind of information for many years and has developed several toolsto make the process easier.But WordTracker relies on several secondand third-tier search engines.For that reason,both sources should beused.By using the tools available from Google and fromWordTracker,we are able to find out exactly how potential customersand clients are searching for you:i.e.,specifically what language theyuse.That data is then used to create a plan for developing a site thattargets exactly what your potential customers and clients are seeking.As an example,let us look at a website listing automobiles forsale.A marketing team might suggest that people want to searchfor autos by make,model,year,mileage,color or any number ofother factors,and ask developers to create a home page with aseries of drop-down menus for selecting various options.Anotherteam might suggest that the best way to organize vehicles is bytype (for example,passenger cars,sports cars,SUVs,trucks andcrossovers).Or,they could be organized the same way rental carcompanies tend to (compact,midsize,full-size,luxury and so on).To maximize SEO potential,one must analyze how con-sumers are searching for automobiles;tools can provide thou-sands of keyword phrases ranked by their search frequency.Acompetitive analysis could then be performed pertaining toeach phrase,to determine the amount of effort winning eachphrase will take.Finally,the stakeholders should identify thephrases that are most valuable to them based on frequency,competitiveness and overall quality.When identifying the quality of a keyword phrase,break theminto four categories: Premium Keywordsare keywords with the highest strategicvalue to a company.These keywords typically include natu-ral customer language combined with either a commercialor local qualifier.These searches clearly indicate buyingintent on the part of the searcher. High-Value Keywordsuse terms that customers would typi-cally use,but may be less commerce-oriented. Medium-Value Keywordsmay be related to the company’score business,but these terms tend to be academic,job-centered or very general in nature. Magnet Keywordsrepresent search terms related to deliver-ing useful resources and practicing good internet citizenship.While these keywords have a generally low quality,provid-ing content related to these keywords attracts natural organ-ic links from across the Internet,as people tend to share linksto informative,high-quality,noncommercial content.In the case of automobile searches,there are some obvioustrends.For example,people rarely search using phrases such as“passenger cars,”“red cars”or “compact cars.”None of the mar-keting team suggestions above match up with what people aresearching for in this case.What we find is that people largelysearch for cars by brand.The five most frequently searchedphrases related to automobiles are “car,” “cars,” “Ford,”“Honda” and “Toyota.”There is also a clear trend distinguishing those searching fornew cars and used cars.The optimal website organization is one that correlates as close-ly as possible with the actual searching behavior of a site’s poten-tial visitors.That way,the visitor finds exactly what he seeks,in thelanguage he uses when searching.So,the optimal organization for an auto website may be:NEW CARS USED CARS

New Ford Cars

Used Ford Cars

New Honda Cars

Used Honda Cars

New Toyota Cars

Used Toyota Cars

Etc.

Etc.Content6 |April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERConnectivitythere are twotypes of connectivity that matter in search engine optimization:intersite connectivity and intrasite connectivity.Connecting with other sites through link building (intersite connectivity) has always been critical to getting traffic.This started with thesearch engine AltaVista,which determined how often it spidered your site based on the number of links to your site.Every time AltaVistafound a new link to your site,it returned to spider it.The more interconnected your site was with other sites,the more frequently yoursite would be visited by AltaVista’s spider.But the search engine basically relied on self-reporting to determine which websites were themost relevant to any given search,and webmasters quickly began to manipulate the process by stuffing their pages with keywords andresorting to other SEO hacks.INTERSITE CONNECTIVITYWhen Google launched,it used a new idea about intersite connectivity to improve its results.It ranked a site in its results based oninformation gathered from the site itself,just like AltaVista did,but introduced a new measure.Instead of relying solely on self-reporting (or spidering the actual page),Google attempted to measure the relative importance of a website by measuring the num-ber and quality of links that point to a site,then factoring that information into its overall ranking algorithm.Google called thismeasure PageRank,also known as link popularity.Under this model,the more likely a random crawler is to find your site,the bet-ter chance your site will have of winning.Basically,the more links you can get from sites that have more links pointing to them,the better.However,it’s not just a numbers game,and all links are not created equal.Google also judges the quality of a link basedon several factors.The Four C’s of Search Engine Optimization|April 12, 2010| 7LET’S LOOK AT A TINY UNIVERSE OF 10 WEBSITESEach of the sites below is completely independent and not connected toany other siteFrom a search engine’s perspective, the chance of landing on any one siteis 100% divided by the number of sites, or 10%ABJCIDHEGFAB10%10%J10%C10%I10%D10%H10%E10%G10%F10%CHART 1CHART 28|April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERNOW CONSIDER WHAT HAPPENS WHEN WEBSITE A LINKS TO WEBSITE CAB10%10%20%J10%CI10%D10%H10%E10%G10%F10%10%CHART 3At that point, since the chances of a spider visiting A is 10% and the chances of it visiting C if it visits A is 100%, then the link from A to C adds all of A’s chances ofbeing visited to C’s chances of being visited (10% from A plus C’s 10%), resulting in website C having a 20% chance of being visited on any single crawlThe Four C’s of Search Engine Optimization|April 12, 2010| 9WHITE PAPERAB10%10%30%J10%CI10%D10%H10%E10%G10%F10%10%10%CHART 4If website B also links to site C, then the chances of C being visited on a single crawl increases to 30% (10% chance from A and 10% chance from B)10|April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERIf instead, website A links to both B and C, there is a 10% chance of A being visited, and then a 50% chance of a spider visiting B after visiting A, and a 50%chance of the spider visiting C after visiting A, so the chance of a spider visiting B or C is 15%AB10%15%15%J10%CI10%D10%H10%E10%G10%F10%5%5%CHART 5The Four C’s of Search Engine Optimization|April 12, 2010| 11WHITE PAPERIn overly simplified terms,Link Popularity is the chance that a web page will be crawled in the very complicated universe of websites.Since Link Popularity is one of the factors used by all the major search engines to determine the ranking of websites,increasing the num-ber of links to your site will help your site to rank better in organic search.But keep in mind that the quality of the site that links to yoursite will impact the value of the link,both the Link Popularity of the page linking to your page and the number of outbound links from thatpage to your page.In the examples above,site A started with a 10% chance of being visited.If instead it had a 30% chance of being visited,then the links to B or C would have had a much greater impact.By the same token,if A had linked to five or six of the other sites,the valueof each link would have decreased.To further complicate things,there are links,and then there are links.That is to say,when a person looks at a page,he sees any word orimage he can click on as a link.But for the search engine spiders that determine rankings,only properly coded links will be seen as true links,bringing us back again to the critical nature of code.What is a link?<a title=”...” href=”...”>anchor text</a>What isn’t a link?There are hundreds of ways to make what you think of as a link using code.Here are two examples,neither of which counts as a link thatcontributes to your overall Link Popularity to Google:<a href=”/aclk?sa=l&ai=Bi2ivHaGuSqeTNJOW6AbI3oHeDtLdo5EB9KTOigrAjbcBsLePARABGAEgpa2DCCgDOABQ9dbG3v3_____AWDJ1oOI8KPsEqABxJHC_wOyAQ13d3cuYWxleGEuY29tugEKMzAweDI1MF9hc8gBAdoBKmh0dHA6Ly93d3cuYWxleGEuY29tL3NpdGVpbmZvL2VtZXJpdHVzLmNvbYACAagDAegDqgLoA1ToA64C6AOJBfUDAAAABPUDBAAAAA&num=1&sig=AGiWqtxtE6u05mL5X5MikXMfKWe5Bbx9Yg&client=ca-amazon-alexa_js&adurl=http://www.aplaceformom.com/%3Fkw%3D3234-2083579%26distrib%3Dcontent%26sstest%3D1%26kws%3Dassisted”onFocus=”ss(‘go to www.aplaceformom.com','aw0’)” onClick=”ha(‘aw0’)” onMouseDown=”st(‘aw0’)” onMouseOver=”return ss(‘go towww.aplaceformom.com','aw0’)” onMouseOut=”cs()”><b>Assisted Living Finder</b></a><a href=”javascript:;” id=”ContactLink” title=”contact”>Contact</a>In general,the value of a link from another website to your site is determined by the following:

Link popularity or PageRank of page linking to your site;

Theme of page linking to your site—if your site is about automobiles,try to get links from pages about automobiles;

Context of link—links within the body of the page are preferable to links in the navigation,sidebars or footer;

Anchor text of link—the text underlined in the link that points to your site matters;

Title attribute on link—the title attribute provides a more detailed explanation of your page to the visitor.INTRASITE CONNECTIVITYJust as important as building inbound links to improve your link popularity is how you distribute that link popularity throughout thepages on your site.That is done by optimizing intrasite connectivity,or optimizing the connectivity of pages within your site.Think of PageRank as the chance that a spider will land on a page of your site by randomly following links throughout the internet.Apage with lots of inbound links has a high chance of being visited,whereas a page with few inbound links has a lower chance of being visit-ed.Of course,you have to factor in the links that point to the pages that link to your page,so it all gets very complicated.To keep it simple,let’s assume a controlled universe:a website made up of a home page and 15 subpages.Also assume that the home page has a 60% chanceof being visited in our much larger random universe.12 |April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERHome Page60%4%4%4%4%4%4%4%4%4%4%4%4%4%4%4%CHART 6In the first image below, each of the 15 subpages is linked directly from the home page, and no other links exist. Given that configuration, each subpagehas an equal chance of being visited: 4%Home Page60%12% 12% 12% 12% 12%The Four C’s of Search Engine Optimization|April 12, 2010| 13WHITE PAPERIn the image below, instead of linking to all 15 subpages, the home page only links to five subpages, so each has a 12% chance of being visitedCHART 7Home Page60%12% 12%6%6%6%6%6%6%12%12% 12%In the next image, three of the five subpages which have a 12% chance of being visited then each link to more subpages, giving each of the sub-subpages an equal6% chance of being visited by a random crawlerCHART 814|April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERIn the next image, the final two subpages which still have a 12% chance of being visited then each link to two more subpages, giving all of the sub-subpages an equal6% chance of being visited by a random crawlerHome Page60%12%12%12%12%12%6%6%6%6%6%6%6%6%6%6%CHART 9Taken together,these two scenarios illustrate how decisionsabout web navigation impact the flow of PageRank or LinkPopularity throughout a website.We are able to increase thePageRank of all pages on our site by reorganizing the structure.Inan SEO project,the goal is to optimize the flow of PageRank tothose pages which represent the most difficult challenges in thebattle for rankings.A practical example might be a financial services company thatoffers:Life Insurance,Disability Insurance,Long-Term CareInsurance,Annuities,Education Funding,IRAs,Cash Management,Mutual Funds,Employee Medical Insurance,Employee Group LifeInsurance,Employee Group Disability Insurance,EmployeeMultilife Long-Term Care Insurance,Employee Dental and VisionInsurance,Business Retirement Plan Services,Qualified BenefitPlans and Executive Benefit Plans.A typical approach to navigation these days is to organize allcontent into categories and to create drop-down menus so that allmajor sections are present at all times in the top-level navigation.The Four C’s of Search Engine Optimization|April 12, 2010| 15WHITE PAPERHOME PAGE COMPANY OVERVIEW PRODUCTS AND SERVICES CONTACT USCompany History FOR INDIVIDUALSOur CEO - Life InsuranceManagement - Disability InsuranceBoard of Directors - Long-Term Care InsuranceBoard of Advisors - AnnuitiesAnnual Report - Education Funding- IRAs- Cash Management- Mutual FundsFOR EMPLOYERS- Employee Medical- Employee Group Life- Employee Group Disability- Employee Multilife Long-Term Care- Employee Dental and Vision- Business Retirement Plan Services- Qualified Benefit Plans- Executive Benefit PlansCHART 10In that case, the site navigation and, thus, the architecture might look like thisThat model is very much like the model illustrated in the firstimage above.All of the link popularity is equally distributed to allpages.Again,assuming a very limited universe of websites in whichonly this site exists and in which the home page has a starting per-centage chance of 60%.Then,each of the subpages has a 60%/29or 2.07% chance of being visited by the spider in one crawl becauseeach is exactly one step away from the home page.Since the actual products and services are most likely what theconsumer is seeking,a much better organization would be as follows:Home Page

Our Company

Life Insurance

Employee Health Insurance

Disability Insurance

Long-Term Care Insurance

Dental and Vision Insurance

Retirement Planning

Benefit Planning

Contact UsIn this case,the PageRank is directed primarily to nine majorsubsections.So,in one crawl,each of the primary subpages has a60%/9 or 6.67% chance of being visited.We have reduced thenumber of outbound links from the home page and qualitativelychanged the nature of those links so that they now point to themost important sections.That gives a substantial link popularityboost to the pages that describe the primary services offered.Then,from the “Our Company” page,you can introduce a sec-ond level of navigation,perhaps arranged vertically down the left-hand side of the page as follows:Our Company

Company Overview

Company History

Our CEO

Management

Board of Directors

Board of Advisors

Annual ReportSince there are seven subpages in the “Our Company” section,which itself has a 6.67% chance of being visited,each has a 6.67%/7or 0.95% chance of being visited.In the “Life Insurance” section,which has an 6.67% chanceof being visited,there are three subsections:Individual LifeInsurance,Employee Group Life Insurance and Executive LifeInsurance.Each of those three subsections then has a 2.22%chance of being visited.By rethinking the overall organization of your website andstarting with keyword research,you can optimize the flow of linkpopularity to your pages,focusing on the pages or sections that arefighting the most competitive search engine battles and significant-ly enhance your chances of winning.16 |April 12, 2010|The Four C’s of Search Engine OptimizationWHITE PAPERCommitmentConclusionsuccessful search engineoptimization requires making a commitment to building a site that deserves to win.To do that,youmust make SEO an integral part of everything you do online.Also required is a commitment to updating your site regularly andgrowing it over time.You must remember that organic search is,and likely will continue to be,the most effective way to drive highly targeted traffic to yoursite.Three out of four search engine referrals come from organic search.In our experience,organic search is a powerful business driverand an extremely valuable long-term investment.PULLING IT ALL TOGETHER.One of the clearest points of intersection between code,content and connectivity lies in the navigation of a site via links between its innerpages,known as intrasite connectivity.Several factors matter.First,the structure of the navigation directly determines the flow of link popu-larity or PageRank throughout your site.Second,from a spider’s perspective,the text in these links is of critical importance to “understand-ing”what information the site holds.In fact,Google has filed for a patent for a process enabling it to tell what a site is about without actuallyvisiting the site.Instead,it analyzes the words in the HREF links that point to its various pages.But perhaps even more important to searchengines is the code used to create these links.It is essential that a site is developed using HREF links that a spider can follow.A spider’s path through these links has an enormous impact on how the site is interpreted by the search engine,making the organiza-tion of the site key.A hierarchical organization of links will send spiders first to the most important pages of a site,distributing link pop-ularity first to these critical pages.A truly optimized site combines such deliberate organization with optimal anchor text in each link foroptimized navigation.success in searchengine optimization is a boon to those companies that can achieve it.Search engines can be the No.1 referrer to prop-erly built websites,particularly sites that have not yet established a large user base.At this point,organic search engine referrals make upapproximately three out of every four referrals from search engines.And for well-optimized sites,that number is even higher.To truly optimize your website requires a fully integrated approach to web development.Remember the Four C’s:Code,Content,Connectivity and Commitment,and you will be well on your way to success in online marketing.C.J.Newtonis founder and chief strategy officer at SEO Logic.He has researched and consulted on organic search engine optimiza-tion since 1996,working with clients including the American Marketing Association,the Service Master Family of Brands (TruGreen,RescueRooter,Terminix,and others),The Wharton School,Experian,Heinemann Publishing,Life Fitness,Penske Truck Leasing andEmeritus Senior Living.He is also an executive board member of SEOPros.org,the standard-setting organization of SEO specialists.This document, and information contained therein, is the copyrighted property of Crain Communications Inc. and Advertising Age (Copyright 2010) and is for your personal, noncommercial use only. You maynot reproduce, display on a website, distribute, sell or republish this document, or the information contained therein, without prior written consent of Advertising Age. Copyright 2010 by Crain CommunicationsInc. All rights reserved.