On the Web

Profile Information

Jenn has over 20 years of resort and 15 years of spa experience, including being part of the opening team and Spa Lead/ brand standards Learning Coach with Ritz-Carlton Dove Mountain. Currently, she continues to work with luxury hotels as both a Concierge and a Massage Therapist, while building our site and brand.
Ed has worked for one (and only one) aerospace company since graduating from Ohio State with a BS in physics. This corporate background gives him understanding of the need for travel and adventure to escape and recharge. The master plan was never to sell your soul to corporate America, but rent it for enough to provide for the family and the future.

Display Name

colemanconcierge

Email

(Enable Javascript to see email address)

Job Title

Founder/CEO

Company

colemanconcierge

Type of Work

Business Owner

Location

San Diego, CA

Favorite Thing About SEO

It's like going to the dentist. A good thing to have done and makes you a better person.

You have convinced me that I need to try at least a 30 day trail in really dig in to see what my pages, especially the low quality pages, are doing. The first content I put up is somewhere between hot mess and flaming dumpster fire so I have been wanting to re-up it for some time. There are also the "great ideas" I had that went absolutely no where. Ugg, where to start. I guess that's where getting a real site assessment and some no kidding data would be invaluable. So much SEO to do and so little time. I really appreciate MOZ writers like you to teach and inspire me. Super cool that you offered a second trail period too.

I have read about this happening before. The Serbian Crown Restaurant in Washington DC went out of business and the owner sued Google over a false listing on Google Maps in 2014. The Maps entry said the restaurant was closed during the weekends (much like your DUI Lawyer example). They suffered a 75% loss of business, which they attributed to the listing.

Ultimately the case was dismissed. Google successfully argued that the Communications Decency Act of 1996, which grants online services like Google immunity from liability for third-party content hosted on their sites, barred the suit. It also appears that the Serbian Crown didn't claim their listing, which made monitoring and amending it even more difficult.

My take away is to claim your listing and be vigilant in protecting it. One way would be to utilize automated change detection software.

Web crawls are fairly easy to automate and the data can be compared to a master template. With this technology, the vigilant and tech savvy listing owner (or their account manager) could build / use an application that would actively monitor the listings and detect changes. Google notifications are better than no notifications but I would sleep better knowing that my crawler is checking for data validity everyday.

We have been getting requests for republishing articles but have been hesitant to do so b because of fear of getting penalized by search engines. It's a good data point that you don't think there is a penalty so I'll start investigating this more. I have found that most of the people asking for contributed content are willing to put in an author bio that does contain a do follow link. In addition to the exposure from the articles, you can get do follow links from the bio.

I love all the data backing the decision. Thank you for a lot of hard work Brian. I wonder if the final conclusion "With a similar percentage of clicks going to paid and organic, your investment in each should be similar" should be modified with some kind of a factor to account for the age of your website. It seems like investing in paid advertisement should give the same benefits regardless of the age and rank of your website. If a newer site invests heaver in paid advertisements, it could benefit SEO (for example increasing organic back links). Of course, the counter argument could be made that heavy SEO investment upfront in a website will yield better long term results because it's always easier to build something right than to fix it later. Perhaps that's what you meant by your qualifier "unless, of course, you have some catching up to do with one channel " since a new website will be seriously behind on SEO.

Fascinating question about how a plumber could make a brick and mortar business. I think the general blueprint for forming a brick and mortar company would look like this:

(1) Keep costs down: Share space and personnel if possible. Perhaps you are really good friends with your locksmith buddy and he will let you move into his office for a notional fee and his front office staff can service your calls.

(2) Look for parallel markets: Home Depot has an entire plumbing isle filled with products like drain cleaners. Besides direct profit, you also get to offer expertise on how to use the drain cleaners but also, I would be sooner or later your drain cleaner customer will need a full drain snaking.

(3) Provide point of sale service: Unlike the first two items, which deal with money directly, this is a pure service feature. A lot of people what to speak to a real person when they call, especially with an issue. If plumber A phone goes to voice mail when the hot water heater blows up and plumber B has his front office staff take the call, guess who is servicing that water heater. Likewise, if there is any compliant, such as the toilet still runs after you cleared the drain, the last thing your customer wants to hear when he calls to complain is voicemail or your distracted conversation as you drive around town. A brick and mortar , flesh and blood experience might retain a loyal customer who has had a bad day.

Thank you Zee. I love how well organized this piece is. It seems like this checklist focuses on items you can control a lot more readily than simply backlinks and link building. I was wondering what you do with the score after it is generate.

Welcome Kane and thanks for bringing us WBF. I'll have to make it to a MozCon and see you speak in person. I'll have to check out tools you suggested for making video shorts. I know that one thing that I really can't stand about pop up videos is when the volume is way to loud. Pop ups are obtrusive without having to be an earful.

Increased tracking always reminds me of the Observer Effect / Hawthorne Effect. This effect is a type of reactivity in which individuals modify an aspect of their behavior in response to their awareness of being observed. The original research at the Hawthorne Works in Cicero, Illinois, on lighting changes and work structure changes.

This is a good things for the local SEO / SEM team to have happen. If the company was not tracking the data from analytics, they wouldn't know how important it was in the first place. (which makes you wonder why they would hire an SEO guy). The very first thing you do is educate the consumer on how important SEO really is with real statistics. Then you go on to show them deltas, which hopefully are increasing and show how good your SEO work is.

The observer effect comes in when they notice things link daily fluctuations in the analytics. Perhaps the site goes down and they make it come up faster. Another instance would be since they are tracking an item they might want to increase add budget to see the response because just tracking it makes it important emotionally. I absolutely believe in the power of the observer effect and have seen it first hand many times. It's real and it's a good thing that comes out of tracking data.

Great article. I enjoyed reading it. What I found particularly interesting was "save your butt" item #4 --> Important content was hidden on the mobile version. I wonder what the threshold is for "important content". I make a design choice to modify a significant number of my pages for the mobile reader. First, I figure that they are reading more causally, like they are in line at a coffee shop instead of in depth reading at their desktop with their fuzzy slippers on. There is also significantly more screen space and visual bandwidth on a desktop than a mobile. For these reasons, I suppress a certain amount of graphics for mobile devices. I think it adds value for the intended usage of my readers and doesn't change the intent of what I was writing so it feels "ok" to me. It "feels" to me that the graphics I am suppressing does not qualify for "important content".

This was hilarious. It also spoke to me on a very deep level. Just last night, my partner was saying how we had like 25 random emails in our inbox from some SEO newsletter giving us the "hot tip of the week" and when was I going to get around to reading them. My response was simple. I didn't want 25 random tips. I wanted an organized database of classes and instruction to guide me on my SEO journey. I wanted to be in charge of where I clicked and what I learned based on my needs and interests. Then, literally the next day this shows up on the Moz Blog. I think it's a sign that I should browse the SEO learning center and not just read random tidbits of what-ever-clickbait people send me.

I love the idea of getting the developers in the room early. That last thing you want is to be recommending changes that the developers hate. You need everybody working in the same direction.

It would be really cool if you could combine sever tools together for an SEO crawl that could report out a prioritized list. From the category sort box there, the upper left corner was URGENT & IMPORTANT which contained Primary Page Issues and High Volume Page Issues. If you combined the SEO Crawl with analytics you could assign a priority score to issues based upon page volume. That would be the "I" (Importance) in the ICE acronym from Paul Shapiro.

the C (confidence) and E (Easy) could could be addressed by adding a priority score from the crawl itself (which also should include some importance factors). By having a ranked SEO Crawl combined with traffic flow analysis you can really hone in on what's important. You talk about doing this manually, but it seems like there is an opportunity for automation.

We are manually working a lot of social media at this time and are considering the options between a VA and a DVA. I keep thinking that a DVA would be the correct path to ensure consistency and scalability. We have been experimenting with "like only" DVAs with some success. The hardest part in my mind is the semantics and conversation flow. I really appreciated your outline of conversation flow, complete with flow chart and references. That helps me advance my bots from mute liking to interacting information gathering machines.

I love how clearly state the call to action was in all of the one click upsell banners. What they were offering and why the reader should click was instantly visible after the redesign. No wonder the sales increased dramatically.

I thought there were multiple benefits to and intelligently designed IA, of which link equity is a part. I also thought the linked pages established LSI and help direct Google crawling.. For these reasons, I have heard it advocated that your linking structure be heavily siloed. That is to say, your parent posts can link to children and the children can link together but it better stay pretty close to the family. In your example, the pages appear in a collective circle labeled Mydomain.com. From a paradigm such as this, I could find multiple semantic and intent relationships that have the link structure crisscrossing all of "Mydomain.com". Are there any costs/issues, bedsides diluted link juice, to building internal links between pages that fit well together?

Great article. I love how straight forward you were in the solutions without bombarding the reader with an enormous amount of options. You evaluated Traditional HTTPS / Let's encrypt / Cloudflare for cost, complexity and functionality.

You also wrote - " numerous sites (including Moz) have reported experiencing major traffic fluctuations following their migrations." I was wondering if any of these three options for mitigation affects the fluctuation in traffic differently. I could imagine Cloudflare with its compatibility issues being particularly problematic but that is just a WAG.

Sounds like some great features. It is important to be able to ignore non-critical problems so you can actually see the critical issues when the arise. Question - Does the Pixel Length Titles depend to the font used? A monospace slab font, like Currier, might space out differently than an Arial or Times New Roman.

This guide was a little intimidating to me. I was expecting structured data to be more generally cast with constructors, inheritance of fields like object oriented programming. I can see this with the Schema.org Types and properties but what I was really hoping for a turnkey app or choice of apps that have a visual construction / display of the types and properties used. Maybe it would even have a pull down menu of available options. Then I would hope that this app would create a ready to use snippet that Google could just find and process. I am still a big fan of structured data conceptually but I am going to need to do more research for how it applies to me.

I wonder if you can expand the cluster of competitors strategy with the look for competitors' link strategy. The algorithm goes something like this.

(1) Search for you competitors' backlink with the standard SEO tools

(2) Find which backlinks are common to all / most of your competitors. (this could be a machine learning search if there is a significant number of backlinks to process)

(3) Contact the common backlinks with a link request.

This is similar to the Google search tactic you describe but the input data source is different. Different in and of itself could be good but you can also get more metadata from a backlink list. Examples of useful metadata would be follow / no follow, link test, linking page's authority, etc. This could be useful for prioritizing which links you pursue.

So excited for part 2. I am sold on the idea of structured data and it's all I can do to keep from harassing my IT guy right now for the implementation part. This looks like a great way to organize data for the human readers as well.

We run a travel blog and have been concerned with creating geographically specific content. During TBEX Huntsville we attended a lecture from Trivago. Their research showed that 90% of travel was within a 100 mile radius of the person's home. That drives our advertising, site structure and SEO strategies. Being able to see how we rank in the local searches would be a key tool to have. Thank you for some great advice.

There is a KPI for site design and styling. We run a graphics rich site that just didn't look right at all in our sample AMP styling. Our core color pallet was distorted and just weren't happy with the aesthetics. Perhaps, someday AMP will be balanced between speed and appearance, but until then, we are opting to maximize our strengths with visual design.

There is a tendency in this business to focus on gathering as many loose social connections as possible. I have found time and time again that having a core group of loyal supports (Advocates) can be more valuable than a thousand Facebook friends. These are the people who will tell you the truth (no matter how much it hurts) and always have your back. You need these people for the positive energy they bring and to make this all worth while.

Early bird specials are absolutely the best. We booked a couple of conferences already for next year and the way we justified our "buy" decision was that the video bundles and VIP contacts alone would be worth it at the presale pricing,.. Even if we had to cancel the trip at the last minute.

We are getting into our second year of blogging and it is obvious that we need to do a lot of cleanup of old articles. A lot of the stuff that we wrote at the beginning is pretty much no good. I hope that we can refresh the articles with improved writing and on page SEO so they'll will increase in traffic.It sounds like I'll have to prune out some of the deadwood that just aren't going to be quality articles no matter how we spin them since low quality pages will bring down our overall ranking. So much to learn in the SEO world. It makes me glad that you guys are here.

I really like the audit format for this post. It poses a real question that is surprisingly hard to answer. I love how you give suggestions to each of the business as hypothetical clients too.

My takeaway is the need to create quality content on high level pages. This makes some sense with the PA of your root page having valuable link juice to pass down. I just didn't think of the lower pages' quality passing back up to your root. Fascinating idea and I'll have to explore it a bit.

I do some freelance writing for a travel site. They emphasis snippets as a core element to their writing guide so it is something that I have worked with. In my own work, I have a hard time with the mechanics of writing a snippet into the article. I'll try your suggestion of Ask question and answer with a snippet and see how it goes.

I love how organized and annotated this article was. I ended up clicking on two or three different links from the text and setting it aside to read later.

I am a number nerd by nature and it's so hard to not bludgeoned people with statistics. Recently, we did a campaign for a canyoneering outfitter and guide. The owner was an ex financial guy so he could hang with the statistics and asked good questions too. Pretty soon, he came to the point of "what is this going to do for me?" How does the rank and reach relate to customers through the door. It was harder for me to answer that question that it should have been. We are tracking his campaign now with a series of before and after screen shots instead of just straight up statistics so he can see the growth.

Brilliant piece. We are just getting started with AdWords so this article really speaks to me. I have been hearing mixed reviews about AdWords where some people say how irrelevant and spammy the adds seem on their page and others say the matching algorithms do a great job of providing content. What I hear you saying is that it will absolutely be worth your while to make sure the AdWords configuration is done correctly. I could also see how proper tagging, LSI and on page SEO in general would help AdWords (or any other dynamically sourced advertisement) function properly and match the message to the reader.

Net Neutrality is quickly becoming one of my most passionate issues. From yesterday's search volume post, one can see how the traffic is flowing through fewer and fewer nodes. If the bandwidth also becomes restricted, small news outlets could completely disappear. I look at net neutrality as a First Amendment issue. In the years and decades to come, I believe it will be the free proliferation of information that will do more to fight tyranny and oppression than any other action or organization. This essential freedom can not be taken for granted.

Thank you for sharing the results of your hard work. There was a lot of data convulsion going on here. There probably are human psychology conclusions you can draw from this data but I had three take aways-

1) This data could be used for choosing your LSI links to establish context for your pages

2) Travel is much smaller than I realize and the subdivisions of travel, like Recreation and Hobbies are even smaller. I wonder if there is a bias from the way the data was collected. In the opening, you said you focused on the top 1,000 domains and went on to explain travel sites tended to be smaller and more diversified than other market sources. Selecting top domains would tend to excluded hits from small, diversified sites.

3) For the love of all things sacred, let's keep Wikipedia a non-profit.

Great article Garrett. We just buffed our landing page a little to "The exact pain point your product/service solves". I still think we need build a better CTA. I really wanted to hire a video production company to make a promotional video. I think having a video to watch will allow us to engage with people higher on the funnel than demanding that they submit information.

Thanks Rand. I was so spun around by with the 4th of July I didn't even realize this was Friday. As soon as it clicked, I clicked to White Board Friday.

What resonated with me this week was your primary solution - "You can try separating your media or your blog or editorial content." I can see this technique forming a reasonable silo structure that is very natural. We are running a series on Indianapolis and are doing a blend between informative pieces and narrative pieces to form our structure. I always get a warm fuzzy feeling when I can squint and imagine that I am following WBF best practices.

It is so essential that you need to keep connections warm. I it seems like writing a timely and professional email is the first step. You never know which connections will be needed or when, but if you have them queued up, you're good to go. I have been closing emails with very respeciful and shying away from Thanks in Adavance. I will have to rethink that choice.

I love whiteboard Friday. This is the one and only article I read Friday mornings during my first cup of coffee. This article is going into toolbag. Lately, in travel blogging, we have been encountering examples of this and our current position is to avoid all controversy.

Example 1: Adventurous Kate attacks Travel Blog Expo (TBEX) for planning a conference in Zimbabwe. Her idea was that Zimbabwe was a dictatorship and working with and branch of that government is inherently evil. The counter argument is bringing western press and establishing travel industry will liberalize the country. There are valid points on both side but I was under impressed when she called out an adversary as a douche canoe. All this being said, She is winning many key words for TBEX and Zimbabwe as well as getting a lot of traffic and notoriety.

Personal Example: We live in San Diego and cover San Diego. The zoo has a search volume of 375,000, the Safari Park is 175,000 and Sea World is about 175,000. There is an obvious appeal to write to these key words but all zoos, and particularly Sea World, have the potential to make enemies. My wife/partner and I have been discussing this exact issue at length. Your content by making enemies article is going into this conversation.

I wish I could go to MozCon. For me, I would be playing the long game. I figure it takes at least seven hours of communications to become "friends" with somebody. Last con I attended, I made a point to keep track of about half a dozen fellow attendees in a private Facebook group. We review strategies, share ideas and even personal stories in the group. I think this private group setting was key to phase shifting our friendship. When we left the con, we only knew that we had similarities. Now, after two months in a private group together, we have truly become friends. MozCon is going to rock. I can't wait to hear the stories.

Beautiful post. I am a aerospace engineer and, believe it or not, we make the same mistakes in our systems engineering. I think the most pervasive mistakes are basing decisions on incomplete or insignificant data or, worse yet, no data at all and just a gut feeling. Data driven engineering (or CRO) isn't practiced as thoroughly as you might expect.

There is a classic social experiment called the Westinghouse effect that speaks to the interaction between the observer and the experiment. In the Westinghouse factory, they ran an experiment to determine if workers were more productive with brighter lights. The increased the lighting and production increased. Then, they decreased the lighting and production increased even more. How could this be? It was observer's paradox. The increase came because people were watching, not because of the change in lighting.

Observer's paradox seems like it would be a factor in CRO in two of the stories you told. First, the decision to make a redesign based on very recent data from a very recent change could have been only measuring the effect of any change on CRO. Your old readers saw a new box and clicked it. Secondly, the manager who was a pet strategy could be only gathering data that supports his pet project. These are the most insidious types data abusers. They have a long list of anecdotal evidence that seems impressive by shear volume, but each element is essentially useless. By the time you use complex statistics to disprove half of his list, the audience has lost interest but not belief in the remain volume of anecdotal evidence.

Thank you for taking the time to write this piece. It's a great piece to keep on hand to challenge the data nayseyers and remind yourself to do a dillegent job.

Love the pointer about three types of links. That is going into the vault. I think striving for photo rich, "clip-able" content is a great way to get links to your page from people in your niche. Having the best "Rand using hairdryer" picture is great. People might link it into their site. You can also have the best "Rand using hairdryer" graphic that people will want to use and re-use.

I think you hit the managerial nail on the head how you summed up costs / benefit. With costs you could apply a methodology and come up with a dollar value of $4,630–$10,230. With rewards you're left with a vague "$A Lot?". There are people who see potential value and rewards. They probably believe that their company will be the one doing the poaching of the high quality employees too. There are other people who only see the negatives, particularly with a dollar sign attached. I have met very few managers who could both sides of this argument. They were either in one camp or the other and just couldn't perceive the counter arguments.

Love your work. There are real gems in the #! story. Perhaps this is my Microsoft bias showing through but, anything that only Bing does probably a bad idea. Friends don't let friends use Bing.

Do you have a good example of files that should or should not be accessible to search engines? I could follow the methodology of internal linking and the URL structure, but I am still a little fuzzy on the big picture idea of when to use it.

Very useful article. It's great to hear that there is a revision / replacement to Mozscape that will reduce cycle time for metric updates.

One thing I am still having questions about is with DA/PA for your root domain / landing page. If DA is basically PA for your root domain, would it be fair to assume that your DA would be your PA for your landing? I could even imagine that your DA would be higher since your landing page will have a plethora of backlink for combinations of http, https, www. or just your domain name. However, I have seen many examples of landing page PA's higher than domain DA's. For example, my site's homepage PA is 32 and the site DA is 23. In a logarithmic scale, this is quite a difference. Do you have a gut feel for what causes this difference?

Cool post. It sound like niche words are an easier way of saying LSI. I have used Google suggestions before to find relevant LSI words that provide SEO context for my keywords. This article takes it one step further and goes straight to niche words. Not only does this save a step for me, I don't have to remember what LSI stands for or explain it to anybody.

You are a brave man indeed for letting an SEO see raw data and even braver still for having him work on the back end personally to fix issue. Perhaps it's all part of a devious plot where you leverage the company to install your working backup data once the suits crash the whole thing.

I love this article. I can imagine using the Content Grouping feature to form logical groups along the silo structure of a site. That way, I can tell how the silo as a whole is performing from a custom dimension instead of post processing the data. It is always better to let the tool work for you instead of performing laborious and error prone hand manipulation of data.

At the end of the day, if the organic traffic doesn't change you are left with no harm / no foul. Still, there is a time response for every system. Usually, the time constant (tau) is the time it takes to get to 63% of the step input for classical control theory. You also measure the overshoot and damping coefficients as well. From your descriptions, it sounds like you have stable, but somewhat under damped system that reaches a fairly constant steady state value after a little while.

I always wondered how much JavaScript is running in the background of some of the dynamic pages. Excellent idea using an IDE (Chrome Console) and just walking through the debugger. The observed growth in organic traffic is a interesting motivator to understand the purpose and need for JavaScript applications.

Fascinating. I always heard (and believed) that linked YouTube content would automatically be a boost for Google ranking since they want to boost their products. I would have assumed that they built their own service to handle the SPA handshake. Great information.

I would love to come to this event. I am interested in all of the topics and I would love to meet the fantastic presenters I have been seeing on the Moz forums. I would have a hard time picking only three. Now I really want to go to MozCon. If I can't make it this year, it's going on the calendar for next year.

I am still wondering about the difference between the feature snippet to total question ratio between "how" questions and "how to" question in your "question phrases by question type" graph. How questions only have about a 50% corresponding feature snippet while how to questions have about a 90% corresponding feature snippet. It begs for a follow on article about "how do" "how is" "how come" and all the other variations of how questions and what is their population of feature snippets. Once the how family is explored, they obvious question is - why. Why does Google see fit to populate key snippets so vastly differently. Do they consider "how to" questions more important or is it simply easier for the algorithms to match snippets to how to questions? Inquiring minds want to know,

Love the use of humor to deliver a complex subject. It's so hard to explain data driven methodology to managers and visually driven customers. Being able to explain why something works, and why you are doing a great job is almost as important as doing the great job in the first place.

Love whiteboard Friday. Love it. I have been thinking about keyword research a lot lately and really believe that it's a great tool to reach your audience. My questions are:

(1) What tools are out there that access the ease of ranking with a keyword? I know that this is available on professional search tools but are there alternatives for the part time bloggers.

(2) How much lead time would I need to plan for to rank on a trending topic. I heard a good rule of thumb is 60 days to index new material so would I have to guess a keyword that would still be trending in 60 days? What factors go into making this estimate.

I would be so excited to hear Kane Jamison speak. Last week, I was at TBEX (Travel Blog Exchange) in Huntsville and there was a tract for hyper local blogging. One of the presenters only covered Pittsburgh. They both used hyper focused (geographically) social media marketing and showed significant return for their marketing dollars.

What a timely article for me. We recently had an image shoot up to the fourth position on Google and we were wondering how that could have possibly happened. After reading your post, I realize that I have been doing most of these suggestions to try and have the image boost the page it was embedded in. I was using image name and alternate properties as a LSI boost for the keywords in my original post. Of course, I made sure the sizing was reasonable for load speed. It seems like many of your ranking factors for pictures would be good SEO tactics for the page as well. As always, great stuff and thank you for sharing.

Awesome job Lydia. As a small digital publisher, I face these problems every day. I have been looking for alternatives like this to extend my reach. BuzzSumo is going to be my first project from your list. Do you have any hints on how to find complement the BuzzSumo results with additional data so that I can find trends my DA could rank in and would have some longevity?

Sometimes, I feel old fashion in the new digital world. Kids today are tech savvy and ready to spin a fantastic web of vague promises in wrapped in fleeting uncertainty. It's good to see articles like this tout the value of hard work, honesty, personal commitment and follow through. No matter how flashy the sales pitch goes, in the end it's people talking and working with people.

Great Stuff. It seems like everything from search engines to home pages are going away from a universal presentation to every customer to a specialized presentation for the individual customer Certainly key information, such as geographic location and age would help roll out this approach.

I am a physicist and do stochastic modeling on a regular basis. When building a regression model it's critical to check to goodness of fit. You stated your ML model was non-linear by design. I refer to the paper title - "An evaluation of R2 as an inadequate measure for nonlinear models in pharmacological and biochemical research: a Monte Carlo approach" The paper opens -

"Background

It is long known within the mathematical literature that the coefficient of determination R2 is an inadequate measure for the goodness of fit in nonlinear models. Nevertheless, it is still frequently used within pharmacological and biochemical literature for the analysis and interpretation of nonlinear fitting to data.

ResultsThe intensive simulation approach undermines previous observations and emphasizes the extremely low performance of R2 as a basis for model validity and performance when applied to pharmacological/biochemical nonlinear data..."

Although it's a different field of pharmacology and not SEO, the fundamental mathematics associated with machine learning and model building should apply. R2 is a simple and useful tool but it is not a definitive test for nonlinear data.

Using an R-squared value presumes the underlying relationship is linear. I firmly believe that the the link-ranking relationship is monotonic, and positive but I could image several nature inflection points. Are links 1-10 equally as valuable as links 90-100? Or another way visualize the response space is do you believe position 1 and position 2 are have the same separation as position 9 and position 10? Strictly speaking, a low R-squared value indicates that the relationship between the two factors is not linear which is different from the factors not being correlated.

Interesting data. What is amazing is how well behaved the trends are. Your article from July 5 last year about HTTPS reaching 30% last predicted 16 months until the 50% threshold was crossed. We appear to be a bit ahead of schedule. In that article you stated "If rewarding HTTPS too heavily when adoption is low is risky and rewarding it when adoption is too high is pointless, then, naturally, the perfect time to strike is somewhere in the middle. At 30% adoption, we're starting to edge into that middle territory. When adoption hits something like 50–60%, I suspect it will make sense for Google to turn up the algorithmic volume on HTTPS.".

Do you still feel it makes sense for Google to change their algorithm now?

Awesome article. I think the codification, communication and management of expectations is key to a project like this. A lot of time, non-technical people will come in with a series of wishes and no concept of the scope or risk involved with each task. Explaining how complexity adds technical, cost and function risk is an important job. The enumeration and details of each type of migration is a valuable tool for my toolbox.

Great piece and it really speaks to me. Popups affect my user experience to the point where I refuse to put them on my site. I felt like I was being reactionary at first, but it seems like services are catching on to reward sites for having outstanding user experience.

I am bookmarking this page so I can re-read it until I understand everything. The order, flow and visualizations absolutely help me understand this complex topic. I believe your estimates that it takes 9 man months of work to rank #1 for a high volume keyword. Of course, all of this hinges on picking a "winnable" project.

I am a big believe in the Pareto Principle where you are can get the 80% solution with 20% of the work. Loosely application of these numbers still puts you into the first page of the search with about a man month of work. This also fits in with a publishing interval with monthly pillar pieces. An example from this article is the July 27th 2016 update that caused a huge increase in site views.

Great post. I am going to go back to my (small) site and see how well these ideas can be implemented. We run a blog that pulls blog articles in into dynamic pages via different category tags. This sounds like it is another instance of faceted navigation where the categories are a facet. Each place the article appears on the crawl is a duplication. What's worse, I have multiple navigation schemes to help my human users find content. I'll send this article to my tech guy and make sure that we have the site optimized.

ok just got off the phone and this is what I understand. Since I don't do any dynamic link generation this isn't a problem for me but I could optimize my data layers and links from the landing page in general so the link authority from the landing page goes where I want it to go. I am very thankful that this forum exists to keep introducing me to SEO.

Wonderful post on a timely subject Rand. I absolutely and whole heartily believe the foundation for a free, civil society is economic freedom. I think net neutrality (or non-neutrality) is a threat to, not only free trade but also free speech. The internet information revolution has allowed so many non-traditional vectors of information flow to come into existence it's a moral imperative to keep those lines open. If anybody disagrees, I would suggest you listen to one hour of Fox News and one hour of NPR then ask yourself how you would ascertain truth if information becomes more restricted.

I have been weighing the benefits of publishing a series of small articles to boost regular content but I am worried about producing regular content. Would I use XML Sitemaps to keep the crawlers focused on higher quality content?

What a fantastic list. Anytime I am confronted with a task of this magnitude I want to apply the Pareto Principle. What do you think the 20% of the ranking factors are that really matter. I love how you did this with your 10 factors to wrap up the piece.

Tags are still a bit of a mystery to me. I have seen opinions vary from - post penguin update all tags are useless to you better tag your pages well or the machines will get you. I think your approach has merit, in that, if you look for the value added from tagging data to evaluate the goodness of the tag.

I use a professional theme and I always wonder if some of the theme options are messing up SEO with their internal tagging. The theme allows for categories to have behavior very similar to static pages including a large amount of static content particular to each catagory. I am still wondering if I need to tag these pages in some way so crawlers know how to index my site with a rather unusual combination of static and dynamic content.

We have been launched for just about a year now but this checklist is still very worthwhile. I am definitely putting some of these site checking tools into my SEO toolbox and regular routine. It's still hard to believe how many tools there are out there and how many steps there are in SEO.

Love the graphics Simon. I keep swiveling the laptop over to my wife saying - look at this. Previously, for my content calendar I was working off the social media rule of thirds where content is, more or less, divided into site conversion, pay pieces and personality pieces. That paradigm falls apart with your layered approach and I think you are probably correct. Your Content Matrix shows the high interval pieces are for advocacy and loyalty while upper right corner is the Big-bang pieces for advocacy, I think this paradigm is a much more informed and useful approach than simply dividing output into thirds.

I am glad that all my Python training can be be used at home. Installing Anaconda on a Windows box sounds painful. I just want to turn my old XP machine into a pure Linux box that much more after reading this article. I hate dealing with Window's "cleaver" ideas on what to do with file extensions.

Wow, I am just getting into SEO marketing and this post is filled with so many useful nuggets. My gut feel is that, as a newish site with low DA I would want to focus on the gap marketing techniques. How much does your current DA/PA affect which of these techinques you would use?