Google's March 2019 Core Update - Full Analysis Of Over 1,700,000 Pages

by Eric Lancheres March 25, 2019

Jan 2nd 2020 Update:

We're still helping people recover from the March core update inside Traffic Research. Want to be another success story? Check out Traffic Research's members club before it's too late!

June 19th 2019 Update:

Since this paper was published, I have helped numerous people recover from the March 2019 Core update inside Traffic Research. In addition, we had numerous members recover from the August 2018 updates by following the Traffic Research coaching.

With the recent June 2019 update, many Traffic Research members have seen significant traffic increases as they are reward by Google. If you have been impacted by a Google algorithm and are looking to recover, then I STRONGLY recommend subscribing to Traffic Research. This is the best way to recover from an algorithm update.

I wasn't the only one feeling the tremors, before I had time to visit the search engine forums, one of my clients emailed me:

"Our site which had recently dropped during the medic update just experienced a 30% recovery."

Unfortunately, as I dove into the chatter online, I quickly discovered that significant increases AND decreases had occurred around the web. While my inner circle seemed fairly pleased with this update, many webmasters saw decreases of up to 80% in traffic overnight.

Here's a full analysis of what happened during the March 2019 Google Core Update.

Preamble

This analysis combines data from various sources (both external and in-house) and uses multiple tools (internal and external) to process it. If you would like to submit additional data or request data be removed, please contact me.

Any Google Search employees that would like to submit corrections or refute any information here are also welcome to contact me.

While the raw data is irrefutable, it is important to note that correlation does not imply causation.

As the search engine industry matures, we will hopefully move away from bogus claims such as "health sites were affected and therefore the update specifically targeted the medical industry" that have been erroneously echoed in the community over the past year.

Instead, the data will be used to support or disprove theories about the algorithm change so you get a bigger piece of the Google traffic pie. With that said, let's get on with it!

Site Wide Or Page Specific?

One of the first questions I ask with every large update is:

"Was it a page specific update or a site-wide update?"

As I scanned through our data, Isaw multiple instances of site-wide changes. In other words, nearly all the pages on the affected sites increased or decreased.

The update officially rolled out on March 12th 2019 while traffic fluctuations were seen on the 13th & 14th.

The community name for the update was "Florida 2" because it occurred during the popular Florida web conference

While there can be exceptional pages, I believe that the March 2019 core update affects the entire domain.

(Contrastingly, Google's Penguin algorithm is currently said to work on a page specific level)

To illustrate what this means, if we simplify the Google algorithm into one overly-simplistic equation (many elements have purposely been ommited):

(Links) X (On-Page) = Ranking Score

Then this update would wrap itself AROUND the equation, acting as a multiplier.

[(Links) X (On-Page)] x (Multiplier) = Ranking Score

Sites that would have seen a decrease in rankings have had this multiplier reduced. For example, 0.75 would reduce all your ranking scores by 25%.

While sites that experienced ranking increases would have been assigned a higher multiplier. I'll expand more on this multiplier later on.

The March 2019 Google update is a site-wide update that affects all pages on a domain.

Word Buckets & Neural Linguistic Networks

While there has been some chatter on the internet about Google updating their neuro-linguistic algorithm (responsible for understanding the words on the page), I examined specific site patterns to see if there had been any changes.

Ahrefs organic traffic of raccars.co.uk

This graph clearly shows that the update increased the rankings of the entire site while keeping the quantity of organic keywords the same.

While sites will usually rank for more keywords as they increase in overall power (and rank score), it was particularly interesting to find a site that didn't.

The reason it's an important discovery is because this indicates that sites are increasing in traffic because of higher rankings... and NOT because they are explicitly ranking for more keywords.

The indication that the keywords remained reinforces the claim that Google did NOT make any significant modifications to it's neuro-linguistic engine.

To support this theory, I compared various metrics such average word count, exact match in the title tag and related terms used on a page.

The word count remains essentially the same pre and post update.

I saw NO change with how often the exact match was used in title tags.

I saw no significant changes with how often related terms were used in the new March 2019 update.

Therefore, if you have been impacted, is it likely NOT because Google has a new preference for certain words on your page.

The March 2019 Google update does not affect the interpretation of words on a page.

Not Quite A Reversal

Another rumor that started to float around was that this was a reversal of the August 2018 medic update. They pointed to sites that were previously impacted by the update, claiming they had 'recovered'. One such example is draxe.com. Although it is being classified as a ‘winner’, it is in fact, just slightly less of a loser:

Source: Ahrefs organic traffic chart for draxe.com

It appears as if draxe.com was hit by the August 2018 ‘medic’ update and now it has bounced back a little from the massive drop.

A similar situation occurs here, with medicinenet.com

Source: Ahrefs organic traffic chart for medicinenet.com

It is still impacted, just slightly less.

The implication is that the March 2019 core algorithm update focused on similar ranking functions previously modified in the August 2018 medic update.

It’s important to note that the traffic did not return to it’s original state, the negative impact is just less severe than before.

In addition, some sites that were not affected by the August 2018 Medic update were impacted by the recent March core update.

Therefore, because:

1. A high ratio of sites impacted by the August Medic update saw varying degrees of change.

I can safely conclude that the March Core update was NOT just the Google engineers hitting the "undo button" and instead, it is an evolution of the algorithm.

The March 2019 Google update is an evolution of the previous Medic update of 2018.

Full Recoveries Exist

I did, however, see full recoveries happen with this latest Google core update.

Here is one such example. The site was originally hit by the August 2018 Medic update and has subsequently made a full recovery. I suspect that the site was originally impacted due to over-optimization.

Source: Ahrefs traffic chart of of cbdoilusers.com

While the site seems to be highly keyword optimized, the content itself is quite good and the user experience seems promising.

Because I do not have access to the direct bounce rate or specific user metrics, I can only assume that visitors landing on the site would visitor multiple pages. The website is well crafted, easy to use and as a human, it is a resource I would consider bookmarking if I was in the market for CBD oil.

However, from a search engine spider point of view...

There are multiple examples of over-optimization:

- The sidebar navigation contains an excessive list of all the keywords they are trying to rank for on the site. (See below a partial snapshot of review keywords)

- Extensive usage of keywords in articles, content and navigation. (See below a partial snapshot of the homepage)

Source: CBDOilusers.com sidebar links snapshot on March 2019

This is only a partial view of the sidebar... it continues with dozens more of reviews.

Source: CBDOilusers.com homepage snapshot on March 2019

This is just a sample of the abnormally high quantity of keywords being used throughout the site. Nearly every single navigational link has the word "CBD" in it!

This might have contributed to the site originally being affected by the August 2018 update however with the March 2019 Core Update, it appears as if Google is willing to overlook over-optimization as long as the user experience is satisfactory.

This supports my theory that the site was originally impacted in August 2018 due to on-page over-optimization and that the March 2019 update is focused on metrics provided by user experience.

The weight balance of the major rankings factors seems to have shifted, reducing the impact of links / on-page and increase the weight of user metrics.

The March 2019 update prioritizes user experience above over-optimization. Sites that provide a good user experience have seen an increase in traffic in spite of having certain on-page issues.

Previously Unaffected Websites

As previously mentioned, some sites that previously weren't affected by the Medic update were hit by the March Core algorithm update.

According to Ahref traffic estimate, one of those sites is everydayhealth.com which grew quickly and then got struck down on March 13th 2019.

Source: Ahrefs organic traffic chart of everydayhealth.com

While a quick analysis with Screaming Frog SEO Spyder, Ahrefs and various other tools revealed nothing abnormal with regards to backlinks and on-page optimization, it's only when you load up the page with a browser that you begin to uncover issues.

This is an important clue! When sites seem to be doing well according to tools but it the traffic and rankings drop, it usually indicates something else is happening that cannot be measured directly with third party tools (such as poor user experience, bad click through rate metrics, etc).

So what does the site look like?

Source: Snapshot of everydayhealth.com March 2019

There is an over-bearing amount of ads, a fixed ‘sticky’ sidebar too that add up to a below-average experience that is compounded when you have an ad-blocker.

As a reader seeking health information, I personally wouldn’t stay there long!

It gets worse too. After a little while, I'm attacked by pop-ups. This is the SAME page as above (a few moments after closing the initial pop-up and scrolling down a little)

Source: Snapshot of everydayhealth.com March 2019

While my experience is subjective, I suspect that I'm not the only one that would click the back button to go back to Google after landing on this site.

Considering that this site does everything else 'right', it supports the idea that the user experience is at the center of this latest Google core update.

Excessive advertisement that lead to a poor user experience is one of the main reasons why sites lost traffic during the Core March 2019 update.

ECommerce Traffic Loss

While I have been mostly focused on information sites, they haven't been the only ones affected by the March 2019 update.

The jewlery giant, HSamuel.co.uk, was also affected according to Sistrix & Ahrefs traffic estimates.

Source: Ahrefs organic traffic chart for hsamuel.co.uk

While traditional ranking signals such as backlinks and content seem to follow SEO best practices, the desktop usability is horrendous.

The top menu bar is so big that when you scroll over it, you can barely use the page. To add to injury, when I was browsing the site, I kept on accidentally hitting it and would have to maneuver my cursor off my screen in order to resume browsing.

Source: Snapshot Hsamuel.co.uk navigation bar

While I don't have access to the direct data, I suspect the usability of this site would be less than it's competitors.

In addition, scrolling through some category pages (such as the wedding rings), I noticed there is hardly any text and the content is hard to consume.

I suspect that the header navigation would not bother robots but would but would strongly annoy humans! Enough to cause them to quickly leave the site after landing on broad categories such as wedding rings.

It's important to note that according to traffic estimates, this site was impacted but not devastated (suffering an estimated 25% traffic loss). This supports the theory that the user experience multiplier can vary depending on the severity of the problem.

Affiliate Site Traffic Loss

How are more aggressive, shadier sites doing? One semi-famous 'grey hat' success story, ironjunkies.com, has been ranking quite well using a combination of expired domains, redirects, PBNs and other techniques frowned upon by Google.

So how did they fare during the March 2019 update? It appears as if they dropped by about 50% in traffic... but perhaps not for the obvious reasons.

Source: Ahrefs organic traffic chart of ironjunkies.com

Looking at a specific page that went from position #4 to #11,
https://ironjunkies.com/testosterone-boosters for one of their biggest pages.

Source: Snapshot of ironjunkies (2 tables of content are better than one... right?)

They have a TON of advertisement & affiliate links on the page. Over-optimized on the sub-headlines. They have 2 tables of content? They aren’t using any alt-tags in their images, nor relevant filenames.

While it would be easy to point out that poor links are responsible for ironjunkies' recent drop, I highly doubt it.

Instead, I believe this is another example of a site growing too quickly (due to a large amount of "authority" links) while disregarding user experience.

Excessively commercial pages masquerading as informational pages were negatively impacted by the March Google update.

Growing Too Quickly

While there is certainly no limit on how quickly you can grow, webmasters have gotten VERY good at acquiring powerful links that result in rapid growth... regardless of content.

Looking at health24.com, I notice that they have a mix of good and bad links while covering a wide range of health topics.

Source: Ahrefs organic traffic chart of health24.com

Unfortunately, the quality of the content is poor and it makes me wonder if the people linking to the site actually read the content before linking to it! Either that, or many of the links have been manufactured.

Although I do not have direct access to their analytics, I suspect that that people are bouncing off the site rather quickly.

Source: Snapshot of health24.com on March 2019

While they continue to receive natural links from various organizations and continue to produce keyword targeted content, it appears as if it's the generic "no real value added" content that is driving visitors away. The site is littered with re-posted content or very low expertise content that was likely outsourced from a content service.

My data clearly indicates that Google has not changed it's language interpretation so I can only assume that health24's traffic loss was a result of the visitors not enjoying the content they are being presented.

This is reflected in the overall data as there seems to be slightly less ads on pages ranking.

It appears as if this website grew too quickly (due to an abnormal amount of powerful links) and the poor user experience (stuffing the page with ads) seems to have caught up with them.

Sub-Domain Penalty

While getting high authority links is never a bad thing, getting undeserved links can sometimes lead to sticky situations in the future.

It begs the question: "Why does this website have so many powerful links if the visitors keep bouncing off after a few moments?"

You might have links because you're a known brand, maybe you have a large marketing budget or perhaps you own another high powered property... in this case, it's possible that your main domain is powering up your sub-domain.

This is VERY interesting because it clearly shows that your sub-domains can be affected while your main domain remains intact.

Looking at the blog portion of Bulletproof on the sub-domain blog.bulletproof.com:

Once you're done reading the content, you are left with no real other option but to close the window or hit the 'back button' on your browser. This might inadvertently lead to Google believing this site is providing a poor user experience when pitted against it's competitors.

This is in sharp contrast with the website's ecommerce portion, www.bulletproof.com that contains ample navigation.

Source: Ahrefs organic traffic chart of www.bulletproof.com

While the blog's traffic sees a significant drop, the ecommerce portion remains relatively stable (although it does rank for less keywords after this update).

Because both of these properties share a lot of the same link power due to heavy interlinking, I suspect that Google is assigning different scores to both sub-domains (likely considered as different entities).

Sub-domains are independent from each other and can be affected differently by the March 2019 core update. ie: blog.site.com suffers a penalty while www.site.com remains unscathed.

Keyword Cannibalization

Interestingly, sites that previously suffered from keyword cannibalization issues seem to have seen a slight resurgence with the March Core update. Perhaps this is a coincidence, however many sites that might have previously been artificially held back by cannibalization seem to have increased in rankings.

My theory is that as long as the user experience is good, Google will now overlook site issues such as cannibalization whereas, in the past, it would have prevented you from ranking.

One such example is raccars.co.uk which had a TON of pages competing for the same keyword.

Source: Ahrefs organic keyword graph for Raccars.co.uk

Near the end of 2018, it seems to have picked one page and after the recent update, it increased in rankings. While we aren't sure that this is specifically related to the March update (we know Google deploys multiple updates throughout the year), this might be the result of a combination of updates.

I uncovered that many of the big winners were littered with cannibalization issues:

Source: Ahrefs organic keyword graph for Raccars.co.uk

And another site, waytoosocial.com that saw a significant increase in spite of having keyword cannibilization issues.

Source: Ahrefs organic keyword graph for WayTooSocial

These sites have been following historical best practices of producing keyword focused content which might have lead them to be overly optimized.

Sites that provide a good user experience have seen improvements in rankings in spite of lingering keyword cannibalization issues.

Links (Off-Page)

At first glance, it was impossible to tell if a site was impacted by the March 2019 algorithm just by looking at the links. Here are a few site graphs, can YOU guess which ones were hit?

Source: Referring domain chart of a "Winning" website

Source: Referring domain chart of a "Losing" website

Source: Referring domain chart of another "Winning" website

While links are still extremely important, they do not seem to have been the main focus of this core update. During my manual review, I did not find any correlation between the sample sites increasing or decreasing in rankings with regards to any new link activity.

Some sites with excellent backlinks saw a traffic decrease while sites with average backlinks saw a traffic increase (and the other way around).

However, while inspecting large sets of data, I discovered that the average quantity of backlinks went DOWN ever-so-slightly.

The average quantity of backlinks on page 1 of my large-data set lowered by 6.1%.

However, in regards to raw power. I found that the number remained the EXACT same.

This indicates that while the quantity of links diminished on page 1, the raw power of the links pointing to pages on page 1 stayed the same. In other words, having fewer (but high quality) links is the way to go.

It appears as if site-wide links are slightly less useful than before. While you can definitely have more, the first link received from a domain will have the biggest benefit.

One big surprise is that the page 1 results contained significantly more government links. While I saw .edu links slightly droppping, .gov links were a particularly big winner.

* The difference is so significant that I'm going to have to revisit the data to make sure there wasn't an outlier that caused a disruption in the data.

Perhaps this is just a coincidence however it appears as if the new trusted backlink authorities on the web are now government websites. In comparison .edu links, .gov links are extremely hard to acquire so it might make sense that Google is classifiying government sites as authorities.

However, perhaps this is just an unintended consequence of the algorithm change.

Because the average page power has remained the same over the data set, I conclude that while Google might be changing it's preference for the types of links it values, the importance of links remains about equal in this update.

It appears that authoritative links no longer excuse a poor user experience which would explain why sites, regardless of link profile, saw increases and decreases in rankings.

#1 The Core March 2019 update does not seem to penalize sites because of links.
#2 The link power required to rank has remained the same while the quantity of links diminished.
#3 Government links are trending as trusted properties.

On-Page Changes - Search Terms Within The Page Content

While nothing catastrophic has happened with regards to on-page optimization, I discovered some interesting trends during our recent data analysis of over 1,700,000 pages.

First, the data suggest that Google is continuing to move away from having the exact match keyword from within the content. I have been seeing this trend for quite some time now where the order of the words in the page don't seem to matter as much as they did before.

While exact match keywords (written exactly like you would find it in a keyword tool) can still rank, I see them less than ever. Instead, I'm seeing an increase in keyword focused pages that prioritize content quality.

The quantity of exact match keywords within the content ranking on page #1 decreased by approximately 5% according to my data samples.

This might be explained by the fact that Google seems to be increasing the weight of user generated signals over traditional on-page ranking signals. People don't care about finding the exact keyword within the page as long as the content is good.

As I uncovered more and more traffic recoveries, it seems as if Google might be willing to overlook overly optimized websites (for instance, path / high keyword count, etc.) as long as the user signals vouch that the content is good.

While the March core update still relies on keywords, LSI and synonyms to rank pages, the exact match term is found less often than before.

Commerciality Takes A Hit

While it has long been rumored that Google helps big brands rank better on Google, I have discovered some interesting data that might prove that Google is looking out for the little guy.

This might be due to sites containing an excessive amount of ads causing a poor user experience.

This seems to be quite a significant change!

While I don't believe for a second that Google would punished anyone for running outside ads, it hints that sites with a high commercial intent (who are likely to spend quite a bit in advertising) are performing slightly worse than sites with lower commercial intent.

If your content is designed to sell first (and not help the user), it is likely that this will reflect in the user behavior signals captured by Google.

Websites with a high commercial intent are performing slightly worse for commercial queries after the March 2019 core update.

Load Speed

According to my data sample, I noticed that faster servers and load times are performing better after the March Core update.

There are two reasons why this might happen:

1. Google might decide to increase the weight of server speed as a ranking factor in their algorithm. I highly doubt this.

2. More likely is that because faster load times lead to a better user experience. Faster loading pages might be an indirect benefit because users are happier and less likely to return to Google when a page does not load.

Quick loading websites are more prominent after the March 2019 update. This is likely because faster pages lead to a better user experience.

User Experience

The Google algorithm uses a mix of both direct factors (things that the Googlebot spider can crawl such as sub-headlines, word counter, images) and indirect factors (such as click-through-rate) to determine how well a page should rank within the search engine.

There is an abundance of evidence that suggests that the latest March 2019 core algorithm update has increased the weight of indirect factors that fall within the category of 'user experience'.

In layman's terms, this means that Google is putting more emphasis on how humans react to content rather than the absolute data returned by the Google spiderbot.

I believe this is a positive change as it allows for more creative freedom and doesn't pigeon hole webmasters into following strict templates in order to rank on Google. The idea is that if you have appealing title, a well crafted page and you deliver the information promised, you’ll likely have a high ranking website.

While I would love to tell users "Do what you want and as long as you make good content you'll rank", we're not quite there yet. In fact, we seem to be a at point where:

You must deliver content that follows Google's best practices AND provide a good user experience in order to rank.

Let's break down the "user experience" part a little...

While Google has access to an abundance of data (seriously, they know where I am currently typing this from) they have been secretive as to which parts of data they are using for search.

1. One measure we know for sure is click through rate. That is, how often users click on your website per impression on Google.

For example, if your website shows up as the #1 result on Google but most users click on the #2 result instead (perhaps because the title or meta-description is more appealing), you are likely to suffer and quickly drop from the first result. Brand recognition can play a role here as users are more likely to click on a site that they like and recognize.

2. Another metric that we can be confident is used by Google is the "bounce back to Google". That is, how many users click on the "back" button on their browser to return to Google after they clicked on a search result.

For example, when user A clicks on the #1 result on Google's search, he is sent to a certain webpage. If user A only stays on the website for a few seconds and clicks the back button on his browser to return to the Google search engine listings, this "bounce back to Google" signals that the user did not find what he was looking for on the website and is returning to Google to try another result.

If the user A then clicks on the #2 search result, finds the information he was seeking and closes his browser. Google can assume that the informational query was satisfied by that webpage.

This is likely complicated by the fact that modern web users open up multiple tabs, perform multiple searches for similar keywords and have diverse objectives.

While Google won't tell us exactly how they measure user experience (likely because of privacy reasons and spam reasons), we can safely assume that there are more metrics and data accumulated that contribute to the overall quality score of a website.

Chrome warns users that it collects anonymous browsing data, Google Page Speed insights presents comparative user load time for different websites just to name a few...

The bottom line is that if you're producing enough volume for Google to accurately measure user behavior and that data tells Google that users are NOT enjoying your website, it is likely to assign the entire site a lower score.

Website Score

One thing that's clear is that with every major core update, Google recalculates a website score based on the newly updated algorithm. This is why we see drastic changes across the web on the same day.

This new score affects all your pages and rankings on your website.

While many websites have received the same score as before (noticing little-to-no impact), the sites in the winners and losers category have seen a different score applied to their site.

Using an overly-simplified (many factors were omitted from this equation) equation:

[(Links) X (On-Page)] x (Website Score) = Ranking Score

Sites that were previously ranking well due to a high amount of authority links were toned down (slightly) while sites with good content that was previously under-performing has been rewarded.

Fortunately, it has previously been stated by Google that the score is recalculated in real time (as long as it accumulates enough data) so you don't have to wait until the next big algorithm change to recover from it if you have been impacted.

Recoveries

While it's too early to outline a recovery procedure that has worked on the March 2019 update (as far as I can tell, no one has experienced a drop and recovered from this update yet), I can share proven recovery procedures that have worked with previous updates (medic) and I can share the steps taken by some webmasters that have experienced a 100%+ increase in traffic during the March 2019 update.

Let's look at the 3 major trends:

#1 Sites that were positively impacted during the March 2019 seem to have been over-optimized while providing a good experience. These sites were likely under-performing before this update.

#2 Sites that were NOT impacted during the March 2019 seem to be well-optimized and provide a good user experience. (Or perhaps are not well optimized and continue to provide a poor experience. In other words, nothing changed.)

#3 Sites that were negative impacted during the March 2019 seem to suffer from a negative user experience.

Recovery Procedure

The procedure that has worked in the past to recover from core quality updates is as follows:

2. Improve, merge or delete the problematic pages that are hindering the overall site's performance.

3. Identify the elements repeated throughout most of the pages that may contribute to a poor user experience and cause users to bounce back to Google / leave the site prematurely. (This might be excessive ads, poor navigation, slow loading webpages, etc.)

4. Split-test improvements on the page in order to improve the engagement of users. (Increasing time on page, time on site, page views, reducing bounce rate, etc).

7. Perform a site wide analysis of the title tags & meta descriptions of the site. Address outdated title tags that might be contributing to a lower click through rate (for example, title tags that contain an old date).

8. Perform a complete backlink analysis, specifically looking for site-wide links that might have previously been helping. Focus on increasing page power while minimizing the quantity of links. (Get fewer, but more powerful, backlinks from authority sources.)

Big Takeaways

Takeaway #1
Overall, it appears as if Google is focusing less on direct factors and more indirect factors.

Google didn’t go in and tweak things like “we need more headlines” or “we need bigger pages with more words”. That’s not how they operate. Instead, they are focusing on the outcome more than the actual process. So webmasters have more freedom to be creative as long as the outcome (happy users) is satisfied.

Big takeaway #2
Sites that don’t advertise have seen an increase in visibility. Typically larger sites with commercial intent advertise on Adwords. This means that Google is actually helping ‘the little guy’ that has a site with less commercial intent (or at the very least, doesn’t outright advertise as much.) I personally believe that sites with too many ads and pop-ups were impacted by the latest update... it's getting ridiculous how many pop-ups there are in 2019!

Big takeaway #3
The quantity of backlinks is now slightly less important than before while the power of backlinks pointing to pages ranking remains the same. This indicates that Google is rewarding pages with fewer, but higher quality, links.

Gov links appear to be big winners. Perhaps .edu links are too easy to get… but getting .gov links is still quite hard so they appear to convey more trust.

Takeaway #4
A bias towards slightly faster servers and load times has been observed. This MIGHT be because faster load times lead to a better user experience and not necessarily because Google is rewarding load time directly. (It might be an indirect benefit because users are happier)

Closing thoughts

The March 2019 Core Google Update is a sign that Google is listening to users and continuing it's effort to produce better search results. Sites that contain an excessive amount of ads, pop-ups and distracting elements were negatively impacted (as they should) while sites that focused on helping users saw an increase in traffic.

I suspect that many of the recoveries were sites that were previously operating with a lower than optimal website score. The latest Google update seems to be a little more lenient on sites that might be a little over-optimized.

This might be Google's subtle answer to all the webmasters in the forums shouting: "My content is good! Users LOVE my content! Why am I not ranking?"

In addition, we see that the trend of getting fewer, but higher quality, backlinks continues to rule the search engine landscape. While the overall power (and impact) of powerful links has not changed, it's nice to know you'll do just fine with a handful of high quality authority links.

March 2019 Core Google Update Feedback

About The Author

Eric Lancheres is credited with being the first SEO professional to discover the solution to the Panda & Penguin Updates.

In his coaching, Eric has helped dozens of entrepreneurs recover from Google penalties through his SEO consulting and regularly helps businesses raise their profits by 20-75%. Additionally, on his own, he’s created affiliate websites that sometimes generate up to 50,000 in a month, entirely on organic traffic.

These achievements have made him a featured speaker at Traffic & Conversion Summit, SEO Rockstars, & Internet Marketing Party. Although he is no longer taking on clients, he currently coaches and helps people with SEO challenges inside Traffic Research.

We have absolutely no control over Google, Bing, Yahoo, Facebook or any other entity. Therefore, we cannot control any of the changes that might occur by your actions. We will not be held responsible for any modifications, penalties, ranking and/or traffic fluctuations that may result from following the content on this website. Even though we always try to provide the best information, you understand that you are responsible for all changes involving your website. Proceed at your own risk.