Google's Top Warnings & Tips to SEOs in 2017

Let's recall Google's latest warnings and consider how we should (or shouldn't?) react to them.

By: Inessa Bokhan

September 5, 2017

Google often shares some tips, news, and warnings with the SEO community, but is the company's advice the best answer to the burning SEO issues? In this post, I wanted to analyze the loudest announcements and see whether we should follow the company's instructions in each case.

Google's warnings & tips on content & onpage optimization

The first portion of warnings I'm going to look at deals with content and onpage optimization. These activities are the foundation of any search engine optimization campaign, so it's better to stay aware of the nuances.

1. User-generated content: good or bad for SEO? It depends.

In 2017, Google expressed some thoughts on user-generated content and spam. At the very beginning of the year, you could have seen this post on Google's Webmaster Central blog. The post explains how users can protect websites from user-generated spam. In case you missed it, here are the key techniques:

Keep your forum software updated and patched.

Add a CAPTCHA.

Block suspicious behavior.

Check your forum's top posters on a daily basis.

Consider disabling some types of comments.

If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they're publicly visible.

After reading a post like this, an SEO might start thinking: "Why should I bother with user-generated content at all? If it may cause serious issues for my website, wouldn't it be better to get rid of UGC once and for all?" It seems, no.

Bang! The words "quality" and "signal" work like magic in the SEO community and folks started to debate on whether they should bring comments back. Leaving aside the emotions and guesses, here's what you should keep in mind:

1. Poor comments can be indexed just like any other type of content, and yes, they can impact your search engine ranking.

2. To check whether your comments do count, go to your Search Console account and use the Fetch as Google tool. If the comments show up in the code, then they do count.

3. If you don't have resources to moderate and manage user-generated content on a regular basis, it's better to disable it.

2. Read out loud your site's content. If it sounds strange, it may not rank well.

Okay, Gary enjoys the do-you-know tweets a lot, you must have already grasped it. And that's great because it's always handy to get some additional tips from the official guys.

Gary suggested we should read page content out loud, and if it sounds weird, it may not rank well.

DYK if you read out loud the text on your page and it doesn't sound natural, that piece of text may weigh much less during ranking pic.twitter.com/IfXMKB1GFg

Or that could be just another way to identify low-quality content. If it was spun and automatically translated, it'll be really hard to read it out loud without stumbling.

Another interesting point is that this recommendation applies to any language, which makes me think Google has well advanced with their localized versions of the search engine.

So, if your landing page reads like: "Buy black shoes, black shoes are #1 trend, as only black shoes make you look your blackest" — you have to do something about it asap and this content quality audit guide might come in handy!

3. Review site architecture to stay away from penalties.

If you're as curious about SEO tips from Google as I am, you must have sometimes watched or reviewed the official Google Webmaster Central hangouts with John Mueller and co. After watching this episode, I've learnt that Google Panda looks at your site's architecture to identify the site's quality:

If you're lazy to watch the video, here's the direct quote (the question asked was: "Does Panda take site architecture into account when doing Panda score or would fixing those categories make no difference at all?"):

"When we look at Panda we see that as something that is more like a general kind of quality evaluation of the web site and it takes into account everything around the site. So that is something where if we find issues across the site where we think this is essentially affects the quality of the web site overall, then that is something that might be taken into account there.
So if you are saying that your category pages are really bad and that is something you really can improve then that is something I'd work on, I'd work to improve."

As usual, the answer's a "bit" vague, but on the whole, it looks like "yes, poor site architecture can cause a Panda penalty." How do you check your site's architecture? Fire up WebSite Auditor and let the tool analyze your site's skeleton. In a minute, you'll see your site's pages analyzed in detail under Site Structure, Pages. For a more convenient view of the structure, you can switch to the tree view.

Things to check at this point include:

1.Crawlability. If your site's not crawlable, you won't achieve high rankings no matter how hard you try. In your WebSite Auditor's project go to Site Audit, Crawlability and check the HTTP status codes of the pages. Also, make sure these pages are not blocked from indexing by your robots.txt.

2.URL structure. Work on a simple and consistent URL structure. When you group pages logically in categories and subcategories, you help both users and search engines reach your content.

3.Mobile-friendliness. Although Google's mobile-first index is not coming soon, it's better to make sure your website looks good on mobile devices. You can check this in the Page Audit module (switch to the Technical factors tab and look through the Page usability (Mobile) factor issues).

4.Site speed. Google indicated many times that site speed is one of the signals used for ranking pages. Speeding up your website is important for user experience, too. To check factors that can impact page speed, go to the Content Analysis module, Page Audit and switch to the Technical factors tab. Review all factors under the Page speed (Desktop) section.

If your site's architecture is top-notch, make sure you also check the layout of your main landing pages. Earlier this year, Gary confirmed that Google's page layout penalty "is still important". So, it won't hurt to check how big the ads on your website are and where they are placed.

4. Use ALT texts to boost Google Images rankings.

Have you ever wondered why your images aren't showing (or not ranking high) in Google Images search results? Google gives us a hint what may help with that problem:

"Anchor text (and image alt text) helps us quite a bit in understanding context, so I wouldn't leave it out if you can avoid it." — John Mueller

Many webmasters just don't bother with image optimization or don't do it regularly, though it's not that complicated. If you've decided it's time to brush up your site's images and update their meta content, WebSite Auditor will be a handy tool.

Launch the software, create or open a project and let the software scan your website for errors and warnings. After the check is complete, locate the Images section under Site structure > Site audit. There, you'll see a list of pages with images that have missing ALT texts.

Here are some basic rules to remember when creating image ALT texts:

1. Describe images in plain English.

2. If you optimize images for an e-commerce website with product variations, use model or serial numbers to make tags unique.

3. Avoid keyword stuffing ALT tags.

After you optimize the site's images and search engines recrawl and reindex them, don't be surprised to see more of your pages ranking in universal search results in your Rank Tracker project:

5. Optimize for non-English head queries.

If you have a non-English website, here's your lucky chance — Google says it needs much more quality content for lots of non-English head queries.

DYK there are many languages that don't have enough content even for head queries. If you are fluent in more languages, there's your chance pic.twitter.com/4U0tETTr3n

Unfortunately, Gary didn't mention which languages are most under-represented and which "head queries" desperately need more content.

So, what can we really do with this suggestion?

1. First and foremost, consider localizing content. Here at SEO PowerSuite, we often hear people would love to get more localized guides and posts about online marketing and software use. Grab your chance ;)

2. Analyze your top target keywords translated into different languages. For instance, your website won't rank on the first page for the "how to create a website" query. Let's launch Rank Tracker and check the difficulty score for this term and its French equivalent.

At first sight, the difference doesn't seem significant (66.5 vs. 54.8). But if we look at the websites in the chart below, we'll spot that the individual difficulty score for the query in English starts from 55.1. Now let's click on the French phrase:

Here, you can see domains with difficulty score as low as 22.1 ranking third! Obviously, this term has lower competition and will be easier to rank for.

A word of reminder: do not try to quickly gain traffic by adding machine-translated page copies to your website. Not only will it bring poor results, it may cause a penalty for your website.

Google's warnings & tips on link building

Ah, link building! The most debatable SEO topic ever. Obviously, Google can't but tease us with some tips on this issue. So, what did Google officials mention in 2017 with regard to links?

1. You can get unnatural links from good sites and natural links from spammy sites.

Sounds confusing, but nonetheless that's exactly what John Mueller stated on his Twitter account:

You can get an unnatural link from a good site & you can get a natural link from a spammy site.

Does that mean backlink quality doesn't have much to do with the backlink page quality? I think Google does consider both, but the algo seems to be more focused on detecting the unnatural character of a backlink.

How to check which of your site's backlinks are unnatural?

Check your Google Search Console account for any alerts. If you don't see any manual action alerts, you're good. But if you have some, that's the problem you just can't ignore. Download site's backlinks from Search Console for further analysis.

Fire up SEO SpyGlass, create a project for your website, and let the tool analyze your site's backlink profile. If some of your site's links have not been found in the project, copy and paste the links downloaded from your Search Console account.

Make sure that all of the backlink pages return the 200 HTTP status code. You don't want to lose time on the pages that are not live. Also, check that backlink pages don't have zero external links. These actions will help you save lots of time when working with large backlink profiles.

Take a look at the Penalty risk column. The links marked red pose danger to your rankings and most probably they have caused you trouble.

Surely, some links may have a normal Penalty risk score and can still be viewed as unnatural by Google. While doing the backlink profile analysis, pay attention to:

Links that have commercial anchor text (e.g., "best wedding suits").

Links that were on any SEO company reports.

Sites that have no reason to link to you.

Sites that are off-topic.

Sites that are in another language.

Sites that have experienced a major drop in rankings.

Spotted such links? Select them, press the right-click, and choose Disavow. After that, you can go to Preferences > Disavow/Blacklist backlinks and import the rules into a .txt file.

Before you rush to Google to submit the backlinks using the Disavow tool, you should remember that Google wants to see you've done some work too. It means, if there are links over which you have some control, do take them down first. It's only natural that some (not all!) links can't be easily removed and those should be disavowed in the first place.

Depending on how severe your case is, it can take from 3 weeks to months to see full or partial recovery for your rankings.

2. Google may frown upon links built via articles, guest posts, and contests.

If you've looked through the results of our recent link building survey, you should know that SEOs have become much more cautious while building links. No one wants to risk their site's traffic and conversions, as it's not always easy (and fast) to recover from a penalty.

Google's universal advice to SEOs hasn't changed much — create quality content and you'll rank — this is all we've heard for many years. At times, however, the Google gurus take a look at what real SEOs are doing and share their thoughts and tips.

For instance, in February, John Mueller warned SEOs about using contests for links:

"I think the situation where you're doing kind of a contest is something to watch out for. So, mostly with regards to whether or not these are actually clean links or not. In the sense of are you perhaps exchanging something for these links like do you require a link in order to take part of the contest."

So, if you plan to run a contest, make sure you do not require people to link back to your site. If your contest is just awesome and hundreds of people wanted to link back just to spread the word at their own will, you should be alright. Seems pretty straightforward, which I can't say about Google's stance toward guest posts and articles.

Again, Google keeps an eye on guest posts.

For years, bloggers, SEOs, webmasters have been successfully engaged in guest blogging activities, when all of a sudden Matt Cutts told us to "stick a fork in it" because "guest blogging is done". The truth is guest blogging has never been dead, it's been an effective technique so far, but only when done the right way. I think what Matt really wanted to say back in 2014 was that it wouldn't be that easy now to use low-quality guest posts to manipulate Google's search results.

Three years passed and again Google issued a warning about "links in large-scale article campaigns." How do people at Google tell you've been involved in unethical guest posting?

They're looking at the following factors:

Stuffing keyword-rich links to your site in your articles

Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites

Using or hiring article writers that aren't knowledgeable about the topics they're writing on

Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel="canonical", in addition to rel="nofollow", is advised)

So, it seems logical we can safely keep on building guest posts that:

Have links that are useful for the content of the post.

Have links without commercial anchor texts.

Have links that point to multiple authoritative sites.

Are published on high-quality websites.

From all I've heard, guest posting is far from being dead, but as with any other link building technique, it's been more challenging and time-consuming than ever before.

3. Google on the use of disavow files.

Even if you've never used Google's disavow tool, you probably know it's used to communicate with the search engine bot about your site's backlinks. When webmasters suspect they have poor and unnatural backlinks, that's the number one tool to fix the problem. Is there anything new we should know about it? I've discovered a few bits.

Do some disavow link pruning.

Imagine a situation: you get an unnatural backlink alert in your Search Console account and go on a frantic hunt for those bad links. You review your backlink profile with scrutiny and when the job's complete, add all the found toxic links to the disavow file. But, for some reason, your rankings drop even lower!

That means you might have overdone the check and disavowed the links that were part of your ranking success. In a Twitter discussion, Gary Illyes mentions:

^ that, and if you see your rankings dropped after a disavow, just remove the less shady links from the file. You have total control

This is something new. What Google says is we can experiment with the disavow tool to get better rankings, we have "total control", according to Gary.

If you've suffered a ranking drop after disavowing lots of backlinks, you should roll back and take another look at your backlink profile. Again, you can easily manage it with SEO SpyGlass. The pro tip — you can add comments to the links you add to/remove from the disavow file, so you won't get confused after a couple of reviews and edits:

And the second interesting quote (coming this time from John Mueller) mentions that you can use Google's disavow file to tackle negative SEO attacks. If you're curious, here's Bill Hartzer's story behind the tweet. In short, an online marketing agency received an email with a negative SEO threat and Bill asked Google if he could simply ignore it all. John Mueller advised using a disavow tool:

Sorry to hear about the hassle -- this comes up from time to time. I'd disavow (& maybe send me the list). https://t.co/T0GkkI4p5j

Ranking "for a competitive query" usually takes lots of good backlinks, but for some reason, Gary doesn't want to say it loud. Instead, he suggests there's nothing else you should do, but add some optimized images and videos to your website.

And the icing on the cake was this reply to one of the questions in the thread:

Possibly, it's just me that I don't fully get it? Maybe he replied positively because he meant — yes, for sure you can easily rank without any backlinks. On the 9th page of search results. Logged in your Google account. Or he was in an "everything' s possible" mood on that day, I don't know.

I think most of you would agree with me that we can't cross out links once and for all from the SEO strategy. Links have been the most precious SEO asset for many years (72% of SEOs believe backlinks are a significant ranking factor), but for some reason, people at Google seem to be severely allergic to the term "link building".

The bottom line

Last February, Maile Ohye of Google posted a "How to hire an SEO" video. The best part for me was the moment when Maile said: "In most cases, SEOs need four months to a year to help your business first implement improvements and then see potential benefit."

Why am I so enthusiastic about it? Well, if Google starts "protecting" SEOs from clients, that means the company's become more favorably disposed towards them. Secondly, that's probably the first time Googlers say how long it really takes to see search engine ranking results after optimizing your website. And the mentioned timeframe is absolutely realistic to me — that's what I've seen myself in the course of my SEO career.

I hope we'll see Google being even more open and cooperative in the coming years. In the long run, it'll benefit everyone — the company, users, and search engine specialists.

Did I miss any important SEO warning or tip issued by Google this year? And what do you think about these warnings — do you always take them seriously or just keep working the usual way? I'd love to hear your thoughts in the comments!