Thursday, March 19, 2015

Search engine optimization has significantly changed in recent years much to the amiss of Google’s guidelines. As you move into 2015 and look to grow your website’s organic traffic, what will be most important for you to use for on-page SEO? Here’s a list of on-page SEO techniques that you are probably already using but are still important in 2015 and also some newer developments in SEO that you should consider in your strategy.

Include these On-Page SEO Tips in Your 2015 Strategy:

1. Optimize Your Site Page Around One Keyword or Topic

The days of keyword “stuffing” are over, but you still need to keep your site pages optimized around one central idea and keyword. Keywords should appear in important on-page elements like the page title, heading, image alt text, and naturally throughout the page copy, but you should still be sure to craft each of these items for humans, not search engines

2. Remember that Keywords Are Important But Not Verbatim

Considering Google announced in 2014 that their paid search services, AdWords, would no longer rely on exact match keywords but also co-varieties of a keyword, it is likely that the same holds true for organic search, although it has not been explicitly announced. Keywords no longer need to be the exact same variation as displayed in your keyword tool. For example, the plural keyword, “inbound marketing tactics”, is equivalent to “inbound marketing tactic” in singular form with AdWord’s new targeting strategy. In addition, even if a searcher misspells, Google will still help them find your website despite the variation of the keyword optimized on your site.

3. URL Structure Should Be Short, Descriptive and Help to Categorize Your Website

A URL is one of the first things a search engine uses to determine page rank, which is why it is really important to make your URLs easy to crawl. You can do this by keeping URLs short (this is also beneficial for UX), aligning to the page’s topic and keyword, and ensuring that URLs help you to categorize your site pages.

4. Optimize Page Titles

A title tag is used by search engines to display a page in search results and can also be found at the top of your browser. Title tags tell search engines and searchers what the page is about. Since Google will only display between 50-60 characters in the title tag, you should keep title tags under 55 characters and try to drive people to click with compelling copy. You should also put keywords or topics towards the front of the title.

Wednesday, March 18, 2015

No SEO means no visitors from search engines. If you don’t do it then search engines can’t categorise and rank your site for keywords relevant to your business.

Both on-site SEO and off-site SEO are required. You can’t achieve good results doing one without the other.

Start doing SEO now. The longer you leave it to start, the further ahead your competitors will be, and the harder it becomes to rank higher than them.

Know your competition. Find out what the sites ranking on the 1st page for the keywords that you want to rank for have done, on-site and off-site, to get there.

No two websites are the same. An SEO strategy that worked for someone else’s site isn’t guaranteed to work for yours because there are so many variables.

SEO doesn’t have to be expensive. You can get big results on a small budget if you invest time in creating good content and building online relationships.

SEO results aren’t instant. The results of SEO work done today might not become apparent, and might not be credited by search engines, for weeks, or even months.

The newer your website is, the more patient you will need to be. It takes time to build authority and trust, and until you’ve developed both, you shouldn’t expect to outrank older, more established sites.

Never consider your website to be finished. If you want your site to continue to rank higher, attract more visitors and make more sales, then you should always be adding to and improving it.

Adapt to algorithm updates. To attain and retain good rankings you need to adapt your SEO strategy as search engines evolve over time.

Tuesday, April 1, 2014

One trend we've seen in the past year or so is that link builders are now extremely busy with link removals. Getting links to your web site has never been easy, it is a talent to get webmasters to read your link requests. But getting link removed is often harder.

Why is it harder? Well, asking for a link is a positive thing. People like positive things. You tell them they have a great site, you are wonderful, etc etc. But link removal requests are negative and people do not like negative things. You are asking a webmaster to remove a link you have to them because Google thinks it is low quality. How dare you tell a webmaster their site is low quality!

That being said, some webmasters and SEOs are getting creative with link removal requests.

As I covered at Search Engine Land, diamond shop Brilliance, did just that. They sent emails with a pirate theme and hoped the webmaster would take the bait. And some did. Here was the email they sent out:

Shai Barel of Brilliance said it worked well and even shared three email responses from webmasters who complied with the link removal request. Some of those replies were fun as well:

Ahoy Ye Matey website removed.

I appreciate the way you approached bloggers about this so I will waive my typical fee and remove the link.

The other day, Steve Plunkett asked Matt Cutts of Google, what percentage of penalties/filters are based on violations of Google's Quality Guidelines versus technology issues or bugs on the webmaster side?

matt Cutts responded that "almost all for quality violations." Matt added that "violations of tech guidelines typically just result in those pages being pruned" instead.

I don't think this comes to any as a surprise but good to have in writing.

Google has a new design and with that your title tag length, the blue clickable link in the Google search results, may be impacted.

Pete Meyers from Moz posted New Title Tag Guidelines & Preview Tool. He says the new title tag length is not set at 55 or a specific length, but ranges between 42 and 68 depending on Google's algorithm.

Here is the distribution chart on how likely it would be a specific character length:

I should note, this has no impact on rankings. Just because Google cuts off your title tag, it doesn't mean it isn't used in its entirety for rankings. But it does mean that your title tags may be less click friendly.

A WebmasterWorld thread is seeing a lot of questions about the title tag changes.

As martinibuster put it:

This is just the display of the Title tag, not the consumption of it by the algorithm. So, as you already noted, a shorter title may be more general. I'm not changing anything for Google. If a slightly longer title makes sense then that's what's going in the title.

This doesn't really change anything for me, though. I don't do exact match longtail titles. Prefer to match it generally in the title and more exactly in the text of the content.

Google announced the Index Status reports within Google Webmaster Tools now lets you differentiate between HTTP, HTTPS and subdirectories.

Google's John Mueller said "If you're a data-driven SEO (or just love to see how your site's indexed), you'll love this change :). In Webmaster Tools, we've now made it possible to differentiate the "index status" information for http / https as well as for subdirectories." Zineb explained "you can now see index status data for your HTTPS sites and subdirectories."

You will see on the report an "update" line that will convey when the reporting changed to handle this.

As of March 9, 2014, the Index Status reflects the data of your specific protocol and site combination as it is verified in Webmaster Tools (i.e. distinguishing www and https variations).

Here are the technical details:

We do not show aggregate data for all versions of your site. While Google crawls and indexes content from your site regardless of whether you have verified the site in Webmaster Tools, the number of indexed URLs reported in Index status are specific to those associated with your site version.

For example, suppose you have a site with 10 URLs that people can view without signing in, and 100 URLs that people can only see once they sign into your site. If you have added only one version of your site to Webmaster Tools (e.g. http://www.example.com), you would see Index status totals only for the non-secure portion of your site, which would be a much lower number than for all URLs on your site.

Therefore, in order to see the index count for your secure site, you will need to add it to Webmaster Tools (e.g. https://www.example.com) and then select it from the Site Selector.

Similarly, you can verify a subdirectory of your site with Webmaster Tools, and only data for that subdirectory will be shown in its Index status (www.example.com/blog/). However, the top-level domain will continue to reflect the total count of URLs indexed for that domain.

Google is now testing product images within the free/organic listings. Google has had product images in the AdWords and Google Product Listing Ads for a long time but it is rare to see Google test showing product images within the free listings.

The picture above is from a Moz Help thread, where the user noticed it and asked "Has Google put anything out about this?" Not that I know of.

The listing is actually from an advertiser, and the image matches the product images used in the AdWords PLAs shown above. Do you think Google is using that data, the ad data, to power the image shown in the organic listings? Or is it more of a scheme thing, where Google is testing a new rich snippet?

Google's Matt Cutts, depicted here in an April Fools style animated GIF, posted a video on how Google goes about evaluating new search algorithms.

I summarized the three basic steps at Search Engine Land including (1) quality raters metrics, (2) live test metrics and (3) search quality launch team final review. You can watch the full video or read my summary there to learn more.

The interesting part, to me at least, was when he talked about how more clicks on a specific search result set typically means higher quality results except when it comes to webspam. Results with spam typically see a higher click through rate.

So that does make figuring out some algorithm changes harder but Google is pretty good at, according to Matt, figuring out the spam from the good results and they weed those outliers out pretty fast.

Wednesday, March 26, 2014

The truth is, for an experienced SEO, this video sheds nothing new about the question on determining if your site was hit by an algorithm or not.

In short, the best way to tell if you were hit by a Google algorithm such as Panda or Penguin, is to see your analytics and see if you had a major dive in traffic from Google on a specific day. If so, then write down that date, go to our Google updates section here and see if the date corresponds with anything reported here. If not, then you are out of luck. Well, not exactly.

(1) Manual actions show a notification in Google Webmaster Tools, so it is clear cut, Matt said.

(2) Crawl errors also are likely to show in Google Webmaster Tools, often clear cut also.

(3) Algorithmic penalties are not thought of as a penalty, they are algorithms for ranking. General quality and algorithms will determine rankings. So it is hard to tell if an algorithm is hurting you. But Google will communicate large scale algorithm changes, such as Panda or Penguin. They will tell you on what date they run, this way you can check the date and see if that algorithm had an impact on your site.

But as you improve your site and the algorithms run, your rankings can improve.

For people (especially newbies) having trouble making money online they should remember most things are interconnected. For example if you publish poor content it will lead to weak link development because no one likes linking to poor content. There are ripple effects when working on different parts of your site.

Thursday, March 20, 2014

Google's recent SERP redesign may not seem like a big deal to the casual observer, but at least one change could have a real impact on SEOs. This post will explore the impact of the redesign on title tags, and define a new, data-driven length limit, but first, a new tool...

Title tag preview tool (2014 edition)

Pardon the reverse order of this post, but we wanted to put the tool first for repeat visitors. Just enter your title and the search query keywords (for highlighting) below to preview your result in the redesign:

Enter Your Full Title Text:

Enter Search Phrase (optional):

I'm really happy for you, and Imma let you finish, but Beyonce has one of the best

www.example.com/example

This is your page description. The font and size of the description has not changed in the latest redesign. Descriptions get cut off after roughly 160 characters ...

Note: Enter keyword phrases as natural queries, without commas. This preview tool only highlights exact-match text (not related concepts) and is only intended as an approximation of actual Google results.

How the redesign impacts titles

Google's redesign increased the font size of result titles, while keeping the overall container the same size. Look at the following search result both before and after the redesign:

The title on the top (old design) has a small amount of room to spare. After the redesign (bottom), it's lost six full characters. The old guidelines no longer apply, and so the rest of this post is an attempt to create a new set of guidelines for title tag length based on data from real SERPs.

Tuesday, March 18, 2014

Over the past week or two there has been some people suggesting that Google does not review manually every single reconsideration request.

A new Google Webmaster Help thread has one such complaint but the truth is, at least from what we are told, Google employees reviews 100% of all reconsideration requests.

I have been told that directly by Googlers and Matt Cutts did a video a couple years ago on the topic. That was before Google swapped the reconsideration requests in the manual action viewer where now all reconsideration requests have to be submitted via the manual action section and thus all are reviewed by humans.

They might have some templated responses they use but humans do click on, read and paste the response.

A WebmasterWorld thread links to a story that says Google has recently begun defaulting to Google's secure, encrypted search worldwide.

Here is a statement from Niki Christoff, Google's Director of Corporate Communications:

The revelations of this past summer underscored our need to strengthen our networks. Among the many improvements we've made in recent months is to encrypt Google Search by default around the world. This builds on our work over the past few years to increase the number of our services that are encrypted by default and encourage the industry to adopt stronger security standards.

Honestly, I thought Google's secure search was default globally already based on my 93% not provided count. But I guess, it will soon be 100%.

Yes, as Google defaults all search to SSL, it will stripe out the referral query data and marketers will lose out as well. Most don't care if it goes global because most have already lost 90%+.

Google has penalized a few more international link networks on Friday afternoon. Google went after, as promised Italian and Spanish networks and those who participated in them - as well as a couple more Germany link networks.

Earlier in the day on Friday, Matt Cutts, Google's head of search spam, tweeted that Google has "taken action" on another German link network, this one named efamous plus a German agency network. This is the second time in almost two months that Google booted a Germany link agency and network. The thing isefamous looks pretty legit but I guess behind the scenes, in Google's mind, it was not?

Later on in the day, this may be somewhat historic, but Matt Cutts didn't announce it first, that they penalized an Italian and Spanish link network. Giacomo Gnecchi, a Google search quality analysts who has been with Google for maybe about 4 years, tweeted it in Italian and then much later, Matt Cutts retweeted it and then posted a translated version on Twitter. This shouldn't be a surprise because Matt warned it days before.

I am pretty sure we covered this before but the message is not getting out. If someone puts your site in their disavow link file, it will NOT have a negative impact on your rankings.

There are many link spammers trying to get their links removed from sites and are using very threatening emails and messages to encourage those sites to remove the links. One example is in a thread at Google Webmaster Help.

Here is part of the message:

We would like to bring your notice that failure to remove these links would require us to file a "Disavow Links" report with Google. Once we submit this report to Google, they may "flag" your site as"spammy" or otherwise if anything is not in compliance with their guidelines. The last thing we want is to have another web master go through this grief!

John Mueller from Google responds to the concern saying:

They are wrong. Having URLs from your website submitted in their disavow file will not cause any problems for your website. One might assume that they are just trying to pressure you. If the comment links they pointed to you are comment-spam that was left by them (or by someone working in their name) on your website, perhaps they are willing to help cover the work involved in cleaning their spam up?

I love how he outright calls them wrong and then goes on and suggests they pay up to remove the link they placed on their site. It is like throwing it back in their face and using a tactic the spammer would have used.

A common question large sites or e-commerce sites have to ask themselves is what do I do about product pages that are either temporarily out of stock or forever out of stock. It is a question we asked a coupletimes. In fact, Google's John Mueller gave his two centsin 2008.

Well, now, Google's Matt Cutts created a short video answering what you should do in three different situations.

Friday, March 7, 2014

Google is upgrading more and more businesses to the new Google business local dashboard and during that process is running into issues with old listings being duplicate.

Google is sending out emails to business owners with issues with instructions on how to repair the duplicate issue and move forward with the upgrade.

Jade Wang from Google posted in the Google Business Helpforums a snippet of the email and then in detail described the issues and how to address them.

The email reads:

We'd like to inform you that Google Places no longer accommodates more than one authorized owner per business location. Your account contains one or more listings that have been identified as duplicates of other listings and as a result, some of the information you provide will not be shown to Google users anymore...

There are two issues where this can be triggered:

(1) Your account and another account that you don't control became verified for the same business using the old Places dashboard.

(2) You may have verified the page multiple times using accounts you control.

In each case, Jade describes how to repair the issue so you can continue with your upgrade.

You know that often, Google Webmaster Tools shows an image preview of your site's home page on the dashboard listing view of your site profiles within Google Webmaster Tools.

What if the image of your site is wrong? Should you be concerned? Google's John Mueller implies it is really not something to be too concerned about.

In a Google Webmaster Help thread, one person was concerned because his mobile site was displayed in the preview and not his main site. John said, don't "give too much weight to these preview images."

I wouldn't give too much weight to these preview images -- they're not meant to be a representation of how Googlebot crawls the page. If you do see something wrong there, I'd still follow up to see where it came from, but it's not something that would be affecting how we index & rank your website. From what I can tell here, we do recognize your website appropriately, and can pick up the smartphone version. Since the homepage has a very different response size depending on the version that you serve, one way to dig into the details could be to check your server logs, comparing size with the user-agent that made the request, to double-check that your real users get the correct version.

These previews don't represent what Googlebot crawls or your indexing and ranking. It is likely just a style thing for Webmaster Tools.

He said he has uses this cheat sheet for "some time" and if you take it step by step in a careful and detailed manner, "this information should work for you as well."

Here are the instructions but for the cheat sheet, go toWebmasterWorld.

First, complete the following questionnaire BEFORE you begin to submit your website to the search engines, review websites and other citation sites. Some of these questions may not apply to your business, if that is the case, just leave them blank.

Be accurate and thorough. The most important thing to remember is to BE CONSISTENT! All of your submissions must have identical information or you will not get the search engine rankings that you will need to improve business.

After you have completed this questionnaire, use it and the information that it asked you to gather up, to submit to the 25 sites included below it.