A few months back, I was trying to figure out just how to replicate the Olive Garden’s insanely amazing salad dressing, and the first thing I did was look at a copycat recipes cookbook given to me by my mother. The second thing I did was wonder why, as a person who spends so much time online, I didn’t think to just look it up online.

When I did, I was quite surprised to learn that Olive Garden puts recipes for their products on their site. I decided to see if some other big chains did the same thing, so I chose one of the most popular chains in town, P.F. Chang’s, and took a look. No recipes. Interesting, right?

Why Should You Care?

Let’s look at the searches for recipes so you’ll see why this was such a clever move.

A search for “Olive Garden recipes” returns over 6 million results, and guess who’s number one? That’s right: Olive Garden.

A search for “PF Changs recipes” gives you 244k results, and guess where P.F. Chang’s ranks? Number 6.

However, let’s look at just one more URL ranked above the P.F. Chang’s site so you can see why this matters:

The fifth result, a page from Damn Delicious, has 94 linking domains — 94 unique linking domains! Heck, some sites that do well online barely have half that number.

Wouldn’t you rather have those links than not have them?

For Olive Garden’s recipes page, there are 119 unique linking domains. Those are 119 domains that link to the site but wouldn’t if that recipes page didn’t exist. They are totally using resources that they have to get the links they deserve.

Now, if you offered a product, and a search query for it returned a copycat version on someone else’s site, wouldn’t you think, “Maybe I should have this info on my own site!”?

Sure, you can argue that giving away that info might make customers less likely to frequent a restaurant. I could point out that knowing how to make their fettuccine might mean I don’t go there for it – but then I’d remember that I am a busy person, and I really don’t feel like giving up all my guilty pleasures.

An Industry Spawned From Another One

Copycat recipes are big — big enough that they spawn cookbooks and entire sites devoted to them. Why ignore the chance to grab that market share?

They don’t list their recipes online, but they do have their own cookbooks on Amazon (and a landing page designed to take you there). They’re ranking number one for the term “California Pizza Kitchen recipes” with that, as you can see in the image above.

(They don’t have many unique linking domains pointing to that page, but they’re still number one in Google!)

How Does This Apply To You?

You probably aren’t the Olive Garden or P.F. Chang’s (or California Pizza Kitchen), so you may be wondering how all this applies to you.

The point here is that, no matter what your industry, you could be failing to capture traffic that is rightfully yours. Try asking yourself these questions:

What do I have?

What do people want from me?

How can I give it to them?

Use the keyword tool of your choice to see which keywords people are searching on that mention your brand, product or service. From there, figure out which ones you’re not addressing with existing content. (Make sure you get an idea of their potential by looking at search volume, too.)

Check what topics related to your business are being talked about across the web. I love IceRocket for this – you can search blogs, Twitter, Facebook, or all of those at once.

Twitter trends can be useful if you happen to follow a lot of people interested in the same thing that you offer. Otherwise, you probably won’t see much that will be helpful. You can see this on your Twitter homepage. There are also loads of services that give you trends in Twitter (way too many for me to even mention since there’s always a new one).

Google Trends can also be very useful, and you can subscribe to get updates about your searches.

Have alerts set up for your brand name and major products/services. Talkwalker Alerts is my favorite, but I also use Google Alerts.

Look at what people are asking questions about. FAQ Fox is amazing (you can download a spreadsheet of data!), but you can also look at Quora, Yahoo Answers, and many others.

Look at queries that lead to your site. Look at your rankings for those queries. If you have decent traffic and rankings from queries for which you don’t yet have entirely relevant content, then create it!

What I love best about Olive Garden’s move is that they didn’t view this as taking away from their main offering, which is in-house and takeout dining at their restaurants. From my point of view as a link builder, it’s absolutely brilliant as they’re owning the traffic (and links) for a phrase that did not previously come to them — and doing so in a 100% organic and relevant way.

Moz announced a new feature around their Open Site Explorer link analysis tool where paid subscribers can view a new section called “spam analysis.”

Spam Analysis within Open Site Explorer aims at documenting and uncovering the more risky and penalty prone links pointing to your web site. It goes as far as helping you build a disavow file to upload to Google.

Here are some screen shots shared by the Moz blog showing the reports:

The tool uses 17 different flag spam classifications and the more flags associated with a site, the more Moz will classify the risk. What are the flags? Here are some as they are defined now by Moz:

Low mozTrust to mozRank ratio: Sites with low mozTrust compared to mozRank are likely to be spam.

Large site with few links: Large sites with many pages tend to also have many links and large sites without a corresponding large number of links are likely to be spam.

Site link diversity is low: If a large percentage of links to a site are from a few domains it is likely to be spam.

Ratio of followed to nofollowed subdomains/domains (two separate flags): Sites with a large number of followed links relative to nofollowed are likely to be spam.

Small proportion of branded links (anchor text): Organically occurring links tend to contain a disproportionate amount of banded keywords. If a site does not have a lot of branded anchor text, it’s a signal the links are not organic.

Thin content: If a site has a relatively small ratio of content to navigation chrome it’s likely to be spam.

Site mark-up is abnormally small: Non-spam sites tend to invest in rich user experiences with CSS, Javascript and extensive mark-up. Accordingly, a large ratio of text to mark-up is a spam signal.

Large number of external links: A site with a large number of external links may look spammy.

Low number of internal links: Real sites tend to link heavily to themselves via internal navigation and a relative lack of internal links is a spam signal.

Anchor text-heavy page: Sites with a lot of anchor text are more likely to be spam then those with more content and less links.

External links in navigation: Spam sites may hide external links in the sidebar or footer.

No contact info: Real sites prominently display their social and other contact information.

Low number of pages found: A site with only one or a few pages is more likely to be spam than one with many pages.

]]>http://searchengineland.com/moz-open-site-explorer-adds-spam-analysis-for-risky-links-217694/feed0Authorship Is Dead; Long Live Authorshiphttp://searchengineland.com/authorship-dead-long-live-authorship-217209
http://searchengineland.com/authorship-dead-long-live-authorship-217209#commentsMon, 30 Mar 2015 13:39:01 +0000http://searchengineland.com/?p=217209Google's authorship program may be over, but columnist Eric Enge notes it's still important to work on building authority.

On August 28, 2014, Google ended support for authorship markup (aka rel=author) and announced that they were no longer going to look at that data. Mark Traphagen and I helped break the news right here on Search Engine Land. This marked the end of a three-year experiment that Google announced in June 2011.

Does this mean that author authority no longer matters? Not at all! In today’s post, I will explain why you should still be actively cultivating that authority and discuss the benefits you can get from it.

Why Did Google Kill Support For Rel=Author?

If Google was hoping for mass scale implementation of rel=author tags all over the web, it did not come even close to happening. Not only did the great majority of authors and sites not participate, but even among those who tried to implement the tags, a significant percentage did so incorrectly.

As a consequence, the scope of the impact of the authorship tags ended up being quite limited. This is important, as Google had to implement special algorithms to support scanning for these tags and implementing special search features, such as author photos. The benefit they were getting from the program was not enough to cover the expense of supporting it.

But is that all there was to it? Or did Google use the three years of authorship tagging data to help tune its own algorithms for recognizing and tracking authors? Then, once those algos were tuned, they simply shut the feature off?

To explain that in more detail, Google could have used authorship tagging as a way to train algorithms for identifying authors. This could work by running two algos in parallel: one that read the rel=author tagging, and a second that tried to identify the authors without use of the tagging. Then they could compare the results of the two algorithms and use the rel=author based one to tune the other.

My guess is that this is not what happened, for the following reason: If Google really wanted to train such an algorithm, it would not be a difficult project for them to go through and manually identify thousands of authors and then use that data to test and train their algo to better recognize authors automatically. This would save them from having to launch a public program around Authorship, and it would provide more accurate info than relying on third parties to implement rel=author correctly.

Bottom line for me on this “debate” is that I think that Google wanted author tagging to work, and thought it would benefit users, but it did not succeed on either score.

So Why Build Your Author Authority?

There are many reasons for doing so, they just may not be Google related. Here are my top eight:

1. It’s an awesome way to build reputation and visibility: As my colleague Mark Traphagen likes to say, “A Personal Brand is Very Powerful.” Indeed it is. People like to connect with other people, and a personal brand opens the door to many unique opportunities.

2. Your Content Will Attract More Links: Yes, when people see someone with a personal brand publishes something, some of the links are automatic. In addition, other higher profile people are more likely to take the time to read what you have written.

3. Your Content Will Attract More Social Shares: This helps get the content in front of more eyeballs and helps you further accelerate the growth of your brand.

4. It Will Help Accelerate Your Content Marketing Program: High-end content marketing programs are all about getting exposure in the right places, much like traditional marketing and PR campaigns.

5. It’s Easy to Pitch Known Authors: Media and bloggers will be faster to accept pitches from known authors. It’s an instant credibility builder when you can point someone to articles on other prominent sites, or that have high volumes of legit social shares.

6. Fan Base Growth Accelerates as Your Following Gets Larger: This is easy to see in the social media world. People with larger social followings tend to add more new followers per day (provided they are active). Why? See points 2 and 3 above. People are more likely to share their stuff and get them exposure to new people.

7. New Opportunities Present Themselves More Often: People will be more likely to ask you to speak at conferences, or accept your speaking pitches. Media/bloggers may contact you and ask to interview you. They may ask you for quotes on news, and so forth.

8. Personalization Via Social Media Connections: Currently, this only works via Google+, but if someone is connected with you there, content you write about is more likely to show up higher in the Google SERPs than otherwise. Will the new Google-Twitter deal extend personalization to Twitter as well? Nothing has been said about this by either party, but in my view, it’s a possibility.

That’s eight rock solid reasons right there, but…

What About Author Authority As A Ranking Factor?

Well, I don’t know, and I can only speculate. I am not convinced that Google would ever support the concept of Author Authority as a generic ranking factor except in the following scenarios:

The top two or three authors in a given market space may get a generic ranking boost for their content. In the search marketing space, examples would be Danny Sullivan and Rand Fishkin.

They could implement an extended form of personalization. For example, if I regularly visit articles by Bill Slawski and AJ Kohn, perhaps they will start to show other articles from those authors higher in the results for me.

The reason for my thinking here is that it’s very hard to classify authority based solely on identifying authors. For example. in the political arena, one person might think that Stephen Colbert is an authority, and someone else may think it’s Rush Limbaugh. Those are pretty different people, and the way each would get accepted by someone as an authority is quite subjective.

Summary

There are tons of reasons to think about building Author Authority. Google and Bing do not have to be on the list. The direct impact on your business is already quite powerful. In addition, the indirect impact on your SEO strategy is awesome too. Here is an example of what happened to our site traffic last fall:

Traffic has stayed up ever since, and much of that traffic is going to pages unrelated to the content that we published that caused those two spikes on the chart.

Much of the SEO game is now about building content and a user experience that’s so good that people will get upset if Google no longer shows it in the SERPs. At the same time, you should build your online presence to a point where Google is not your sole significant source of traffic. Ironically, if you do these two things, it will probably do wonders for your SEO at the same time.

Since a dramatic peak in 2012, the number of update announcements has been dropping. (That low number for 2015 is my own projection, based on what we’ve seen so far and on what you’re about to read.)

Why?

Two Reasons For Fewer Google Algo Update Announcements

I don’t think that Google will ever kick back, put up their collective feet, and stop updating. So why the drop?

1. Mind Control: Google Wants To Change SEO Behavior

First, Google doesn’t make those big announcements just to ensure sure that all their friends in the SEO community are on the same happy page. They announce algorithm updates because they want to change SEO behavior.

The war on links is a great example. Google doesn’t want its bots fooled by spammy backlinks, so along comes Penguin. They could have released the algorithm change, and all its subsequent updates, without a word; but, in addition to actually devaluing bad links, they also want black hat SEOs to just stop it already.

In that regard, fewer updates means Google sees less of a need to change our behavior. Have they beaten the SEO world into submission? Whipped us into shape? Or just stopped caring if we’re on board?

Google search results are always gettings smarter. The changes in results are becoming deliberately more subtle and more intuitive, such that most users hardly notice them at all. If Google’s algorithms are in the early stages of learning, they can easily make small, undercover changes on the fly.

The number of unnamed and unconfirmed updates bears this out. Just last month, SERP-trackers and webmasters noticed some major shifts in search results, but Google has yet to officially confirm the update. Are those relatively minor updates unannounced because they’re not worth the PR effort, or because Google didn’t intend for us to notice?

If it weren’t for tools like MozCast and other SERP-tracking utilities, many of those updates might go unnoticed. On the other hand, the MozCast might make these sneaky updates even easier for Mountain View engineers – even they can watch the MozCast to make sure their changes stay just below the radar.

This begs the question: How many changes do go unnoticed? Are we catching them all? Is Google’s future in AI learning machines that can make quiet, invisible updates day in and day out?

Google Is Investing Heavily in AI

It’s no secret that Google has a lot of skin in the game when it comes to artificial intelligence. Google has been slowly collecting researchers and developers, scattering the purchase of various departments over continents and years… probably in hopes that no one would notice. Oh, we noticed.

March 2013: Google acquires DNNresearch, a neural network startup out of the University of Toronto, and gets the team refocused on expanding traditional search algorithms.

January 2014: Google acquires DeepMind and sets up the artificial intelligence team to work directly with the Knowledge team on Google’s search algorithms. (They also, almost immediately, set up an AI ethics board – presumably, to save the human race from AI-wrought extinction.)

September 2014: Google expanded research surrounding quantum computing by hiring John Martinis and his research team out of UCSB.

October 2014: Google acqui-hires two teams of AI researchers from Oxford (and announces a partnership with the University) to “enable machines to better understand what users are saying to them.”

Google has put up a pretty penny to build a team of researchers that can push machines to the very edge of artificial intelligence.

Google Is Already Smarter Than We Realize

The new development may not impose dramatic changes on Google search right away, but not because it’s not applicable.

Is it possible that the reason we have only seen one major update so far in 2015 (and an unnamed update at that) is that Google no longer needs to launch major algorithm changes? If its AI bots are busily making small, under-the-radar updates every minute of the day, Google might never have to launch (and therefore announce) another update ever again.

The Future Could Get Even Weirder

That the future of algorithm updates is up in the air is somewhat unsettling, but it may only be the beginning.

Google already curates its own medical information, in part to protect us from misinformation I’m sure. (Has anyone else noticed that medical keywords are some of the most expensive keywords AdWords has to offer?)

Next, it might be checking facts for us to determine rankings. No matter where you land on the political spectrum, there is a valid concern that this type of algo could stifle bold new voices in the scientific community.

It could make it more difficult for bright young people to bring about the next revolution in science. After all, most of today’s established science came about because someone challenged the herd mentality of yesterday.Jim Purtilo, Associate Professor of Computer Science at U of Maryland

What’s A Marketer to Do?

Even though Google’s tactics are changing, their endgame is not. That means the intersection of SEO and content marketing is still the sweet spot for marketers to focus on:

Identify specific, measurable attributes of high-quality content, and put them into practice. Google’s algorithm updates may be more like ninjas than Goliaths from here on out, but the goal is the same. Keep your eye (and your SEO) on the prize.

Think user intent, not just keywords. The later are still valuable, but only in the context of the former. Keep improving how you talk to your audience, not how you talk to search engines.

The Certainty Of Change

The future of search is simultaneously settled and uncertain. Google wants to deliver the best user experience possible, but how they do it and what it looks like on SERPs is (still) constantly changing.

And until you can beat its high score on Space Invaders, it doesn’t have to tell you about updates if it doesn’t want to.

]]>
What’s more important than writing great content? Making sure people can find it!

While search engine algorithms are getting better at discovering and ranking new content on the web, there are still a lot of things you can do to influence that process. Publishing content to a blog is only the first step in the promotion process. Doing only that doesn’t guarantee anyone will see it. You have to go the extra mile to get your content in front of eyeballs!

Balance Great Content With Quality With Search Engine Optimization

Search engine optimization (SEO) is more than just adding keywords to content, though that is certainly an important part of it. SEO is about seeking out and fixing any roadblocks that might be preventing your content from being found. This includes “hidden” site architectural issues, navigation issues, usability issues and, yes, keyword/content issues.

Whether you are engaging an outside SEO consultant or firm, or you’re handling the SEO yourself, you need to make sure you invest appropriate resources (time and money) into uncovering content findability issues. A good SEO practitioner will run your site through a number of tools on a regular basis, looking for things that might cause the search engines to either bypass your content or rate it lower than it deserves.

As you invest the time into proper keyword research and optimization, you will discover that your content is being found more often by searchers looking for what you offer.

Write Compelling, Optimized Title Tags

One of the key elements of optimization is your blog post title tags. The title tag is usually the clickable link that appears in the search results, and you only have a limited amount of characters (50-70) to convey your message. Search engines give these tags significant weight when determining how pages should rank, making the title tag the single most important piece of optimizable real estate on a single web page.

The Importance Of Title Tags

That’s not to say it’s a magic pill to top rankings. Many other page elements can weigh a good title tag down. But on well-optimized pages, a strategic change to a title tag can often have a near-immediate impact on the page’s ranking.

The title tag is also often used, by default, as the headline when sharing a page socially. These social titles can usually be edited independently of the title tag, and many times it makes sense to do so. However, most people who share your content won’t edit the title before sharing, making the title tag an even more important part of the optimization arsenal.

Ultimately, the title tag has to work for both search engines and visitors. When crafting your title tag, don’t just think in terms of keywords. Think in terms of reader value.

Make Sure Your Content Uses Keywords Properly

When it comes to keyword integration, the content of your blog posts must align with the optimized title tag. If you title a blog post to target a specific keyword, your content should focus on that topic. As such, using the keyword throughout the content would seem natural. But it’s not just about working your keywords into the content. It’s about producing content that is authoritative to the topic you are addressing.

Yes, by creating authoritative content, you’ll use your keyword(s) – but that doesn’t mean you repeat them ad nauseum. Search engines want more than keyword repetition. They want to see that you have a firm grasp and understanding of the topic. That means liberally incorporating other words that are frequently found on authoritative documents of the same topic.

If you want to write an authoritative piece on dog food, you’ll do more than keep repeating the words “dog food” throughout your text. Obviously the words “food” and “dog” will be used throughout, as will “dog food.” But so will the words “pet,” “nutrition,” “health,”etc.

Look beyond a single keyword phrase when optimizing and think about the topic as a whole. Then work all of this into your content, headings, and even use them in your blog categories and tags. This will give you more authority points and more opportunity for top exposure.

Optimize Your Images & Alt Attributes

Don’t forget about images! These can be an important avenue for both drawing in traffic (via image search and social shares) and keeping the reader engaged.

The most common way to optimize your images is to add Alternate Text (aka “alt text” or “alt attribute”). The alt attribute is contained within the HTML code of the image and is often used when you mouse over the image or displayed if the image doesn’t load. However, the search engines also use this alt text as a signal for the image’s content.

You can also add an image title, a caption, and even a description, as well as using appropriate keywords for the image file name. All of these elements can give both visitors and search engines more information about what your image is. Of course, it makes sense for the images to visually reinforce your content. As such, the content you use in these areas should do the same.

Often neglected, images can be a powerful reinforcement for your on-page optimization. The same for any video or other content that you add to the page. Look for opportunities to optimize this embedded content to provide a richer experience for both the visitor and the search engine.

By following these simple techniques, you can help ensure that the content you work so hard to create actually gets seen.

Note: This information was part of a larger presentation. You can view the entire slide deck here:

]]>http://searchengineland.com/get-content-front-millions-eyeballs-216577/feed0Google News Ranking Won’t Be Impacted By The New Mobile-Friendly Algorithmhttp://searchengineland.com/google-news-ranking-wont-be-impacted-by-the-new-mobile-friendly-algorithm-217452
http://searchengineland.com/google-news-ranking-wont-be-impacted-by-the-new-mobile-friendly-algorithm-217452#commentsThu, 26 Mar 2015 12:57:41 +0000http://searchengineland.com/?p=217452Google News ranking algorithm won't factor in if a publisher is mobile-friendly or not, at least not yet.

Everyone is preparing for the mobile-friendly Google algorithm which will touch down April 21st. But Google News publishers don’t have to worry just yet, at least in terms of the ranking in the Google News search results.

Google News community manager, Stacie Chan, said in a Google+ hangout yesterday, that as of now, the Google News ranking team has no plans on implementing the mobile-friendly algorithm into the Google News results, at least not yet.

As of now, Google News is not committed to making that change yet. We are always exploring because we think it is awesome that sites are trying to be more mobile friendly. I love that. And Google News is very well aware that search is doing that… We are definitely exploring that option as well but don’t have an exact timeline for that or even weather or not we will implement that.

Here is the video:

What I said I am unclear about is if the In The News box found in Google’s web search results, that include both news publishers and non-news publishers, will be included in this mobile ranking algorithm or not. That is unclear, but I suspect since it is not news specific (although it sounds like it is), it should be included in the mobile ranking algorithm.

]]>http://searchengineland.com/google-news-ranking-wont-be-impacted-by-the-new-mobile-friendly-algorithm-217452/feed0Google Clarifies The Mobile-Friendly Algorithm Will Roll Out Over A Week, Be A Yes/No Response & Morehttp://searchengineland.com/google-clarifies-the-mobile-friendly-algorithm-will-roll-out-over-a-week-be-a-yesno-response-more-217399
http://searchengineland.com/google-clarifies-the-mobile-friendly-algorithm-will-roll-out-over-a-week-be-a-yesno-response-more-217399#commentsWed, 25 Mar 2015 13:45:01 +0000http://searchengineland.com/?p=217399As we approach the April 21st date of the release of Google's mobile-friendly algorithm, we uncover more SEO facts about what to expect around the algorithm.

Roll Out Will Be A Few Days To A Week

We are expecting it (the mobile friendly algorithm) to roll out on April 21st, we don’t have a set time period because it is going to take a couple of days to roll out. Maybe even a week or so.

Your Page Is Mobile-Friendly Or Not

The mobile-friendly algorithm is an on or off algorithm, on a page-by-page basis, but it is not about how mobile-friendly your pages are, it is simply are you mobile-friendly or not. I transcribed this one also:

As we mentioned in this particular change, you either have a mobile friendly page or not. It is based on the criteria we mentioned earlier, which are small font sizes, your tap targets/links to your buttons are too close together, readable content and your viewpoint. So if you have all of those and your site is mobile friendly then you benefit from the ranking change.

But as we mentioned earlier, there are over 200 different factors that determine ranking so we can’t just give you a yes or no answer with this. It depends on all the other attributes of your site, weather it is providing a great user experience or not. That is the same with desktop search, not isolated with mobile search.

How Do You Know You Are Mobile-Friendly

How do you know if your web pages will be mobile-friendly or not? There are a few ways, but Google said the easiest way is to see if your current pages have the mobile-friendly label in the live mobile search results now. If so, the mobile-friendly testing tool should also confirm this. Keep in mind, the mobile usability reports in Webmaster Tools can be delayed by crawl time and general webmaster tools reporting delays.

Take out your phone, look up your web site. See if there is a gray mobile friendly label in your description snippet. If it is in the search results, if you see it, that means that Google understands that your site is mobile friendly and if you don’t see it then we don’t see that your site is mobile friendly or your is not mobile friendly.

]]>http://searchengineland.com/google-clarifies-the-mobile-friendly-algorithm-will-roll-out-over-a-week-be-a-yesno-response-more-217399/feed0Ready To Go Multilingual Or Multinational With Your Online Business? Ask Yourself These 10 Questions Firsthttp://searchengineland.com/ready-go-multilingual-multinational-ask-10-questions-first-216527
http://searchengineland.com/ready-go-multilingual-multinational-ask-10-questions-first-216527#commentsWed, 25 Mar 2015 13:26:45 +0000http://searchengineland.com/?p=216527For businesses looking to expand into a new regions, columnist John Lincoln has created this checklist of things to consider when developing your strategy.

Offering your products, services and information in different languages or in new countries can provide a great deal of benefit. It allows you to reach entirely new audiences, boost traffic, increase revenue, and expand your overall business.

According to a 2006 study by Common Sense Advisory, 72.4% of consumers admit they are more inclined to make a purchase when information is in their own language. In addition, 56.2% of customers report information in their native language is more important than a product’s price.

The benefits of going multilingual/multinational are clear, but the challenge can be great. Before making the move, there are some things to consider first.

1. Have You Accounted For Shipping Rates?

If your website ships something, shipping rates can be a big deal for international business. Make sure you account for the cost of getting your product, people or just your general business offering to the location you are about to launch in.

You need to run the numbers before investing the time in the market. A search engine optimization (SEO) company might be able to get you ranked in another country, but does it make business sense to sell a product there? Expanding your business on an international level can mean a lucrative new avenue, but it may not make financial sense due to high shipping costs.

2. Will You Offer Backend Support & Customer Service Specifically For The Language Or Country?

When you go into a new market, expect that you will need to offer a similar customer service experience in that language or region. In some cases, you may even need to offer an increasingly dedicated experience.

This can mean an onsite knowledge base in that language, customer service and sales staff. In a survey conducted by Rosetta Stone, 89 percent of customers feel customer satisfaction and loyalty are promoted when a company serves in native languages.

3. Who Will Manage Inventory & Keep The New Sections Up-To-Date?

On an e-commerce site, you generally segment your product offering by region. For example, you have all the United States products in the U.S. section, the U.K. in another section, etc. In order to make this happen, you need to have a team manage the inventory.

Furthering this idea, you will want to offer products in specific regions that are profitable after expenses, shipping, etc. Because of this, you need to make sure that the person organizing the products online understands the business model, profit margins and overall company big picture goals.

When moving into other countries and languages, this can get more complex. You need to have the right products, in the right language, in the right country location, at the right price.

From a service offering perspective, these same rules apply. However, they can be a little less complex, depending on the business offering.

4. Do You Have An SEO Who Can Manage The Onsite Optimization Requirements?

Multilingual and multiregional SEO is fairly complex. Outside of doing keyword assignments and optimization and then translating that into different languages, you also need to make sure the content speaks to the demographic.

In addition, there is hreflang tag implementation, the decision between TLDs vs. subdomains vs. directories, and ensuring your current rankings are not lost should you alter your site’s URL structure. It can get even more complicated if you are dealing with certain types of mobile optimization, secure shopping carts (and other areas that are non-secure) and multilingual checkout processes.

SEO on a global level requires extensive strategic planning throughout the entire rollout process to promote efficiency and effectiveness. It’s important to have an experienced SEO team in place that is capable of handling multilingual/multinational website requirements.

5. How Will You Get Links In The New Country?

Unless you are a massive brand with high domain authority and the type of presence that will always be linked to, you might run into the issue of creating a great, well-optimized website that has no inbound links from within the new region.

This can be problematic. Even if you perform thorough on-site optimization, your rankings could remain low without inbound links — and thus, your site might never generate the business you were hoping for in the region. Sure, you can use the pages for PPC, but with a few links you could also be ranking higher in organic.

Make sure you have a strategy in each region to build links in a white hat way.

6. Will Moving Into These New Markets Hurt Your Ability To Manage Your Existing Clients?

You never want to spread yourself too thin. It can hurt your existing business, lead to high levels of stress, and leave you with a disorganized business. If you are going to make a big push into another language or region, make sure you will not be losing sight of your existing user base.

As we all know, search engine optimization and other forms of digital marketing are only becoming more competitive. You don’t want to spend a ton of time moving into another country and then, next thing you know, your current operation has fallen behind.

7. Who Will Handle Ongoing Translation & Regional Needs?

Finding a good translation company and website managers who understand regional issues can be tough. It is important to have a partner or staff who really gets it and will continue to be there. Make sure you have some type of plan in place to help maintain your lingual and regional goals.

8. What Will Your Marketing Be Outside Of SEO?

All the things you need in your home region will be needed in any new region you expand to. Multinational/multilingual SEO is important, but it is best if you complement it with digital and offline marketing in the region. This is why you generally see large companies implementing comprehensive global strategies as opposed to just launching new sites.

I have the pleasure of working many global clients – often, they have radio, TV, social media, email, PR, online ads, you name it, in each of the regions we are targeting. All that can really complement your SEO strategy to help boost online visibility.

Now, don’t get me wrong — you can be successful with just a general SEO build. I have a client right now with 70% year-over-year traffic largely due to their international optimization, but they are a rare case because the product has a good buzz.

9. Do You Need A Person On The Ground?

This question is pretty clear and must be answered before making the push. Although sending a company representative to another country may seem costly, the move is beneficial as you’ll have someone directly overseeing business relationships to improve overall functionality and performance.

Should you decide to take this route, have a general idea of how you will make that happen before you launch a service in some area that’s thousands of miles away.

10. Will You Be Able To Offer A Quality Experience Overall?

This is probably the biggest question you must answer, and the one I will leave you with. According to Oracle’s 2011 Customer Experience Impact Report, 89 percent of customers admit that they will stop doing business with a company based on poor customer service. Do you have what it takes to successfully offer your products and services to multiple countries?

Final Thoughts

There are a lot of emerging markets out there in the world. If you can get your website better visibility into those markets, it can mean you beat competitors to the location and establish a presence and sales first.

No matter what your online goals, you should consider multilingual and regional SEO. If you had the right response to all of these questions, you could be the right candidate.

Google has quietly added support for moving subdomains, i.e. subdomain.domain.com, to a new domain in the Change of Address tool within Google Webmaster Tools.

If you review the documentation, as of last night, it was updated to include “or subdomain” to the instructions. The new documentation reads:

Use the Change of address tool when your site move entails a domain or subdomain change, such as changing from http://fish.example-petstore.com to http://example.com or http://example-petstore.com.

Note: The tool does not currently support the following kinds of site moves: subdomain name changes, protocol changes (from HTTP to HTTPS), or path-only changes.

Previously, you were unable to communicate to Google in Google Webmaster Tools that you are moving a subdomain to a new domain. Now you can, which is a great addition to the change of address tool. Of course, many still want the change of address tool to support HTTP to HTTPS moves, which is currently does not. In either case, setting up 301 redirects is a really good step to also communicate the domain moves.

Menashe Avramov created a screen shot of the document change, which I will embed below as a PDF:

]]>Learn more about software for content marketing campaigns – what the capabilities are, and how they can help marketers automate content marketing campaigns. This free guide also includes the recommended steps for choosing a content marketing tool and contains profiles of leading content marketing vendors.