FEATURE: 2013 Internet Marketing Forecast by Bruce Clay

2013 will be a year with a faster pace and more innovations in search and social than we've seen yet. For my predictions this year I've focused in on three areas of the Internet marketing industry: new spam, new search engine optimization tactics and new tools.

Under Google's crackdown on scraped ranking data, the SEO tools market will see a shakeup and ultimately shrink.

Companies will increase their budget for SEO to cover the rising cost of SEO tools that report ranking data.

Long live rankings. SEO tools that report unbiased rankings are cause for renewed focus on the rankings as a KPI.

In the area of ranking monitoring, we're going to see a continuation of Google making sure people adhere to the terms of service. There will be further spam crackdown on scrapers and we'll see a shift in the tool usage out there because a lot of the tools on the market are utilizing scraping in order to keep the cost down.

The position I've always had on ranking is that ranking is highly valuable when there was no personalization. If I can put in a query and see that I went up or down, that information is useful. With personalization, every individual searcher has different search results.

It is biased by geographical location. It is biased by Web history. The advantage of an API is that it takes out all the personalization, effectively taking you back to the 10 blue links. As a result, an SEO can see that when a change has been made, the rankings go up or down. It may not correspond to what a user may see, but its support for SEO is great because you can see the cause and effect of SEO changes.

The thing about personalization is that the biasing improves a site's performance in search by filtering out results for those outside of your target market. I may not show up for those outside my target market, but they weren't going to buy from me anyhow. And if I'm showing up as #20 in the API, I may be #10 for someone in my target market, making the API the worst-case scenario for my site's rankings.

Personalization eliminates ambiguity. An API answers the question, "If there is no bias for location or Web history, how do you rank?" And that is a number I need.

Google charges a high rate, making the cost of entry to use their API very high. Google will encourage people to use their API rather than scraping, both due to recognition of API revenue as a money-maker and giving Google more flexibility to support their terms of service. I think Google will find the API to be an anti-spam tactic.

We already know that Google is cracking down on scraping. What we don't know is how big it is. I think it's going to get bigger and it's going to force people to use the API if you want ranking. We're going to see a split in schools of thought. On one side there are going to be people who say ranking doesn't matter — not because it doesn't matter but because they don't want to pay for it. And then there's going to be people who are willing to pay for it who are able to make a difference. We're going to see a shift toward paid tools from free tools because that's the only way you can get access to rankings now.

More companies are going to recognize the value of tools and are going to be willing to pay for tools because the APIs are going to require that they be paid for. There will be a shakeout in the tool market. Last year, many of the tools out there reported rankings with scraped data. This year we're going to see more SAAS implementations. And I predict that overall there will be a net decrease in the number of free ranking tools available. The cost of monitoring rankings will increase, but a good tool is worth it.

SEO's Hot New Thing: Knowledge Graph Optimization

By spring, Knowledge Graph will be the SEO buzzword.

Knowledge Graph optimization will be the most popular new strategy and businesses will rush to implement Schema markup on websites.

Structured markup will be considered in tandem with the emphasis on a website's content quality.

Knowledge graph optimization will be the SEO buzzword of 2013. The Knowledge Graph intersects content at a level that is unprecedented. The Knowledge Graph will be a major talking point for Google at the first search marketing conferences of the year. By midyear, most SEOs will be promoting Knowledge Graph optimization tactics and services.

As with many new website markups, people will implement Knowledge Graph Schemas incorrectly at first.

People are going to drill down too deep. Objects on a site should be defined in a siloed structure from the top down, rather than from the bottom up.

What we need to do in order to have a true Knowledge Graph optimization is to understand the things that we need to describe instead of the things we can describe.

On the search engine side, the Knowledge Graph is going to be used by the search engines to eliminate ambiguity within the Web page. That gives the search engines a better opportunity to match the Web page to the query. Add to that, the query ambiguity is being eliminated by Web history, making search results more relevant than ever before.

Social Surge: Spam, Tools and Quality Standards

Google will address social as the next area for eradicating spam.

Social media users' tolerance for mediocre content will drastically reduce. Content will be elevated to a higher caliber level by necessity.

The social analytics and measurement tools market will grow and social platforms will offer higher granularity of metrics reported.

What's the next area of spam Google will try to wipe out? It has to be social since that's the new area of spam. It's the new area of spam because it's Google's next frontier for relevancy signals.

Look at links; as soon as links were important to Google, overnight there were thousands of different ways to spam them. As soon as social is clearly important to Google rankings, there will be thousands of ways to spam social. Google has to find them one at a time, and it could take a year or two for them to find them all. I think social is going to follow the same path as links, and there will be a rapid increase in social spam.

Concurrently, social media users' will demand content of a high caliber. In part, this is due to fatigue of promotional and commercial content in social streams. In part, it is due to the influx of social spam. Regardless of the cause, users will reward with views, likes and shares, only the most interesting, unique, entertaining and relevant content.

Bounce rate from social sources will deteriorate rapidly. The bounce rate of social media will be so high that it will be a catalyst for businesses to revamp their content. Due to a decreasing tolerance for mediocre content, social communities will compel people and business to create higher quality content.

Improvements to social reporting tools from Facebook, Google and Twitter, as well as third-party analytics, make it possible to better track ROI of social media marketing and participation for brands. Just as the budgets for robust and accurate SEO tools grow, so does the budget for social reporting. Paying customers drive demand for ROI reporting and meaningful data, spurring the availability of more granular views of social traffic.

Social analytics brings privacy issues to a head, but marketers demanding attribution will ultimately win. A "global cookie" identification system that can maintain anonymity while tracking a single user, possibly even across devices, will be developed. The system will be able to report a user's sequence of events online while maintaining a level of anonymity that satisfies privacy advocates and users.

Bonus prediction: Someone will turn over the Mayan calendar and discover another 5000 years on the back.

HQ Hours of Operation:
8:30am to 5:30 pm Pacific timeDays of Operation:
Monday through Friday — email works other times in many casesSupport Operations:
M-F 9:00 to 5:00 Email Support FormTraining Facility:
Please see the training facility map