I think our team has done a pretty good job on this blog of advocating internet marketing tactics according to their practical use over the years. Our goal: To help marketers put things in perspective and make informed choices. One particular area of focus has been on content marketing.

Rather than just pontificate about the latest trend, our approach is to experiment, optimize and share what we learn. For example, we’ve implemented content marketing projects for organizations that range from several Fortune 50 companies to Content Marketing World, the Mac Daddy of content marketing events. That experience has inspired many popular and useful posts during the first half of 2013.

The rise of ecommerce websites has heralded a new era of convenience for consumers, as well as the most powerful tool for retailers since electricity. However, ecommerce sites bring with them a host of SEO challenges that can quash any hopes of being found in organic search.

Fortunately, with a little foresight and a bit of technical knowledge, ecommerce retailers can maximize their SEO potential and literally rise above the competition (in the search results), resulting in more customers and increased sales.

Duplicate Content

The single biggest SEO challenge for ecommerce sites is duplicate content. While nearly every modern website can be subject to this issue, the characteristics of most ecommerce websites tend to have every attribute that typically results in duplicate content.

Social Stereotypes: You Are What You Share

This recent infographic from WIX focuses on social media, and the types of things that people share, or read. Findings include:

71% of tweets are ignored

23% of tweets get a reply

5% of Pinterest users are under the age of 18

97% of executives surveyed used LinkedIn in 2010

23 Tips on How to A/B Test Like A Badass
With all of the tools now available it has become very easy and effective to run an A/B test. However, many marketers are still scratching their heads trying to determine how to take their testing to the next level. Via Search Engine Watch.

7 Effortless Ways to Find New Ideas for Your Blog
Coming up with new and exciting topics for blog content is no easy undertaking. This article shares 7 easy ways to come up with new blog content regularly. Read on to find out more. Via Social Media Examiner.

Understanding the way that search engines like Google and Bing crawl your sites for duplicate content is not always easy to follow. What exactly are the rules, and what are the ramifications for not following the rules?

This presentation by a group of industry experts focused on gaining an understanding of how search engines read your content, as well as what steps you can take to avoid penalization for your content.

Peter van der Graff: Redirecting Duplicate Content

We always assume that Google knows best, but is that necessarily true? According to Peter finding a formula that works 100% of the time is no easy task. He opened up with a great example “If you have a 301 redirect and you tell Google to go left, they’ll probably go right.” You could end up implementing a 301 redirect and when you request the location a cached version of the site may still appear instead of the redirect you intended.

The second session on my liveblogging hitllist for SES Chicago is the “Duplicate Content and Multiple Sites” moderated by Adam Audette. Unfortunatley, Michael Gray was not able to be there, so things started up with Susan Moskwa, Webmaster Trends Analyst from Google, and Shari Thurow was added on the fly.

What is duplicate content? Identical or substantially similar content. Also multiple URLs with the same content. Google realizes that duplication can be deliberate or accidental.

Basically, duplicate content in the context of a search engine is publishing different URLs that present the same content. (My Example:)

Why does Google care about duplicate content? Users don’t like to see 10 nearly identical results. Also, there’s no benefit in crawling multiple urls with the same content. It’s a waste of resources for Googlebot to do that.

More and more website owners are concerned that they might get penalized accidentally or overtly because of duplicate content. For example, if you run mirror sites, will search engines ban you? If you have listings that are similar in nature, is that an issue?

What happens if you syndicate content through RSS? Will other sites be considered the “real” site and rob you of a rightful place in the search results? This Search Engine Strategies San Francisco session looks at the issues and explores solutions.

Did you realize that search engines have gone full circle on URLs in variables? It used to be considered something to avoid, now search engines are saying variables in URLs are good, as long as you use the canonical meta tag. Google is pushing them with FeedBurner and if webmasters aren’t careful, they could fall victim to a new onslaught of duplicate content issues.

One of the biggest issues with SEO is duplicate content. If search engines can’t tell which version of a document is the original or canonical version, then there can be consequences involving less than ideal search visibility. For example, the following URLs might all point to the same web page, creating the illusion that they are copies of the same thing. But in reality, it’s just one web page.

On day one of SES London I was able to catch up with Google Search Evangelist Adam Lasnik to do a short (10 min) video on several topics important to web masters looking for better results on Google. Adam starts with a descrption of his responsibilities at Google and then answers questions about Google compliant Flash and JavaScript, duplicate content – especially with press releases and suggested uses of internal site nofollow other than for “PageRank sculpting”.

This session was moderated by Chris Sherman and included the following speakers: Mikkel deMib Svendsen, Shari Thurow, Adam Lasnik (Google), Tim Converse (Yahoo) and Jon Glick (Become.com) who is filling in for Anne Kennedy whose husband unfortunatley passed away over the weekend.

Duplicate content has resurfaced as an issue due to the increased use of syndication.

First up is Jon Glick.

What is duplicate content? Multiple domains with same home page. Duplicate content is a problem because while search engines want your content, they want unique content. Search engine webmaster guidelines offer specifics.

Example problems:
– Dynamic urls where different urls can access the same content.
– Multiple domains and one web site