Tuesday, January 31, 2017

There are a handful of data sources relied upon by nearly every search engine optimizer. Google Search Console (formerly Google Webmaster Tools) has perhaps become the most ubiquitous. There are simply some things you can do with GSC, like disavowing links, that cannot be accomplished anywhere else, so we are in some ways forced to rely upon it. But, like all sources of knowledge, we must put it to the test to determine its trustworthiness — can we stake our craft on its recommendations? Let's see if we can pull back the curtain on GSC data and determine, once and for all, how skeptical we should be of the data it provides.

Testing data sources

Before we dive in, I think it is worth having a quick discussion about how we might address this problem. There are basically two concepts that I want to introduce for the sake of this analysis: internal validity and external validity.

These two concepts are extremely important for our discussion. Depending upon the problem we are addressing as SEOs, we may care more about one or another. For example, let's assume that page speed was an incredibly important ranking factor and we wanted to help a customer. We would likely be concerned with the internal validity of GSC's "time spent downloading a page" metric because, regardless of what happens to a real user, if Google thinks the page is slow, we will lose rankings. We would rely on this metric insofar as we were confident it represented what Google believes about the customer's site. On the other hand, if we are trying to prevent Google from finding bad links, we would be concerned about the external validity of the "links to your site" section because, while Google might already know about some bad links, we want to make sure there aren't any others that Google could stumble upon. Thus, depending on how well GSC's sample links comprehensively describe the links across the web, we might reject that metric and use a combination of other sources (like Open Site Explorer, Majestic, and Ahrefs) which will give us greater coverage.

The point of this exercise is simply to say that we can judge GSC's data from multiple perspectives, and it is important to tease these out so we know when it is reasonable to rely upon GSC.

GSC Section 1: HTML Improvements

Of the many useful features in GSC, Google provides a list of some common HTML errors it discovered in the course of crawling your site. This section, located at Search Appearance > HTML Improvements, lists off several potential errors including Duplicate Titles, Duplicate Descriptions, and other actionable recommendations. Fortunately, this first example gives us an opportunity to outline methods for testing both the internal and external validity of the data. As you can see in the screenshot below, GSC has found duplicate meta descriptions because a website has case insensitive URLs and no canonical tag or redirect to fix it. Essentially, you can reach the page from either /Page.aspx or /page.aspx, and this is apparent as Googlebot had found the URL both with and without capitalization. Let's test Google's recommendation to see if it is externally and internally valid.

External Validity: In this case, the external validity is simply whether the data accurately reflects pages as they appear on the Internet. As one can imagine, the list of HTML improvements can be woefully out of date dependent upon the crawl rate of your site. In this case, the site had previously repaired the issue with a 301 redirect.

This really isn't terribly surprising. Google shouldn't be expected to update this section of GSC every time you apply a correction to your website. However, it does illustrate a common problem with GSC. Many of the issues GSC alerts you to may have already been fixed by you or your web developer. I don't think this is a fault with GSC by any stretch of the imagination, just a limitation that can only be addressed by more frequent, deliberate crawls like Moz Pro's Crawl Audit or a standalone tool like Screaming Frog.

Internal Validity: This is where things start to get interesting. While it is unsurprising that Google doesn't crawl your site so frequently as to capture updates to your site in real-time, it is reasonable to expect that what Google has crawled would be reflected accurately in GSC. This doesn't appear to be the case.

By executing an info:http://concerning-url query in Google with upper-case letters, we can determine some information about what Google knows about the URL. Google returns results for the lower-case version of the URL! This indicates that Google both knows about the 301 redirect correcting the problem and has corrected it in their search index. As you can imagine, this presents us with quite a problem. HTML Improvement recommendations in GSC not only may not reflect changes you made to your site, it might not even reflect corrections Google is already aware of. Given this difference, it almost always makes sense to crawl your site for these types of issues in addition to using GSC.

GSC Section 2: Index Status

The next metric we are going to tackle is Google's Index Status, which is supposed to provide you with an accurate number of pages Google has indexed from your site. This section is located at Google Index > Index Status. This particular metric can only be tested for internal validity since it is specifically providing us with information about Google itself. There are a couple of ways we could address this...

We could compare the number provided in GSC to site: commands

We could compare the number provided in GSC to the number of internal links to the homepage in the internal links section (assuming 1 link to homepage from every page on the site)

We opted for both. The biggest problem with this particular metric is being certain what it is measuring. Because GSC allows you to authorize the http, https, www, and non-www version of your site independently, it can be confusing as to what is included in the Index Status metric.

We found that when carefully applied to ensure no crossover of varying types (https vs http, www vs non-www), the Index Status metric seemed to be quite well correlated with the site:site.com query in Google, especially on smaller sites. The larger the site, the more fluctuation we saw in these numbers, but this could be accounted for by approximations performed by the site: command.

We found the link count method to be difficult to use, though. Consider the graphic above. The site in question has 1,587 pages indexed according to GSC, but the home page to that site has 7,080 internal links. This seems highly unrealistic, as we were unable to find a single page, much less the majority of pages, with 4 or more links back to the home page. However, given the consistency with the site: command and GSC's Index Status, I believe this is more of a problem with the way internal links are represented than with the Index Status metric.

I think it is safe to conclude that the Index Status metric is probably the most reliable one available to us in regards to the number of pages actually included in Google's index.

GSC Section 3: Internal Links

The Internal Links section found under Search Traffic > Internal Links seems to be rarely used, but can be quite insightful. If External Links tells Google what others think is important on your site, then Internal Links tell Google what you think is important on your site. This section once again serves as a useful example of knowing the difference between what Google believes about your site and what is actually true of your site.

Testing this metric was fairly straightforward. We took the internal links numbers provided by GSC and compared them to full site crawls. We could then determine whether Google's crawl was fairly representative of the actual site.

Generally speaking, the two were modestly correlated with some fairly significant deviation. As an SEO, I find this incredibly important. Google does not start at your home page and crawl your site in the same way that your standard site crawlers do (like the one included in Moz Pro). Googlebot approaches your site via a combination of external links, internal links, sitemaps, redirects, etc. that can give a very different picture. In fact, we found several examples where a full site crawl unearthed hundreds of internal links that Googlebot had missed. Navigational pages, like category pages in the blog, were crawled less frequently, so certain pages didn't accumulate nearly as many links in GSC as one would have expected having looked only at a traditional crawl.

As search marketers, in this case we must be concerned with internal validity, or what Google believes about our site. I highly recommend comparing Google's numbers to your own site crawl to determine if there is important content which Google determines you have ignored in your internal linking.

GSC Section 4: Links to Your Site

Link data is always one of the most sought-after metrics in our industry, and rightly so. External links continue to be the strongest predictive factor for rankings and Google has admitted as much time and time again. So how does GSC's link data measure up?

In this analysis, we compared the links presented to us by GSC to those presented by Ahrefs, Majestic, and Moz for whether those links are still live. To be fair to GSC, which provides only a sampling of links, we only used sites that had fewer than 1,000 total backlinks, increasing the likelihood that we get a full picture (or at least close to it) from GSC. The results are startling. GSC's lists, both "sample links" and "latest links," were the lowest-performing in terms of "live links" for every site we tested, never once beating out Moz, Majestic, or Ahrefs.

I do want to be clear and upfront about Moz's performance in this particular test. Because Moz has a smaller total index, it is likely we only surface higher-quality, long-lasting links. Our out-performing Majestic and Ahrefs by just a couple of percentage points is likely a side effect of index size and not reflective of a substantial difference. However, the several percentage points which separate GSC from all 3 link indexes cannot be ignored. In terms of external validity — that is to say, how well this data reflects what is actually happening on the web — GSC is out-performed by third-party indexes.

But what about internal validity? Does GSC give us a fresh look at Google's actual backlink index? It does appear that the two are consistent insofar as rarely reporting links that Google is already aware are no longer in the index. We randomly selected hundreds of URLs which were "no longer found" according to our test to determine if Googlebot still had old versions cached and, uniformly, that was the case. While we can't be certain that it shows a complete set of Google's link index relative to your site, we can be confident that Google tends to show only results that are in accord with their latest data.

GSC Section 5: Search Analytics

Search Analytics is probably the most important and heavily utilized feature within Google Search Console, as it gives us some insight into the data lost with Google's "Not Provided" updates to Google Analytics. Many have rightfully questioned the accuracy of the data, so we decided to take a closer look.

Experimental analysis

The Search Analytics section gave us a unique opportunity to utilize an experimental design to determine the reliability of the data. Unlike some of the other metrics we tested, we could control reality by delivering clicks under certain circumstances to individual pages on a site. We developed a study that worked something like this:

Create a series of nonsensical text pages.

Link to them from internal sources to encourage indexation.

Use volunteers to perform searches for the nonsensical terms, which inevitably reveal the exact-match nonsensical content we created.

Vary the circumstances under which those volunteers search to determine if GSC tracks clicks and impressions only in certain environments.

Use volunteers to click on those results.

Record their actions.

Compare to the data provided by GSC.

We decided to check 5 different environments for their reliability:

User performs search logged into Google in Chrome

User performs search logged out, incognito in Chrome

User performs search from mobile

User performs search logged out in Firefox

User performs the same search 5 times over the course of a day

We hoped these variants would answer specific questions about the methods Google used to collect data for GSC. We were sorely and uniformly disappointed.

Experimental results

Method

Delivered

GSC Impressions

GSC Clicks

Logged In Chrome

11

0

0

Incognito

11

0

0

Mobile

11

0

0

Logged Out Firefox

11

0

0

5 Searches Each

40

2

0

GSC recorded only 2 impressions out of 84, and absolutely 0 clicks. Given these results, I was immediately concerned about the experimental design. Perhaps Google wasn't recording data for these pages? Perhaps we didn't hit a minimum number necessary for recording data, only barely eclipsing that in the last study of 5 searches per person?

Unfortunately, neither of those explanations made much sense. In fact, several of the test pages picked up impressions by the hundreds for bizarre, low-ranking keywords that just happened to occur at random in the nonsensical tests. Moreover, many pages on the site recorded very low impressions and clicks, and when compared with Google Analytics data, did indeed have very few clicks. It is quite evident that GSC cannot be relied upon, regardless of user circumstance, for lightly searched terms. It is, by this account, not externally valid — that is to say, impressions and clicks in GSC do not reliably reflect impressions and clicks performed on Google.

As you can imagine, I was not satisfied with this result. Perhaps the experimental design had some unforeseen limitations which a standard comparative analysis would uncover.

Comparative analysis

The next step I undertook was comparing GSC data to other sources to see if we could find some relationship between the data presented and secondary measurements which might shed light on why the initial GSC experiment had reflected so poorly on the quality of data. The most straightforward comparison was that of GSC to Google Analytics. In theory, GSC's reporting of clicks should mirror Google Analytics's recording of organic clicks from Google, if not identically, at least proportionally. Because of concerns related to the scale of the experimental project, I decided to first try a set of larger sites.

Unfortunately, the results were wildly different. The first example site received around 6,000 clicks per day from Google Organic Search according to GA. Dozens of pages with hundreds of organic clicks per month, according to GA, received 0 clicks according to GSC. But, in this case, I was able to uncover a culprit, and it has to do with the way clicks are tracked.

GSC tracks a click based on the URL in the search results (let's say you click on /pageA.html). However, let's assume that /pageA.html redirects to /pagea.html because you were smart and decided to fix the casing issue discussed at the top of the page. If Googlebot hasn't picked up that fix, then Google Search will still have the old URL, but the click will be recorded in Google Analytics on the corrected URL, since that is the page where GA's code fires. It just so happened that enough cleanup had taken place recently on the first site I tested that GA and GSC had a correlation coefficient of just .52!

So, I went in search of other properties that might provide a clearer picture. After analyzing several properties without similar problems as the first, we identified a range of approximately .94 to .99 correlation between GSC and Google Analytics reporting on organic landing pages. This seems pretty strong.

Finally, we did one more type of comparative analytics to determine the trustworthiness of GSC's ranking data. In general, the number of clicks received by a site should be a function of the number of impressions it received and at what position in the SERP. While this is obviously an incomplete view of all the factors, it seems fair to say that we could compare the quality of two ranking sets if we know the number of impressions and the number of clicks. In theory, the rank tracking method which better predicts the clicks given the impressions is the better of the two.

Call me unsurprised, but this wasn't even close. Standard rank tracking methods performed far better at predicting the actual number of clicks than the rank as presented in Google Search Console. We know that GSC's rank data is an average position which almost certainly presents a false picture. There are many scenarios where this is true, but let me just explain one. Imagine you add new content and your keyword starts at position 80, then moves to 70, then 60, and eventually to #1. Now, imagine you create a different piece of content and it sits at position 40, never wavering. GSC will report both as having an average position of 40. The first, though, will receive considerable traffic for the time that it is in position 1, and the latter will never receive any. GSC's averaging method based on impression data obscures the underlying features too much to provide relevant projections. Until something changes explicitly in Google's method for collecting rank data for GSC, it will not be sufficient for getting at the truth of your site's current position.

Reconciliation

So, how do we reconcile the experimental results with the comparative results, both the positives and negatives of GSC Search Analytics? Well, I think there are a couple of clear takeaways.

Impression data is misleading at best, and simply false at worst: We can be certain that all impressions are not captured and are not accurately reflected in the GSC data.

Click data is proportionally accurate: Clicks can be trusted as a proportional metric (ie: correlates with reality) but not as a specific data point.

Click data is useful for telling you what URLs rank, but not what pages they actually land on.

Understanding this reconciliation can be quite valuable. For example, if you find your click data in GSC is not proportional to your Google Analytics data, there is a high probability that your site is utilizing redirects in a way that Googlebot has not yet discovered or applied. This could be indicative of an underlying problem which needs to be addressed.

Final thoughts

Google Search Console provides a great deal of invaluable data which smart webmasters rely upon to make data-driven marketing decisions. However, we should remain skeptical of this data, like any data source, and continue to test it for both internal and external validity. We should also pay careful attention to the appropriate manners in which we use the data, so as not to draw conclusions that are unsafe or unreliable where the data is weak. Perhaps most importantly: verify, verify, verify. If you have the means, use different tools and services to verify the data you find in Google Search Console, ensuring you and your team are working with reliable data. Also, there are lots of folks to thank here -Michael Cottam, Everett Sizemore, Marshall Simmonds, David Sottimano, Britney Muller, Rand Fishkin, Dr. Pete and so many more. If I forgot you, let me know!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, January 30, 2017

During the holidays, my youngest daughter apparently had cabin fever after being in the house for a couple of days. While exiting the bedroom, my wife found the note below on the floor, after the former had slyly slid it under the door.

Though tired and not really feeling like leaving the house, we had to reward the youngster for her industriousness. And her charm.

Her effective "outreach" earned plaudits from my wife.

"At least she did it the right way," she remarked. "She cleaned her room, washed dishes, and read books all day, obviously part of an attempt to make it hard for us to say no. After all she did, though, she earned it."

Hmm...

She earned it.

Can you say as much about your outreach?

We're missing out on a great opportunity

Over the last few months, I've been thinking a lot about outreach, specifically email outreach.

It initially got my attention because I see it so badly handled, even by savvy marketers.

But I didn't fully appreciate the significance of the problem until I started thinking about the resulting impact of bad outreach, particularly since it remains one of the best, most viable means of garnering attention, traffic, and links to our websites.

What I see most commonly is a disregard of the needs of the person on the other end of the email.

Too often, it's all about the "heavy ask" as opposed to the warm touch.

You're likely saying to yourself, "But Ronell, the second person didn't get anything in return."

I beg to differ. The first person likely lost me, or whomever else they reach out to to using similar tactics; the second person will remain on my radar.

Outreach is too important to be left to chance or poor etiquette. A few tweaks here and there can help our teams perform optimally.

#1: Build rapport: Be there in a personal way before you need them

The first no-no of effective outreach comes right out of PR 101: Don't let the first time I learn of you or your brand be when you need me. If the brand you hope to attain a link from is worthwhile, you should be on their radar well in advance of the ask.

Do your research to find out who the relevant parties are at the brand, then make it your business to learn about them, via social media and any shared contacts you might have.

Then reach out to them... to say hello. Nothing more.

This isn't the time to ask for anything. You're simply making yourself known, relevant, and potentially valuable down the road.

Notice how, in the example below, the person emailing me NEVER asks for anything?

The author did three things that played big. She...

Mentioned my work, which means she'd done her homework

Highlighted work she'd done to create a post

Didn't assume I would be interested in her content (we'll discuss in greater detail below)

Hiring managers like to say, "A person should never be surprised at getting fired," meaning they should have some prior warning.

Similarly, for outreach to be most effective, the person you're asking for a link from should know of you/your brand in advance.

Bonuses: Always, always, always use "thank you" instead of "thanks." The former is far more thoughtful and sincere, while the latter can seem too casual and unfeeling.

#2: Be brief, be bold, be gone

One of my favorite lines from the Greek tragedy Antigone, by Sophocles, is "Tell me briefly — not in some lengthy speech."

If your pitch is more than three paragraphs, go back to the drawing board.

You're trying to pique their interest, to give them enough to comfortably go on, not bore them with every detail.

The best outreach messages steal a page from the PR playbook:

They respect the person's time

They show a knowledge of the person's brand, content, and interests with regard to coverage

They make the person's job easier (i.e., something the person would deem useful but not necessarily easily accessible)

We must do the same.

Be brief in highlighting the usefulness of what you offer and how it helps them in some meaningful way

Be bold in declaring your willingness to help their brand as much as your own

Be gone by exiting without spilling every single needless detail

Bonus: Be personal by using the person's name at least once in the text since it fosters a greatest level of personalization and thoughtfulness (most people enjoy hearing their names):

"I read your blog frequently, Jennifer."

#3: Understand that it's not about you

During my time as a newspaper business reporter and book reviewer, nothing irked me more than having people assume that because they valued what their brand offered, I must feel the same way.

They were wrong 99 percent of the time.

Outreach in our industry is rife with this if-it's-good-for-me-it's-good-for-you logic.

Instead of approaching a potential link opportunity from the perspective of "How do I influence this party to grant me a link," a better approach is to consider "What's obviously in it for them?"

(I emphasize "obviously" because we often pretend the benefit is obvious when it's typically anything but.)

Step back and consider all the things that'll be in play as they consider a link from you:

Relationship - Do they they know you/know of you?

Brand - Is your brand reputable?

Content - Does your company create and share quality content?

This content - Is the content you're hoping for a link for high-quality and relevant?

In the best case scenario, you should pass this test with flying colors. But at the very least you should be able tp successfully counter any of these potential objections.

#4: Don't assume anything

Things never go well when an outreach email begins "I knew you'd be interested in this."

Odds suggest you aren't prescient, which can only mean you're wrong.

What's more, if you did know I was interested in it, I should not be learning about the content after it was created. You should involved me from the beginning.

Therefore, instead of assuming I'll find your content valuable, ensure that you're correct by enlisting their help during the content creation process:

Topic - Find out what they're working on or see as the biggest issues that deserve attention

Contribution - Ask if they'd like to be part of the content you create

Ask about others - Enlist their help to find other potential influencers for your content. Doing so gives your content and your outreach legs (we discuss in greater detail below)

#5: Build a network

Michael Michalowicz, via his 2012 book The Pumpkin Plan, shared an outreach tactic I've been using for years in my own work. Instead of asking customers to recommend other customers for a computer service company he formerly owned, he asked his customers to recommend other non-competing vendors.

Genius!

Whereas a customer is likely to recommend another customer or two, a vendor is likely able to recommend many dozens of customers who could use his service.

This is instructive for outreach.

Rather than asking the person you're outreaching to for recommendations of other marketers who could be involved in the project, a better idea might be to ask them "What are some of the other publications or blogs you've worked with?"

You could then conduct a site search, peruse the content the former has been a part of, then use this information as a future guide for the types and quality of content you should be producing to get on the radar for influencers and brands.

After all, for outreach to be sustainable and most effective, it must be scalable in an easy-to-replicate (by the internal team, at least) fashion.

Bonus: Optimally, your outreach should not be scalable — for anyone but you/your team. That is, it's best to have a unique-to-your-brand process that's tough to achieve or acquire, which means it's far less likely others will know about, copy or use it or one like it.

Awaken your inner child, er, PR person

Elements of the five tips shared above have been, singularly, on my mind for the better part of two years. However, they only coalesced after I read the note my daughter shared, primarily because her message delivered on each point so effectively.

She didn't wait until she needed something to get on our radar; never over-sold the message; was selfless in realizing we all likely needed to get out the house; didn't assume we were on the same page; and activated her network by first sharing the note with her sister first, and, through her mom, me.

Now, the question we must all ask ourselves is if the methods we employ as effective?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

At least once or twice per month, I talk to a small e-commerce store owner who wants to invest in content marketing. Often times, I have to break it to them that they’re not ready for content marketing.

You see, before you spend a bunch of time generating traffic from your target audience, it’s important to make sure those visitors get the best experience possible while browsing your store.

So, in this post, I want to give store owners and e-commerce newbies a clear idea of where they can invest their time before investing in more paid and organic traffic to their sites. Many of these can be accomplished for less than $1,000 or a few hours of your time.

With a few small-scale investments you can help drive performance on conversions, SEO, and more.

So what are they?

Rewrite Your Weak Product Descriptions

Take Better Product Photography

Build Lookbooks & Product Collections

Start Adding Product Videos

Upgrade Your Review Software & Process

Let’s look at these opportunities in detail, and better yet, show you some actual examples of what your site could look like.

Rewrite your weak product descriptions

From product details to features and benefits, product descriptions must pack a lot of information in a short format. You may have overlooked some missed opportunities.

If you answer “yes” to any of the following questions, consider investing in improved product descriptions.

If you’ve built buyer personas for your brand, make sure the copy addresses the appropriate persona’s unique pain points and concerns. Bland descriptions meant to appeal to everyone — or just bots — aren’t as effective.

This high chair example from 4moms.com focuses on the three things that matter to their audience: single-handed adjustments, spilt-food prevention, and easy cleanup.

2 - Does your copy focus on benefits rather than features?

You can list features all day long, but customers really want to know how your product will make their life better.

The Amazon Echo sales page does a great job of focusing less on the technical features of the product, and more on the cool things you can do with it.

3 - Are you describing your product with the same words that your customers use?

Using the same language that your customers do will help you better communicate with your target audience in a way that sounds natural for them and touches on their pain points.

A simple way to find these words is to do some reverse engineering. Start by looking at customer reviews and feedback you’ve collected (and those of your main competitors as well) to pick out common words and phrases that satisfied customers are using. From here, you can tie that customer language back into your own descriptions.

I was shopping for a new tent last week and saw this awesome reviewer on Amazon drive home a point that the copywriters had missed. If you read that entire review, the phrase “family tent” is mentioned about 13 times.

But if you read the product description, "family tent" only shows up once. The description fails to mention many of the benefits covered by the reviewer: lots of pockets, sleeping arrangements, ability to catch a breeze but keep the doors closed, etc.

There’s an opportunity here for a competitor in the tent or outdoor space to improve their own product descriptions for the same tent (or even put together a larger guide to family tents).

Take better product photography

Good product photos do more than just show shoppers what you’re selling — they provide context and help customers visualize using your products. Plus, high-quality photos will reduce product returns that happen due to misleading images.

So what can you do to upgrade your product photos?

Smartphones aren't going to cut it

Use a DSLR camera, not your smartphone. Although modern smartphone cameras can take higher resolution photos than ever before, you’ll get better results from a DSLR. Lower-end models start at around $500 — try finding a used body online and spending more money on a better & cost-effective fixed lens that can handle video, too.

Build a cheap lightbox

Create a lightbox for well-lit photos with a solid white background. For less than $10, you can build your own lightbox that will vastly improve the quality of your product images.

Use creative angles

Shoot products from multiple angles. Be sure to include several images on every product page. The more perspectives and viewpoints you have, the better customers will be able to judge your product.

It's OK to tweak & process your images to make them pop

Process your images with filters that enhance color and overall image quality. Photo filters resolve poor lighting or color issues and vastly improve your product photos. Just try not to get carried away with dramatic filters that distort the color of your products, as this can be misleading for the buyer. Here’s a good example from ABeautifulMess.com showing the difference before and after image edits:

If you don’t have time or the inclination to take your own photography, outsource it to a professional. No matter what route you go, know that upgrading your product page photography is well worth the investment.

Build lookbooks & product collections

You can also provide more context for your products through lookbooks, which showcase your products in use. The term “lookbook” is mostly common in the fashion industry, but the concept can be extended to a variety of industries.

The photos in the lookbook for Fitbit’s Alta model of fitness tracker help shoppers envision themselves wearing them. Fitbit’s lookbook also establishes a brand lifestyle promise — impossible with product photos alone. Even better? The various photos are clickable and take you to the product page for that color/style of wristband:

Product collections are another great variation on this strategy. In this “Mediterranean Collection” page on Coastal.com, shoppers get an opportunity to shop by “style,” and to see examples of the glasses on actual faces instead of just a white background:

As I alluded to before, this isn’t just an opportunity for fashion sites. The trick is to make sure you're showing your products in action.

Heck, even office furniture is easier to purchase when you see how it looks in a workspace.

Start adding product videos

Adding video to product pages is another relatively low-budget improvement you can make, yet it has extreme value for shoppers and your bottom line.

Why? Because video’s ability to quickly educate shoppers is a powerful conversion tool. Eyeview Digital reported that including video on landing pages can improve conversions by as much as 80%, and ComScore indicated that online shoppers are 64% more likely to buy after watching a video.

Video also allows you to give shoppers a more complete overview of the product and to go beyond static pictures with a story element. These engaging visuals can help shoppers envision themselves using your products in a way that photography alone simply can’t.

Zappos is well known for including videos on what seems like every listing, but what’s more impressive to me is how much personality and brand voice they show off. While shopping for boots recently, I have to say Joe was my favorite video personality:

Customer reviews make your product pages more effective, allowing shoppers to evaluate the product based on real customer opinions — and can help you spot product issues.

I’m listing a few common platforms here, but you should really check out Everett Sizemore’s guide to product review software, which has some great insights on the performance of the entire marketplace of product review software options, including technical SEO concerns:

Traditional product reviews may not be right for all stores...

The best option for you will depend on the tool’s ability to integrate with your store, your preferred functionality, and your budget. Sometimes, traditional product reviews won’t be the best choice for your product or store.

In this example from ThinkGeek, they’ve opted to just let people leave Facebook comments rather than any product reviews at all. Which makes sense, because they’re Star Trek garden gnomes, and it’s not like you need to tell people whether they were the right size or not. Even better than Facebook comments, they also solicit product photos via social media on their #geekfamous hashtag.

There are many different tools that will allow you to showcase elements of social proof like ratings and reviews, so take your time carefully reviewing different options to see which is the best fit for your needs and budget, and if normal product reviews aren’t the right fit, feel free to take a different approach.

Make enough of these small investments and you should see big improvements over the long term.

Tackling these small investments — as your schedule and budget allows — will dramatically improve the overall user experience and the effectiveness of your e-commerce store.

Consider which aspects are the most important to complete first, and then start doing your research and put together a strategy for how you’ll prioritize these site upgrades. With a well-thought-out plan of action, you can focus on the projects that will drive the best results for your business, rather than trying too many different tactics all at once.

This is by no means the complete guide to investing in your e-commerce store, so in the discussion below, I’d like to hear from you. What creative ways have you improved your e-commerce site content in the past that boosted conversions or organic search?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!