This infographic shows a fun path-based look at Conversion Rate Optimization (CRO) – essentially the right and wrong way to do it. To get started, let’s look at each path to see whether you’re here to chew bubblegum and kick some ass, or screwing up your optimization efforts.

Make sure you scroll past the graphic, to see my dissection of each step.

Now Let’s Dissect it Piece by Piece

First up is the wrong way. For the record, you don’t want to be doing it like this…

The Wrong Way to Do CRO

Or is it? Read my often argumentative interpretation of what the infographic says.

Step

What the Infographic Says is Wrong

What I Say

1

Green buttons convert better than red ones

Agree – this is wrong. The color rarely has an impact on conversion (there are some case studies that show otherwise, but there are better ways to make your buttons more effective). If you really want your button to stand out, you need to focus more on design concepts like contrast, whitespace and directional cues that draw attention to them. In the case study shown, it’s my opinion that the conversion lift was more to do with contrast, rater than color. I’d be more interested in a test that involved other highly contrasting colors – to really uncover the key differentiator.

2

Desaturate your logo colors to improve conversion rate

Agree – this is wrong. In fact it’s total baloney. It’s another contrast issue. If you were to knock back the saturation or simplify your page entirely, your CTA would stand out more. But just fading the logo a bit won’t do anything.

3

Have an arrow next to your CTA will increase conversions

Disagree – this is not wrong. This isn’t the wrong thing to do. Directional cues (points 7 and 8) help visitors understand what you are asking them to do.

4

Short pages convert better than long pages

Disagree – this is not wrong. It depends entirely on the customer’s needs. Sure, for the most part it’s true. But at the end of the day simplicity sometimes needs to be replaced by the needs of the visitor. If they need to be told a long story or flooded with details in order to make a purchasing decision then it can be better to have a long page. In fact there are a significant number of pages that employ this technique including a “smooth scroll” interaction from a top navigation to point people to the area on the page they are looking for. Sort of like a microsite on a single page.

5

Pinks don’t work on the web when defining shades

Whatever. Sure. Let’s be pink haters. It can definitely be ghastly, but there’s a place for every color if it’s appropriate for the design.

6

A British flag in your logo acquires more leads

Agree completely! This is wrong. Yikes, who would even think of doing this. If this is all you come up with, you have no business in the CRO business.

7

Let’s not do any testing and see what happens

Agree – this is wrong. Without testing, you’re not doing optimization. You’re just throwing darts at a donkey in the dark.

8

Best practices are better than testing

Agreed – this is wrong. Best practices are something you should incorporate into your first page, but they are not good enough to establish a highly converting page. To really do a good CRO job, you need to test your control page by developing a hypothesis for how it could be improved (often via user feedback) and test it.

The Right Way to Do CRO

Note: You can click the tiny, incomprehensible images to get a bigger version.

Step

What the Infographic Says is Right

What I Say

1

Set up funnels

Agree. Once you can ascertain the fall-off points in your flow, you can target them with optimization efforts.

2

Analytics (or more accurately – user observation)

Agree. Using tools that let you understand click density and user behavior can illuminate areas on your pages that are not being used or seen.

3

Barriers – Find out why people don’t convert with point of conversion tools

Agree. If you can position a survey tool such as Qualaroo or live chat like Olark, you can find out what is hindering your visitors from becoming fence-sitters to customers.

4

Go offline and talk to front line employees such as customer service.

Agree. No one understand the frequency of complaints and problems faced by your customers that these teams. Their insight can help with setting priority for feature changes and bug fixes.

5

Use testimonials and expert reviews.

Agree. Getting an outside opinion can remove the subjective nature of an internal perspective (one based on being too close to the product or brand). At the end of the day, you still need to test any feedback gathered.

6

Strengthen your advertising: Reward customers with loyalty initiatives, use competitions for engagement, and use free trials and gifts to add value.

Partially agree: Don’t get caught up in cannibalizing your revenue by giving too much away. You’ll be surprised by the impacts of being seen as a product that doesn’t believe in it’s own worth. In my experience, raising prices can help the perception that your product or service is superior.

7

Wireframe solutions.

Agree: Use visual mockup tools to quickly turn your test hypothesis into a new page layout/concept. Then you can use simple in-house testing methods such as the 6ft test and the 5-second rule, to back up your new ideas. For more detail on these methods, read The Ultimate Guide to Landing Page Optimization (page 24 and 25).

8

Testing – One accurate measurement is worth more than a thousand expert opinions

Review – try applying your winning results to other areas of your site

Agree: A good way to do this is to optimize a landing page, and use your findings to make changes to the areas of your site representative of the messaging and design you have been testing. At Unbounce we did some headline testing during a PPC campaign and after finding a clear winner, will be rolling it out in the next iteration of our homepage.

10

Add, rinse, repeat.

The lesson here is that no page is perfect, and you should continue testing until your page is optimized to the point where you feel it’s flatlined and you should be focusing on another page.

Tweetables

Share these rad testing tips with your followers. And don’t worry, you can change the tweet text before it goes out.

1/4 People use online reviews before paying for a service» Tweet This «

44% of companies use split testing software. It should be 100%» Tweet This «

“One accurate measurment is worth more than a thousand expert opinons” » Tweet This «

“Your tests will only be as good as the ideas you put into them.”» Tweet This «

If people don’t trust your site then changing the color of your buttons is like waiting for your cat to bark » Tweet This «

In Summary

Looks like they got it right for the most part, aside from a few cooky ideas in the “wrong path”. What do you think about the ideas presented in the infographic?

I think they’re calling it out as incorrect methodology to take a series of changes that you’ve heard about as being successful for others, and then applying them to your site.

It’s fine to get ideas for things to test from other people (ie. “test headlines” and “test button colours” and “test button text” etc.) but if you’re not doing it as part of a research framework within which you’re setting yourself up to learn as much as you can from each change, then you’re no better off than you were before (ie. you’re still just guessing).

The only thing in their “wrong path” that I’d disagree with is “let’s not do any a/b testing”.

If you’re working with very very low traffic volumes, A/B testing is basically meaningless. There was a great article which talked about this on marketingexperiments.com the other day:

Launching a “radical redesign” (see: Tip #4: Test radical redesigns) as an A/B test when you have very low traffic volume, you’ll wait months and months to get statistical significance.

However if you have basically 0 conversions now, and you launch a radical redesign and start getting some conversions, well great. Now you at least have something you can work with.

Even Optimizely suggests in their product to wait for 100 conversions *per variation* before you can count a result as statistically significant.

If you’re a wedding photographer who gets 1 – 2 leads per week on an ad spend of $70 then you have a profitable deal funnel, but very little likelihood of being able to run a meaningful A/B test until you’ve got a tonne of traffic (ie. organically, social traffic, a huge email list, a blog audience, or through a dramatically increased ad spend).

Oli Gardnersays:

Have to disagree there. If it’s presented as the wrong way then that’s how I’ll take it. I’m merely responding to what I’m seeing (not a mind reader). I have to say though that everything should be open to useful critique – and it’s worth noting that I agreed with 90% of what was said. So not really sure where you’re going.

I also agree with your statements about A/B testing (of course – that’s what we do – in part).

Anyway – nice to have a good banter about this, always makes it more interesting.

Definitely something to watch out for when creating infographics I guess (ie. you’re “compressing information”, you want to make sure the compression format isn’t “lossy” ;)

[…] The first step is to identify all those points which make users leave the site, and figure out the click density and other key factors. Then improve your advertising techniques and sample test all your promotions and campaigns. The last step is obviously the implementation. Check the infographics here. […]