I’m a UI designer with a deep focus on data and conversion rate optimization. I've been running a consulting firm at Linowski where we perform various A/B tests for clients. We also share tons of UI advice for easier and higher converting pages over at www.goodui.org. We also share some of our testing advice over at: www.goodui.org/betterdata. I can advise on ways to optimize for more signups, more purchases, more leads, etc ...

What I know nothing about:
SEO, PPC, channelling traffic.

What can you ask? Here are a few sample ideas:
- How to run tests on low traffic sites?
- How to setup your first test?
- How to know that your results are real?
- What to measure?
- How many variations?
- How long to run the test for?
- Top conversion ideas?
- When to stop a test?

James: And I’m James. This week we have a great guest Jakub Linowski from GoodUI.

Alaister: So Jakub is a founder of GoodUI.org. He specializes in conversion rate optimization with a design centric approach. Welcome Jakub.

[00:33]

Jakub: Nice to meet you.

[00:37]

James: It is really nice to have you on the show Jakub. Usually we just ask people as the start how did you get into doing user interface and conversion rate optimization and how did you get to where you are today and what do you do day to day?

[00:52]

Jakub: Sure. So we are a consulting company that exactly precisely focuses on conversion optimization and we have been doing this for just over a year. So like many viewers here we are still learning and we have been mostly focusing on running AB tests full time since this time period. Before that however we have been doing quite a bit of UI design. And I guess there was a point some time a year and a half ago where basically we wanted to get rid of…there are moments where there is quite a bit of uncertainty and we wanted to tackle that precisely through design experimentation.

[01:42]

Alaister: Yeah so obviously within your experience you have run like hundreds of different AB tests and we run a lot of AB tests here within the Freelancer and the Warrior Forum team. What are some of the biggest mistakes you see with people starting off with AB testing and conversion rate optimization?

[01:59]

Jakub: Sure some mistakes. So some of the mistakes that we have run or some of the mistakes that people do or...?

[02:06]

Alaister: Yeah just mistakes in general that you sort of experience whether it be first time clients or first time conversion rate optimizers and things like that.

[02:16]

Jakub: Sure, so one of the things that can go wrong or can happen pretty quickly is not measuring the sample size. So whenever, I mean even this includes us as well. Whenever we basically start off testing, we didn’t really understand the current state of our project or our set up pages. And what happens is we could eventually generate too many variations or we don’t know the current traffic and what happens with that is we had a head start on a test, we ran that test for a week or a two and it becomes pretty obvious within a timeframe that in order to actually reach significance, that test will have to run for half a year or a year, right. So not really understanding the current state is one example.

Other pitfalls could include stopping a test too early okay so typically also similarly tied to that scenario, what might happen is you might have an effect that is plus 70% or so and at first they are too weak and people get kind of excited about that. And typically what happens is maybe the first couple days it is still trying to... the results are kind of chaotic in the first timeframe. So there is this thing called like [04:04 progression to the main]. So over time what will typically happen you know of these high peaks, things will stabilize and if people pull the trigger too soon on something like that then you might apply a variation but with time you actually won’t see any results benefitting from that supposedly winning variation.

[04:32]

Alaister: You mentioned earlier one of the biggest pitfalls is not being able to run an experiment with a large enough sample size. So you may be running an AB test and you may have to run it for months or multiple different months to achieve any level of statistical significance. I have a question here that has come in from Angus and he is asking, “How do I go about running a test for a site with low traffic when perhaps my goal may be a high price conversion however I get a very low amount of traffic?”

[05:07]

Jakub: Mhm and that’s something exactly we have dealt with as well. Some of the kind of ways that we can do in that case is fewer variations. So something like I mentioned before is, the pitfall was we basically did too many variations right and you can fall back on traditional AB testing where you just have A, B. With that you can kind of reach significance a bit quicker. Another thing that we’ve done was actually moved away from traditional like AB testing where you have small changes towards making larger changes. So on these low traffic sites we focus on would be grouping multiple changes together because it is easier to detect plus 50% or so improvement than it is plus 5% improvement. And typically the smaller kind of changes, smaller adjustments will result with smaller effects. So when you group multiple changes together that can be one way of dealing with that.

And finally I think another thing you could potentially do is you could throw more traffic at it, whether it is paid traffic, whether it is rechanneling through some sort of ad or some sort of promotion on your website or an email blast, email campaign. So yeah.

[06:44]

James: How do you typically go about tracking the actual conversions on the page? Do you use any tools for that or how would you advise people to track their conversions?

[06:57]

Jakub: So we use visual website optimizer for most of our projects. We’ve used optimizely once or twice which is potentially I think they are one of the more popular ones and they are probably competitors of each other. I think there is something else called Google something that does some sort of AB testing. But as far as from what I, I mean we haven’t tried that one but from what I’ve heard is visual website optimizer or optimizely they provide a lot more flexibility and yeah that is kind of what we use.

[07:36]

James: Interesting.

Alaister: Yeah we will provide links for everyone to access both visual website optimizer and optimizely.com in the forum.

You mentioned when you’re looking to create different variations for low traffic sites you create larger changes and dramatic changes that potentially have much higher impacts on conversions. How do you go about creating hypotheses and actually coming up with a change?

[08:07]

Jakub: So I think that raises the question, I mean sometimes yes, with the lower changes, with the smaller changes it is easier to create a hypotheses and I think when you have enough traffic, when you have large sites such as Freelancer.com you could run these small changes, you can ask yourself a question, “If we do this, the effect will be that.” And you test that. So in some form or another you are testing you are maximizing for I guess the clarity of the cause. That generates the effect.

So you are aiming for a clear cause, a true experiment right. You are kind of applying the scientific method so to say right. I think when you are dealing with the larger variation tests and this is something again this is still kind of emerging based on what we are doing, sometimes it becomes more and more difficult to create those hypotheses. However we maximize for the largest effect and I think in certain cases it is okay to sacrifice for, you might not really know that one of these five things whichever one of these five things caused the big improvement or maybe they are interacting in some form or another but the point being is the business or client has a plus 50% plus 40% effect as a result of that, then at least that baseline, that new baseline, that new improvement has been covered and maybe some time in the future that could be followed up more granularly with a more hypothetical test. So yeah.

[10:05]

Alaister: What’s great about all the tests that you run is that I can see you are really passionate about sharing your experiences and your learning from the test. And both James and I have gone through a lot of your GoodUI data story product and read about the different experiments that you’ve run and the learning that you have had. And a lot of the information is extremely valuable. What are some examples of maybe some learning you didn’t expect, or big wins that you didn’t expect that came from some of your client’s AB tests, or from your own AB tests that you found?

[10:38]

Jakub: So there could be two tests. One of the biggest tests that we found, one of the biggest effects that we generated was around 230% lift for sign up rates. And I think that kind of gave us an understanding that, that effect…I mean typically we are very skeptical of anything of that sort. But I think there is sometimes the baseline, the control could be in such a state that could be improved [11:20 so much dramatically] has so much potential right that for example that case what was missing was a large or a clear headline. And the call to action, you know the second kind of fundamental landing page element, the call to action was kind of in the corner on the side. So by missing some of the fundamentals, if the control is in a poor weaker state then it is a lot easier to generate these larger effects.

With that very same test we also tried in one of the kind of variations we also applied this thing called gradual engagement. And that is a learning that we now try to reuse on many future tests as well, is basically as opposed to pushing someone to sign up, as opposed to, “Hey you will get this, sign up, sign up, sign up” it is kind of counter intuitive a little bit, but you postpone the call to action of signing up with something, you get the person to do something to invest in a split second or segment themselves, make that choice, make that subtle choice. Maybe one of three options that they choose which then is followed by a confirmation, yes we have that, yes we can provide that for you that thing you chose. By the way here is the sign up. So that was one of the kinds of tests.

Another test that we were a little bit shocked about, a couple months earlier we noticed that all the variations had a plus 40% or more improvement right. And at first, we were doing button tests okay, so imagine small button label changes. And at first we were like, oh yeah we are onto something. We were so happy to provide this to the client. And then the more the test ran and the more we thought about this, like I think sometimes you have to be a little bit skeptical about yourself right, and basically what turned out was that the measurement was wrong. There was some sort of problem, actually we haven’t really, we tried five retests and we tried to nail down the issue, what really caused that, but the point being there was an artificial lift. And sometimes you have these things that just enter your measurements, the experiment you are running, and it just has to be like something artificial, so.

[14:09]

James: Just going back to your first point, we have got a question here from Jonathan D. He is asking, “How would you go about approaching converting users differently to converting the sales?”

[14:25]

Jakub: Users as in signups?

[14:26]

James: Yeah registrations, you know just for a more registration based service versus a sales sort of…

[14:35]

Jakub: Right so would we do it differently? I’m not sure. I know one thing, it is that it is probably easier to optimize for sign ups as opposed to purchases. I mean purchases…you know falling back on the kind of standards or typical conversion rates which again always differ for each website, but sign ups you could potentially get a turnaround of 10, 15, 20% whereas purchases are probably, again depending on how warm your traffic is and what you are selling, it is probably closer towards 2% plus or minus.

How would it differ? I think with the larger conversion rate for sign up I think there is, you could potentially take more risks and could potentially try going back to the kind of sample size calculation advice, could potentially with a larger default and current conversion rate I think you could potentially have more variations or reach significance faster.

[15:58]

James: I guess this goes back to your very first point about really understanding where people are at and where their data already is, and then you can make a better decision about where you will go next if you already have that data.

[16:13]

Jakub: I think so yeah.

James: Yeah.

[16:15]

Alaister: A lot of warriors in the community, their core businesses revolve around building email lists, so they spend a lot of time working on email marketing and trying to grow their list as big as possible so they are able to market to a large audience. So being able to optimize and increase sign ups and email registrations is a huge part of what the Warriors do and what they are trying to achieve.

It is really interesting that you mentioned in your experiment that achieved over 200% sign ups you basically got the visitor to invest a little bit of time so that they were already somewhat invested into the action of providing an email address. So a lot of the Warrior’s have a system which is called double opt in and they try to basically hide a registration form behind a button. So the theory behind it is, as opposed to showing the registration form up front, they show a button which says something like, “Claim your eBook.” You press the button and the theory is because they’ve pressed the button, they are showing intent that they want to claim the eBook. They are somewhat invested. And then you sort of ask them for the email address straight after that so that they are already invested into that action.

So we are seeing that increase conversions. Is that sort of similar to what you were talking about in regards to that client that you worked with?

[17:38]

Jakub: I think that is the pattern, that is the right pattern, that is exactly very similar. So the form yes, like you mentioned, the form was hidden up front. Maybe it was slightly different in some respect where maybe what you are talking about is just basically expressing a single action, you are pressing a single button whereas what we had was on top of that we also had a bit of a choice. So potentially make a selection out of one of these four or five things.

So on top of the kind of smaller commitment you also have I guess one of the results is that you have some degree of stronger segmentation and you are kind of tailoring the content, the results to that choice.

[18:29]

Alaister: With a lot of the tests you mentioned running traditional AB tests where you have got two variations, maybe because you have got a small sample size. Would you recommend running AAB tests where you are actually testing the same variation as well as a new variation?

[18:48]

Jakub: For sure and I think that is something we did more and more of especially in the beginning. When you pick up a tool your natural instinct is not to trust it right like okay how do I know that this ruler is working properly right? So AB tests are one way to in some form or another, trusting the tool. Of course with time as you trust the tool more and more I think the need for that disappears.

One thing we found while we were running AAB tests for example was that some of these tools, especially when you are like this AB testing mode which is using inline editors and inline changes so you have your variation that loads and you know there is a split second or half a second where the change comes in. So you kind of see this somewhat of a flicker effect. And I think that could be, that in itself could potentially skew some of the results. So I mean it has been some time ago but I think there are ways to deal with that and how you preload your variations and whether you run them as split tests or not but that is something we discovered through AAB testing.

[20:21]

James: I have got a question here from Paul Z. He has been asking, “Is there anything you see time and time again that has just worked every time, be it a piece of copy or a certain design layout or something like that, that just every time you’ve tried it just seems to ring?”

[20:38]

Jakub: There are definitely some ideas which we feel more and more confident in as we try them and we see that they work. Can we say with 100% certainty? Probably not but yeah one thing is fewer form fields. And I think that is an example of decreasing the friction of, let’s say someone wants something but to actually get that something you have to go through fifteen form fields or something like that. It is painful. And they are repeating passwords and asking your mother’s maiden name or something and you know confirm your email address. So we basically found that yeah having less form fields is one easy way to remove that friction.

Other things, kind of like I mentioned a little bit with the previous test there is some fundamentals like you need to have a benefit oriented headline or something that offers something to the reader as well as a clear call to action right, if you are optimizing for some sort of action to be taken right. So if you have a little button that is faded in the right corner or somewhere at the bottom then yeah.

[22:02]

Alaister: Yeah there have been countless times that I have tried to do something online and have been turned off by too many form fields or too many options or something like that, it is crazy. But I mean from a very basic perspective, when you are trying to get a user to achieve something being able to reduce the friction and making it easier for them to do it is always a good first step. And that is what good user interface and good user experience is all about.

So I can see a whole bunch of people actually coming into the event right now which is great. For those of you guys that have just joined now and missed the first half of this Warrior Ask Me Anything event you can access all of the recordings within our war room. You can also access the previous events and all the future events. I encourage you guys to check out the war room. So we will just have a look at this right now.

[22:53]

Join the hottest membership in the internet marketing community and gain a huge wealth of knowledge for a tiny yearly fee of just $97. Access secrets shared by top internet marketers, hours of Q & A video recordings from the best experts. Get amazed with awesome war room special deals. Join the War Room.

[23:16]

James: Welcome back to Warrior T.V. we are with Jakub Linowski, he is a conversion rate specialist, founder of GoodUI and we have been having a good conversation here. Jakub I would just like to ask you for the people who are interested in getting going with UI changes and AB testing like this, would you recommend that they have some sort of UI background or how do you get into starting off?

[23:44]

Jakub: Starting off as in running your first test?

[23:47]

James: Yeah getting into this sort of field that you are doing, user interface design and conversion rate optimization stuff.

[23:57]

Jakub: I think it is good to aim for something that you will see the benefit. And I think one of the first things they can experience when you are running a test that can motivate you to keep on going is if you reach a winning test. You will see like oh wow, I have done something and it improved something by whatever percent. At the same time I think it is also good to experience the flip side of that, this kind of test where you think you’ve achieved that but you basically run it longer and you see it just trickle down and now it is like plus 5%, oh it’s zero, oh it’s actually -5%. And that experience is also valuable. Do you need UI background? Sure, I mean that will help. I mean there is definitely some, like the way you treat your landing page whether it has enough contrast, whether the typography is right, that can help. I think one of the key things that can, if we are talking about sign ups and potential lead and customer conversions, I think the most important thing that will help people to improve their landing pages is basically knowing their customer. So if over the phone you hear five or ten repetitive issues that people are frustrated about you can turn that into an AB test without any graphic design knowledge right. And maybe people are wondering is there free shipping, or do you ship to whatever country or something? You can just answer that with a variation or just implement it right away.

[25:50]

James: So your advice would basically be to just jump in there and give it a go then?

[25:55]

Jakub: There are definitely some best practices. There are definitely some other things to watch out for. We came up with, we have this growing list of some best practices that we learn about and capture as we go along and this other GoodUI.org/betterdata and I kind of mentioned in the first part of the interview we mentioned some things like measuring sample size. I would say in order to get to some result that they can actually claim and say that yeah you have actually provided something that is an improvement, try to identify those things that you are kind of somewhat certain about that will improve. Maybe it is a test that someone else has run and maybe you can repeat that test and maybe there is like a blog post about x, y, z test or something like that that you can take it on, basically copy paste and try yourself. So yeah jumping right in and aiming for that kind of improvement. That’s, yeah.

[27:15]

Alaister: Yeah so being a usability expert and UI guy you are obviously interested in behavior psychology, human behavior and things like that. For your websites and for your client’s websites do you use usability tools? Maybe things like heat maps or usertesting.com and things like that to try and understand what people are actually trying to achieve and what they are actually doing on the website?

[27:42]

Jakub: We have used heat maps in a test or two provided by visual website optimizer. The usabilitytesting.com I have heard about, I haven’t used that yet. I had some clients, some potential leads that were, we were going to use that to drive the way we designed the test. I think qualitative feedback is definitely…typically we haven’t done much of that. I think there is a point where pages are so optimized and you run out of ideas to test or you run out of the obvious things that you can improve. Whenever you hit that point I think qualitative and deeper customer insights are more and more important.

[28:38]

Alaister: With those deeper insights do you find that using the language that your customers and potential clients are using on the landing page to work well with increasing conversions?

[28:50]

Jakub: So using customer’s language?

Alaister: Yeah.

Jakub: I can’t think of a test that we actually run that we try to understand the customer’s language and apply that in a very concrete AB test. So I don’t know if I can answer that. We did have a test where we ran a so called natural language forms for one client and basically it was…so traditionally we see form fields that are top aligned and they are kind of ordered from top to bottom and they have a couple of labels. Instead what we have done on the test was again we picked this up on I think some other companies out there tested this, and basically we intertwined words and sentences with form fields to create a bit of a narrative, a bit of a story that the person can participate in and fill out those fields. As well as when you formulate forms in such a way you can also use the sentences, use wording, use language to reinforce the benefits of filling out those fields. And that we had significant improvement on.

[30:21]

James: Okay. When you are starting off on a site what is typically the first type of test you might run? When you first meet a client and you want to just see sort of how things are going, what would be your first test?

[30:38]

Jakub: Sure. So I think there is, even before the test, even before we run a test I mean if there is five things that we totally, like 95, 99 % are really sure about and can just improve, we might even decide not to run an AB test and just update the control, just implement the changes. The less certain we become, some of the first tests we can run are again things that we think will have the biggest impact. We might also pick the places that involve changes, so the things we can change but also the location of the page that can be test. So pertaining to a checkout, let’s say it is a funnel of three pages. We might focus on the last bit. If we find that we feel it has the most problems we might pick that as a first test and kind of work our way backwards.

Another thing you can do is potentially do like the lowest risk tests first right. I mean if someone is afraid of running experiments, you can do a bit of a less risky variation first to get them more comfortable.

[32:15]

Alaister: Working with a conversion rate optimization for clients I can imagine there will be times when you or your team would be confident in a variation however the client may be unsure as to the impact that this variation will have. How do you go about trying to sell that variation or trying to convince that this is a good idea to run a test with this change?

[32:41]

Jakub: So dealing with risk, I think every idea has a degree of risk and some ways to kind of mitigate risk would be, sometimes we [32:59 said drop rules.] So we might say, almost like in the stock market or trading, you can say that, let’s say a week passes and the variation will be less than minus 40% performance, we have in the past taken out variations out of a test or stopped a test completely if we really feel okay this is going nowhere, this is just hurting the business, we have got to pull out of this. So yeah you can establish [33:29 drop rules] and stop tests that way.

Other things that reassure people, is this a test that has been run by someone else? Sometimes you can look up articles or blog posts or user data stories and you can fall back on trying to repeat someone else’s success right, that kind of alleviates some risk.

[33:59]

James: I have got a great one here from Ben H. and it is a bit crazy but I want to ask you anyway. “Do you have any color schemes or specific colors that you see that manage to convert consistently better than other colors? Whether it be for buttons or whole schemes.”

[34:19]

Jakub: So I think some people might expect okay if I change green to red all of a sudden it is going to improve conversions by 100%. Be careful about stuff like that right because it is a small change and most likely you are probably dealing with a minor improvement if any.

One thing to kind of fall back on however is complimentary colors in order to increase contrast. So I mean if you have a purple background page, if you use a purple button you are going to make it pretty hard to find that button right. So anything that has contrast and again there are different degrees of how far away from purple you can move away to. And I think on the complimentary color wheel or something like that I think it is orange, so there is like these opposite colors that you can use.

Another thing in terms of conversion that we typically apply and it is more of an ease of use thing but also can work in terms of helping conversions is, it is a common mistake that people do is using the same colors for copy, for anything that can be clicked on, any kind of buttons, links, and then finally the selected states. So we look at those three elements and we look at them, we try to establish a color scheme that has unique colors for all those three types, copy, clickables and selected items.

[36:01]

Alaister: I’m guessing one place that may have larger impact with color changes may be high visibility areas of your website. So maybe things like your navigation or maybe where the header image is and things like that. That potentially may have larger impacts. And I guess it comes down to really as we’ve been talking about just running the AB test and seeing what the results are.

I’ve got a really interesting question coming in here from Jeremy M. and he basically wants to talk about payment. So he runs a service and he wants to ask, “From a best practice perspective is it better to ask for the payment first and then the sign up and the registration or vice versa and ask for the registration first and the payment at the end?”

[36:51]

Jakub: Yeah that question, I’m not really sure about. I would assume that I think the best, I mean I’m just making this up because I haven’t actually tested this but to me logically it makes sense to offer someone something and then ask for payment at the very end once there is some sort of motivation or intent to purchase or intent to act.

To ask for payment details up front, assuming that is asking for some sort of credit card number, it feels like you first make a choice. And I think it is also easier to start off with like these lower commitment entries such as like your name or the plan selection or the product selection and then finally once that is built up ask for payment. So that would be my gut feel.

[37:55]

James: I’ve just got a question come in from Chris and he is asking, “What is the longest AB test you have ever run that hasn’t sort of reached significance? So whether it be a couple of months and it just never got there and why do you think that was?”

[38:10]

Jakub: Right so just recently actually we were running a test for close to three months and did we reach significance? We did generate kind of…I mean to reach significance is not like an absolutely true or false right, it is not black or white, you have different degrees of…you establish your standard, like it is all up to the person to determine like okay what is good enough. But to answer the question why was that, why did that happen? In that particular case I think I actually covered that in data story issue ten, it was a book launch okay. So when we were estimating the sample size up front we were looking at I don’t know 5,000 uniques a month. And we were like okay cool we can work with that. Based on those numbers we thought we’ll be done in a month and a half, two months tops. However as the book launch kind of fizzled a little bit and winter came and traffic started going down, we had to deal with that.

So one way we dealt with that was called something with [39:32 _ test]. So because there are multiple product pages that were very similar in the type and the structure and the way they sold those products, we duplicated tests onto multiple other products and then we basically summed up the results. And that is how we kind of dealt with that test.

[39:55]

Alaister: We have spoken a lot about optimizing conversions for things like sign ups, email registrations, sale and things like that. Have you had any experience or seen other people optimize for other things? Maybe things like reducing bounce rate and exit and things like that?

[40:15]

Jakub: So bounce rate I am still not really sure about in terms of whether, like what value does that bring to a business. The way I understand bounce rate is once you come to a page and visit another page whatever that second page is, that bounce rate will drop off. Whether someone visits one or two or three pages of a website will that really be good or bad for a business? I’m not sure maybe there are some businesses that are great. If you are in maybe advertizing area or just want people to come back and spend as much time on your portal; I don’t know maybe it is good for Facebook right, but typically we focus on other metrics such as purchases, lead generation, getting people through a couple of steps, signing up. Those are the most common.

I had one person ask me once as well about comprehension of content for kind of an educational site, but no we have never measured or figured out a way to measure something like that yet.

[41:40]

James: If you have multiple ideas for a page and you are not sure which one to run, how do you decide which idea to go with?

[41:54]

Jakub: Right. So typically we organize them by again kind of the ones that we feel will have the most effect. And I think the benefit from doing that is assuming that you are going to be doing multiple cycles of testing potentially right, if you can improve your…let’s say you are struggling with purchase. Maybe you are only getting twenty purchases, fifty a month and if you can get that baseline up higher then it will be, like your future tests will be easier right. So that is kind of one way to look at it. And by that I mean maybe the checkout is in a weak state that could be improved which will make again the next steps easier.

Another way to organize your ideas is... I mean we spoke a little bit about risk, maybe less risky ones first. We have also started tests with easiest to implement ideas as well. So let’s say you have one test that will take two days to set up versus another variation that is complicated that might take three weeks or so. Knowing that, I think it is good to avoid that state where you are not testing anything, where you are not using the current traffic to generate anything. So with that in mind, yeah you can start off with an easiest to implement idea, followed by something more. And then that time of that idea being tested you can develop your second or third variation that is more complicated.

[43:42]

Alaister: I know a lot of people when they first start out with AB testing they begin testing on the headline, changing the headline and trying to get more people to engage with their content. You mentioned earlier about trying to construct a benefit oriented headline. What are some examples of headlines you found worked really well for your clients?

[44:05]

Jakub: Headlines that worked. I think one of the tips maybe is to try to focus on the “you” language as opposed to “I” language. Sometimes when you build your website and you build your business you can say, “Oh hi, we build websites.” You can transform that, you can reformulate that into something that resonates more with the readers with potential leads prospects, you can write something like, “You can expect to receive some sort of amazing benefit x, y, z here if you read on” or something like that right. So the “you” type of language.

Benefit focused headlines, again tied to the previous point, having some sort of value to the readers. And sometimes we sprinkle in anchors or we try to quantify the value. Sometimes people have a…it is difficult to assess how valuable something is and by I think showing sometimes numbers…maybe it is a book with twenty chapters and seventy handouts, or seventy solutions to your problem, or something like that. By throwing in numbers I think it quantifies the value. At the same time it can anchor, it can be used as a price anchor to overshadow the price. Let’s say you have seventy handouts and the book costs $19. So people subconsciously maybe compare that value to another value. And that can be used as a bit of a tactic as well.

[46:03]

James: You must have so many ideas buzzing around in your head, and I would love to quiz you all day about them, but where do you go to get inspiration for new tests and ideas and stuff like that?

[46:17]

Jakub: Articles, blog posts, podcasts. I like to listen to business podcasts basically talking entrepreneurs, any kind of small startups which are doing quite a bit of testing and are open about it, things like entrepreneur on fire, eventra millionaire, smart passive income podcast, stuff like that I think those have a lot of stories about people measuring and trying to lift their business off the ground. And sometimes they share insights.

At the same time we have our very own listing that you mentioned, GoodUI.org. We try to collect and kind of curate some ideas for ease of use and for higher conversion rates, so we kind of list those out there as well for others to make use of.

[47:16]

Alaister: With GoodUI.org and your data stories I mean we’ve spent a lot of time going through all of the data story issues and learning from a lot of your experiences. I know a lot of the team in the Warrior team as well as the Freelancer team has gained a lot of value and used a lot of the learning that you presented for our own tests. Tell us a little bit more about GoodUI data stories. And I know we’ve got a deal and we’ve been fortunate enough to get a deal specifically for the Warriors. So Jakub tell us a little bit about GoodUI data stories and the deal we have today for the Warriors out there.

[47:52]

Jakub: So data stories is a monthly publication where we write about our best tests and like I mentioned at the very beginning we are still learning. We have been doing this for a year and a half so there is still lots of room to get better at testing. So as we are taking that journey I guess, we write about our successes, what works, our failures as well as we try to show the effects and results from tests in a way that can be replicated or the insights could be used by others. AB testing takes time, you know you can have a week or two or you can have tests which require two or three months especially if you have lower traffic sites. So by sharing the results with others, we try to help others who are interested in either getting better at AB testing or basically just want to use the results to implement and improve their conversion rates on various pages.

And the deal that we are running together is basically we have bundled six of our best issues for the price of one. So hopefully that will come in handy for some of you.

[49:18]

James: That will be amazing I mean I know myself and Alaister we have both learned a huge deal. So anyone out there watching this should really try and snap up some of those data stories because they are really great for learning about great ideas and actually seeing the data behind these tests and stuff like that.

[49:36]

Alaister: Yeah so you have compiled six issues for the deal and I know they retail usually for $210 however just for the next two weeks we have got a deal where we are offering all six issues for just $35 so like you said that is the price of just one issue. So that is a really good deal. I mean with these data stories what I love about them is they allow people to really take a peek behind the curtains and really see exactly what you have done in regards to running these AB tests and the actual results. So it really saves people time in terms of running tests and really gives them actionable ideas and concepts that they can implement in the other tests. So I think we have gained a lot of value from them and I think a lot of Warriors are going to get a lot of value. So I definitely encourage you guys to check it out and take advantage of the deal.

So I think we are running out of time right now, but we have got a few questions coming in so we will try to go through a couple more and then we will have to wrap up.

So I have got a question here from Julie W. and she is saying she is always trying to increase conversion rates whether it be through AB tests, running usability tests and thing like that. Have you had any experience with using live chat software to kind of speak with customers and people that are interacting with your website to try to increase conversions?

[50:59]

Jakub: So no unfortunately I cannot say we have run a test with something like that. I did hear, recently I read something on Kora about that and I think there is multiple people who had some experience and I think there is lots of mixed results and maybe that can be attributed to the fact there is so many variables. I mean some people I’ve read that some people claim that maybe it is depending on how well the person chatting on the support side is trained. It could also be dependent on whether they are present or not 24/7. Maybe it is also attributable to the state of the page. So if let’s say you have a very successful page that answers a lot of questions, the effect of chat will be smaller than if at all than from a page that barely answers any questions.

So yeah I’m not sure about that one.

[52:06]

James: I’ve heard a lot of folks kind of say if you want to get more conversions just put more CTA’s on the page. And I have got a question from Angus B. here. “How many CTA’s is just too many on a page?”

[52:20]

Jakub: 257. That is a hard question I mean…

[52:29]

James: Have you found that having multiple CTA’s has increased conversions or anything like that?

[52:35]

Jakub: We have I mean we had a test as well that we had a shorter page. Basically we transformed two calls of action to a single one as a very concise very packed page and we had success plus 10 or 12 effect with that. But having said that we do typically repeat especially the longer the page is, every now and then we repeat the message and the call to action. Having said that I think there is also an important distinction between…because a CTA or a call to action will imply there is possibly two or three steps along the way that are deeper, whether it is to a true sign up or a true purchase.

And one interesting thing we found that you know, you can make the buttons larger, you can sprinkle quite a bit of them. You can make it almost like in some form or another impossible not to click right. But that first step, that first action that first click does not necessarily mean you are well off as a business because what really matters is the conversion depth and as you are measuring stuff like that, I think it is important to also look at the stuff that is kind of deeper in the sequence of steps.

[54:14]

Alaister: A lot of Warriors use a lot of different types of landing pages whether it be landing pages with long form sales copy, short form sales copy or even a video landing page. Have you tested all of these sorts of landing pages and is there a certain type that you have found successful or not successful?

[54:34]

Jakub: No I cannot say whether long form, short form, video, without video as an absolute that it is better or worse. I think that all depends on…a long form copy could have lots of garbage in there, or it could have great content, maybe a strong narrative tied in with some social proof of some sort that really builds trust and credibility in the thing that is offered. So I think that is more dependent on each case.

With video we had run some tests actually on data stories however we haven’t reached significant results.

[55:20]

James: Do you have any thoughts in a similar mind whether you have tested them or not about parallax design or multiple page sites for sort of more simple landing pages and stuff?

[55:33]

Jakub: So parallax, I’ve never tested that. I guess the question is parallax for parallax sake. Maybe that…yeah I’m not sure like whether it is good or bad. I know there is one case where we might be implementing it somewhere and I think the use of that will be...so sorry to answer the question, no we have not tested that in any form or another, but one use we find for parallax is we try to avoid double scrollbars okay and when we have two panels that are different heights, different amounts of content that are being displayed on the same page and at the same time we want to draw some sort of comparison between these two panels, maybe parallax is one solution to solving that problem and making both of those panels appear, again without using double scroll bars.

[56:37]

Alaister: Yeah it is a great solution. I never thought of using parallax as a solution for double scroll bars and different content on the same page.

So unfortunately we are actually running out of time right now. Before we wrap up Jakub what are sort of the top few things that you would, what is the top advice that you would give to people looking to start out in conversion rate optimization AB testing?

[56:57]

Jakub: Top few things to start off with conversion rate optimization? I think an easy test, again just a simple AB test where B is something you are confident in, just to try and obtain some sort of results. A good habit to build is calculating your sample size up front just to know and just to get an understanding of the fact that data will fluctuate and will be very chaotic the first day or two, or three, or weeks or even a couple of weeks depending on your traffic and it will change. And calculating the sample size for that is a way of fighting that off a little bit.

And last tip is actually 30 plus tips, kind of repeating something from the past, have a look at GoodUI.org/betterdata. There is quite a bit of advice there in terms of running your own tests.

[58:10]

Alaister: Great so that is GoodUI.org/betterdata is that correct?

Jakub: Yep.

Alaister: Excellent so we will have the link in the forum for everyone to access as well.

So thanks very much for everyone joining this Warrior T.V. episode and this Warrior Ask Me Anything event. It has been a great pleasure speaking with you Jakub and we really appreciate you giving up your time for the community and also offering that deal on data stories.

So just to recap we have got six issues normally retail for $210 and right now a special deal for $35 for all six and that is the price of one. So I definitely recommend that you guys check it out.

If my site is brand-new, how do I contact Affiliates on ClickBank or other affiliate networks, if I have limited or zero prior affiliate contacts but I have an incredible product? How can I make my brand-new site appealing to serious Affiliates?

I have already learned alot of stuf when I visited your site!!! There is hell lot of amazing info on your site.. I just can not imaging what will you teach us on the webinar... just can not afford for miss it...!!!