If software is eating the world, Eric Ries’s Lean Startup is the menu, offering developers a blueprint by which to change the world of startups, corporations and even churches. Lean Analytics, a new book by Alistair Croll and Ben Yoskovitz, tackles the "measure" part of Lean Startup’s holy trinity of build, measure, and learn. It defines the 5 stages of a company (empathy, stickiness, virality, revenue, and scale), the most important metric to use at each stage, and the benchmarks against which you can measure your performance.

Fast.CoLabs talked to Lean Analytics co-author Ben Yoskovitz about stickiness, reality distortion fields, and the one metric that really matters.

Why does the world need Lean Analytics?

When you watch companies go through the simple lean startup cycle of build-measure-learn, you realize that companies are very good at building, but the whole process breaks down when you get to the measure and, by extension, learn part. It's so easy to collect data now that it's hard to figure out what to track. What should I measure? When should I measure? What's a good benchmark? As complicated and big as the book is, it's actually intended to help you simplify.

Is there always just one metric that really matters?

The one metric that matters comes down to the number one problem that you are solving now. Surprisingly enough with startups, they often don't know what their number one problem is and they are distracted by a host of different things: "I can't get on TechCrunch" or "I can't get enough users," but you already have 1,000 users and none of them are using the product.

It might change very quickly over time. You might only be focused on one thing for a period of a week. You need to have a target for that one metric that matters. You need a goal which says that if "I reach this level I am confident enough to move along to the next thing."

As the business moves along there will generally be more numbers that you do genuinely care about. There may be one metric that matters for a particular department.

What makes a good metric?

Metrics have to be easy to understand. You are looking for numbers which are ratios or rates because they are easier to compare. In most scenarios that comparison is over time: Month over month what's my percentage of daily active users? The hardest thing is that a good metric has to change how you behave.

The quantitative data isn't going to tell you why things are happening; It's just going to tell you what's happening. You need some qualitative insight from talking to your users. Developers often shy away from that. It's not always obvious to them that they should talk to other humans. You need all of that. Just knowing the data isn't enough.

What are good metrics to use at the early stages of a company?

At the empathy stage, which is "Am I solving a problem which is actually worth solving?" it's going to be largely qualitative, looking for the thread of commonality among the people that you speak to. In my experience when you have talked to ten people you start to see patterns.

When you get to stickiness, it's really about usage and engagement. It will depend on the type of product, but you are looking for daily use, weekly use, and percentage of daily or weekly active users.

You'll know pretty quickly if it's somewhat sticky, within days or weeks. Within a couple of months, you'll know if they continue to use the thing. You are probably not going to get past stickiness on the first go. You'll be iterating a number of times on that MVP before you'll feel like you've nailed it.

You also want to keep talking to these people throughout the process to get a feeling for the value you are creating, which will give you a sense for whether they will stick around or not. That might not be obvious just from the data. A lot of people use surveys as a way of not having to talk to people. Surveys are useful, but only when you have talked to enough people to know what kind of questions you should be asking.

The book contains benchmarks for various stickiness metrics like daily usage. Where do those figures come from?

When you talk about numbers almost everyone will say: What's the threshold? When do I move on from that? When have a nailed it? It's very hard to find these benchmarks. We collected them through research, but they all come with a massive asterisk. This is very particular to a particular type of business. It may or may not apply to your business. You have to use them carefully. Let's say you are building a Pinteresque site and you know you want people to spend a lot of time on this site every single day, multiple times a day. If that's your goal and you are at 5 minutes (For media and UGC sites, 17 minutes daily usage serves as the threshold for stickiness), that's probably not good enough. It's orders of magnitude off.

What are the most common bad or vanity metrics you have seen founders use?

Users, which is pretty common, is a bad one. That number is almost always going up and to the right. It completely ignores obvious things like does anybody use the product?

In the social and media space the bad one is followers and fans. Advertising agencies will push back and say "I get paid by companies—brands—to increase the fans." I get it that agencies get paid to do that, and the truth is that you need fans before you can get them to do anything, but we shouldn't be celebrating because we have accumulated a lot of fans. There's always a line that has to get drawn right to the business model. I'm doing something way over here like accumulating fans because I want them to do some behavior which will eventually drive my business model. Alistair (Yoskovitz’s co-author) says "I don't want fans, I want minions. I need people who do what I need them to do, when I need them to do it.

What analytics tools do you recommend?

We get asked a lot about tools but I am largely tool agnostic. I would encourage startups to track early, to instrument right away. Once your company is rolling there will always be a million things to do and you will never get to it and miss something that really matters.

We use Geckoboard internally for dashboards. What I will often tell startups is use the tools that are out of the box—Google Analytics, KISSmetrics, Mixpanel, Keen IO for mobile. Those tools are pretty good but unfortunately they don't often track everything you need. When you start to understand your business better, you may want to build your own tools for tracking things.

At what stage in your process are startups mostly likely to fail?

Most early stage startups jump ahead of themselves too quickly. Most of them will skip empathy or say "Oh yeah, I talked to five people who said it was awesome." Most startups fail at stickiness. They don't build something which people actually want. All the metrics in the world and all the process in the world won't matter if you don't build something people really care about.

They will attempt to go very quickly into virality, which is all about user acquisition and scale, and start talking about press and doing all these things to make them look or feel really good. They don't get to revenue.

How do you avoid having founders think that Lean Analytics is a formula for success?

A)There is no formula for success. B) You should never say "I am a lean startup." I don't know what it means anymore. I'm not sure if I ever knew what it meant. What I really want to know is: Have you identified what matters to your business today? What are you tracking? How many experiments are you running? How often are you iterating? I want to know the practical details of what that actually means.

Entrepreneurs believe that sheer willpower alone will make something happen. That's great. You need that to fight the fight, but if you can't back that up with anything that's practical you are in serious trouble. Entrepreneurs themselves have to say "Why do I believe this is true? Why do I think this will work?" If there's a way I can measure it to some degree, I owe it to myself to do that and not just go exclusively with my own gut. On the other hand if you go all practical you will end up in analysis paralysis. For me it's always about trying to balance gut and data.

The way I describe it is as poking holes in the entrepreneur's reality distortion field. We can't quite tear the whole reality distortion field down, nor do I think we necessarily should. People who start their own businesses have to be a little bit crazy I think.