Interview with Eric Ries Part 2

Man 1: Is as follows. We add a fourth state to the system which is, okay, the code is done, but have we validated whether this task was worth doing in the first place or not? Then we capacity constrain that bucket too. So what happens is let’s say we have five people. We do five tasks in flight. We complete those five tasks, and then we go get the next five. But while we’re working on the next five, somebody needs to be figuring out whether customers, for example, actually liked the five features that we previously did. Of course, we don’t do them in batches of five. They flow through one at a time.

If by the time you’re ready to deploy the next feature, you still haven’t figured out whether the previous feature was any good. There’s no room to move it to the next bucket so the capacity constraint immediately brings things to a halt. It starts to back things up and focus the team to get together and say, “Wait a minute. Wait a minute. How do we know that that feature we shipped last week was actually any good?

Man 2: Yeah.

Man 1: Did we bother to measure? Did we ask anybody? Did we do any kind of testing? We had this realization, “Oh no. Actually, we didn’t.” It’s like, “Okay. Why do we do that? Why would you go on to the next thing when you haven’t learned if the previous thing would actually having a positive impact?” Because whether you mean to or not, each feature, each thing you do is informed by what you’ve learned in the past. So when you decline an opportunity to learn, you’re actually very likely to be working on stuff that doesn’t matter.

Man 2: Yeah.

Man 1: So teams that apply that very simple discipline, it’s a very innocent simple technique but it has a very important long-term impact, which is reorients us from thinking of the code being written as our unit of progress to having learned something.

Man 2: Right.

Man 1: When I work with teams that start practicing say, split testing. That’s the usual way. When in doubt, split test.

Man 2: What does that mean?

Man 1: Oh, sorry. In an AB test, say 50% of your customers see the new feature but 50% don’t and we’ll see if that changes their behavior in any way. That’s like there are a lot of other ways to do validation of an idea but that’s the easiest one.

Man 2: Yeah.

Man 1: When in doubt, test. Most of the teams I work with that are not doing that kind of testing in product development, they might be doing it in direct marketing or something like that. But in product development, when they start doing it, discover that between 60 and 80% of the features that they’re shipping have no impact on customer behavior. None. That’s really demoralizing. A lot of them are like “We’ve got to stop doing this split testing. It’s driving us crazy.” It’s like, “Hold on, hold on!”

Your new features are so great having such impact that they affect customer behavior, not at all so customers are indifferent to whether they’re there or not? Why are we doing them again? Then first, you get away with excuses. Well, it looks so much better because products that don’t have split testing tend to get prettier over time, whereas possibly split testing get more effective over time.

Man 2: Yeah.

Man 1: So it’s like, was our goal to have a pretty product or one that actually help the customers accomplish their goals? Oh, right. Our goal is to change how customers behave because that is ultimately what our product is supposed to be about.

Man 2: This sounds like Google.

Man 1: Yeah. They do a lot of split testing.

Man 2: They try to. It drives designers nuts.

Man 1: It absolutely does.

Man 2: That’s why Bowmerminn[SP] left and went to Twitter. Now he’s being driven by Twitter.

Man 1: That’s right. There’s one important distinction to be made. Big companies that do split testing generally are using split-testing to optimize, right? The classic 21-colors of blue. It’s like, well, we wanna squeeze out every tenth of a percent improvement in conversion rates. That is very useful for certain circumstances.

For startups, what I try to do is get them to only measure the macro. That is judge their split test only by the top five to ten key metrics of customer behavior. So that even if you’re testing a minor thing like a button change, we don’t measure the click to get to the button.

Man 2: Right.

Man 1: We don’t care. We weren’t doing it for that reason. We were changing this page because we want customers to be more likely to download our app or pay us money or be retained or invite a friend. If the micro thing doesn’t have an impact on those macro metrics, why are we doing it? So then, that’s a way of getting people to focus not on the micro-optimizations, which most is classic premature optimization.

Most startups that haven’t achieved product market fit yet are still trying to get every point of conversion out of a page. It’s like, no, no no. We need to learn which of our fundamental hypotheses about customers are wrong. Let’s run those experiments. So it’s a way of up-leveling people’s thinking to “Oh, right. Focus on the experiments that actually matter.” Not matter to customers, matter to our learning about what our product really needs to become.

Man 2: Wow. That’s pretty good stuff because most companies don’t think about testing. Do customers really want these things? I spent the last half an hour talking about that

Man 1: Right. I mean, we kind of would rather not know. I mean, that’s a very human thing. It’s fun to stay in stealth mode. It’s fun to stay in planning mode. That’s where you’re living in a fantasy and that’s very entertaining. But if our goal is really to change the world, then we have to be really in the world through the whole process of building a new product. It’s very difficult to maintain your vision and also be dealing with feedback. That’s what these techniques are really about. It’s giving you the routine discipline to be testing as a reflex rather than as a special one-time thing, like, “See? It proves that I’m right.”

Man 2: On the other hand, in the last 10 years, doing this kind of testing has gotten dramatically.

Man 1: Dramatically easier.

Man 2: Right. We can spin up a new. I was talking with a guy who builds databases. I said, “Don’t you have to have a database to test all this stuff?” He goes, “Not really. You just go to Rackspace or go to Amazon, slide your credit card in and start up a server and start up another one and start up another one and then find some data somewhere and shove it in there and test it out.

Man 1: Yeah. One of the lead startup success stories, I know you’re very familiar with Aardvark. Those guys did something that was really smart. They would use Mechanical Turk to fake features that they hadn’t built yet. So a lot of situations where they’d expose a test of a new feature to a customer that they planned one day to back it up with a really amazing computational algorithm. But in the meantime, they just have a human being do the task and then present the results back to the user.

Man 2: Yeah.

Man 1: Just to make sure that customers would actually want that incredible algorithm if they ever had the break through science necessarily to build it. You see a lot of that.

Man 2: Actually, do they just keep that Mechanical Turk? Because that’s what we all do on Aardvark, right? I mean, I get questions all the time during the day. “Would you like to answer this questions about Silicon Valley?” “Sure, why not?”

Man 1: That’s right. It absolutely is. They were able to iterate their way into a system that actually made it fun for people to wanna engage in those questions and actually work, because they were very focused on having a great design, building great technology but doing that in the service of actually helping people get the information they needed. Whenever they had to choose between their precious, whatever, and the effective choice which would be to help people. They chose to help people that’s why they’ve got a lot to celebrate. That’s the general pattern that we see. When we’re reality-based, then it actually frees us.

Man 2: Now, they got bought by Google, right?

Man 1: Yeah.

Man 2: How do you think that’s gonna go? Do you think they’re going to continue lean startup methods when they get into Google?

Man 1: I’m very interested to see. I have high hopes because a number of the founders had worked at Google before and so they really knew what they were getting into. I think some of the other acquisitions, the classic thing with a big company is people don’t exactly understand the culture of the new place they’re going and the integration doesn’t always happen seamlessly as you know. But I’m hopeful. We’ll see. I think of all the big companies that I have had the chance to work with my consulting practice. Google was one of the most open to this kind of thinking.

Man 2: It’s interesting talking with engineers who worked there. They have infrastructure constraints that make doing this type of iteration tough. You know, Google Buzz was developed over three years, so I keep joking, “Well, it’s actually a copy of Friend Feed.” Well, no, it was actually developed two or three years ago. In fact, it was partly developed by the guys that started Friend Feed. They left, did Friend Feed in the same amount of time and shipped it before Google Buzz could shut down.

Man 1: That’s right. I mean, when we were doing IMVU, we had a point where we had to decide if we wanted to sell it to Google, and we decided not to and take venture financing and go on our own path. Then subsequently, Google came out with their own 3D environment on Google Lively. It was a kind of a Microsoft-y kind of thing to do to say, “Hey, you should let us buy you or we’re going to compete with you.” We were like, “Hey, Google, really? But that happened to be the dynamic of that situation.

Man 2: Yup.

Man 1: Google Lively didn’t come out then for another two years. It wound up having copied having a lot of elements from IMVU. But in the meantime, we had been learning and iterating over that whole time. So when they finally came out, they did the classic, they made all the classic company mistakes. They waited too long to launch. They tried to make the product too elaborate, and then they launched it way too publicly.

Most startups don’t realize that one of the big advantages you have is that you have a pathetically small number of customers. The obscurity that we languish in and we’re always like, “Oh, why can’t we get more customers?” is a gift. The gift is you can experiment wildly and nobody cares. Use that to perfect the thing in micro-scale and then launch big. Some companies really forget that.

Man 2: It’s interesting. I’ve been following this company called Pipeo, which does social network, a new kind of social network. I almost didn’t want to talk about them because after talking about them got them customers or got them users and users put demands on them, I’m scared now that they’ll never get to really where they need to be to really build a good system. Google Buzz is the same way. They’re hearing feedback. I mean, they built a database to handle the feedback that they’re getting, right? So now, they have a stack-ranked version of feedback. They could have just shown it to two or three people and it would have given them the same feedback.

Man 1: I think that’s right. It’s proving the model in micro-scale before you scale it up, and prematurely launching is really dangerous. It’s too bad. I know that experience you’re having where you have a team and you’re like, “Boy, I don’t want to expose them to too much feedback because they might lose their way.” But if you think about it, if the company is at all successful, the majority of its life is going to be spending in public getting feedback. So instead of being afraid of that phase and trying to delay it, what I try to do is teach companies do is to develop a discipline for coping with that feedback early and often. In fact, try to eliminate the stealth period or minimize it as much as practical so that the company doesn’t get confused. Like yeah, you get all kinds of feedback from everywhere.

But if the company is systematically testing its assumptions, then it knows what to do with that feedback. It knows how to slot it in. It knows when to test it, when to ignore it, and when to say wait a minute. That’s actually an unexpectedly good idea. Let’s go test it out. If you’re in that kind of testing, lightweight testing mentality, it’s okay. Nobody can knock you off your game, because hey, that’s an interesting idea. Maybe we’ll test it. You don’t have to commit to. It’s not all or nothing. You can go all in very rarely.

Man 2: Did you work with Aardvark?

Man 1: Yeah.

Man 2: What did they learn by trying this methodology? Because they probably weren’t at all familiar with it all that much before they started their company, right?

Man 1: Yeah.

Man 2: They learned that at Aardvark and learn it before. Did they end bringing it?

Man 1: I don’t want to take credit for the success they’ve had by any means. I think the founders there came in with a lot of really good ideas about how to test ideas that I can’t take credit for. We worked together because we had really compatible ideas, not because I taught them everything they know. Now that they’re successful, of course. Of course, I taught them everything they know.

I mean, it’s so tempting to take credit but it’s totally inappropriate. What I will say is what I admired about them from the first time I met them was that they had an unswerving commitment to finding out the truth. Their point of view, they had kind of a philosopher’s background which was our goal is to build something that really will matter to customers, and that’s our highest priority. So even though we have a lot of cool ideas, every cool idea no matter how cool it seems is going to have to be subjected to some testing.

Man 2: Give me an example of a cool idea that they had, and how did they test it?

Man 1: I don’t want to speak out of turn. So they spent a number of months. They incorporated the company without a firm idea of exactly what the product was going to be. Instead of just picking the first good idea that they found, they spent months in building prototype after prototype of different kinds of products that could lead to different businesses that were all kind of in their area of interest, this thing called Social Surge. But Aardvark, the thing we know and love, was one of the later concepts that they tested because they didn’t want to get prematurely committed to something that wasn’t right.

I thought it was a really smart way to go about founding the company, to say, “Look, we have a vision of an outcome we want to see but the path is confusing. We want to really try some different routes.” In each one, not just build a concept, not just to a paper prototype but actually build some version semi-working software, what we call minimum viable product, and actually expose some customers to it and see, “Does their behavior conform to what we thought?” And also, does the technology? We’ve got the perfect idea but the technology is not right.

Man 2: So you said that Let’s back up 20 seconds just to make sure I got it.

Man 1: No, no. That’s fine. One of the things that impressed me about Aardvark was they started the company and then were really vigorous about testing a number of different product concepts. All of which were compatible of their overall vision, but none of which they were specifically wedded to until they had gotten a sense that not only did this thing work for customers but also that the technology was actually gonna be good enough to do what they were promising to customers. That’s why we talk about having this problem team, solution team, parallel process. If you just take customer development, the idea that you should, before you build something, go out and talk to customers and figure out what their problem is to make sure you’re solving an important problem. If you don’t have that couple of solution minimum viable product process, I always say you always wind up building teleportation, because you’re asking customers, “What’s the most important thing we could solve for you?”

If you really walk up that hierarchy of pain, you’re like, “Boy, customers would really infinite money, the ability to teleport, extreme longevity. You start to come up with problems that are too big that you actually don’t know how to solve them. So the goal of the vision, constraints in certain search space, and then the problem team is looking for, customer pain in that space, solution team is what can we do in this space. I say it not overlap when we find, okay, here are the areas where early adopters can be found to buy a product that’s not quite ready. The kind of people that Robert Scoble of the world who will use an 80% solution and give you feedback about it instead of saying, “This is stupid,” or people who will tell their friends about it instead of waiting for their friends to tell them.

Those are the people you have to start with. You take their feedback in. Then again, you don’t just do what they say because Robert Scoble is not exactly the mainstream customer, right? He’s not exactly like other people. He’s special.

Man 2: No. Actually that’s a big mistake a lot of people make. We can discount Scoble because he’s an outlier, he’s a weirdo. I remember hearing that when I had 1,000 followers on Twitter, and now a lot of people have 1,000 followers on Twitter. So if you would said that his feedback is wrong, you would have said a future group of people are wrong.

Man 1: That’s the proven technology life cycle adoption curve, seriously. Early adopters are your customers. Also, mainstream customers are your customers, just at different times and they have different needs. So we always say feedback is not about you, it’s about them. When someone gives you feedback, it doesn’t tell you a thing about your product. It tells you about the person speaking. You have to understand that feedback in context. Figure out how it affects your choice of vision. Then going back to the Aardvark story, understanding how the information you’re getting interacts with the path you’re trying to go on.

Is it time to double down on this product or to pivot to a different product? That’s a judgment call. There’s no spreadsheet that will tell you for sure it’s time to pivot or not. But if you make solid predictions and you make database decisions, my experience is your judgment gets better over time because you’re actually using your brain hardware. We’re basically associative maps at the end of the day. That’s what we’re good at. We start to develop an intuitive feel for what really works with customers and what doesn’t. That allows us to pivot and iterate faster over time, not slower. That’s the essence of the art of entrepreneurship in the era of the late startup.

Man 2: Interesting. So what’s this conference you’re doing?

Man 1: Oh well, Aardvark is the perfect segue because they are going to be one of our case. People who are interested in hearing the story straight from the horse’s mouth, they’ll be there. So April 23 in San Francisco. It’s called Startup Lessons Learned, sllconf.com My goal with that conference is to build the most coherent startup conference of all time. Coherent in the sense that we’re not just going to have people standing on stage giving you random advice. When I was an entrepreneur, this is what drove me crazy. People want to give you startup advice and it’s always in the form, “Well, I did X and then I made a lot of money. So if you do X, you too will make a lot of money.

How do I know that that’s going to work for me? How do I know that you really did that. Hey, how do I know you actually made any money? So what we’re going to do is…

Man 2: Even if you did, it was a different time.

Man 1: It was a different time, so how do I know it will work now?

Man 2: It won’t.

Man 1: Right. Almost by definition.

Man 2: Well yeah, because YouTube was built five years ago in a different time. So if you try to copy YouTube and what they did, you’re gonna fail.

Man 1: I think that’s right.

Man 2: Our expectations are different. So I love the fact that you test your theories out. Oh, you wanna do a YouTube? Well, build a little prototype and show it to your friends and see if they like it. They’ll probably tell you, “Why would I use this? I’m on YouTube already.”

Man 1: Correct. This doesn’t solve a problem for me anymore.

Person B:. Yep. And so Chatroulette came out, right?

Man 1: That’s right.

Man 2: So there is opportunity to do another video startup, right? If you want to be on video, you’ve got to try some new things, not copy what the old guys.

Man 1: That’s right. So the only enduring advice are the principles of innovation, not the practices and the tactics. That’s really what late startup is all about. So the conference is going to be organized around this build, measure, learn feedback loop that we talk about in the lean startup. Our goal is to optimize total time to that feedback loop. So the conference is going to be one day divided into three sections around each of those stages in the feedback loop .

Man 2: Build, measure, learn.

Man 1: Build, measure, learn. Our goal is to not to do any one of those things well. Our goal is to do the whole loop as fast as possible. So in each stages, we’re going to have a keynote address. In the build phase, we have Kent Beck, the father of Extreme Programming and a signatory to the Agile Manifesto.

In the measure phase, our keynote is Randy Komisar who wrote the book Getting to Plan B, and a really great business book called The Monk and the Riddle. He’s a partner at Kleiner Perkins. He’s going to be talking about pivots and why is it that successful startups never actually work with plan A? Why are they always on plan B, or more likely plan C?

Man 2: Well, you know. Hopefully, you never figured that one out. My best advice to people getting married, marry 2.0 first.

Man 1: That is the getting the Plan B story. In the third phase, our closing keynote is going to be from Steve Blank, the creator of Customer Development. In each of those modules, we’re going to have the keynote address followed by case studies from actual entrepreneurs. That’s going to be entrepreneurs like Aardvark who have already existed. We’re going to have entrepreneurs who are boot-strappers, who are relatively early on, who are later on, people who came out of big companies to do startup venture backed, not venture backed. Really diverse range. Aardvark, Grockit, Dropbox, companies that are really scrapping with these issues now so that the experience that they’re having is A, it’s relevant. It’s a contemporary experience. It’s based in the internet of 2010, and also is informed by this theory so they’re not just gonna say, “Randomly, here’s the things that we did,” but “Here’s how it fits into the overall methodology.”

Man 2: Yeah.

Man 1: Then in each module, the third part is going to be we’re going to focus on the dilemma. That’s, okay, let’s say you buy into the theory, but what about? These questions are based on the really difficult questions people have been asking me over the past year as I’ve been traveling around the world evangelizing these ideas. Does this idea of rapid deployment, 50 deployments a day, does that scale? Sure, I can see how you’d do that in a five-person team but in 75-person team, really? So we’ll have a case study from entrepreneurs who had actually done that.

Man 2: Who’s done that?

Man 1: Well, we’re gonna bring IMVU to talk about. Now, they’re a 75-person company. How does it work now? What did they have to change? How did they stay true to those principles over time? Then I’ve asked every presenter to tell the truth. What didn’t work? What about the model was hard? What were some of the unexpected problems they had? If they had to throw stuff out, what did they have to throw out? Then the question is what’s the role of design? It’s your question about the famous case of Google, right? When you’re split testing everything, when you’re building a minimum viable product, is there still a role for good design? What about this Steve Jobs, the supposed Steve Jobs’ effect of just designing.

Man 2: I think design is gonna be how companies differentiate themselves. Gowalla is a good example of that. Gowalla actually, in some ways, has fewer features than FourSquare but I still am attracted to Gowalla because of its…

Man 1: Superior design. That is this tension between how do you get great designers to work in an environment where their ideas are going to be tested against objective reality? My experience is that the best designers love that. Because they are like, “Wait a minute. I am not going to be subject to the whim of some random prock manager.” But if my ideas are better, they’re truly going to win out. That’s very exciting. But my experience is that for mediocre designers, it’s terrifying. A lot of people are used to a world where you can win a designer award for designing a product that never gets built. That’s a relic from the age of waterfall. We’re exiting that, a design that is become important has to be integrated into product development at the core.

Man 2: It’s interesting because even CEOs don’t do this, right? I’ve been in failed startups, unfortunately. I’ve never had that win, but I can see the failures frequently. One of them is they don’t manage by who actually has the better idea.

Man 1: We live or die by the percent to which we’re a meritocracy. I mean, that’s the Silicon Valley way. Yet, we talk a really good game about that, But in a lot of cases, we do not live up to that value. You can see it in the lack of diversity in the people that we hire sometimes. You can see it in the companies that make decisions by what it says on your business card instead of who’s got the data to support it. That’s not about bad people. It’s about bad systems. So part of lean startup is learning to see entrepreneurship at a systems level, not getting caught up in the individual personalities and egos of, “Well, that person says,” or “This great CEO says.” This is not a fortune cookie entrepreneurship. We’re trying to actually figure out what really works and what doesn’t.

So, that’s the conference. It’s April 23. It’s going to be in San Francisco.

Man 2: Do you talk about infrastructure at all?

Man 1: We will. As we talk in the build phase, we’ll be really getting into the details of how do you build a continuous deployment system? How do you actually do that Agile thing I was talking about of constraining what you’re working on based on what you learned, and there are really clear infrastructure components. But the conference is designed to be experienced by cross-functional teams. So my experience teaching this now for a while is it’s really key to have the CEO and the CTO in the same room at the same time. Otherwise, when I meet with MBAs and CEOs and business people, they’re always like, “This sounds great but my technical guys will never go for it.” But when I meet with the technical teams by themselves they’re like, “This is great but my CEO will never go for it.”

But if they’re sitting in the same room at the same time, they kind of go like this, and they kind of turn their heads and say, “Wait. Oh, could we actually do this?” and have a serious conversation. So actually, the conference is designed optimally for a team. There’s actually team ticket and we’re going to treat the team like VIPs. They’ll have their own special seating. We’ve got a bunch of really quality mentors who’ve agreed to come to the conference and sit with the teams and actually help them to take in what they’re hearing and figure out how to apply it to their company.

So yeah, we want to have technical people there. We want to have non-technical people. We’re trying to walk that line to make the content substantive and actionable for both group, which is a hard thing to do but I’ve had something like a year’s practice in doing that.

Man 2: This is really aimed at startups, right?

Man 1: Yes.

Man 2: Are you going to have a separate conference for big companies who need to learn these techniques?

Man 1: No. In fact, big company entrepreneurs are welcome. Here’s what I mean by that. We’re used to thinking of startups as two guys in a garage. If you’re two guys in a garage, you’re a startup. But I think that definition is way wrong. First of which, the first problem is that a lot of two guys in a garage are actually just be building a small business. They’re not actually trying to create something new. They’re just perfectly happy to find some interesting way to make a little bit of money and they’re fine with that.

My definition of a startup is a human institution designed to create something new under conditions of extreme uncertainty. If you think about the three-part definition, human institution which says our goal is to not just create a great product but to create a company that builds great products. So it’s fundamentally a management challenge which is very weird based on the two guys in a garage. You don’t think of them as managers. Entrepreneurial management is very different from general management, what you learn when you get your M.BA. That’s fine. I don’t mean to say that that kind of management is bad and this is good. They’re just two different things.

We’re trying to create something new. So we’re actually trying to change the world through the creation of a new product or service like Christensen calls disruptive innovation. Under conditions of extreme uncertainty is the most important part. That means we don’t even know who the customer is yet. We don’t even what the market will achieve. We don’t know what the price point should be. Our goal is to be on a search for those answers. If you think about that definition, human institution, new product, extreme uncertainty, it doesn’t say anything about size of company. It doesn’t matter.

I meet what I call involuntary entrepreneurs all the time. People who went to work for big companies. They thought they were taking a general management job but they discovered, “Oops, my safe industry is being disrupted. So now what?” It’s like, listen, you’re an entrepreneur whether you meant to be or not.

Anybody who’s facing those challenges is welcome to the conference. It is for entrepreneurs so we’re asking that service vendors and other such people unless they wanna come and sponsor, not just come and try to sell to the entrepreneurs. We really want to have a substantive conversation among people who are actually grappling with these problems. My aspiration is to take the movement, lean startup to the next level.

Trying to deliberately misunderstand it. He’s trying to see this whole lean versus fat startup controversy which is about how much money you raise which is to me like asking how tall you should be to be an entrepreneur. It’s an orthogonal question. It’s secondary to what’s the process you use to build companies. So we’re starting to get people confusing it, people trying to get into flame wars about who really coined the term lean start up and who gets credit for this and that.

It’s like, first of all, congratulations. We’ve gone from being misunderstood is way better than being ignored. It’s a huge step up. Yet, we’re still at the very, very, very, earliest phase. It’s just starting to don in the awareness of some people. It’s like now is the chance for us to really make sure that we actually have the ideas right. So ignore who’s getting credit, ignore all of the ephemera around what is trying to coalesce to be a movement and actually get all the people who care and who want to get at the truth in the same room at the same time to have a conversation about what is the future of entrepreneurship really gonna look like. I think it’s gonna be really fun.

About Shmula

First, let’s talk about why this website is useful for you. Like you, I’m battling to become better and trying to put a small dent in my corner of the universe. I write about the struggle that we all face to become better leaders, better workers, and better people. I write about Lean and Six Sigma principles, and how we can put them into practice at our companies, in our businesses, and how to improve the customer experience. Get started by learning about Lean >