SARAH GREEN: Welcome to the HBR IdeaCast from Harvard Business Review. I’m Sarah Green. I’m here today with Nate Silver, who’s the founder of the New York Times political blog FiveThirtyEight.

He’s a statistician and a writer. And before getting into political forecasting, he developed the PECOTA projection system for evaluating baseball players. He’s the author of the new book, The Signal and The Noise, Why So Many Predictions Fail But Some Don’t. Nate, thanks much for coming in.

NATE SILVER: Thank you, Sarah.

SARAH GREEN: So I’m curious. You have this very well-read blog. Why write a book? And why write this book?

NATE SILVER: Well, there are a couple of reasons why. Number one is I think that when you do a blog, you sometimes are so wrapped up in the news cycle and the day-to-day of things that you don’t have time to step back and breathe and think about the bigger picture. I think we have maybe more need now to read books.

And the thing about a book is that when you write a book, you rethink it and revise it many times over. And it involves, I think, certain parts of your brain that you don’t trigger as much when you’re doing a blog. So it was important to me. I think one theme in the book is that people sometimes are overly impressed by the newest information and the volume of information and don’t spend enough time considering how to put it all together.

But the other thing too is on the blog I mostly focus on politics. And I think elections are a really interesting thing to study and to try to predict. But I don’t particularly like politics. I find some of the people involved in politics, I don’t think they’re the most well-rounded or pleasant people necessarily, right?

So I want to broaden my focus a little bit and say, look, by being data-driven and looking at how predictions go, doing analysis from statistics and everything else, we can look at business or sports or a lot different fields or science. And there’s nothing about politics in particular that my interest and skill sets are uniquely suited to.

SARAH GREEN: So it’s interesting when you talk about things like politics or other subjects like baseball, I think part of the benefit to a statistician or a commentator or a writer is there’s a lot of information out there.

NATE SILVER: Right.

SARAH GREEN: So what about some of these other subjects where there’s maybe not as much publicly available data, and someone whether they’re in a business in that field, that’s affected by that field, or they’re just curious wants to explore some of that data?

NATE SILVER: Well, first of all, if you’re a business that can collect its own data, and it’s good data, then that’s hugely useful. I mean, one thing that makes Google so good– and they’re a case I talk about in the book a little bit– is that any time they want, they can conduct an experiment on the whole world, literally, since so many people use their products. And so that’s what they do.

Instead of saying, oh, here’s some theoretical approach to how to improve our search, they’ll just say, OK. Well, let’s try a different change in how the algorithm works and run it on 1% of our users, or 5%, for a couple days and collect data on how they’re responding. And if it seems to be favorable, then we’ll make it the new Google search. If not, then very little cost to us.

But they collect their own information and they also know that there’s no substitute for testing your ideas on real customers. A lot of things that look great in a computer model or in PowerPoint don’t really pan out. And to some extent also, they reverse.

SARAH GREEN: I’m glad you brought that up, because I wanted to ask you about how much of finding the answer, for instance, in hearing the signal in the noise, is about developing that better algorithm, the algorithm that would, say, pick out the terrorist from the 100 guys?

NATE SILVER: Yeah. Technique is part of it, but I think not as important maybe as people think. If you have fancy tools but you’re using them with the wrong goal in mind, then you can actually get yourself in trouble sometimes. And people talk a lot about data mining and so forth. But it’s important to keep in mind that discovering a past relationship and describing a past relationship in statistical terms does not guarantee it will perpetuate itself into the future.

And what you really want to find is not just a description of what happened in the past, but the structural and the causal relationships in the pattern. I think sometimes people think, oh, we have so much data, so much information, that the answers will emerge from that just on its own. And it doesn’t really. It can get you into trouble.

There are 45,000 economic statistics, for example, that the government publishes every day. And so you have some people who will put 8 or 10 of them into a soup and say, oh, a recession’s coming, or the next boom is coming. And it’s usually just looking at noise and mistaking it for signal, for a real meaningful pattern.

SARAH GREEN: Well, and that’s interesting, because in the book you really have this tension between the human being as pattern recognizer and human being as sort of false pattern recognizer and human beings as sort of biased creatures. You mentioned government statistics and economic statistics. Any economist on either side of the spectrum can bring his own statistics to bear to prove his point.

NATE SILVER: Sure.

SARAH GREEN: So how do you think about in what you do this sort of quest for truth or an objective reality? Or is there no objective reality?

NATE SILVER: Oh, no. So I absolutely believe that there is an objective truth. And I think there are some people in politics who are so jaded that they don’t believe that. But I also think that we perceive this truth imperfectly. So you’re making incremental improvements. So one thing you can do is I think describing things probabilistically is the way you almost have to go, unless you’re God, you know exactly what’s going to happen.

SARAH GREEN: So I’m interested in when you say you have to think about it probabilistically, because I’m not a statistician. I am sort of a recovering English major. So what does that really mean?

NATE SILVER: It means that you have to think about how much error there is, right? And you have to, instead of just making the prediction of, oh, who will win, you also have to kind of predict how wrong you might be. And people forget that part. They think, oh, I’ve just kind of issued my prediction, and that’s it.

Where I’m more interested in kind of, how broad is the range? And I’m trying to say– a lot of time I might think, well, I think this event has a 20% chance of happening, and the conventional wisdom is that it has a 5% chance of happening. And so kind of arguing to calibrate those estimates a little bit better.

But in the fields in the book where we found people have had success at prediction– so for example, weather forecasters. You always hear, oh, there’s a 20% chance of rain or a 70% chance of rain, right? That frustrates a lot of people who want kind of metaphysical certainty about things.

But there’s no benefit to pretending you’re better at prediction than you really are. So another benefit is that you’re providing more truth in advertising. And the question is, over the long run, if you say something is a 60/40 chance, you should get 6 out of 10 of those predictions right in a long term. Not 10 out of 10, by the way, but you should get more of them right. And so that’s how we tend to think about things.

SARAH GREEN: I was interested when in the book you talked about the importance of humility, because it seems like we’re talking about very mathematical things.

NATE SILVER: Sure.

SARAH GREEN: But then we started talking about sort of a human value like humility. It was interesting to me to see you bringing in this almost sort of philosophical or ethical quality to it.

NATE SILVER: Yeah. You know, we come into the world with a point of view, which means that we have probably a biased point of view. If you flip a coin and chart the heads and the tails and assume they’re equivalent to the stock market going up or down for a day, then it’ll look like a real stock market chart. And economists have tried that experiment and shown it to people, to investors. The investors are trying to read deep meaning into these charts that were literally generated randomly by flipping coins.

And that comes from our biological instincts, in part, where we tend to have kind of our very Neanderthal fight or flight instincts when we perceive a signal. If you see some rustling in the wind, you have to decide is that a lion coming toward by cave? Is my family safe, or is it just the wind? And so we react very quickly to things.

But when we have so much information we have to do with Daniel Kahneman would call thinking slow instead, where our first instincts aren’t always so good. The first instinct of most people when the stock market’s been going up is, oh, I want to put more money in and the reverse. Of course, what that means is that you’re buying high and selling low, which is exactly what you don’t want to do, right?

So by the way, when the average American in the Gallup poll says it’s a good time to invest, the stock market gets a much worse return on average than when people are terrified. And that’s not that hard to figure out if you slow down. But if you just kind of watch CNBC and you see all the green arrows going up and everyone talks about the market as they did during the dot com boom kind of at the supermarket or in kind of ordinary social settings, you can be deluded by that. And you become part of a bubble that will later burst.

SARAH GREEN: Yes. And I’m glad you brought up the stock market, actually, because you had a few lines in the book that I think are sort of alarming, but at the same time very sadly not surprising, which was in various parts of the book you had talked about how with a few exceptions economic policymakers are pretty much, quote, “flying blind.” And then you had a chapter in the chapter on the 2008 financial meltdown, you wrote that the rating agencies’ predictions were basically so bad that, and again this is a quote, “it was as if the weather forecasts had been 86 degrees and sunny and instead there was a blizzard.”

NATE SILVER: Yeah.

SARAH GREEN: Which is exactly right, sadly. But it’s not terribly surprising, I think, to a lot of people in business, because they’re sort of living with this ambiguity every day. So what do we do about this? Do you see a way to develop better models? Or do you think we’re kind of just stuck with what we’ve got?

NATE SILVER: Well, the book tries to work on both sides of the problem. So it asserts that there’s a gap between how good we think we are at prediction and how good we really are. So you can do both. You can both say, OK, let’s think more realistically about areas where this analysis can accomplish a lot, areas where it can give you some guiding light, but not tell you everything, and areas where it’s probably futile, distinguishing those. And then once you have that more realistic conception of what you can accomplish, then you can start, I think, to move toward the realm of the achievable.

So it’s not really a pessimistic book. I think of it as ultimately being an optimistic book. And it talks about the areas where there has been a lot of progress. But it does require us to be realistic about what we can achieve and what has worked and what hasn’t.

SARAH GREEN: So I’m going to ask now about patterns. And you may have seen a pattern and you may have only seen a false pattern. I’m going to ask anyway. When you were really looking into the people who are really good at distinguishing signal from noise, did you notice a pattern among the people who can do it and the people who seem terrible at it?

NATE SILVER: There are characteristics. One of them is they tend to be multi-disciplinary. So instead of having one big theory, kind of capital T, that explains everything, then they throw a lot of different methods at a problem. They look for the consensus of information and where that might be.

Another thing is one I mentioned earlier, which is that they think in terms of probabilities. So if you’re someone who’s played poker, for example, as I did for several years, then that comes very naturally to you. And you know that you can play the hand well and your opponent catches a flush, which there’s a one in five chance of, and you’ll know you lose those hands sometimes. By the way, you can also play a hand badly and you can be lucky on the other end. So those kind of qualities seem to work a lot.

And they’re also willing to change their mind. It seems like a fairly obvious thing. But as the circumstances evolve and as the information evolves, then you have to be willing to adapt to it and not become too stubborn and dug in. Frankly, a lot of good prediction is avoiding the politics of appearances. When we’re predicting on the basis of trying to preserve our reputation– you see this is problem in journalism in different ways– that’s different then when you actually have money on the line.

SARAH GREEN: That’s a good point. And actually I want to go back to something you said also when we were talking about changing minds, because I think in the book you have this interesting discussion of sort of the role of information, signal and noise, and partisanship or sectarianism and this kind of weird feedback loop between the two of them. Tell me about that.

NATE SILVER: So it is a fairly easy example with political polls, where on an average day now, we’re getting about 20 polls every day. And some of them show Obama ahead. Some show Romney ahead. Some show the trend toward Obama, some toward Romney.

If you pick out the three of those polls that you like the best and pretend that they represent the truth, and you’ll always have a good spin on the day for Obama or for Romney. And some people were literally doing that if they’re partisans– I was going to say PACs, but that’s a little impolite. But if you’re someone who doesn’t, I guess, believe in there being any objective truth, or loses track of it because you’ve become so jaded, then you really can focus on the outliers instead of the broader trend.

And when you have more information, then you’re more selective about the information that you consider, because none of us can consciously process all the information we see even in the news on a given day. I have a computer program I designed that can be very fast, right. And so that program can consider all the polls. And that’s helpful.

That’s kind of one thing that makes us objective is when we set up a set of rules that are applied consistently to things. And so that model, I decided back in 2008 when I designed it, here are what I think are good rules for handling polls before I see any polls. Whereas instead, people kind of adapt their theses and their ideas in ways that confirm their biases as the data’s coming in.

SARAH GREEN: So what’s interesting to me about this is that in the book– we’re sort of talking about algorithms and computers and polling and these sort of very modern things. But in the book, you actually start this discussion by talking about the printing press and movable type.

NATE SILVER: Right.

SARAH GREEN: In some ways, the technology is new but the problem is old.

NATE SILVER: Yeah. So there are periods in the past where you have an exponential increase in the amount of information available to people. And one of them was in the 15th century, when the printing press was invented. And we went from a book costing literally about $25,000 to produce in today’s figures, to $50 or $100. So it was actually a little bit expensive still, but affordable.

And so people finally said, well, maybe we should actually write down some of the ideas that we have, and could mass produce them. And so the Protestant Reformation was very much tied in to the printing press, where Luther’s theses were reproduced 250,000 times, which was an amazing number relative to the population of Europe back then. That would be a bestseller even today.

So people have something down on paper. They can testify, now this idea is right, not this other idea that I learned. People tend to get very dug in, certainly about religion, but also about a lot of things. And so you had– basically, this is obviously a gross simplification, but the printing press helped to perpetuate 100 years of holy war in Europe, indirectly at least. And then we kind of realized all the benefits to science and commerce.

But usually, when the amount of information available to people increases much faster than our techniques for processing it, then we can get ourselves in a lot of trouble. And now you see political partisanship has risen a lot in the past 30 years or so. And that might have to do with the increasing availability of cable news channels, where you can now get your news kind of flavored in 10 or 12 different types of ways. You no longer have to pay attention to the unpleasant things that might cause you to actually rethink your world view, potentially. And that, I think, of course encourages voters to be more partisan.

SARAH GREEN: Well, and because there’s so much information out there that you can only consume a small stream of it.

NATE SILVER: That’s true. And it is impossible to consume even really a fraction of the information out there. And so within this realm of polling, I have my shortcut which is I have a computer program do part of it for me. But that’s still, even relative to the whole election, a very narrow amount of the information that might be pertinent potentially.

And so, yeah, you do have to pick and choose to some extent. And to some extent it requires just a series of almost kind of practice, I think, in kind of how you consume your information. But it does start out with the goal.

Do I want the truth or do I want to make myself feel more pleasant when I read news that confirms what I already know? And of course everyone will say, oh, I want the truth. But I think deep down some people maybe don’t care as much.

SARAH GREEN: Well, and I think that even if you look at other subject- like for instance, I follow a lot of people on Twitter who are constantly tweeting about the future of the media industry. That’s something I’m materially interested in. But it may be that I’m just following the people that confirm my own biases, because reading that stuff makes me feel smart. And I could be painting myself into a corner.

NATE SILVER: Twitter is a very interesting application of this, where you’re just kind of blasted with new information. And you do have this real barrage of information thrown at you. And kind of picking the signal out of that can be difficult sometimes. But I think there’s also a danger in the age of social media of kind of group-think kicking in and having less diversity.

So one thing I talked about before were the people who are good at prediction tend to apply different methods and kind of look for the consensus or the average between them. Part of the brilliance of that and part of the brilliance of markets in general is that if you have a lot of independently-minded people looking at a problem in different ways, then you add a lot of value. But if people start thinking about things in the same way, you get these kind of feedback loops. And that can be very dangerous. And that’s part of why you have bubbles and panics occur.

And it’s, I think, part of why for example in politics now, the debates in this cycle, both in the Republican primary and the first one in Denver, had much bigger effects on the polls than average, even though other things haven’t move the polls as much. I think part of it is because there can be a consensus that develops over social media very early on. Oh, Obama lost this debate in Denver. And that gets echoed and echoed and kind of blasted out instantly, instead of having people kind of write about it independently.

SARAH GREEN: I think that is a real challenge. And I think that’s something a lot of businesspeople I hear from really struggle with, because they want to know, can I use judgment without having bias filter in? How many people do I need to talk to before we can make this decision? Because you don’t want to get paralyzed by a search for consensus. If you were making a big decision, what would you do?

NATE SILVER: Well, some studies have found that anonymity works best when you remove reputational concerns. So for example, one thing it can be very hard to do if you’re in any kind of business application is to plan around deadlines that you know are uncertain. And you can always apply rules, like well, the real deadline’s always twice as long as the one that is claimed. But some businesses have experimented with actually having markets where people bet on when the project is done. And they do this anonymously, and that tends to elicit kind of more honest answers.

So I think a lot is just kind of making people comfortable to give you honest feedback. And frankly, look, a lot of consulting businesses serve that role, because they can go in and say something politically incorrect and no one at the business loses any reputation, which is a good business for consultants, but also means that you’re spending money usually on those consultants to avoid basically hurt feelings. And that’s probably not the best spent money in the long run.

SARAH GREEN: I was wondering, in the beginning of the book, in the introduction, you start talking about– you sort of refer to some mistakes that you made earlier in your career as a statistician. What, if you don’t mind, would be one of those mistakes? And maybe what could we learn from your example?

NATE SILVER: You know, I think when I was younger I was more inclined to say, OK, well, just the data the model spits out is kind of the truth, and not thinking as much about my own influence on it. I did a chapter for a book called Baseball Between The Numbers, that was about kind of what–

SARAH GREEN: It’s a great book.

NATE SILVER: Thank you. Thank you. That was like what things correlate with success in the post-season. And there were some interesting conclusions that pitching and defense, for example, really does matter more. But the technique I used to find that was a little bit too data-mining-y for my tastes now. And so I’m less confident that it’s really something meaningful as much as just kind of random and interesting patterns that happened to be true in the past but might not have that much kind of structural significance. So it’s minor things like that.

But look, I know that I’ve been fortunate over the long run to have not had some of these predictions that were high profile go bad. Like people give me a lot of credit for calling 49 out of the 50 states right in 2008. But look, unless you were a moron, you were going to get about 46 or 47 of those right. It was a pretty clear verdict by Election Day. And so the other three come down to some combination of luck and skill, and are basically 50-50.

I know that, look, whoever we have winning on Election Day– we have Obama as a slight favorite right now, but it’s a very close race. And we know that whoever we have ahead could easily lose. We might even say it’s 51% to 49% chance on Election Day. But based on where you end up on an outcome, there will be a huge amount of reputational concern.

You know, I play poker. And in poker, I made a lot of money for a while, then lost some of that money. So you know that’s kind of how it goes.

SARAH GREEN: Regression to the mean.

NATE SILVER: Exactly. You just try and make the mean a little higher, right?

SARAH GREEN: Yeah. And I think maybe a lesson here for all of us is to just get a little more uncomfortable with uncertainty. And despite all the numbers, there’s still a lot of uncertainty out there.

NATE SILVER: No, you have to. So I guess one of the people I talk to in the book is a guy who coaches high stakes poker players. And he encourages some of them to meditate, as strange as it might sound. You think of poker as this– I think of poker as actually being a very kind of left brain, mathematical kind of game, especially at the stakes these guys are playing it. But at some point you have to become comfortable with knowing that you can play your best poker and you’ll not win sometimes.

SARAH GREEN: Nate, thanks so much for coming in today.

NATE SILVER: Of course. Thank you.

SARAH GREEN: That was Nate Silver. The book is The Signal and The Noise. For more, visit hbr.org.