The expert on experts is Philip Tetlock, a professor at the University of California, Berkeley. His 2005 book, “Expert Political Judgment,” is based on two decades of tracking some 82,000 predictions by 284 experts. The experts’ forecasts were tracked both on the subjects of their specialties and on subjects that they knew little about.

The result? The predictions of experts were, on average, only a tiny bit better than random guesses — the equivalent of a chimpanzee throwing darts at a board.

“It made virtually no difference whether participants had doctorates, whether they were economists, political scientists, journalists or historians, whether they had policy experience or access to classified information, or whether they had logged many or few years of experience,” Mr. Tetlock wrote.

Indeed, the only consistent predictor was fame — and it was an inverse relationship. The more famous experts did worse than unknown ones. That had to do with a fault in the media. Talent bookers for television shows and reporters tended to call up experts who provided strong, coherent points of view, who saw things in blacks and whites. People who shouted — like, yes, Jim Cramer!

And a great quote from Samuel Goldwyn (MGM). “If I had said ‘yes’ to all the projects I said ‘no’ to, and ‘No’ to all the projects I said ‘yes’ to, it would have probably come out the same.” via The Drunkard’s Walk.