from the collaboration-is-key dept

If there is a single place where the sports and the geek worlds collide, it is undoubtedly in statistics. It's long been said that baseball is a thinking man's game, in part because of the chess game that is built into its very skeleton, but also because of the role that math and numbers play in terms of making decisions on each team based on individual situations. By this time, only those that work really hard at staying away from baseball will fail to recognize names like Bill James or Billy Beane. The people now most responsible for constructing teams and their strategies are people with advanced degrees in fields like economics and statistics. What's interesting is how quickly advanced metrics, or sabermetrics, have exploded in use and depth in the past ten years after being almost universally derided by the major league clubs. Advanced stats are everywhere in baseball now, from the early focus on OPS (On-Base Plus Slugging) to WAR (Wins Above Replacement) to WRC+ (Weighted Runs Created) and so on. What's amazing is how far behind other sports appear to be in developing their own advanced statistical systems. Take basketball, for instance. It would be very easy to conclude that there has been nothing resembling the development of baseball statistics in professional basketball, otherwise we'd have heard about it and the knowledge of it would have spread as wide as it has in baseball, right?

Well, no, actually, and the reason why is a lesson in how collaboration, open development, and building off of the ideas of others provides the most advanced outcome. Such is Jason Schwartz's conclusion in his lead up to the Sloan Sports Analytics Conference, where at least some discussion of basketball metrics is occurring. That conference, now an ESPN sponsored event, grew out of what was once a simple Yahoo message board started in 2001 by basketball stats geeks. Early on, as was the case with baseball metrics, the forum was open for discussion, peer review, and the exchange of ideas. Unlike baseball, however, the NBA knew all about Moneyball by 2003 and teams were extremely interested in the potential of advanced metrics.

The NBA establishment quickly took notice. [Dean] Oliver, who published the seminal Basketball on Paper in 2003, seven months after Moneyball hit stores, was hired full time by the Seattle Supersonics in 2004. Another frequenter of the board, John Hollinger, was hired the following year by ESPN - and recently became a vice president of basketball operations for the Memphis Grizzlies. Hollinger's ESPN gig was filled by Pelton, who, after making his name at Basketball Prospectus, did a consulting stint with the Indiana Pacers' front office. Roland Beech, who created the popular website 82 games, was hired by the Dallas Mavericks in 2009 as director of basketball analytics. (His boss, Mark Cuban, is regularly one of the biggest names at the Sloan conference.)

So you're probably thinking, "Great! The teams took notice in the early stages, unlike what happened in baseball, meaning that the knowledge was embraced!", right? Well, that's true, but the result was the severe retardation of growth in basketball statistics. Why? Well, if you know anything about how patents and intellectual property often function today, you've probably already guessed.

As soon as each statistician joined an NBA squad, sharing in public became off-limits-and so, gradually, the think tank closed shop. What were the teams paying for, after all, if their new stat gurus were just posting their ideas online for the other 29 franchises to read? This has had a paradoxical result: Because NBA teams embraced advanced stats so quickly, progress on basketball analytics has actually slowed down. The top minds are now all working in silos, not only unable to collaborate but actually competing against each other.

This is, again, the exact opposite of what occurred in baseball. For baseball statistics, because teams were not impressed by the idea of advanced metrics, favoring instead old-timey scouts on the ground, the best minds were free to collaborate with one another, forming what are now some of the most prestigious sports stats think tanks in history, like Baseball Prospectus and FanGraphs.

Major League Baseball teams were hidebound enough to ignore Bill James and sabermetrics for a full quarter century-as a result, he and others hashed out ideas out in open, public forums. By the time MLB executives finally embraced advanced baseball statistics, the movement was fully formed.

If you want to draw the obvious analogy, baseball statistics were developed on an open source model, while basketball has mostly been proprietary. As Schwartz notes, it isn't necessarily a lack of knowledge that is the resulting problem, but rather the issue is that this knowledge is all segmented throughout individual teams and nobody has the collective manpower to use it to its full potential.

Many, including Oliver, believe the killer app is hiding in there somewhere. The challenge is that there's so much information, it's easy to get lost. "It's like saying you're going to Wal-Mart or Ikea to get something," offers Tommy Sheppard, the Washington Wizards vice president of basketball administration. "You better know what you want, or you're going to walk out with a ton of s***." That each franchise is working alone - and against each other - compounds the problem. Goldsberry describes it as 30 "micro-CIAs," all racing against each other to "procure actionable intelligence out of these haystacks of vast data."

Sound familiar? Now, here's where it gets really fun for the purposes of our analogy. The quality of team construction in baseball is leaps and bounds ahead of where it was 20 years ago, in massively large part because of the explosion of advanced statistics and the resulting understanding of the game. Think about that for a moment. Even as these teams compete with one another, because of this open source statistical model for knowledge of the game, every team is better off for it. The game has universally advanced. Basketball, however, under the proprietary model, has not. While there have been rule changes that have influenced how the game is played, player evaluation is still essentially the same game it was 20 years, or even 40 years ago -- and thus you still end up with teams that look good on paper based on the old stats, but fail to perform well as a team. Why? Well, perhaps because the best minds aren't collaborating to advance the game through knowledge, and thus they're measuring the wrong things (and optimizing for the wrong things as well).

Thinking of each league as a microcosm of society and industry, the implications for intellectual property in general, and patents in particular, are somewhat breathtaking.