Saturday, September 27, 2008

Relative League Strength

How much better is the American League than the National League?

To answer I looked at all pitchers and all hitters who appeared in each league within a decade and figured the RC/game for hitters in each league, and RA/game for pitchers, matching by the lessor total of plate appearances. I've done this before looking only at players who played in both leagues in the same season, but in the past there was much less player movement, killing my sample size.

Grouping by decade instead of year gives me a large enough sample to look at relative league strength before 1970. I use the difference in pitcher performance and combine it with the difference for hitters to estimate the won-lost record an average team in one league would have in the other league.

For the 40's, 50's, and 60's, the National league dominated to about the same extent that the AL does today. What does this mean for player stats? surprisingly little. Bill James used a .82 league strength modifier, or M, for AAA when he first published minor league equivalencies. To get an 88-74 record for the stronger league you only need an M of .95. That would mean a slugger who hits .300 with 35 homers in one league would hit about .290 with 33 homers in the stronger league. It's barely noticeable, and only aggregate analysis with a strong sample size (and the introduction of interleague play) allows us to quantify it.

Using an M of .82, by the way, would indicate an average record of 53-109 for a typical AAA team, about the same as the worst team we'd see in an average 5 year stretch in the majors. Other leagues, assuming an M of .7, .6, and .5 as I move from AA to low A:

AA 35-127A+ 22-140A 12-150

If I used M= .25 for college ball, then a typical college team would win one game per year, scoring one run per game and allowing 16. I have no idea how accurate that is, but to me it passes the smell test. 16-1 I think is a typical score when big league teams play college teams in spring training.