/m/sabermetrics

Reader Comments and Retorts

Statements posted here are those of our readers and do not represent the BaseballThinkFactory. Names are provided by the poster and are not verified. We ask that posters follow our submission policy. Please report any inappropriate comments.

Great article; I feel like all of this is true (better fielders; better fielder placement). Also, it seems like every team has 3 or 4 guys that throw mid 90's from the bullpen. I am not sure there is an immense advantage getting into the bull-pen any more; teams have their 7th, 8th and 9th inning guys that throw the snot out of the ball.

What if players are getting more athletic, resulting in faster fielders and more covered ground? Is it possible that, much like we see in the NBA, the playing space is seemingly shrinking because of the size and speed differences of players today compared to players of older generations? Just imagine how many fly balls would be dropped if your outfield consisted of Starling Marte, Carlos Gomez, and Gerardo Parra.

You could always look to history for an answer, such as "how many flyballs fell between the outfield of Vince Coleman, Willie McGee, and Andy Van Slyke?"

Only read the excerpt, but strongly disagree with the approach and conclusion presented. Yes, there are a lot of factors that influence the outcome of a PA. And if you only look at ERA, RA or whatever, you aren't going to be able to tell which one(s) are responsible for the shift.

Here's the thing though: We have a bunch of component breakdowns of what is happening. And basically most of difference we have seen over the last few years can be explained by increasing k-rates. Which means we can toss out basically all the fielding and game strategy arguments. It means either pitcher's are getting better, or hitters are getting worse. Since we also know that average FB velocity is up a chunk, it strongly points to the former.

And basically most of difference we have seen over the last few years can be explained by increasing k-rates. Which means we can toss out basically all the fielding and game strategy arguments.

This is not true at all. If pitchers were exactly the same but fielding improved, you could still see K-rates jump up if hitters decided that they should only swing at pitches they could really drive, because if they can't drive it, it will be caught. I've noticed that you don't hear many "that ball had eyes" or "dying quail base hit" during broadcasts anymore. The problem is that the cause is basically non-identifiable, even with respect to:

a) Pitchers are throwing harder/better, so hitters can't get good wood on balls out of the sweet spot of the strike zone;
b) Hitters have decided that they can still get the same wood on balls out of the sweet spot of the strike zone, but that this wood is no longer good enough because those hits aren't hits anymore.

As with all things in life, it is almost certainly both, but deciding the fraction is extremely difficult.

bfan Posted: February 10, 2014 at 06:57 AM (#4654076)
Great article; I feel like all of this is true (better fielders; better fielder placement). Also, it seems like every team has 3 or 4 guys that throw mid 90's from the bullpen. I am not sure there is an immense advantage getting into the bull-pen any more; teams have their 7th, 8th and 9th inning guys that throw the snot out of the ball.

There may not be a short-term effect, but there's a long-term effect - if you're able to get into the bullpen consistently, you eventually hit their not-so-good relievers, if not in THIS game, in the next few games. Forcing teams to burn out their best relievers early and often eventually lead to, well, being Proctored.

This article does go beyond run scoring and shows the increasing strikeout rate, declining walk rates, and declining HBP rates. The last part was something I learned, I hadn't paid much attention to that. Could be a result from the ban on hitters bringing body armor to the plate like Biggio and Bonds used to do. Though somebody better start paying attention to Victorino and enforce the rule that he only gets a HBP if he actually tries to get out of the way, and definitely not if the pitch is in the strike zone.

The author doesn't go any deeper than this, not looking into pitch fx. In another recent article it was shown that the strike zone has been expanding in the pitch fx years, and this is responsible for at least a portion of the K-W changes.

There may not be a short-term effect, but there's a long-term effect - if you're able to get into the bullpen consistently, you eventually hit their not-so-good relievers, if not in THIS game, in the next few games. Forcing teams to burn out their best relievers early and often eventually lead to, well, being Proctored.

Teams mostly play 3 game series, so the benefit of your work will go to the next team your opponent plays. Well, at least the "wearing out pitchers" benefit. Teams that work the pitchers well take more walks and work their way into better hitting counts. That's the primary benefit to the team that takes a lot of pitches and hits a lot of fouls.

Though somebody better start paying attention to Victorino and enforce the rule that he only gets a HBP if he actually tries to get out of the way, and definitely not if the pitch is in the strike zone.

Absolutely on the latter, no on the former. I never understood why it's up to the hitter to bail the pitcher out when the pitcher throws a pitch where it shouldn't be thrown (in the batter's box). If you're willing to absorb the pain that comes with getting smacked by a 90 MPH pitch, then damn right you ought to get your base without having to go through some charade of trying to avoid it.

And I suspect Victorino's astounding HBP rate after the switch will start to drop. He was seeing same-side pitching for the first time in more than a decade, so it wasn't surprising that he got hit so frequently (particularly considering his stance). Of course, if he's not afraid to stand in there and take it, more power to him.

But yes, if he gets hit outside the box, by all means he shouldn't be rewarded.

The problem is that the cause is basically non-identifiable, even with respect to:

a) Pitchers are throwing harder/better, so hitters can't get good wood on balls out of the sweet spot of the strike zone;
b) Hitters have decided that they can still get the same wood on balls out of the sweet spot of the strike zone, but that this wood is no longer good enough because those hits aren't hits anymore.

It is probably closer to (a) than to (b), from what I can tell. Not that I want to say that there's a trend in BABIP because YTY variance is generally larger than any systematic effect, but to the extent there is a change, the impact has affected in-play power more than it has BABIP.

There was a spike upward in both BABIP and ISO on BIP between 2005 and 2006; in 2006 BABIP for non-pitchers went from .296 to .303, ISO on BIP went from .083 to .086. In 2010, the spike went back down; BABIP went from .300 in 2009 to .297, and ISO on BIP went from .085 to .082. In 2013, BABIP was at .298, but ISO on BIP was at .079, the first time it had been below .080 since before 2000 (I only checked 2000-2013).

Repeating a point - my numbers might be a little different from those you see elsewhere. When I compute BIP I don't include pitchers, nor do I include bunts. I do include inside-the-park home runs.

if you're able to get into the bullpen consistently, you eventually hit their not-so-good relievers, if not in THIS game, in the next few games. Forcing teams to burn out their best relievers early and often

Something in this statement doesn't quite square with current managerial doctrine. Virtually every manager today leaves a starter in for a predictable chunk of the game unless they're getting unmercifully hammered, or are hurt, or both. Then, if they're behind after five or six, they go directly to a string of not-so-good relievers. If ahead, they go to a pre-set sequence of good relievers. (That's a huge generalization of my impressions, but it's recognizable.)

In that scenario, a string of wins that go according to plan sometimes necessitates some shifts to prevent burnout. Your 7-8-9 good relievers may be able to go for three wins in a row and then need some rest in the fourth game; but that's the way it was supposed to go, not a problem.

IOW teams have by and large stopped burning out their best relievers. If you happen to need some relief in the 4th or 5th, these days, you go directly to the not-so-good pitchers and have them absorb their beating in a game that's probably lost anyway. In that situation, you can only be pleasantly surprised by a fireman stint that leads to a big comeback.

I'm not convinced that this is quite true; we will have to see how the long-term trends play out, but there are still a lot of short relievers who have 2-3 good years and then implode to a point where they are no longer useful.

They are definitely trying. Bullpens in effect are actually bigger than the 7 or 8 man bullpens you have at any point in time. If you have a doubleheader, usually the starter for one of the games is a guy you call up and then option right back to the minors after the game. And often if a mop-up reliever pitches 4-5 innings after a starter is knocked out early, his reward will be an option to AAA so a fresh arm can come up and take his place.

As to the actual article....
1. a trend of 2004 to now isn't really notable other than normal random fluctuation.
2. it completely ignores other changes made (the go to of steroids, or other likely influences such as changing of the bat standards)
3. As Arom points out in post 7....the body armor thing has probably made as big of an impact as anything else.
The conclusion is somewhat funny, as it lists nothing that it talked about in the article (faster fastballs? what evidence of that? I wouldn't be surprised if that was true, but nothing in the article supports that, and I'm sorry, but saying nastier curveballs is a joke...Nobody throws a curve like they did in the 70's and more deceiving changeups? was the writer even alive in the 80's? )

This is not true at all. If pitchers were exactly the same but fielding improved, you could still see K-rates jump up if hitters decided that they should only swing at pitches they could really drive, because if they can't drive it, it will be caught. I've noticed that you don't hear many "that ball had eyes" or "dying quail base hit" during broadcasts anymore. The problem is that the cause is basically non-identifiable, even with respect to:

I hate k/9 being ever used as an analysis tool. It's completely useless. Same with anything that uses /9 as a tool. In this case it's even worse because it actually would help their case more if they used k% instead of k/9 since in a higher scoring environment the pitchers face more batters per /9.

Or, what if coaches are becoming more strategic because of the advanced data available to them? Things like defensive shifts and situational pitchers (lefty specialists, for example) have certainly increased over this time period.

Wouldn't defense efficiency support or disprove that.
2013 .692
Going all the way back to the selected year of 2006
2006 .687(mind you 2005 was .693)

I am not sure there is an immense advantage getting into the bull-pen any more

I tend to agree with this. I lean toward the larger strike zone as the root cause for the increasing strikeouts and decreasing walks. If pitchers are getting the calls at the margins it is a huge advantage for the pitchers.

Hitters are still being trained to go deep into counts to wear out pitchers but I agree that the benefit is fading with larger bullpens. Better to swing at the first good pitch you see.

I am not sure there is an immense advantage getting into the bull-pen any more

Disagree with that. As pointed out on this thread that getting to the "lower" levels of relievers, the ones generally used in the 6th, 7th inning helps the offense out.

Roughly speaking relievers performance over the course of the season is better than starters over the course of the season (roughly .5 run better) but that performance improvement is being carried out by 1. the closer, who will never, ever, ever pitch in the 6th/7th inning. 2. the setup man who is being saved for the 8th inning, 3. the loogy who is being saved for the biggest left handed bat in a crucial situation. The rest of the relievers are roughly equivalent to the fourth starter on the team.... You absolutely want to get away from the ace to face those guys (regardless of what Tango and MGl say)

Disagree with that. As pointed out on this thread that getting to the "lower" levels of relievers, the ones generally used in the 6th, 7th inning helps the offense out.

There's always been an advantage in getting into the bullpens early. That advantage still exists. Many teams have built pens where there's little advantage in getting "into the pen" in the 8th or 9th, because you're just going to get "O'Ventbrel" or something similar.

Of course, this is hardly new. There was no real advantage in getting into the Reds' "Nasty Boys" pen after the 7th inning. There is always an advantage in getting into the pen early, because it means 1) you've beat up the starter well enough to get him out of the game, which means you're probably already on the board and likely leading, and 2) you're getting the soft middle of failed fifth starters just trying to hold on another year and not flamethrowing 20 something throwing it past you.

Disagree with that. As pointed out on this thread that getting to the "lower" levels of relievers, the ones generally used in the 6th, 7th inning helps the offense out.

Roughly speaking relievers performance over the course of the season is better than starters over the course of the season (roughly .5 run better) but that performance improvement is being carried out by 1. the closer, who will never, ever, ever pitch in the 6th/7th inning. 2. the setup man who is being saved for the 8th inning, 3. the loogy who is being saved for the biggest left handed bat in a crucial situation. The rest of the relievers are roughly equivalent to the fourth starter on the team.... You absolutely want to get away from the ace to face those guys (regardless of what Tango and MGl say)

Get away from the ace, sure, but there are 4 other starters on that team, and the ace is the one least likely to be knocked out early. Besides, the question isn't whether the starter is better than the middle reliever in a vacuum, it's whether he's better facing the lineup a third time than the middle reliever is facing the lineup for the first time.

Perhaps the sports medicine field is helping out here, as well. Pitchers are pitching less late inning stressful pitches, due to the research done, extending the careers of the most talented pitchers and when they do get hurt, they get fixed (and get back sooner). This really is better when comparing this generation with previous generations but might have some value to 10 years ago, especially the speed to recovery part. Specialized pitching (so when a pitcher comes out late in the game, the hitters are facing effective pitchers) and the added time the best pitchers are enjoying (not the top 1% - Seavers, Ryans, Etc) but the top 25-30%.

One last thought, I believe younger pitchers are being given a chance earlier and more often than 10 years ago. Calling up younger soon-to-be stars (but taking care of their arms) and letting go of the over-paid, wrong-side of the career vets might also account for better results.

1/3 of current pitching slots in MLB would be filled by pitchers currently in the minor leagues if not for Tommy John surgery. It’s hard to believe that this has had no effect. It’s not like the pitchers with TJ surgery are exclusively in the bottom third either.

I stopped reading when the second sentence of TFA referred to Clayton Kershaw as a "finesse lefty." Not to go all Joe Morgan here, but why the hell should I waste my time reading an article by someone who's clearly never watched a baseball game?

This is not true at all. If pitchers were exactly the same but fielding improved, you could still see K-rates jump up if hitters decided that they should only swing at pitches they could really drive, because if they can't drive it, it will be caught.

Yep. Also, with fielders becoming better and better at turning double plays, batters no longer have an incentive to simply avoid striking out with a runner on base.

faster fastballs? what evidence of that? I wouldn't be surprised if that was true, but nothing in the article supports that

Fastball velocity is up a bit over the last half decade, which can be quantified with pitch fx data. But yeah, this article doesn't go there.

Fangraph's earliest season for reporting velocity is 2002, and MLB average velocity is 89.9 mph in that season. The MLB average velocity in 2013 is 91.7 mph. With a couple of exceptions, the velocity increases progresively each year. That is a fairly significant increase in velocity over the course of a decade. I thought it was fairly well accepted that average pitcher velocity today is the highest its ever been. I recall reading quotes from scouts and pitchers in the 70's, 80's, and 90's who say that average pitcher velocity was in the high 80's mph compared to above 90 now.

C'mon, let's stop exaggerating. BABIP has barely moved at all and it looks like random variation. The great defensive positioning revolution is, at best, having an effect around the margins. I'm really not sure it's accomplishing all that much other than messing up our zone-based fielding statistics that don't adjust for the defender's positioning within those zones at the start of the play.

To the extent we hear the phrase "seeing eye grounder" less often it's because of the booming K rates (lower contact rates). Even so, just some recent numbers for the AL 2009-13 (basically keeps pitchers out of the equation):

Those numbers are extremely stable. The obvious first guess for a cause is a larger strike zone. Some pitches that used to be balls are now strikes which, if batters made no change, would result in pretty much exactly what we see -- higher Ks, lower BBs and everything else pretty much the same.

At best you can argue that BABIP is down 7 points ... or 7 per 1000 BIP. That is not a world where batters have to make some radical adjustment to respond to better defensive positioning.

Now baseball is a game of very small differences -- an extra 2 K per 100 PA isn't an obviously "big" change either. But swapping a non-K for a K must be worth something on the order of half a run. That would be an average reduction of about 62 runs a year or about .4 runs per game ... actual scoring difference between 2009 and 2013 is .49 runs. (Clarify -- I guessed at half a run; if somebody has the actual estimate, please plug it in.)

HBT or Fangraphs or one of those had that nice article a couple weeks ago looking at pitch/FX over the last few years, demonstrating a larger K zone, especially below the knees.

And don't get too worked up about the G/F ratios either. A .82 G/F ratio translates to 45% of contact resulting in a GB; .76 is 43%. Under an 8% HR/FB ratio, that's 1.6 fewer HR per 1000 contacts.

Those ratios have been pretty stable since 1994 except for K-rates. 2013 K% is up 4% relative to 2000 AL and 1994 AL while walks are down about 1.5% compared to each. It's when you comp to 1992, when the world was pure, that you can see some possible trade-offs by the hitters -- K% up then way up, walk rate stable then down, IP% down in exchange for 2+% on HR/FB and a "big" jump in BABIP.

24. zachtoma Posted: February 10, 2014 at 04:50 PM (#4654452)
This is a lot simpler than any of that - for a long long time, umpires didn't call the rulebook strikezone. Then pitch fx data was developed and now they more or less do

Not true though. Pitch f/x has shown the strike zone has expanded significantly since it's inception.

Also, as pointed out above, BABIP has not changed much at all, and is still well above historical norms dating back to 1930, so defense and poorer contact is not explaining it.

My votes for a slightly deader ball and the expanded strike zone that responsible for the advantage pitchers have. No evidence for the former, but there is evidence for the latter.

Fangraph's earliest season for reporting velocity is 2002, and MLB average velocity is 89.9 mph in that season. The MLB average velocity in 2013 is 91.7 mph. With a couple of exceptions, the velocity increases progresively each year. That is a fairly significant increase in velocity over the course of a decade. I thought it was fairly well accepted that average pitcher velocity today is the highest its ever been. I recall reading quotes from scouts and pitchers in the 70's, 80's, and 90's who say that average pitcher velocity was in the high 80's mph compared to above 90 now.

Velocities in the pitch f/x era are higher because they are measuring it, or I should say estimating it at the release point. Velocity falls off quite quickly as it reaches the plate, With the radar gun the velocity is measured well off the release point so is less. so I would not put much stock in the 2002 report. This is similar to the issue with Japanese pitchers velocities in NPB supposedly being a couple of miles per hr lower than in the US since they don't use pitch f/x there. That from an article comparing Darvish velocity in Japan with his velocity in MLB. The increase in velocity since the advent of 2007 is rather slight, and could possibly be attributed to tweaking of the system as in the early days of pitch f/x there were some significant variances from park to park. From 2009 when pitch f/x had its first full year in every park under its belt the average velocity has only increased 0.5 MPH and no idea if that's real or due to changes in calibrations and/or algorithms used to estimate the velocity at the release point

Pitchers are obviously bigger and stronger than in the 60's and 70's, but so are hitters, and I don't think there is that much difference in pitchers over the last 5-10 yrs. I think the strike zone is a much more important factor.