Last week, I posted a graph showing the history of MLB run scoring since 1946. The graph showed a steady decline in run scoring since it peaked in 2000, including a fairly precipitous drop over the last 4 years.
The most common thing, other than the exit of PEDs, that I’ve been hearing from commentators trying to explain the drop has been the growth of ‘power bullpens’ or ‘specialized bullpens.’ Generally, they’ll point to relievers throwing 96 coming out in the 7th inning, and talk about how that just didn’t happen a few years ago. Up until I started looking at the evidence, I believed that this was a big explanation for the decline in run scoring.
But not so much. Let’s start with RP usage:
This chart could be a post on its own. It’s pretty cool to watch relief usage grow over time. After four decades of growth, the starter/reliever shares of innings stabilized. 2014’s usage patterns are not significantly different from 2000’s.
But are relief pitchers just getting better? Second chart:
Another cool chart. Starters commanded the vast majority of innings prior to 1973 because starters were consistently as good or much better at preventing runs from scoring. That relationship is clearer here:
Relief ERA relative to Starter ERA actually peaked in the 1980s, but has shown no clear trend in recent years.
Here’s what the data show: MLB relief pitchers have gotten better since 2000, but their ERAs have improved in tandem with MLB starting pitchers, while their usage has remained constant. Whatever is changing to make pitchers better at their jobs is not unique to relief pitching.