AL run scoring was way down in 2013. Batters scored just 4.33 runs per game, their lowest level since 1992 [chart updated for added lusciousness]:
I find this this graph amazing. Except for a brief period in the late 80s, run scoring hasn’t been this low since Reggie Jackson and Ron Guidry were winning World Series. A .275/.339/.437 hitter in 2006 was equivalent to a .256/.320/.404 hitter in 2013.
That’s crazy! I don’t think that I am the only baseball fan that looks at a stat line and makes an intuitive judgment. When I first started paying attention to OBP, a .300 OBP was unacceptable, even for a middle infielder. Now, its just barely below average. A player with the same batting line as an average AL hitter in 2006 is now very valuable. We all need to adjust our expectations.
Why has run scoring decreased? A lot of people have theories. BABIPs haven’t changed much. HR/Fly is down from its peak in the late 90s, but is still significantly elevated from the late 80s. But two key indicators are sky-high: infield fly balls, and K rates.
A lot of people like to attribute the decline in run scoring to performance enhancing drugs. I think the real answer is more complicated than that. Run scoring peaked in the 90s due to a combination of PEDs, new small ballparks, new roles in the bullpen, new hitting technology, and policing of the inside pitch. Run scoring has decreased as managers have gotten smarter about bullpens, teams have used data to model defense, and PEDs have exited the game.
My favorite explanation for the decrease in run scoring: the growth of the cutter as a major league pitch. Only 1% of pitches thrown in 2004 were cutters. In 2013, 5.7% of pitches were cutters. The difference has come almost entirely out of fastballs. Rates of curveballs, sliders, and changeups thrown have been pretty flat. A cutter is a pretty powerful pitch – there’s not a lot you can do with a well-thrown one. The bottom line is that pitchers are finally a step ahead of batters again.