And what about home runs? Do they require double the power? No. The difference between some doubles and home runs are mere feet. Why does a home run hit 405 feet receive twice the points as a 395 foot double? If we’re measuring power, the ratios don’t seem to work out. Yes, a home run requires more power—250 foot line drives can end up in gaps, but they don’t end up over the fence—but does SLG accurately reflect the difference? Again, I think not.
If we’re looking at measuring power, total bases seem to be a poor way to measure it. Sure, it makes some sense, but total bases are dependent on defense, park, etc. in many of the ways that wins are dependent on other things. Hitters are still primarily responsible, but there are other factors that lead to extra bases than a hitter’s power. Instead, it might be more accurate to figure out the average distance a ball is hit—total feet covered by batted ball/number of batted balls. We have all sorts of spray charts, so I figure that, while it will take more time, we do have the technology.
But is SLG completely useless? What about measuring how likely they are to bring in runs? According to The Book website, singles have a run expectancy of 0.47, doubles 0.77, triples 1.09, and home runs 1.39. If we use this model, we don’t have the same problems SLG did with power, namely that doubles and triples are essentially the same batted ball but towards different spots and with different speed runners, but there are still problems. SLG gives doubles double the points of a single, but doubles don’t give twice the run expectancy. Home runs don’t give twice the run expectancy of doubles. It’s a lot closer than simply measuring power, but again, it falls short.
I’m not the first to realize some of these problems. ISO (Isolated power) realizes that all SLG are not created equal. A .220 hitter that has a .420 SLG has more power than a .320 hitter with the same SLG. ISO, then, takes SLG – BA, and the resulting difference gives us a better look at how many more extra base hits one hitter has over another. Unfortunately, it uses SLG. wOBA (and other similar statistics) avoids using SLG, but it’s an overall batting statistic that doesn’t directly reflect power.
SLG is a pretty quick and easy statistic to calculate, but I’m not entirely sure what it is calculating. Directly, it calculates the average number of total bases a batter gets per at-bat, but I don’t know that it directly reflects a batter’s power. And if ISO essentially just uses SLG and BA, I’m not sure it does either. Sure, it gives us a general idea, but I don’t know if it’s an accurate answer to the question of who the most powerful hitter in baseball is.
This started from an interesting article I read at Wahoo Blues. The idea was that ISO doesn’t accurately reflect power because not all ISOs are created equal. For instance, a .200 ISO isn’t always the same. A .220 hitter with a .420 SLG has more power than someone with a .320 average and .520 SLG because the hitter with a .320 average has quite a few singles boosting his SLG, and thus his ISO. Lewie Pollis has an excellent idea on how to solve it, but I wonder if it’s not based on a flawed premise. If SLG is flawed, then anything even tangentially affiliated with it, no matter how many times removed, is flawed, right?
So what now? Power is about more than just hits. Home runs and doubles are nice, but guys hit really far outs as well. Guys also have really short hits as well. Yet, SLG and other statistics treat all singles the same, all doubles the same, etc. when some obviously demonstrate more power—an infield hit is treated the same way as a line shot that lands just before the center fielder, but they obviously demonstrate differing levels of power. Every batted ball should be counted when considering power because they all demonstrate an aspect of power. A well-struck ball at a left fielder is the same as a ball hit similarly well toward the gap, but they are counted differently. As for how to solve this problem, I have options but not an absolute.
The average batted distance seems to be the simplest way to do it, and I think we have the technology to do it. But how exactly is the distance measured? For instance, a line shot at Jeter will be caught by Jeter a certain distance from the plate, but it obviously could have gone farther. And does a line drive that goes 320 feet require more or less power than a fly ball to the same distance? And where exactly (I really want to know the answer to this by the way) does a home run distance come from–where it actually stops or where it would have landed had stands and fans not gotten in the way?
So we could also use the batter’s average batted ball speed. Hit f/x, I think, records such measurements, but I don’t know if that information has been released. But I don’t know if a batter would hit enough of a certain pitch to throw things off. For instance, would a batter who hits and/or faces an inordinate number of fastballs going to have a higher batted ball speed because the pitch speed is faster, and would a batter who faces more breaking pitches see a batted ball speed less than he should? Maybe we could just use batted ball speeds off fastballs. Again, I’m better at giving options.
Those are a couple of my answers, but feel free to leave more in the comments section. Also, if I’m missing something completely obvious that throws all of this under a bus, I’ll take that as well.