Perhaps because of its comparatively leisurely pace, baseball--more than any other sport I can think of--has been the object of obsessive statistical fascination. Over the years, baseball fans have continuously devised ever-more complex formulae to measure and compare players' performance. Even a casual fan is familiar with the traditional statistics: batting average for hitters (number of base hits divided by number of at-bats), earned run average (ERA) for pitchers (average number of earned runs surrendered per nine innings). Over the years, however, additional statistics of greater or lesser value have been developed.
One of the more newfangled statistics for pitchers is "WHIP," which stands for "walks and hits per innings pitched." This stat has a nice intuitive value. I remember once, years ago, listening to Tim McCarver back when he was a Mets announcer. explaining that one way to measure the effectiveness of relief pitchers--who often have misleading earned run averages--was to look at their how many walks and hits they had given up: If the sum of walks and hits was lower than the number of innings pitched, then that suggested an overall effective pitcher. Basically, WHIP formalizes this observation: The closer a pitcher's WHIP is to 1.00, the more successful that pitcher is likely to be. (A WHIP below 1.00 is terrific.) On the offensive side, though, things are less clear.
Two traditional measurements--albeit less familiar to the casual fan than batting average--are on-base percentage and slugging percentage. The former is similar to batting average, only it includes things like walks and hit-by-pitch, so it will be somewhat higher than batting average. For example, if a hitter comes to the plate 10 times and gets three hits, he will have a .300 average; if he also walks twice and is hit by a pitch once--thus reaching base a total of six times--he will have an on-base percentage of .600. Slugging percentage is a power measurement. To figure it out, you divide total bases by at-bats; thus, if a batter comes to the plate once and hits a home run, his slugging percentage will be 4.000 (four total bases divided by one at-bat). Slugging percentage, too, will generally be a bit higher than batting average: A hitter who comes up four times and gets one base hit will have an average of .250; if, however, that one hit is a double, his slugging percentage will be .500 (two total bases divided by four at-bats). Get it?
Now, on-base percentage does give one an idea of a player's ability to get on base--a necessary pre-requisite to scoring. And slugging percentage does provide a decent snapshot of power. The latest vogue statistic, though, is on-base-plus-slugging percentage (OPS), which is touted by many sabermetricians (baseball's statistical cognoscenti--yes, they have a name for themselves) as a true measure of a player's offensive skills, just makes no sense to me. As the name implies, OPS is derived by adding the two percentages--on-base and slugging. But I have never understood the rational behind adding two averages--unless the idea is just to produce a number that sounds mind-bogglingly impressive to people familiar only with batting averages. After all, if you know that a lifetime average of .300 his Hall-of-Fame worthy, when you hear an OPS number like .650 bandied about, you will be left slack-jawed with amazement--even though n OPS of .650 is not overly impressive.
When it comes to statistics, too, I've always been under the impression that adding two averages is something of a no-no. The choice of units to add seems arbitrary--why not add batting average, as well? Or why not add batting average to slugging percentage? Why not add a pitcher's ERA and WHIP? A perfect quarterback passer rating in the NFL is an inexplicable 158.3. While a perfect OPS is easier to grasp (5.000), the logic behind it escapes me.
No comments:
Post a Comment