The idea is to reward balance. I used the Baseball Reference Play Index in the post mentioned above.
In that study, I called up all the seasons with at least 0.1 in each of the following stats: fielding runs (Rfield), base running runs (Rbaser) and batting runs (Rbat). Those are all above average (no seasons came up from 1876-1885, maybe because they don't have SB data for those years).
Then all three stats were multiplied by each other. Then I took the cube root to get the geometric mean. But that means that any seasons with a negative number or a zero in any of the three stats did not count.
If I use career totals, then there can still be negatives and zeroes so using the cube root probably does not make sense. So here I tried to convert each stat into a rate stat (the explanation is at the end in technical notes). It involved OPS+ and turning the running and fielding stats into something like OPS+. Also, OPS+ itself was adjusted to be an "above replacement" stat. The others were not as it is probably easy to find average runners and fielders in the minors.
That meant that OPS+ went up for most players (if not all). They got credit for more runs and that increased their estimated OPS+. If a player had negative fielding runs, his fielding OPS+ would be less than 100. If positive, above 100 (100 is average in OPS+). Then I found the geometric mean of all three stats. I used all players with 5000+ PAs and their career stats.
Here are the top 25. I don't know if this is better than the other method. Just different. If you look at the complete lists, some players move up quite a bit.
|17||Shoeless Joe Jackson||187.26||98.55||100.93||122.98|
Click here to see the complete rankings.
Technical notes: I ran a regression with OPS+ as the dependent variable and batting runs above average per PA as the independent variable. Here is the equation
OPS+ = 832.14*BattingRuns/PA + 99.32
So for all three stats, the runs per PA was plugged into this equation.