Friday, January 22, 2016

How the error rate can affect the run value of OBP & SLG

This is something I did several years ago and I think originally I just mentioned it on the SABR list.

Here is an example of how the error rate can affect things. The error rate is 1 - fielding pct. The regression below shows runs per game as a function of OBP & SLG for each season of the NL from 1920-2012 (I used the whole league instead of teams)

R/G = 22.77*OBP + 6.7*OBP - 5.68

Now what if we add in the error rate. The regression becomes

R/G = 15.75*OBP + 9.7*SLG + 15.69*ERATE - 4.94

The relative value of OBP & SLG changed quite a bit. But this is for a whole league. The ERATE applies to the whole league.

I have used the ERATE and applied it to teams. That assumes that the rate of errors made against each team is the same. Not totally realistic, but that is what I have. Sof it the ERATE was .02 one year in a league, every team got that rate.

I did all teams from 1920-1998. Here are the two regressions

R/G = 19.89*OBP + 9.79*SLG - 5.95

R/G = 17.63*OBP + 10.7*SLG + 13.51*ERATE - 5.87

So again the relative values of OBP & SLG change

1 comment:

Carl said...

Can math like this be applied to daily fantasy sports as well? I am playing on https://www.dailylineups.com/ nowadays and I find it very interesting and fun to see how these kinds of games work. Do you know any strategies at how to improve your DFS teams?