Here is the regression for runs scored per game

R/G = 9.8*SLG + 17.17*OBP - 0.308*GDP - 0.394*CS + 0.143*SB + 0.54*ROE - 5.09

Now the regression for runs allowed per game

RA/G = 9.4*SLG + 17.57*OBP - 0.188*GDP - 0.446*CS + 0.302*SB + 0.86*ROE - 5.35

So the value of stealing a base is .143 runs per game while allowing one is .302 runs per game (it seems like there are big differences in GDP and ROE as well). I can't think of any reason why there would be a big difference here.

I started looking at this because when I added variables like SB differential, etc. to my regressions estimating winning pct based on OPS differential, the value of the SB differential seemed too high.

If we look at a team like the 2010 Red Sox, they allowed 1.04 SBs per game while having 0.259 CSs per game. If I use the coefficient values for RA/G, they allowed about .2 runs per game from stealing. That would be about 32 runs per season or about 3 wins.

If I used the values from the R/G regression, they would have allowed about .047 runs per game from stealing or 7.59 per season. That is not even one win. So the difference between the two methods is about 2.5 wins.

In a recent post, I found that over the last 5 years, the Red Sox seemed to under perform based on their OPS differential. See

**The Relationship Between OPS Differential And Winning Percentage Using 5 Year Averages**

**.**They won 5.23 fewer games on average each season than their OPS differential would predict (only the Rockies were worse at 5.4 fewer wins).

The 2011 Red Sox were similar, with 0.963 SBs per game and 0.309 CSs per game.They have 4 teams in the top 21 in SB allowed per game from 2010-14. The Rockies, however, don't seem to have been that bad at allowing SBs, having only one year in the top 50 in SB allowed per game. So their under performance must be due to something else.