The first table shows the number of seasons from a decade that are in the top 100 (actually the top 106 due to ties). 1900 means the decade of 1900-09. The 1950s had only one such season, Ernie Banks, 3.5 in 1959, which is tied for 62nd place. From 1945-58, there was not a single season in the top 106.
The percent of the total that each decade has is listed next. Then the total defensive WAR from the guys in that decade and that decade's percentage of all the WAR from the top 100 (actually 106).
Decade
|
Count
|
Percent
|
Defensive
WAR
|
Percent
|
1900
|
11
|
0.104
|
40.5
|
0.102
|
1910
|
6
|
0.057
|
23.7
|
0.060
|
1920
|
5
|
0.047
|
19.8
|
0.050
|
1930
|
2
|
0.019
|
7.9
|
0.020
|
1940
|
9
|
0.085
|
31.6
|
0.080
|
1950
|
1
|
0.009
|
3.5
|
0.009
|
1960
|
15
|
0.142
|
55.4
|
0.140
|
1970
|
11
|
0.104
|
42
|
0.106
|
1980
|
11
|
0.104
|
40
|
0.101
|
1990
|
9
|
0.085
|
32.9
|
0.083
|
2000
|
11
|
0.104
|
40.2
|
0.102
|
2010
|
15
|
0.142
|
58.2
|
0.147
|
The 1930s also had only 2 seasons while the 2010s have already had 15, the most since the 1960s. Some of this has to do with how many games get played and teams there are, but the 2010s are still doing very well.
Here is the same table for the top 500 seasons (actually the top 546 due to ties)
Decade
|
Count
|
Percent
|
Defensive
WAR
|
Percent
|
1880
|
13
|
0.024
|
31.8
|
0.020
|
1890
|
8
|
0.015
|
20.5
|
0.013
|
1900
|
40
|
0.073
|
117.5
|
0.076
|
1910
|
33
|
0.060
|
97.8
|
0.063
|
1920
|
23
|
0.042
|
68.7
|
0.044
|
1930
|
28
|
0.051
|
76.2
|
0.049
|
1940
|
31
|
0.057
|
89.5
|
0.058
|
1950
|
26
|
0.048
|
70
|
0.045
|
1960
|
48
|
0.088
|
140.4
|
0.090
|
1970
|
69
|
0.126
|
194.9
|
0.126
|
1980
|
62
|
0.114
|
172.3
|
0.111
|
1990
|
54
|
0.099
|
149.4
|
0.096
|
2000
|
58
|
0.106
|
165.5
|
0.107
|
2010
|
53
|
0.097
|
157.9
|
0.102
|
6 comments:
A twitter comment said "also, your charts don't account for more players/teams in expansion era, so more opportunities in recent seasons"
True, but at most, we are talking about double the number of games. We only have 6 seasons of the 2010s so far (and not yet complete in 2015). Four more to go. We could easily have 25 top 100 seasons when it is over and that would be more than double any 154 game decade and 60% more than the 1960s with only about 50% more games.
Then look at 1900-09. Why was that such a good decade if progress is steady? Why are the 2010s already doing better than the 1990s and the 2000s? The number of games has not changed that much over the 90s.
Thanks for the reply. I'm not sure if single season WAR leaders is the best gauge for overall fielding excellence. Since defensive runs saved is zero sum, any player in the top 100/500 is just that much better than whatever their league average is.
So wouldn't your charts be more of a reflecting of the variance in fielding ability during each decade? They might be telling us that the distribution in talent in the 1950's is much tighter with fewer outliers than the other decades.
If what your last thing said is true, that would be interesting. What made the distribution of fielding talent less in the 1950s? Why would it be more varied now?
Wouldn't a higher variance/standard deviation indicate a thinning out of the talent pool? I've always felt that the early 1900's had high SD in statistics for this reason (fewer players to choose from).
The 50s were the first (full) decade with african american players. A better talent pool would presumably improve the league average player by weeding out the worst players.
The 1960s brought 4 expansion teams or a 25% increase in players, presumably thinning the talent pool. The same can be said for the following decades with more expansion teams. More players/games and and a thinner talent pool would probably result in more top 100/500 seasons in these decades.
So it's my hypothesis that the talent pool gradually got better each decade before expansion, but took a hit in the 1940s due to WWII. Then expansion started thinning it out a little while creating more opportunities for top 100/500 finishes with more players.
As for the current decade (your subject in the post), I'm honestly not sure. I think the talent pool is currently as good as it has ever been, plus we haven't had expansion in almost 20 years. This would make me think the SD should be lower than ever, although it doesn't appear to be.
So I go back to my original point on twitter about how defensive runs saved are calculated, by humans/scouts watching games at baseball info solutions. Of course, this is much different than the total zone (play by play) calculation in the pre-BIS era. I have no evidence of this being correct, but I think this may be a reason for more top 100/500 seasons.
Another possible theory would be the increase in defensive shifts. With some teams shifting more than others this decade, could this cause more outlier seasons?
I think you've got alot of plausible theories and it may be a combination of things that makes it hard to nail down just one trend like the spread in talent or the way things are calculated. I also wonder if there if managerial philosophies affect things. In the 60s it seemed like teams were quite willing to have very weak hitting SS and 2B who could field very well
6 of the 9 top 100 seasons in the 1940s came during the war. All in 1943-44. Then the next tops 100 season came in 1959
14 of the 31 top 500 seasons from the 1940s came during 1943-45
so what you said about WWII might be true
Post a Comment