in town for free camps
strength of schedule
In the national playoff discussions now, we hear so much about WL records and large scoring margins. Yet, as every chess enthusiast, baseball player and 10-year old video gamer knows, it does not mean much to win games--or even win big-- unless you consider the quality of the opponents.
This post will mainly discuss Ohio's SOS, since it is the one now most at issue nationally. As Wojo recently pointed out strength of schedule matters a lot… and Ohio State doesn't measure up” this year. In fact, the graph below shows that it hasn't measured up for a long time---throughout the Urban Meyer years from 2012-4.
Based on the standardized Sagarin ratings (stdzSOS) for 3 yrs (one for RTG and MD)
Ohio has had the second worst SOS WITHIN the B1G and is nearly tied for last with PSU, far lower than all the other teams.Due to their low SOS within the B1G itself, it is clear that Ohio cannot simply blame their SOS on a poor conference.
Just how bad has Ohio’s schedule been? It’s about three orders of magnitude (standard deviations) worse than the team with the top SOS in the B1G---which will be briefly noted later.. More concretely, Ohio’s average SOS during the past 3 years was actually 12 spots worse than the rank of the best top FCS division team, ND State (#52 vs. #40).
It shouldn’t be so surprising. This year, Ohio faced but one team in Sagarin’s current top 15. They faced only two in the top 30---which is tied with Neb and Lville for the least among all top 30 Sagarin teams. By contrast, Ala faced 10 top 30 teams. Ohio even lost at Home to a team that finished tied for last in the Coastal Division and r #12 overall in the ACC—the only conference with a lower Sagarin rank than the big ten.
In addition. last year, Ohio beat no one in the top 15 and the two teams they played in the top 15 were both losses. OOC that year, they played Buffalo, San Diego State, and Florida A&M at home and the supposedly toughest was to be Cal—which went 1-11 that year, 0-9 in the Pac 12. The year before (when UM played away games vs. the two national title participants), Ohio did not play anybody in the top 19 all year. They did not play in a bowl or B1G title game. Rather, they played all their OOC games at home against powerful foes like Miami of Ohio, UCF, UAB and yes, you guessed it---they beat a 3-9 Cal team by only one score at home.
BTW: UM had the toughest SOS in the B1G for the entire past 3 years. The SOS gap between Ohio/PSU and UM is, in fact, staggering. This discrepancy may be worth discussing, even though it clearly does not explain all that's happened to UM.. It is worth discussing because the media---eager to prop up some teams and pound on others-- have entirely ignored this issue.
The guys at Football Outsiders have projected strength of schedule based on projected FEI rankings for the 2014 season. You can read about it here and here, and they have a fancy visualization of it here, but for those who don't want to read through the articles, here's the most relevant info:
SOS = "the likelihood than an elite team (two standard deviations better than average) would go undefeated against the given team's entire schedule. Stronger schedules have smaller SOS ratings."
Additionally, they have some other interesting statistics. Against Michigan's schedule,
- 60.7% is the likelihood that an elite team would lose one game or fewer.
- 88.3% is the likelihood that an elite team would lose two games or fewer.
- 0.2% is the likelihood that a good team (one st. deviation better than average) would go undefeated
- 4.9% is the likelihood that a good team would lose one game or fewer.
- 20.6% is the likelihood that a good team would lose two games or fewer.
- 50% is the likelihood that a good team would lose three games or fewer.
- 75% is the likelihood that a good team would lose four games or fewer.
No, I don't believe Sagarin rigged his schedule ratings to help Oregon and prevent TCU from miraculously slipping by Oregon. But it is interesting to note that while I have heard plenty of talk about TCU and Boise St. lacking schedule strength, I hadn't really heard much regarding Oregon's.
Step in unnamed MGoBlogger* (**edit** named Drakeep) who pointed out that the Big Ten teams' schedules included an average of 7 winning opponents (while each SEC team faced an average of 5.8, and the PAC-10 something like 4...) This savvy blogger also pointed out that Oregon had only faced 3 teams with a winning record. I could barely believe it, and checked the stats myself. Such is true.
So I head over to Sagarin to see where exactly a schedule against 3 winning teams and a very much non-winning FCS school would rank. 20th. What was U of M's against 7 winning teams and a winning FCS school? 40th. Hmmm....
Next, I give Sagarin the benefit of the doubt and assume that although Oregon's opponents didn't all win a lot of games, the games they did win must have been meaningful. (In other words, Oregon's opponents must have combined to beat a lot of winning teams... as beating crappy teams and losing to good ones should not build a team's own strength.)
Oregon - Played 3 teams with winning records (out of 11, plus one losing FCS team.) The 12 teams Oreg played, combined to achieve 12 victories over "winning FBS opponents" and 7 victories over "winning FCS opponents." That equates to Oregon's opponents each beating ONE winning team.
Mich - Played 7 teams with winning records (out of 11, plus one winning FCS team.) The 12 teams Mich played, combined to achieve 32 victories over "winning FBS opponents" and 7 victories over "winning FCS opponents." That equates to Michigan's opponents each beating 2.67 winning teams.
These statistics are not even close, on either the primary or secondary level. Yet, there it is: Oregon's SOS at 20 and Michigan's SOS at 40.
For another reference point: Mich St. played 5 teams with a winning record, and MSU's opponents combined to haul in 19 wins against "winning FBS opponents." They lie between Michigan and Oregon on both the primary and secondary levels, and have a SOS rated 65th.
In conclusion, based on the ranking of Michigan and MSU schedules, Oregon's schedule should probably rate somewhere between 70 and 80. This has placed me in the odd position of questioning the legitimacy of Sagarin's rankings... if any mathematician out there can point out how strength of schedule might use something more meaningful and direct than opponent's wins and opponents' wins against winning teams to rank schedules, let me know. Until then, I'm going to have to believe that Sagarin is off his rocker.
*Unnamed MGoBlogger - my apologies, but I went in search of your forum and could no longer find it. If you (or anyone else) would care to link to your post, I will gladly edit the above content to include your name and a link.
The BCS caliber team thread below got me wondering: is there a 2010 end of season strength of schedule ranking?
I suppose it doesn't really matter, but I think Michigan faced some pretty good teams this year. Truly, our defense lost several games for us. In particular, our decimated defensive secondary, caused several losses. If you add Warren & Woolfolk to this year's team, we end up at least 9 - 3, and there aren't so many calls for RR & Gerg's head on a platter.
In a way, I'm fine with 7 - 5, given the circumstances. Not happy, but fine. I'm just a broken record, but 2011 is the year that will tell us a lot about RR & Michigan.
There’s been a debate on this board whether UM had a tougher schedule during the second part of the season.
No, based on the Sagarin PREDICTOR ratings for B10 plus ND games only. Our strength of schedule (SOS) has NOT been tougher in the second part of the season, even when the games are corrected for 3-point home-road advantage-disadvantate.* Our schedule during the second half of the season was about 2 points easier on average.
However. If we include ALL of the games, the second half of the season WAS tougher overall. They were only mildly tougher. However, statisticians often seek to improve the reliability of the data by throwing out the highest and lowest ratings (here, IA and DSU). If you do that, there is a significant trend toward tougher games throughout the season (r-.47) —making each successive game almost 2 points tougher than the next, on average.
When we add OSU (87.1) next week, our second-half season ratings will seem even tougher.
*Summary (for comparison purposes, note that UM’s current ratings is 69.0).
IA 80.8 < PSU 83.2 Actually 83.8>80.2 with Home-Road (+3,-3)
ND 79.8 > WI 77.7 actually 76.9 <80.7 with H-R (-3,+3)
MSU 76.6> Purdue 69.0 actually even more so 79.6>66.0 (+3,-3)
Ind 64.9 > IL 63.9 actually 61.9<66.9 (-3,+3)
Total ratings for 1st vs 2nd half of B10 and ND games only: 302.1>293.8 actually with the same corrected ratings (H-R balances out)
WMU 59.3 actually 56.3
EMU 56.5 actually 53.5
DSU 38.6 actually 35.6
**** Based on the current rating, UM should have lost to ND, beaten Purdue and Il. We beat ND , lost to Purdue and Il. The other five major games (Ind, IA, PSU, WI,MSU) went as expected, when you make adjustments for H-A. Note that to really make a fair test, however, we should recomputed the sagarin rating for all the games except the one we are considering (ie to determine whether we should have won or lost it). I don’t have the software to make those corrections, however.