Very interesting, thank you. You must have a lot of free time!
Peppers at 10, which seems low.
The question was whether Positional Rating meant more than star (or Rivals' sliding scale) ratings in determining the value of a player toward success of his program.
To do this, I compared averages of recruits who entered college during the period of 2005 to 2008, and stacked that up against real results. In other words, I wanted to see if there was a greater correlation between the current BCS standings and any of several factors:
1. Star Rating
2. Rivals Rating
3. Positional rank in class
4. Number of ranked recruits
5. Team speed (40-yard dash)
6. Team size (height (in) times weight)
Since I can't figure out how to post the entire sheet, I've written out the Top 15 for each category.
For our purposes, I used only recruits that were ranked by Rivals.com. Rivals usually ranks only those recruits that it rated higher than 5.5 (which is a low 3-star).
For some positions, there were typically 90 or more ranked recruits each year, whereas for others, there was only 15 or so. I thus considered making recruiting rank a percentile, for a team that recruited a 4 star receiver ranked 70th (of 92) in his class would be penalized more than a team that recruited the 15th (last) fullback or kicker. However, I felt this -- the ability to recruit all positions -- was an important factor in the key differential between Star Rating and Positional Ranking. Plus, a successful team would be responsible for recruiting all of those positions. So instead of a percentile, I simply dropped from our list any team that did not bring in 20 or more ranked recruits in this period.
This eliminated a lot of Mid-Majors, but the only BCS conference teams to be cut were Cincy (7), Indiana (13) and Connecticut (15). It is assumed that these teams, which could not bring in more than five 3-star or higher recruits in four years, would perform below the teams included.
I also did not cover attrition (transfers, early leaving, etc.) Instead, I simply omitted the 2004 class altogether, figuring the good done by holdovers was roughly equivalent to the bad of attrition for the four following classes. I understand that this is very inexact, but I feel the results are still useful for our purposes.
Results Skewed by Southern Bias in Data
Particularly interesting was the SEC schools and their higher number of ranked recruits brought in. Among teams that had more than 85 recruits in this time, we get Florida (91), Alabama (95), Georgia (90), Florida State (88), LSU (88), Oklahoma (88), Tennessee (86) and Auburn (86). In other words, seven of nine over-recruiters were SEC schools, and all were in the South. Overrecruiting did seem to have a correlation with winning, but not as much as class rank or star rating. I posit that this represents not so much the ability of the SEC conference to recruit better talent, but Rivals.com's tendency to overrate recruits in the South. It's noteable that teams that recruited nationally like Notre Dame and Michigan were the most overrated by talent influx, while other Northern teams that had most northern recruits were largely underrated. If you look at the cut-off line for where Rivals stops their ranking, the picture becomes even more clear; the last few 5.6-rated players included in the rankings are normally from Southern states; the alphabetic list of unranked, 5.5-rated players is overrepresented by Northern players.
Top 15 -- Current BCS Standings
6. Texas Tech
7. Penn State
8. Boise State
9. Ohio State
12. Oklahoma State
13. Georgia Tech
Top 15 -- Star Ranking Only
3. Ohio State
4. Florida State
11. Notre Dame
12. Penn State
Top 15 -- Rivals Ranking Only
4. Florida State
7. Ohio State
10. Notre Dame
13. Penn State
Top 15 -- Avg Positional Rank of Recruit in Year
4. Florida State
6. Ohio State
10. Notre Dame
15. Penn State
Top 15 -- Most Ranked Players
4. Florida State
9. Notre Dame
13. South Carolina
Top 15 -- Avg. 40-Yard dash time
4. Texas A&M
5. Southern Miss
7. Ohio State
10. Georgia Tech
13. Kansas State
14. South Carolina
(Little brother is 16th!)
Top 15 -- Avg. size of recruits
2. Penn State
8. Mississippi State
9. Texas Tech
13. Virginia Tech
15. Boston College
Recruiting as a whole was shown to be a fair but imperfect predictor of overall success.
Team speed was the most overrated statistic; even removing teams without recruits at every position, the speed ratings made little mark on anything. In fact, Texas Tech (4.78) and Missouri (4.81), both of which employ speed-based offenses, were both near the bottom among teams in total team speed. This means that either speed makes little to no difference in a team's ability, or 40-yard-dash times are arbitary (or in Brian, "FAKE.")
No recruiting statistic stood out as a definitive predictor of team ability, but the positional rank of recruits did, in fact, prove to be a slightly better metric than star rating, and a much better predictor than other factors.
To look at Michigan in particular, it does in fact seem that positional ratings mattered a great deal.
Top 15 positional ratings of M recruits 2005-08:
1. Justin Boren (#1 Center, 2006)
2. Marques Slocum (#1 Guard, 2005)
3. Brandon Minor (#1 Fullback, 2006)
4. Zoltan Mesko (#2 Kicker/Punter 2005)
5. Brandon Graham (#2 Mack Linebacker, 2006)
6. Stephen Schilling (#2 Guard, 2006)
7. Ryan Mallett (#2 QB-Pro Style, 2007)
8. Donovan Warren (#3 Cornerback, 2007)
9. Jonas Mouton (#3 Safety, 2006)
10. Boubacar Cissoko (#4 Cornerback, 2008)
11. Kevin Grady (#4 Running Back, 2005)
12. Kevin Koger (#4 Tight End, 2008)
13. Antonio Bass (#5 Athlete, 2005)
14. David Molk (#5 Center, 2007)
15. Carlos Brown (#6 Running Back, 2006)
Look at all that attrition!
Among top recruits in high percentiles we include Mario Manningham, Stevie Brown, Greg Mathews, Michael Shaw and Terrance Taylor.
Then again, here's our lowest positional ratings:
1. Martavious Odoms (Receiver 81)
2. Mark Huyge (Tackle 76)
3. James Rogers (Athlete 71)
4. Chris Richards (Athlete 63)
5. Troy Woolfolk (Cornerback 59)
6. Obi Ezeh (Running Back 58)
7. Mark Ortmann (Tackle 55)
8. John Ferrara (Strongside DE 55)
9. Carson Butler (Strongside DE 55)
10. Perry Dorrestein (Tackle 53)
11. Quinton Woods (Strongside DE 53)
12. Greg Banks (Strongside DE 51)
13. Artis Chambers (Safety 51)
14. Roy Roundtree (Receiver 50)
15. Zion Babb (Receiver 49)
As you can see, these are obviously positions in which there were a lot of recruits. Percentile-wise, the lowest guys were Odoms, Huyge, Rogers, David Cone, Chambers, Brandon Logan, Quinton Patilla, Steve Watson, Woolfolk and Ezeh.
So it does seem that positional ranking means a lot, even if the player was projected at a different position. And if so, it is even more devastating that Michigan lost Bass, Slocum, Mallett and Boren before they could become top contributors. Equally devastating was Stevie Brown and Kevin Grady being busts, at least up until now.
In the future, as a fan, this shows me that my projections for future seasons needs to be looked at again, and that when evaluating recruits, their percentile means more than their ranking.
Other than that, plus the already well-established fact that the SEC badly needs to get over itself (i.e. Rivals scouts need to get over their fear of crossing the Mason-Dixon line), this analysis proves little.
Hope you were entertained.
Very interesting, thank you. You must have a lot of free time!
Link to current positional rankings of our 2009 class. Eight are below the top 30 at their position, while seven are in the top 10 at their position.
I know the recruiting season isn't over yet but maybe you can get a start now on the current recruiting year. This way we can keep a look at how we rank in different catagories.
I knew star rankings don't hold much clout. Look at the difference between M's rankings just on Rivals compared to Scout. Who's best just boils down to a educated guess. The big unknown ranking is how a player will fit into that coach's system. That might give a +/- of a star in apperance. Like Mallett could go from looking like a 5* with Carr to a 4* with RR.
Your post is very interesting. I would be interested to see the success of teams based on how much $$ is spent on the program... I wonder how we would rank in that department.
Very interesting stuff. I always thought that recruiting for the position/scheme, as opposed to the star ranking, made more sense, and this kind of reinforces it. It is startling, though, to see how many top-ranked kids in a class were lost by UM.
Two questions - One, did you notice any correlation between the relative strength of a particular position between years and the success the recruits had? In other words, if 2006 was a "down" year for Safeties (few top stars), did the positional rankings mean as much as a highly-competitive class/position?
Second question - I know you mentioned UM and ND as overrated programs, but did you notice any pattern in schools like Miami being so highly ranked yet having abysmal teams? I was shocked to see them in the top 15 for virtually every ranking except actual results/BCS.
Actually, the biggest shift was simply more recruits each year were included. In 2005 there were 943 recruits (total) who were included in the rankings. In 2006 it was 998, then 1107 in 2007 and 1196 in 2008.
In 2005 there were 57 safeties. In 2006 there were 62 safeties. 2007 and 2008 had 83 and 84, respectively. In 2006 Rivals gave 5 stars to just one safety, Reshad Jones, who went to Georgia. Second was Jonas Mouton, the Tervaris Johnson, Darian Hagan, DeAndre McDaniel then Stevie Brown. Though Brown has been eeuuuh, and Mouton moved to linebacker, all six were starters in their 3rd years. 2005, with a similar crop, has produced similar results, with one star and a bunch of starters. In 2007 and 2008 there were no 5 stars, even though the class was bigger. I guess what that means is there isn't a correlation.
I ran different numbers as I was putting it together, and the more recruits that were input, the more things like that evened out. We're essentially a big enough country with enough football players that except for maybe extreme cases at the top, the classes are going to be pretty even.
I wasn't able to track individual success, except for pulling out Michigan players and using my base knowledge concerning their progression.
For Miami, and Florida State, too, I was pretty shocked at how high they got. I think that's part of the Rivals effect; Southern Schools were WAY overrepresented in the rankings. California (608) and Texas (573) had the most ranked players, followed closely by Florida (470). The next was Georgia (240), then Virginia (196), Ohio (180), Pennsylvania (150), Mississippi (135)(!), Louisiana (121), North Carolina (119), Alabama (114) and Illinois (103).
For anyone who's worked in presidential polling, you should recognize part of what those numbers mean (for those who don't, you're looking at per-state U.S. African American population density). But that's only half the story; I think the other half is simply Rivals spends more time scouring the South, and get overexcited for the 45th best player from Florida while ignoring the 4th best in New York. Miami and FSU have more players from Florida than any other team, whereas Florida does more recruiting nationally and across the South.
Great point about the South. I've always been driven crazy by the extreme exuberance shown by the recruiters toward players from the South, though I do understand that there are some socio-economic and demographic factors that play a role (how's THAT for PC!). What really surprises me, having seen your numbers, is how bad both Ole Miss and Miss St. have been considering how fertile their backyards appear to be. I actually read a book called Meat Market about college recruiting with Ed Orgeron (Orge!) down at Ole Miss, and it continuously shocked me how few top-notch kids they kept from in-state.
Still, great stuff, and I hope you do another one after another year or so.
May I ask what kind of analysis you did (regression; simple correlation, etc) and how you arrived at the conclusion that position rankings differ in predictive ability than star rankings? Is this a significant difference?
Thanks so much for embarking on this project--I've always thought that we could learn quite a bit about the reliability of recruiting ranking if we just do some simple analyses of the predictive value of each recruiting variable on success in college. The hard part, I'm sure is figuring out how to assign a numerical value for success in college.
I call it "comparative" analysis, i.e., I stuck 'em next to each other and drew lots of lines on Photoshop, then measured how long my lines were.
Yeah, I'm sure there's 20 better ways to do this without turning on an abacus, but considering I learned about 30 features of Microsoft Excel this week, I was hoping I'd get cut some slack.
Was just wondering. Thanks for the diary.
I need to clarify something real quick.
This doesn't mean you should look at a recruit's ranking before their star rating.
It does mean, however, that average star rating is not as good a predictor of a CLASS as the AVERAGE POSITIONAL RANKING
This is really interesting! Thanks very kindly for the hard work -- I am thinking differently about recruiting now -- your treatment by percentile washes out the false precision of the absolute rank, and reflects both the number of players in a position and the relative strength of the player in his group (at least as judged by people evaluating). Just as interesting to me is the southern bias you picked up in your analysis. I think the observation that some schools with loads of southern talent underperform their talent and some with loads of northern talent overperform, does indicate that there is an upward bias toward southern players. Would an alternate hypothesis be that some northern-laden schools have superior coaching? Hmmmm.
I think the problem was mostly in the middle ground. After the 5 and 4 stars -- for whom some scouts have been known to use a slightly homoerotic horse racing metaphor -- there's a big group of players you can basically class as "Definite D-IA talents," or "3-stars." This group is huge, and they are where scouting organizations seem to spend a comparatively smaller amount of time.
They have to put their resources where they will do the most good, which means if you're going to pick an arbitrary sample of these (perhaps 1,000 to 2,000 per year), might as well be the ones that are easiest to find: on top national programs, at major schools' camps, and in the town you pass on I-75 on the way to Miami. For this reason, the bottom of rankings have tended to end with a disproportionate number of these 2.5- to 3-star guys from the South, while the players at their same level in the North are included in the database, but not given the kind of attention required to rank them against each other.
To avoid spending MUCH more time inputting players, and having half the list be guys who didn't even end up playing D-IA football, and because I didn't want to come up with a way to normalize the rankings when Michigan picks up the 385th split end (B-Easy?), I cut off my own listings at the end of their rankings. There were definite disadvantages, e.g. Cincy was discarded as recruiting no better when Indiana, when in truth they built a solid program on a big number of overlooked Ohio Valley 3-stars. And it exacerbated a problem in my rankings that was not such a big deal for Rivals, whose readership doesn't really care that much if their guy was really the 65th or 85th inside linebacker.
By the way, Rivals is doing a much more extensive ranking for 2009. Every position is up, and I'm projecting perhaps as many as 1,400 recruits will be included in my signing day update.
If this is the case, it could be agued that Sparty's recruiting class is stronger than ours, couldn't it?
Check out rivals' position rankings and see how many MSU recruits are in the top 10 vs how many of ours are.
I agree, with the caveat that most of MSU's top kids are on offense, specifically RB and OL, and most of them are from in the state. By comparison, UM's top recruits are spread more evenly between offense and defense, and have a more national feel. I think that with UM already having 6 redshirts on the OL and a substantial stable of RBs, MSU was able to sweep in and nab some kids who I'm not sure would have been as available if UM recruited them harder. Still, MSU's past two classes have been solid, and I hope it results in a more competitive series between the two schools.
The fact that there are more Southern players in the rankings does not necessarily reflect bias. High school football is a bigger part of the culture of the South than elsewhere (many Southern high schools have spring football) and the participation rates are significantly higher. In fact, some analysts have made the exact opposite claim: that Northern players are often overrated to increase their representation in the number of top 100 players.
Whenever culture is cited as the reason behind X, my eyebrows start making altitude adjustments.
Not that I entirely disagree. There are only so many top athletes that can be produced by American alleles and American coaching each year. There's where your cultural difference lies, in my opinion.
I like to tell a tale of when I was umpiring the U-M rec softball league. The football team had two squads in the top level. Now I know baseball and I know baseball talent, and the greatest baseball talent I ever saw in my life was playing centerfield for one of those football squads. There are guys who say they saw Ken Griffey Jr. in high school, and I imagine it was like that. He could hit the ball wherever he liked, with power or for average -- the sweetest natural swing you ever saw. But it was the field where he really shone. This guy would have won every gold glove in centerfield in his career. His instincts were uncanny. The fluidity of motion made me gape at the speed with which he was reaching the ball.
That guy was Braylon Edwards.
Braylon grew up in football, and loved football, and learned football technique, and shone in football. But his tools would have made him a Hall of Fame baseball player if he'd been born into a baseball culture instead.
The Midwest, particularly Michigan and Indiana, does this with basketball. The Northeast still siphons off tons of talent with baseball (particularly New York). Many most northern states, particularly Massachusetts, Minnesota, Maine and the Dakotas, shuttle a good portion of in-state athletes into hockey.
However, if your kid is enrolled at a high school in suburban Detroit, and he's got the kind of obvious talent that these kids who end up in D-IA have, then he'll almost assuredly end up on the football squad here.
If there's a cultural effect, then, I would suggest it happens, but happens below the marker I set for this analysis. A cultural emphasis in some towns on football over education could well turn what would in Birmingham end up being just another offensive lineman into a 1.5-star with a couple low offers. But that wouldn't appear on Rivals' rankings. Rather, I maintain that the reason for the Southern Bias is simply the Rivals scouts spending more time there, and thus filling out the bottom of their rankings with the 3-stars there while just listing the equal talent from Northern schools.
People who follow recruiting for a living have admitted that Northern recruits are often overrated. I've got to take them at their word. As for the Braylon story, you do realize that there are a million other stories like that, all over the country, right?