Michigan 12th Best University (Worldwide) By Reputation

Submitted by Everyone Murders on

Different university rankings abound, but this one's methodology has some merit (anonymous survey of 17,000+ academics worldwide).

The Alumni Association states:

U-M has been ranked 12th in the annual Times Higher Education World Reputation Rankings. The 2012 Reputation Rankings are based on results from an invitation-only survey of scholars from around the world. The University is one of only three public institutions in the top 15—the others are the University of California, Berkeley (5), and the University of California, Los Angeles (9).

(See official Michigan release here: http://www.ur.umich.edu/update/archives/120315/rankings .)

The survey results are here: http://www.timeshighereducation.co.uk/world-university-rankings/ .

Other B1G representation (based on reputation) includes Northwestern (26th), Wisconsin (27th), Illinois (31st), Minnesota (42nd), Ohio State (57th), MSU (96th) and Purdue (98th). Good showing for Michigan and the B1G!

panthera leo fututio

March 28th, 2012 at 10:44 AM ^

For all their fan- and athlete-based shenanigans, OSU is actually a really good school. 57th in the world, ahead of Vanderbilt, Rice, Tufts, Rutgers, Notre Dame, etc., is IMHE pretty impressive.

(That said, arbitrary rankings are arbitrary rankings. These appear to much less open to gaming than, e.g., US News, but their chosen method for aggregating opinions over diverse fields and geographies is bound to have a huge influence on results, and there's no reason to think that any one method is the most appropriate.)

jtmc33

March 28th, 2012 at 10:12 AM ^

So we all can understand what ranked by "reputation" truly means:

It's like the University of Michigan is a 5-star All Purpose Back to Rivals, and a 5-star RB to Scout.   ESPN would have us as a 4-star Athlete.

 

Dezzy

March 28th, 2012 at 10:44 AM ^

Just want to give props to the OP on being happy to see other B1G schools on the list and not ripping them for being lower than Michigan.

Space Coyote

March 28th, 2012 at 10:54 AM ^

The B1G as a whole is filled with great universities.  The biggest thing that separates Michigan from the other B1G schools is the number of great programs they have.  If you're looking at other B1G schools, they all have some great programs but not nearly as much.

Still, half the B1G is in the top 50 of engineering schools in the world based on this ranking.  In my opinion, if you know what you want to go into, there are multiple B1G universities that will get you a very well respected degree.  And in engineering there really isn't a B1G school that won't represent you well once you finish school.  It's impressive that so many schools have come together in a region to become both academic and athletic powers, and it makes me proud that the school I root for and the schools I've attended are a part of that tradition.

Everyone Murders

March 28th, 2012 at 11:03 AM ^

Much appreciated.  The overall academic reputation of the B1G is one of its greatest assets, and while it's always fun to mock various "Brah!!" fans of some of our rivals and some of their dopey admistrative figureheads (I'm looking at you in particular, Gordon Gee), each B1G school offers strong programs.  (I don't really know much about Nebraska, other than Big Red haunts my dreams.)  Sports are an important element of the student, alumni, and fan experience, but academics should remain in the spotlight. 

I've also always appreciated that with the exception of Northwestern, all of our schools are public institutions.  That makes the high rankings across the board all the more satisfying.

michelin

March 28th, 2012 at 11:24 AM ^

The OP's cited method for ranking universities--in which Ohio was ranked in the 50-60 range-- is based on the subjective opinion of so-called "leading experts" and the choice of that relatively small group can bias the results. 

By contrast, the US news and world report--in which Ohio fails to make the top 110--is based on many different objective criteria generally accepted by educators as a whole.  The US News and World report is a more widely used and respected method.

In any case, note that UM makes the top 15 in both sets of rankings.

http://www.usnews.com/education/best-colleges/articles/2011/09/12/how-u…

panthera leo fututio

March 28th, 2012 at 12:19 PM ^

The US News rankings are probably the most influential amongst prospective students, but they're deeply flawed, both in their "objectivity" and their impact on the actual project of, you know, teaching people stuff.

One aspect of US News being so "objective" is their susceptibility to gaming. Schools do things like hiring recent graduates to pull books off shelfs and then put them back on, encouraging already admitted students to retake entrance tests, encouraging applications from students who have very little chance of admission, making sham adjunct hires on ridiculously high salaries that are then "donated" back to the program, falsifying test scores, etc. All of these practices are in direct response to the measurement tools employed by US News, and beyond their dishonesty and their ability to skew the "objective" final rankings, many of these tactics result in concrete harm for actual college entrants.

There's also the issue of aggregating measures. Is there really an "objective" way to combine scores of prestige, student-teacher ratio, selectivity, alumni donations, etc. into a single mark of quality? Given some thought, the answer clearly is no. Finally, the bogus nature of the rankings shows up in their instability. If the rankings were effective at ascertaining some enduringly true quality of schools, one would expect much less variation from year to year. (Of course, if you're US News, this volatility is a feature, not a bug -- it drives sales/page views. Who's going to be top 25 this year?!?)

michelin

March 28th, 2012 at 1:03 PM ^

I trust more in measures, like the US News report, whose influences are transparent--and can be easily criticized---than in "subjective" measures, whose influences are muddy.  The latter subjective ratings cited by the OP come from Thompson Reuters, which derives most of its income from the financial community.  So, I question whether its biases influence the selection of those who rate the schools.

It is true that objective methods, like the US News report, can be subject to "gaming", but if gaming is the reason for rating discrepancies, then the lower rating of Ohio (on USNWP vs TR) would imply that it is less likely to engage in deceptive practices.  Do you really believe that?  What concrete evidence do you have that schools like Ohio--widely know for its cheating and whose trustees admit no guilt---is less likely to "game" the ratings?  And even if one accepts this ridiculous idea, how do you know that the greater "honesty" of Ohio would have a major impact on the results (ie enough to drop them from them from a rank in the 50s to one greater than 110)? 

Technical note

Another reason to favor the US News report is that aggregate ratings have advantages.  Many studies suggest that it is preferable to decompose ratings into their components and then aggregate them rather than rely on a single, wholistic subjective rating.  The sum of the decomposed ratings usually is not very sensitive to the weights placed on the individual items (that are later aggregated).  Most studies have shown that decomposed ratings outperform wholistic ones.

 

 

panthera leo fututio

March 28th, 2012 at 1:25 PM ^

I wouldn't point to gaming to explain the particular rank of any given school (and I agree that you certainly can't infer anything about the honesty of, say, OSU's admissions practices by looking at their position on the list). What I think the existence of many forms of gaming indicates, though, is both the accuracy limits of any sort of comprehensive ranking scheme, as well as the potential perverse consequences of these schemes.

I don't think the perverse consequences are necessarily a knock on US News itself; any list that has as much of an impact as theirs does will generate a lot of incentives to skew the results. Maybe the broader point should be that we all pay too much attention to ranked lists generally, and it would be better for everyone if more college appraisals/attendance decisions were based on more detailed personal inspection of programs of interest. Of course, this would seriously limit the ability to engage in dick-measuring contests, but there's always football.

Finally, wrt the virtues of aggregated scores of objective measures, I guess I'd have to look through the research you mention on score composition. It seems intuitive, though, that a great deal would ride on the relative weights that score components receive, e.g. deciding whether alumni donation levels are worth 2% or 15% of the final score would seem to have a great deal of impact on the ratings of many schools, as well as on the average ratings of public vs private schools. Given the extreme arbitrariness of any scheme, I actually prefer ratings that basically just say: "Here's the average of what a bunch of more or less informed people think. Take it fwiw."

michelin

March 28th, 2012 at 4:08 PM ^

I can only shake my head and wonder why.

In any case, it is now hard for most readers to follow the discussion.

PS the lnsensitivity of decomposed models to weights is discussed partly in studies showing that models with "equal  weights " perform as well (or sometimes bettern than) those with so-called optimal weights derived from regressions.  I agree that the finding is counterintuitive.

snackyx

March 28th, 2012 at 11:35 AM ^

"The US News and World report is a more widely used and respected method" Widely used? Yes. Respected? You may want to do some reading on this.

superman26

March 28th, 2012 at 11:42 AM ^

I'll take 127 for Arizona State! Makes me feel a little bit better about attending ASU. Surprised to see Texas A&M, Georgetown, and U of Virginia below ASU though. It sucks U of A is ahead of ASU but certainly not surprised.

Everyone Murders

March 28th, 2012 at 3:30 PM ^

There have been a few criticisms of the reliability of the poll, followed by a couple of folks slamming university rankings as a whole.  I think the criticisms of the Times' poll miss the mark.  And I think it's because people are conflating "Reputation" (i.e., what the survey tries to measure) and "Best" (a separate topic altogether, and much harder to accurately measure).

The Times Higher Education World Reputation Rankings focus on "Reputation" and the results were determined by a broad survey of academics.  It isn't asking academics "(w)hich universities are the best?" - it's asking "(i)n your opinion, which universities do you think have the best academic reputation?". 

Based on 17,000+ respondents, they came up with their rankings.  And they chose a pretty reasonable sample set - people in academia.  Maybe it would be better to caption their findings "Reputation Among Academics".  But there does not seem to be a whiff of intellectual dishonesty in the Times survey.

Can rankings be gamed?  Of course.  The best example I've ever seen is Cooley Law School's in-house ranking system that has them ranked ... second in the U.S.!  Beneath Harvard, but above Yale, Boalt, Penn, Michigan, UVa, NYU, and everyone else.  See this: http://www.cooley.edu/news/2011/020411_judging_the_law_schools.html .    

michelin

March 28th, 2012 at 5:30 PM ^

All academics went to school somewhere, and if you choose a group of only 50, their attitudes may be influenced by such experience.  They may show favoritism toward their own schools--or types of schools--or other factors, like location (south, midwest, east or west coast)

Also, among the chosen raters, there will be different types of academics,, who tend to have much different social attitudes and values. Thompson Reuters, who commisioned your rating study gets a lot of their money from stock brokers, So I remain concerned about possible bias in selecting certain types of academics.

You are correct that  asking about "reputation" may differ semantically from asking about the "best" school--or the factors that make a school "better".  However, to the extent that is true, the academic "reputation" of a school is more subject to distortion than are objective measures, which predict agreed-upon indicants of post-graduate success.

 

lonewolf371

March 28th, 2012 at 2:48 PM ^

The OP listed the rankings from the 2011-2012 Times rankings, while the article that has Michigan 12th is the 2012 reputation rankings. In the 2012 reputation rankings, Illinois is 23rd, Wisconsin is 27th, Northwestern is 35th, Purdue and Minnesota are tied for 47th, Ohio State is 51-60, and Michigan State is 71-80.