Rankings of business schools generally fail to evaluate the inherent quality of an institution, instead ranking the people who choose to attend it.

UPDATE: If you came here from a source that did not make clear the wry, tongue-in-cheek nature of my rankings, also read the clarification.

An MBA student from UC-Davis will graduate, on average, with a starting salary that is $30,000 lower than a graduate from nearby UC-Berkeley. Can we take from this that two similarly-credentialed students at the two schools would have such a high difference in their market values? This reasoning ignores the selection bias. A student accepted by both schools is quite likely to choose the one often ranked in the top 10.

As long as top candidates choose to go to top programs, a higher ranking confounds the quality of students and the quality of a school. The proper interpretation of Business Week’s rankings, for example, is not that Harvard is a better school than Blah College, but that the type of students who go to Harvard do better after graduating from Harvard than the type of students who go to Blah College after graduating from Blah. Thus, a high starting salary for Harvard graduates might imply that Harvard’s professors can polish rough stone into beautiful rubies, but it is also possible that Harvard has the benefit of students who could have very well succeeded anywhere. (Note: I pick on Harvard because it actually does quite well in my rankings, supporting the rubies theory).

Which schools do best with the students they have? Traditional rankings fail to tell a given student with a given skill set which schools are most likely to increase his market value. That’s the goal of these rankings, highlighting that a change in methodology significantly alters the results. Full rankings below the jump. Methodological disclaimers (and there are many) are at the very bottom.

Ranking of Business Schools by Efficacy

Data
Rank School Market      
Value      
GMAT GPA Adj.
Sal
1 Cornell (Johnson) $14K 682 3.31 118
2 Indiana–Bloomington (Kelley) $13K 656 3.37 104
3 University of Virginia (Darden) $12K 688 3.33 120
4 Texas–Austin (McCombs) $12K 673 3.38 109
5 Harvard $8K 713 3.63 134
6 Vanderbilt (Owen) $7K 644 3.27 101
7 Rice (Jones) $6K 642 3.25 100
8 Minnesota–Twin Cities (Carlson) $7K 661 3.37 99
9 MIT (Sloan) $6K 705 3.5 126
10 Maryland–College Park (Smith) $6K 650 3.34 98
11 Georgetown (McDonough) $6K 677 3.26 108
12 Ohio State (Fisher) $6K 661 3.41 97
13 NYU (Stern) $6K 700 3.4 123
14 Duke (Fuqua) $6K 690 3.38 114
15 UNC–Chapel Hill (Kenan-Flagler) $6K 681 3.27 110
16 Brigham Young (Marriott) $5K 661 3.53 93
17 Rochester (Simon) $5K 673 3.52 98
18 Texas A&M (Mays) $5K 665 3.4 97
19 Northwestern (Kellogg) $3K 704 3.5 122
20 Boston College (Carroll) $3K 651 3.35 94
21 Univ. of Pennsylvania (Wharton) $2K 712 3.53 130
22 Columbia $2K 707 3.4 128
23 Southern Methodist (Cox) $2K 640 3.3 95
24 Arizona State (Carey) $2K 675 3.44 98
25 Wisconsin–Madison $1K 656 3.37 92
26 Michigan State (Broad) $1K 633 3.22 97
27 Chicago $1K 709 3.5 126
28 Yale $1K 700 3.47 116
29 Purdue (Krannert) $0K 662 3.32 94
30 Penn. State (Smeal) $-1K 650 3.3 92
31 Emory (Goizueta) $-1K 685 3.3 106
32 Washington Univ, St. Louis (Olin) $-2K 674 3.38 95
33 Michigan–Ann Arbor (Ross) $-2K 700 3.3 118
34 Illinois–Urbana-Champaign $-2K 627 3.4 92
35 UCLA (Anderson) $-2K 704 3.6 114
36 Boston University $-3K 668 3.38 92
37 Dartmouth (Tuck) $-4K 713 3.46 127
38 Carnegie Mellon (Tepper) $-5K 696 3.32 111
39 Georgia Institute of Technology $-5K 665 3.4 88
40 Babson College (Olin) $-5K 631 3.21 93
41 Stanford $-5K 721 3.61 133
42 Notre Dame (Mendoza) $-7K 673 3.2 95
43 Univ. of Southern Cal. (Marshall) $-8K 689 3.3 102
44 Univ. of Washington (Foster) $-8K 679 3.38 92
45 U. California–Berkeley (Haas) $-9K 710 3.57 115
46 Univ. of California–Davis $-11K 674 3.37 87
47 Univ. of Iowa (Tippie) $-15K 652 3.34 76
48 U. California–Irvine (Merage) $-16K 667 3.34 79
49 Univ. of Georgia (Terry) $-16K 653 3.4 74
50 Univ. of Florida (Hough) $-30K 680 3.4 70

 

These rankings consider only the 50 schools in the most recent U.S. News rankings. Specific methodological details are available at the bottom.

Adjusted salary (in thousands of dollars) reflects both the starting salary of those employed within three months of graduation, and a downward adjustment for those who are not.

Market value denotes the difference between a school’s adjusted salary and what that school’s students would expect to earn (given their qualifications at admission) at an average business school (for a loose definition of an "average" school in this context, see no. 29). A student with a high GMAT score and an exceptional undergraduate GPA is likely to receive higher offers than one with lower scores regardless of the MBA program he attends (not because of the undergrad GPA, but because of what it reveals about the person). Market value indicates how much a school improves on this given its actual student population.

The notion of market value is akin to the distinction between two corporate tasks: recruiting the best talent, and guiding that talent to its potential. Most of the popular business school rankings are biased towards achievements in recruiting, while the above rankings measure efficacy with the given talent. Of the U.S. News top ten, only Harvard and MIT are also in the top ten in efficacy. Conversely, Stanford and Berkeley, also top ten U.S. News schools, are in the bottom ten here, suggesting that members of their admissions staff deserve sizable bonuses.

By way of example, UT-Austin, Wash U, St. Louis, and UC-Davis admit nearly identical student bodies, quantitatively, yet differ greatly in the market’s value of these students two years later. On the other hand, Yale and Cornell have nearly identical starting salaries, and therefore end up only one spot apart in U.S. News. Yet, given the superior class, in terms of GMAT, GPA, and selectivity, Yale should do better, thus ranking in efficacy 27 spots below Cornell, which takes the number one spot.

So, what’s the goal of this? Perhaps there’s a deep philosophical point about the purpose of education. I adopt market salaries as a measure of value purely because it is available, and is the most common quality measure in business rankings (or perhaps because of my unfaltering adherence to the social philosophy underlying classical economics). There’s also a mundane point: rankings are not difficult to generate, easy to game, and even easier to tailor toward mass hysteria and overreaction. To that end, extra credit ("adjustment factors") will be applied to next year’s rankings for posting a comment below ("brand management and awareness index"), sending me money ("investment index"), or publishing my papers ("scholarship discovery index").

 

METHODOLOGY

Disclaimer: This was done in great haste, and, in keeping with tradition of business school rankings, without too much regard for the appropriateness of statistical procedures.

Overview: The rankings are based on the residuals in a regression of average GPA and GMAT score on adjusted starting salary. That is, a school’s score is the difference between its adjusted starting salary and the predicted salary from an ordinary least squares regression.

Data: Obtained from the 2009 U.S. News rankings of the top 50 business schools, which compiled data on the 2007 graduating class.

Adjusted salary: All students not employed within three months of graduation are (pessimistically) assumed to have a salary equal to 80% of the average salary of employed students at their institution. This biases results against schools with low placement rates. If S=average salary and e=percentage employed, then
     adjusted salary = Se+.8S(1-e)

Model selection: The only predictive variables of average earning potential available are average GPA and GMAT, though these are not bad as far as instrumental variables go. A model linear in GPA and/or GMAT badly fails specification tests. Various transformations of average GPA and average GMAT were tested. The final model is given by:
     adjusted salary = GPA + GMAT + GMAT^2
The model has an adjusted R2=0.70. Both GMAT parameters are highly significant (p<.001) and GPA is marginally significant (p=.084).

Residuals: From the estimated model, we obtain the studentized residuals by which schools are ranked. The "value" reported is simply the difference between actual and predicted adjusted salary.

Diagnostics: I ran all of the diagnostic tests that I could think of in three minutes. Heteroskedasticity is not a problem. Specification tests are mixed. Test for normality: not even close. GMAT and GPA are not significantly correlated with the final rankings. Overall, on the tests, some looked good, some didn’t. After all, if the methodology was completely sound, how could I tweak it next year to produce an entirely different ranking despite very little change in the schools?

Bias: All of the above methodology was performed prior to matching the rankings to the identities of the schools. If I have a bias, it is this: those who rank schools based on whether its faculty’s books make your journal’s best seller list or on the number of downloads from your proprietary system antithetical to the open access concept of working papers, or who threaten to drop schools from the rankings for adhering to ethical privacy standards, are cynical freeboating knaves.

13 Responses to “Where I take a turn at ranking business schools”

  1. Would adjusting starting salaries by cost of living and tax level change your results?

  2. Have you looked at where the grads go? And how about some sstatiscal measures like variance, etc.?

    Could it be that, considering the focus on activism, and social responsibility, that more Stanford grads opt for political and socialially conscious groups over those from Harvard?

  3. Anon,

    A great question, and one that I received several emails about today. I do not buy the premise that salaries equate to school quality. My post meant only to suggest that IF we buy into this most common of metrics, there are still some issues inadequately addressed by the common rankings.

    Assuming that some students care about social, career, and life issues apart from maximizing net present value of future salary stream, then starting salaries say more about the student’s priorities than the quality of the school.

  4. Interesting. Have you tried applying the same methodology to prior years? Might low variation across years support the methodology (or vice versa)?

  5. [...] Where I take a turn at ranking business schools [...]

  6. Nice try, Mike, but you’ve got to be kidding! Most MBA students come in with some experience. If you haven’t controlled for (average) previous salary then you can’t assess the incremental value. — MBA ’72

  7. MBA ’72. If you are proposing that this paper is invalid because the author didn’t include work experience, I think that you are missing the point. The point is that the rankings are flawed and lead many young twentysomethings to life altering decisions because they believe that the ranking methodology is without flaw.

    In the end, I think that the disequilibrium is a very interesting article and analysis. It would benefit from more variables, but the premise that the rankings are greatly flawed has merit.

    MBA ’02 (from a top 20 business school from both rankings)

  8. I have to say I’m amazed by some of these comments. I understand the author’s goal to be to question the assumptions of all rankings, not to perfect the methodology for these assumptions.

    I agree that the value of a school for a specific student is a better way to think about rankings. I don’t think, though, that one ranking can achieve that. A top student probably has more to gain from a top school. Also, a less-qualified or able student might be best served by a lower-tier school and not be able to gain much from a better one.

    This would mean that Business Week would better serve students if it had only their in-depth profiles, and did away with the rankings. But then people wouldn’t read it. The same probably goes for this post. That’s unfortunate.

  9. By not adjusting salary for regional differences for cost of living, your methodology is fatally flawed. I am a Florida MBA student and I assure you, the starting salary of 70K here is more than equal to 90+K in the North East. By tweaking this, the list will look radically different, with U-F moving past the middle of the list. Ugh, I fail to see why so many (US News included) overlook this fundamental step.

  10. I REALLY like the concept, but the one thing that such a ranking ignores(not on purpose)… but a ranking like US News ends up including (again… they didnt mean to include it)… is that one of the biggest factors that make an MBA good or bad… is not just who it is who is studying the classes… but also who that person is surrounded by.

    Top dudes going to top schools does not show that the school is “better” (and thus should be higher in ranking)????????

    I can try to explain my view by looking at what you are saying from the other side.

    You say – An average candidate would have a greater “value add” in your rank 1/2/3 school than in Standford.

    I am saying – A top candidate would have a lower “value add” in your rank 1/2/3 school than in Stanford because he “value add” in an MBA is governed a whole lot by those other folks in your class.

    Unfortunately(?)… they are in your class because US News has been telling for the last 50 yrs that Stanford is better than your rank 1/2/3.

    MBA at its core is an experience… the people you hate to do your homework with but might get you the job or the client 20 yrs later.

    Take a top candidate(you choose how to define the word “top”)… and put him with a group of “bottom” candidates… and 2 yrs later… the top candidate isn’t what he could have been had he been surrounded by other “top” candidates.

    THAT is why, for example, Stanford is what it is today… because over the years… the top candidates have gone there… and been surrounded by other top candidates.

    Being able to get a job offer or a client 20 yrs later… from the guy you hated in school… “IS” part of the value that the school adds to a student.

    One of the best real examples that comes to mind for this purpose… the Indian Institute of Mgt. They dont have the best faculty or the curriculum… but the fact that everyone who goes in was selected from 120 other applicants… the toughest selection in any BSchool in the world… makes the “group” and the final result… and thus the school… a great place to do your MBA from.

    In summary… I think you have ignored the fact that the value add to a student from an MBA today is decided a lot by the US News rankings of yesterday… pulling all the top candidates of yester-years to a single location. Those value adds are real and intangible parts of the MBA experience that decide where the school should be placed on a ranking. (Of course… its the same reason why “wrongly” ranked Stanford gets a higher pay packet)

  11. Perhaps another explanation of the effect is that schools at the top of your ranking generally do a better job of marketing/selling their students to employers despite their apparent lower quality when they arrive? There is no evidence that the stone is actually polished – simply the shiny bit is well sold?

  12. [...] publication outlets. There are more debates over the methodology of journal rankings than of ranking business schools. There may be no universal agreement on the right method but there certainly is a wrong [...]

  13. [...] the many issues with ranking schools, one of the most glaring is incorporating the input of those who are impacted by the result. [...]

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>