MBA rankings are an interesting animal. What makes one school “better” than another? Does the better school teach you more, or does it position you for the highest-paying job? Is it the one whose admissions are most competitive, or the one whose classes are most competitive? The bottom line, when it comes to rankings, is that there is no objective way of answering these questions. Your responsibility, as a potential applicant, is to determine what factors are important to you. Having done this, it isn’t difficult to explore popular rankings (or create your own) in order to find out which schools are “best” as you understand the word.
However, if you’re like most applicants, the “best” school is the one that opens the most (and most lucrative) doors. This is where things get tricky. While the authors of various rankings attempt to develop methodologies that best reflect reality, the fact is that rankings are self-perpetuating. They create reality. Because the objective of many MBA students is not necessarily to gain the most knowledge, but to best position themselves for competitive, high-paying jobs, it’s often true that it’s less important how good a school is than how good people think it is. In a sense, how good people think a school is *is* how good it is. If employers think a particular school is top-notch, they’ll want to hire its graduates. Knowing the school offers competitive employment prospects, top students will want to attend. As these students enroll, with their high GMAT scores and quality work experience, and as they graduate to top jobs, the rankings will look favorably on the school. This, of course, will influence the opinions of employers, and the cycle will perpetuate itself. It’s worth noting that this cycle can potentially operate independent of any
consideration of the actual quality of classroom education at the schools in question. To some extent, employers aren’t hiring the knowledge you’ve gained at B-school so much as the selection process you went through to get there. If you’re accepted to Wharton, for example, chances are you’re a highly talented, intelligent individual who’ll make a successful employee; companies know you’re a pretty safe bet. (On the other hand, if Wharton’s classroom education weren’t as good as it is, employers might eventually begin to notice the effects of this on the quality of their Wharton hires, which could affect overall perception of Wharton in the long run.)
By this standard, it’s difficult to dispute the assertion that Harvard, Stanford, and Wharton are the top business schools around. The Goldmans and McKinseys of the world hire in greatest quantity at these schools, and they have by far the largest applicant pools, the lowest acceptance rates, and the highest yields (percentage of admitted students enrolling). When students are admitted to Harvard or Stanford, they go. When they’re admitted to Wharton, they go—unless they’re admitted to Harvard or Stanford. I heard recently that an admissions officer at HBS was quoted as saying that the school tends to lose admits en masse to only two places—the Graduate School of Business at Stanford, and Yale Law School. Even then, the
yield rate at HBS has historically approached 90%.
Still, that one school is generally more highly regarded than another does not necessarily mean the former is the better school for you. If, for example, your goal is to work on the business side of the film industry, you’re likely to do better at UCLA Anderson than at a traditional East Coast juggernaut. Or, if you have a young family in North Carolina, it could be that no amount of money could make Stanford a better
option for you than Duke.
The bottom line is this: rankings are serious business, and they’re worth paying attention to if your goal is a high-paying job with a highly reputable company. At the same time, magazines cannot publish rankings based on a “felt sense” of which schools are best, and even the most revered rankings are necessarily somewhat arbitrary in their methodologies. For this reason, I think it’s particularly important to understand the methodologies, and to be able to compare them against your personal standards and goals. The remainder of this blog will focus on the 6 (or 7, depending on whether Fortune counts) most highly regarded, commonly referenced rankings:
- US News
- Business Week
- Financial Times
- Forbes
- The Wall Street Journal
- The Economist
- Fortune
The gang’s all here, in one place, for your reading enjoyment! In each case, I’ve included an abbreviated list of the most recent rankings, along with a high-level breakdown of methodology. Then, I’ve included some reflections on each. My critiques are by no means exhaustive, but they’re meant to
trigger thinking and help you understand some of the limitations of these hallowed rankings.
Afterward, you’ll find a few additional goodies, including:
JayMaven’s Ranking of the Rankings
Some thoughts on how to construct an MBA ranking of your own
JayMaven’s MBA Rankings, 2007
So, without further ado, let’s bring out our first victim, err, example—the US News and World Report Rankings:
Released: Annually
Month of Release: April | Grade: A- |
April 1, 2007 Rankings:
|
1. Harvard
2. Stanford
3. Penn (Wharton)
4. MIT (Sloan)
5. (tie) Northwestern (Kellogg)
5. (tie) Chicago
7. Dartmouth (Tuck)
8. Berkeley (Haas)
9. Columbia
10. NYU (Stern)
11. Michigan (Ross)
12. (tie) Duke (Fuqua)
12. (tie) UVA (Darden)
14. (tie) Cornell (Johnson) |
14. (tie) Yale
16. UCLA (Anderson)
17. Carnegie Mellon (Tepper)
18. (tie) UT Austin (McCombs)
18. (tie) UNC (Kenan-Flagler)
20. Emory (Goizueta)
21. USC (Marshall)
22. (tie) Ohio State (Fisher)
22. (tie) Purdue (Krannert)
24. Indiana (Kelley)
25. (tie) Georgetown (McDonough)
25. (tie) Georgia Institute of Technology
25. (tie) Maryland (Smith)
25. (tie) Minnesota (Carlson) |
Methodology - Quality Assessment—40%
- Peer Assessment Score—62.5% (25% of total) (business school deans and directors of accredited master's programs in business)
- Recruiter Assessment Score—37.5% (15% of total)
- Placement Success—35%
- Mean Starting Salary and Bonus—40% (14% of total)
- Employment Rates
- At graduation—20% (7% of total)
- Three months after graduation—40% (14% of total)
- Student Selectivity—25%
- Mean GMAT Scores—65% (16.25% of total)
- Mean Undergraduate GPA—30% (7.5% of total)
- Acceptance Rate—5% (1.25% of total)
Commentary
This is the ranking that the greatest number of people seem to take most seriously, and I personally find it to be closest to “reality” as I perceive it. No doubt these two observations are not unrelated.
Pros:
Accounts for many of the factors reflective of “the word on the street” about which schools perform well and which don’t. Pro or con? I see it as a pro, though I could understand the other perspective…
The “Peer Assessment” score is a novel approach, not utilized in any of the other rankings. Frankly, I like it. While it may be subject to a minor reporting bias, as deans try to knock their competitors down a wrung, I suspect it evens out, assuming deans are not allowed to rank their own programs.
Cons:
- The criticism I hear most commonly of the US News ranking is that the company fails to stick to a consistent, objective standard, constantly tweaking its methodology to make its rankings a continued reflection of what it feels, subjectively, to be true. I’ve never seen any real evidence to this effect, and I certainly don’t think it can be proven. Intriguing, though, isn’t it? The irony, given the popularity of the ranking, is that the second the publication hits the shelves, its assertions are correct. Again, pro or con?
- Oh, and I do think it’s a bit funny that US News has managed to cram 28 schools into the top 25.
- To focus so heavily on employment rates (21%) seems a little silly to me. All graduates of top MBA programs are more than capable of landing quality jobs, and their reasons for not doing so immediately might have little or nothing to do with the quality of the school they attended.
- There are problems with focusing on recruiter assessment, which will be addressed fully in the Wall Street Journal section below.
|
Released: Biennially
Month of Release: Sep/Oct | Grade: B- |
Sep. 2006 Rankings:
|
1. Chicago
2. Penn (Wharton)
3. Northwestern (Kellogg)
4. Harvard
5. Michigan (Ross)
6. Stanford
7. MIT (Sloan)
8. Berkeley (Haas)
9. Duke (Fuqua)
10. Columbia
11. Dartmouth (Tuck)
12. UCLA (Anderson)
13. Cornell (Johnson) |
14. NYU (Stern)
15. UVA (Darden)
16. Carnegie-Mellon (Tepper)
17. UNC (Kenan-Flagler)
18. Indiana (Kelley)
19. Yale
20. UT Austin (McCombs)
21. USC (Marshall)
22. Georgetown (McDonough)
23. Emory (Goizueta)
24. Purdue (Krannert)
25. Maryland (Smith) |
Methodology
Alumni Satisfaction Ranking 45%
Class of 2006, 50% (22.5% of total)
Class of 2004, 25% (11.25% of total)
Class of 2002, 25% (11.25% of total)
Recruiter Satisfaction Ranking 45 %
Intellectual Capital Rating 10% (based on tally of faculty members’ academic journal entries in 20 publications
Commentary
This ranking gets a lot of press, and is regarded in a few circles as rivaling the US News ranking in terms of authoritativeness. The usual suspects are all present, though the order at the top looks a little suspect. Still, it is not without major flaws.
Pros:
A relatively simple ranking
Doesn’t depend entirely on one year’s data, though it’s weighted to the most recent year. I like the idea that a school must be consistently solid to perform well in the rankings, though I agree that “current” performance should be emphasized.
The journal entries piece is objective and bias-proof.
Cons:
- First and foremost, the B-week rankings are mired in reporting bias. If rankings create reality, and alumni know that having their school ranked highly is good for their careers, why would they not give a biased account of their “satisfaction?” The higher-ranked their school is, the more satisfied they are likely to be. What’s unclear is whether the alumni of some schools do this more than others, or if this bias averages out across all schools.
- As is the case with the Wall Street Journal Ranking (described below), there are problems with relying on recruiter satisfaction, as well. I’ll save the detail for WSJ, which relies entirely on recruiter perceptions, but suffice it to say here that the ranking fails to differentiate between recruiters—and not all recruiters (or the companies they work for) have the same pull with MBA students.
- Like the Financial Times ranking below, 10% is assigned to quantity of faculty members’ academic journal entries. Some might see this as a pro, as I mentioned above, because of its objectivity. Others, however, might see it as a con, claiming they’d rather their profs focus on teaching than writing!
|
Released: Annually
Month of Release: Jan/Feb | Grade: C+ |
Jan 29, 2007 Rankings:
|
1. Penn (Wharton)
2. Columbia
3. Stanford
3. Harvard
5. London B. School (U.K)
6. Chicago
7. INSEAD (France/Singapore)
8. NYU (Stern)
9. Dartmouth (Tuck)
10. Yale
11. CEIBS (China)
11. Instituto de Empresa (Spain)
13. IMD (Switzerland) |
14. MIT (Sloan)
15. Cambridge (Judge) (U.K.)
16. IESE (Spain)
17. UCLA (Anderson)
18. HEC Paris (France)
19. Oxford (Saïd) (U.K.)
19. Northwestern (Kellogg)
19. Michigan (Ross)
22. Manchester (U.K.)
23. Duke (Fuqua)
24. ESADE (Spain)
25. Berkeley (Haas) |
Methodology
This ranking uses 20 criteria, including:
8 from a survey of alumni 3-5 years removed from graduation, worth a total of 50%. This section emphasizes most heavily the salary and salary increase factors, worth 20% each. Also, each criterion is weighted as follows:
50% class of 2003
25% class of 2002
25% class of 2001
11 based on data from a survey that each business school is asked to complete, including employment statistics, data on the gender and nationality of the students, faculty and board of the school and information on doctoral qualifications. This section counts for a total of 40%.
The final criterion—“Research Rank” (papers published in 40 academic and practitioner journals during the past three years) counts for 10%
Commentary
20 criteria? Really? Do I even have that many chromosomes?
This is the first of the international rankings in the list, and if it weren’t difficult enough to determine what makes a U.S. b-school “better” than another, the question becomes even more subjective and convoluted when the rankings go global.
Pros:
There’s something appealing about looking at salary growth instead of just starting salary, because it rewards schools for how well they teach, rather than just how successful they are at attracting the top students. Again, the open question is which of these two makes a school better. In any case, this method is not without problems, some of which are addressed below.
Like Businessweek, this ranking incorporates several years’ worth of data
Cons:
- First I must remind you of the importance of considering which school is best for you. In the context of a global comparison, it becomes particularly important to consider where you hope to work. While it might be possible to argue that an East Coast school like Wharton or Harvard is superior to a UCLA or USC even for a person hoping to work in L.A., I don’t think it’s possible to make this argument for most (if not all) non-U.S. schools. While IESE appears ahead of UCLA in this ranking, an applicant choosing between the two, and hoping to work in L.A., would be well advised to become a Bruin. Conversely, if the applicant wants to work in Spain following graduation, there’s no question IESE would be the superior option.
- While 8 of the top 10 in the ranking are U.S. schools, 9 of the next 15 are not. For 11 of the top 25 to be non-U.S. schools seems a little high to me, and it may be that there is something of an international bias inherent to the rankings. Given that European schools would naturally enroll a greater percentage of international students (from across the EU), to base comparisons on this statistic seems to favor non-U.S. schools. This isn’t to say that diversity isn’t a great thing—it is. Still, being at a diverse school in Europe probably won’t help you land that PE/HF job in NYC. If you want to work in London, that’s another thing altogether. What makes me question this ranking is the fact that the vast majority of total MBA applicants still hope to work in the U.S.—not the E.U.
- I can appreciate the value of tracking alumni salary growth, but looking back 3-5 years creates two problems:
- First, it seems possible, if not likely, that salary growth for many alums after 2-3 years probably has more to do with other factors than with the quality of the MBA they received. True stars won’t stop learning once they re-enter the working world, and it may be that it’s what they learn on the job that gets them that next big break.
- More immediately pressing, though, is the fact that schools can change in 3-5 years. This seems more like a retroactive 2003 ranking than an accurate 2007 ranking.
- As with Business Week: Academic Journals, 10%. Pro or con?
|
Released: Biennially
Month of Release: August | Grade: C |
August 18, 2005 Rankings:
|
1. Dartmouth (Tuck)
2. Pennsylvania (Wharton)
3. Chicago
4. Columbia
5. Yale
6. Stanford
7. Harvard
8. Virginia (Darden)
9. Cornell (Johnson)
10. Northwestern (Kellogg)
11. UT Austin (McCombs)
12. Iowa (Tippie)
13. NYU (Stern) |
14. UNC (Kenan-Flagler)
15. Berkeley (Haas)
16. Carnegie Mellon (Tepper)
17. Brigham Young (Marriott)
18. MIT (Sloan)
19. UCLA (Anderson)
20. Duke (Fuqua)
21. Emory (Goizueta)
22. Indiana (Kelley)
23. Penn State (Smeal)
24. Texas A&M (Mays)
25. Vanderbilt (Owen) |
Methodology
Commentary
Hmm, interesting. In a landscape where rankings are inherently subjective (because of the fact that rankings create reality), this is a noble, if flawed, attempt at objectivity.
Pros:
The simplicity of the ranking is attractive.
Again, there’s something compelling about the “salary growth” factor.
I like that this methodology takes into account the opportunity cost of going to b-school, which I referenced in my last blog entry, MBA Admissions 101: The Value of an MBA.
I could see this ranking being extremely valuable for a certain cross-section of the population. If you don’t like your chances of being accepted to one of the 20 or so most competitive schools, but you want to go to a school that will help you increase your salary, this list might provide you with some valuable ideas. Iowa, BYU, Penn State, Texas A&M, and Vanderbilt, for example, could be great options. But no way are any of those schools better than Michigan Ross. Average GMAT at Vandy? 622. At Ross? 695. Enough said.
Cons:
- Unfortunately, there’s nothing here to standardize students. If School XYZ accepted 500 students with current annual salaries of $1, and then the students got jobs averaging $35k upon graduation, the school will appear to do better than a school that accepts students with current annual salaries of $95k who then move into positions with top companies pulling $120k. While Tippie grads may well have experienced greater salary gains than MIT Sloan grads, I’m quite certain that Sloan accepted a more competitive crop of students. If the same students that attended Tippie had attended Sloan, would their salaries have increased even more? I suspect so. If Sloan’s students had attended Tippie, would their salaries have increased as much as they did? I doubt it.
- As with the Financial Times ranking, this seems more like a retroactive look at the school’s quality in 2000 than a collection of data on which to base current decisions. If FT holds true to their timing, they should have a new ranking out very soon. Will be interesting to see.
|
Released: Annually
Month of Release: October | Grade: C- |
October 13, 2006 Rankings:
|
1. IESE
2. Dartmouth (Tuck)
3. Stanford
4. Chicago
5. IMD
6. Northwestern (Kellogg)
7. HBS
8. NYU (Stern)
9. Michigan (Ross)
10. Berkeley (Haas)
11. Cambridge
12. Columbia
13. UVA (Darden) |
14. Henley Management College
15. UCLA (Anderson)
16. IE
17. Penn (Wharton)
18. MIT (Sloan)
19. Cranfield
20. LBS
21. Ashridge
22. Insead
23. Cornell (Johnson)
24. Yale
25. Emory (Goizueta) |
Methodology
Open new career opportunities—35%
Diversity of recruiters (Number of industry sectors)—25% (8.75% of total)
Assessment of career services (Percentage of graduates in jobs three months after graduation)—25% (8.75% of total)
Jobs found through the careers service (Percentage of graduates finding jobs through careers service)—25% (8.75% of total)
Student assessment—(Meeting expectations and needs)—25% (8.75% of total)
Personal development/educational experience—35%
Faculty quality
Ratio of faculty to students—8.3% (2.9% of total)
Percentage of faculty with PhD (full–time only)—8.3% (2.9% of total)
Faculty rating by students—8.3% (2.9% of total)
Student quality
Student diversity
Percentage of foreign students—8.3% (2.9% of total)
Percentage of women students—8.3% (2.9% of total)
Student rating of culture and classmates—8.3% (2.9% of total)
Education experience
Student rating of program content and range of electives—6.25% (2.2% of total)
Range of overseas exchange programs—6.25% (2.2% of total)
Number of languages on offer—6.25% (2.2% of total)
Student assessment of facilities and other services—6.25% (2.2% of total)
Increase salary—20%
Potential to network (10%)
Breadth of alumni network (Ratio of registered alumni to current students)—33.3% (3.3% of total)
Internationalism of alumni (Ratio of students to overseas alumni branches)—33.3% (3.3% of total)
Alumni effectiveness (Student assessment of alumni network)—33.3% (3.3% of total)
Commentary
Well, there’s a lot of stuff here. Some of it seems fine, and some of it seems a bit absurd. It almost seems like the Economist decided that the more data they had, the safer they were. I don’t know if this is true or not.
Pros:
Salary growth included here, too
Maybe the wide range of factors minimizes bias, as the different biases inherent to each of the factors cancel each other out. Is that a stretch?
Cons:
- “Number of industry sectors”—Personally, there aren’t that many sectors that interest me, and I’d be surprised if that weren’t the case for the majority of MBA students. I’d rather have more good opportunities within a few interesting sectors than a bunch of opportunities spread across several lame, err…less interesting sectors.
- “Average length of work experience”—This seems almost counter-intuitive, as many of the most talented, determined students are those that seek MBAs at an earlier stage in their careers.
- “Range of overseas exchange programs,” “Number of languages on offer,” and “Internationalism of alumni”—These are all wonderful things, and they potentially add value to an MBA experience, but they don’t necessarily add value for a particular individual. If you live in the U.S., and want to work in finance in NYC, these factors are far less meaningful. The fact that they account for 8% of the ranking seems to give an advantage to European schools like IESE.
- I’m sorry, but I’ll take Wharton over Henley Management College any day. Duke too. Duke is omitted completely, and Wharton at 17 is just plain too low.
|
Released: Annually
Month of Release: September | Grade: D- |
September 19, 2006 Rankings:
|
1. Michigan (Ross)
2. Dartmouth (Tuck)
3. Carnegie-Mellon (Tepper)
4. Columbia
5. Berkeley (Haas)
6. Northwestern (Kellogg)
7. Penn (Wharton)
8. UNC (Kenan-Flagler)
9. Yale
10. MIT (Sloan) |
11. Chicago
12. Duke (Fuqua)
13. UVA (Darden)
14. Harvard
15. USC (Marshall)
16. Cornell (Johnson)
17. NYU (Stern)
18. Stanford
19. UCLA (Anderson) |
Methodology
The WSJ ranks schools in three categories: “National,” “Regional,” and “International.”
Listed above is the 19-school “National” ranking. WSJ describes them as follows: “These schools enjoy a national reputation and tend to draw recruiters from many of the same companies, usually large national and multinational firms that pay high starting salaries.”
The rankings are based entirely on feedback provided by recruiters from companies that tend to recruit MBA students
Each school ranking is based on three components, each accounting for one-third of the overall current-year results:
Perception—33%—Judging how well the students of each school “meet the needs” of recruiters, using 21 different attributes
Supportive Behavior—33%—Likelihood of returning to the school in the next two years and likelihood of making an offer to a student at the school in the next two years
Mass Appeal—33%—For National and Regional schools, a school's mass-appeal score is the total number of participating recruiters who indicated they recruit from that school.
Commentary
In my opinion, this is perhaps the silliest of the major rankings. I’ve never heard of anyone taking this poll seriously, except of course (a) the people who benefit from its unusual assertions and (b) people that don’t know anything about b-schools and assume that because it’s the WSJ, it must be spot-on.
Pros:
Uhh…Hmm…Well, I guess the concept is somewhat intriguing—that the people best qualified to rank MBA programs are the ones MBA students most need to impress upon completing their degrees—the representatives of the companies that hire them. Unfortunately, this methodology—of relying entirely on recruiters—begins to break down when you consider the following:
Cons:
- Nothing here ranks the employers, the quality of their decision-making in choosing to recruit at particular schools, or the quality of the positions for which they’re recruiting. If Joe Bob Jones from Joe Bob’s Plumbing comes to Stanford to try to recruit an MBA for a $25k a year job, he’s going to leave disappointed, and will rate Stanford accordingly. Obviously, this is an extreme example, but I think the point is sound. I’ve often heard it said that Harvard and Stanford perform poorly in this ranking precisely because firms sometimes find it so difficult to recruit there. Graduates of these two programs have almost unlimited opportunities, and competition for their services is stiff. Some recruiters may even find them to be “cocky.” But does this mean these are lesser schools? One could easily argue exactly the opposite…
- I find it interesting that the WSJ groups schools into “national” and “regional” categories through a completely subjective process of evaluating what “tends” to be true, making an implied assertion that these schools are better than others because of these tendencies, but then ignores such considerations in doing the actual ranking. WSJ’s “national 19” are the same schools as U.S. News’ top 19, with only one exception (WSJ includes USC instead of Texas). But if these schools are the best because they “enjoy a national reputation and tend to draw recruiters from many of the same companies, usually large national and multinational firms that pay high starting salaries,” then why aren’t these factors considered in differentiating the 19 from one another? Ask yourself which school is most successful at attracting the most (and most prominent) multinational firms, and which school places students in the highest-paying jobs. Here’s a hint: it’s not Ross. My intention isn’t to dog Ross—it’s a great school, but…you get my point. Essentially, what WSJ has done is:
- Suggested they know intuitively what makes one school better than another, and…
- Ignored that knowledge entirely to produce something unique and controversial, in an attempt (I suppose) to generate buzz and attract people to their site
|
Released: Once
Month of Release: February | Grade: F- |
February 22, 2007 Rankings:
|
1. Penn (Wharton)
2. Harvard
3. MIT (Sloan)
4. Stanford
5. Northwestern (Kellogg)
6. Columbia
7. Chicago
8. Duke (Fuqua)
9. Dartmouth (Tuck)
10. NYU (Stern) |
11. Michigan (Ross)
12. Berkeley (Haas)
12. Cornell (Johnson)
12. UVA (Darden)
12. Yale
16. Georgetown (McDonough)
17. UCLA (Anderson)
17. Thunderbird School of Global Management
19. UT Austin (McCombs)
20. CMU (Tepper) |
Methodology
Commentary
My job was easy on this one. Fortune released the following statement on February 27, 2007:
“Last week, CNNMoney.com published "Top 50 Business Schools for Getting Hired."
The data for the list was provided by an outside vendor, Quacquarelli Symonds Ltd. Upon our publication of the feature, we were alerted to potential flaws in the provided data and the data survey methodology.
These flaws in methodology may have resulted in University of North Carolina's Kenan-Flagler Business School and Boston University being omitted from the list.
CNNMoney.com regrets the error, and apologizes to its readers and the business schools involved. The list has been removed from the site.” So far as I can tell, this was Fortune’s first attempt at an MBA ranking, and they botched it.
Pros:
Cons:
- According to Clear Admit, “Fortune’s methodology resembles a blend of the Wall Street Journal’s approach of surveying corporate recruiters and Forbes’s focus on post-graduation salary as a measure of return on investment.” First lesson of methodology development: combining two flawed methodologies doesn’t make for one good one.
- I don’t think I would put so much emphasis on number of job offers per student. I’d rather have one or two offers from companies that really excited me (and would focus my efforts to that end) than a bunch of offers that didn’t.
- As with the U.S. News ranking, I don’t think it makes sense to put so much emphasis on the timing of job placements.
- All in all, it just strikes me as kind of a lazy ranking—as if the boss said “Get me a ranking,” and the worker bees, who didn’t know anything about MBA rankings, took a cursory look at the internet and slapped something together.
|
JayMaven’s Rankings
Again, here are my rankings
of the rankings.
1. US News (A-)
2. Business Week (B-)
3. Financial Times (C+)
4. Forbes (C)
5. Economist (C-)
6. WSJ (D-)
7. Fortune (F-)
I’d put WSJ last, but Fortune loses points for screwing up an already lazy ranking. Speaking of lazy rankings,
this one is lazy. I’m not going to provide a methodology, and that’s about all I’ve got to say about that!
But if I’m going to trash everybody else’s rankings, I guess it’s only fair that I should be required to submit my own for public dissection. I’ve done two things here.
First, I’ve compiled my “felt sense” rankings, based entirely on intuition and countless hours of poring over MBA books, forums, blogs, rankings, employment reports, and other sources. In this ranking, I’ve ranked by group, based on the belief that some schools are essentially equal in their ability to attract top students and place them in top jobs.
Second, I’ve developed a methodology of my own, and used it to produce a second set of rankings.
I must admit to subscribing, at least to an extent, to the belief that the best school is the one the powers-that-be think is best. I know it’s problematic, self-fulfilling, and all that other stuff, but I can’t help it.
Here, then, are my “felt sense” rankings:
1-2. Harvard and Stanford
3. Penn (Wharton)
4-6. Chicago, Northwestern (Kellogg), MIT (Sloan)
(Any of these 6 can be considered “Top 5”)
7-9. Columbia, Berkeley (Haas) and Dartmouth (Tuck)
(Columbia and the previous 6 form the so-called “Magnificent 7,” or “M7.” It is often theorized that the Deans of these schools conspire to keep their schools at the top, and this group has become a kind of de facto “Ivy League” of B-schools. However, Columbia is commonly seen as the 7th of the M7, and has fallen below Tuck and Haas in this year’s US News Rankings.)
10-16. Michigan (Ross), NYU (Stern), Duke (Fuqua), Virginia (Darden), UCLA (Anderson), Yale SOM, Cornell (Johnson)
(Any of these 16 can be considered “Top 15,” and I would say any of the top 9 plus Ross, Stern, and Fuqua could make legitimate claims at “Top 10.” Anderson is the oddball here, since it came in at #10 in the 2007 US News Rankings but dropped all the way to #16 for 2008.
17-22. UNC (Kenan-Flagler), Carnegie-Mellon (Tepper), Texas (McCombs), Emory (Goizueta), USC (Marshall), Indiana (Kelley)
(Any of these 22 can make claims at “Top 20.” USC has traditionally been lower, but seems to have risen in the rankings.)
Constructing your own rankings:
If there’s one thing to take away from all this, it’s that any ranking is necessarily arbitrary in its methodology. Even if the methodology is rigorous (unlike my intuited rankings above), it must still be a product of a subjective process of evaluating the importance of various criteria. What this means, of course, is that you can create your own rankings, and that they’ll be as “correct” as any of the big magazines’ rankings. Simply determine which criteria are important to you, assign percentage values to each, and gather the appropriate data from other rankings and/or school websites.
One challenge to be aware of is that of processing your data. For example, if you use both average GMAT and
Yield, you may not want to treat both in the same way. If you were to simply divide each school’s numbers in these areas by the highest numbers in each area, the resultant
spread in
Yield would be MUCH wider than the
spread in GMAT, so that the impact of
Yield would outstrip the impact of GMAT, even if GMAT were assigned a higher percentage value in the rankings.
Here’s the methodology I’ll use for my second set of rankings:
|
Average Starting Salary | 25% |
US News Peer Ranking (B-school Deans & Program Directors) | 20% |
Average GMAT Score | 20% |
Acceptance Rate | 15% |
Undergrad GPA | 10% |
Yield | 10% |
You can find an Excel spreadsheet containing the results, and the process I used to get there,
HERE.
Here’s the bottom line:
1. Stanford
2. Harvard
3. Penn (Wharton)
4. MIT (Sloan)
5. Chicago
6. NW (Kellogg)
7. Berkeley (Haas)
8. Columbia
9. Dartmouth (Tuck)
10. Michigan (Ross)
11. Duke (Fuqua)
12. NYU (Stern)
13. Yale |
.9933
.9867
.9623
.9468
.9418
.9347
.9275
.9235
.9231
.9051
.8962
.8792
.8711 |
14. UCLA (Anderson)
15. Cornell (Johnson)
16. UVA (Darden)
17. CMU (Tepper)
18. Texas (Austin)
19. UNC (Kenan-Flagler)
20. USC (Marshall)
21. Indiana (Kelley)
22. Emory (Goizueta)
23. Ohio State (Fisher)
24. Purdue (Krannert)
25. Wash-St.Louis |
.8685
.8650
.8642
.8562
.8325
.8286
.8255
.8198
.8060
.8026
.7992
.7967 |
I was shocked when I saw this list. And no, I didn’t tweak my methodology until I had something that matched the “felt sense” ranking. In fact, I kind of wish there had been more discrepancies, because I’m afraid people will think that’s exactly what I did! Truth is, I simply listed the criteria I thought were most relevant, ranked them by importance, and cranked the numbers. Compare the two lists, and you may be as astounded as I was.
So what does this mean? Well, as proud of myself as I was when I first saw the results, the answer is probably “not much.” Basically, I guess it means that I truly believe these categories are the ones that determine the quality of business schools. The open-ended question is how I came to believe this. Did I develop the belief that these are the qualities that set b-schools apart, and then develop my “felt sense” of the schools accordingly? Or did I develop the “felt sense” first, and then come to appreciate these qualities because they supported it? Or perhaps the two developed in unison—or entirely independently (yeah, right!).
In any case, the bottom line is that I’ve reached an understanding of what schools are “best” that works for me. Hopefully, this article has provided you with the tools you’ll need to do the same!
JayMaven
DealMaven Inc.
http://www.dealmaven.com/