Unveiling the secrets behind B-school rankings reveals why the annual lists are, at best, a flawed exercise.
Google “best business schools” and you’ll see links to rankings from various publications, including U.S. News & World Report, Bloomberg Businessweek, and the Financial Times. These publications and others position themselves as arbiters of educational prowess. And like it or not, their results matter. The annual rankings are used to gauge prestige by prospective students, felt as a point of pride by alumni, and scrutinized by administrators concerned with enrollment rates
Despite their air of authority, however, business school rankings are facing a crisis of confidence. Within the last few years, both The Economist and Forbes have shuttered their rankings amid criticism over their methodologies. Other lists have been similarly called into question as schools jump or fall multiple spots from year to year without any seeming rationale or else appear arbitrarily sorted based on minute differences among them.
Last year, a trio of top business school deans, including Haas’ own Ann Harrison, spoke out publicly in a Financial Times op-ed arguing that rankings for undergrad business school programs—despite Haas ranking #2 in U.S. News—fail to account for improved social mobility: “The rankings concentrate too much on the prior accomplishments of students and too little on how much schools help to enhance their skills and improve their opportunities by the time they graduate,” they wrote.
To be fair, rankings aren’t all bad, says Ute Frey, Haas’ chief marketing officer, who has overseen the school’s participation in the rankings for decades. “They hold schools accountable when it comes to the student experience and career outcomes,” she says. “On the flip side, though, they’ve standardized the MBA over the past 40 years and stymied innovation.” Schools find themselves having to decide between teaching to the test, so to speak, or risking a drop in the rankings.
At the same time, schools can’t afford not to be listed. As prospective students compile their list of top places where they want to apply, many look to the rankings to cull their options. In some ways, schools appear to have little choice about participation. When Harvard and Wharton backed out of certain rankings in 2004, the publications ranked them anyway, causing the schools to quietly rejoin and thus be in control of the information provided.
“The rankings may not always get it right, but we need to remember that the visibility they bring is valuable,” says Courtney Chandler, MBA 96, senior vice dean and chief operating and strategy officer of Berkeley Haas. “In our case, the rankings and the accompanying stories have allowed Haas to advance its mission to develop principled, inclusive, and innovative leaders—Berkeley leaders.”
Caveats and pitfalls
The business degree ranked most frequently is the full-time MBA, as it’s more easily comparable across schools than part-time, specialty, or undergraduate programs. However, each publication has its own criteria for what makes a good MBA program, with their own potential caveats and pitfalls.
Take the U.S. News full-time MBA ranking: 25% is based on student selectivity, including incoming students’ GPAs and test scores; 25% on rankings by recruiters and peer institutions; and 50% on placement after graduation, including employment rates and starting salaries. Bloomberg Businessweek, on the other hand, bases most of its FTMBA rankings on surveys of alumni, students, and recruiters to measure compensation, learning, networking, entrepreneurship, and diversity. The London-based Financial Times relies on FTMBA alumni surveys of salary and satisfaction three years post-graduation, along with international criteria, such as diversity of country of origin of faculty and students, and newly added social factors, such as ESG (environmental, social, and governance) courses and carbon footprint.
The compensation measures themselves can be fraught. While graduate compensation metrics are standardized by the MBA Career Services and Employer Alliance, they only consider salary and signing bonuses but not stock options or other forms of equity for those in the tech industry or entrepreneurs who join startups after graduation. “Over 40% of our graduates get some form of equity that rankings do not consider in their post-MBA income,” says Abby Scott, assistant dean of career management and corporate relations. “We know that stock grants have real short-term value and options can become quite valuable over time, significantly bolstering our graduates’ lifelong wealth accumulation.”
While surveys, which allow for more qualitative information, might seem like a better way to go than statistics, the rub lies in who answers them. Often, fewer than 50 alumni respond to surveys that can represent as much as 40% of the ranking.
“We hear it all the time; our alumni may think we’re a Top 5 school and are disappointed when the rankings don’t reflect this,” says Chandler. “But their experience doesn’t translate into rankings unless they complete the surveys.”
Some schools experience wild fluctuations from year to year, and participation rates account for some of those. So do changes in rankings methodologies and other factors. Stanford ranked #1 in Financial Times’ global MBA rankings in 2019; in 2024 it was #23. For the past three years, Columbia has consistently been in the Top 3 for the FT. Meanwhile, in U.S. News’ 2024 MBA rankings, Columbia fell out of the Top 10 to #12 while tied for #1 was—you guessed it, Stanford.
Another issue is that much of the data that publications use is self-reported by schools and alumni. Auditing is sporadic or non-existent, leading to the potential for discrepancies in how items are measured, whether intentionally or not. For example, the Financial Times’ MBA ranking is in part based on two measures of full-time faculty: research publications and those who have PhDs. Schools had long interpreted full-time to mean tenured and tenure-track faculty. Makes sense, as 10% of the ranking is based on academic research publications by those faculty. But when the FT specified full-time to mean everyone teaching on a full-time basis, schools took a while to catch on. When Haas started reporting full-time professional faculty in its rankings data—while some schools continued to report having 100% faculty with PhDs—it dropped in the rankings.
Among top schools, the differences are pretty slim, making the relative order of the Top 25 or so business schools essentially little better than a coin flip.
Of course, prospective students and other interested parties just see that a school dropped out of the vaunted Top 10, says Erika Walker, senior vice dean for instruction. “Readers often accept rankings at face value without delving into the nuances, even if they review the methodologies behind them,” she explains. Unique methodologies mean different surveys will have varied results—but readers opt for simplicity when viewing them. “They often take whichever publication they hold in higher regard as truth,” Walker says. “This does not fully serve prospective students who have a keen interest in particular programs versus the overall reputation of the school.”
Surveying peer institutions as to the quality of a program—as U.S. News does for its business undergraduate and EMBA rankings (its sole criteria)—has its own pitfalls. “It comes down to how much a school is in the news or how well regarded it’s been historically. A school with a new, unknown dean might not rank as highly in the minds of peer leaders,” says Mariana Corzo, director of brand marketing and strategic initiatives who currently oversees Haas’ rankings participation. “What are your peers going to say about you—that you’re better than them?” she asks rhetorically.
While surveys, which allow for more qualitative information, might seem like a better way to go than statistics, the rub lies in who answers them. Often, fewer than 50 alumni respond to surveys that can represent as much as 40% of the ranking.
Surveying recruiters, as U.S. News and Businessweek do for their MBA rankings, can also be fraught. Rankings define recruiters broadly and may include HR contacts, hiring managers, or department heads from organizations of all sizes and industries, according to Scott of Haas’ Career Management Group. Those surveyed may not have data or experience with all schools they are asked to rank. Some may also have a natural bias toward their own alma maters.
Measuring what matters
School administrators aren’t the only ones taking business school rankings to task. Among the critics is, surprisingly, John Byrne, a former Businessweek editor who created the publication’s b-school rankings (one of the nation’s first) in 1988 to measure schools based on input from their main customers: graduates and employers. Those rankings, he says, were created with the best of intentions to counter persistent questions among employers about the value of an MBA and to find a way to keep business schools accountable. Byrne, now editor of Poets&Quants, an online publication for business school news that creates its own composite ranking, still believes the rankings have value for that reason today.
“Regardless of where your school ranks—whether it’s 3 or 10 or 20, these major publications are pushing it in front of people’s faces that business schools are important enough to be ranked,” he says. “Which means if you’re really serious about a career in business, you should consider graduate management education.” However, he agrees that much about the rankings is arbitrary.
The only way to legitimately read business school rankings, he says, is to “not put a whole lot of weight on a single ranking in a single year” but rather to consider a school’s overall ranking on multiple lists over time. Byrne notes, as others have, that among top schools, the differences are pretty slim, making the relative order of the Top 25 or so business schools essentially little better than a coin flip. The editor of the Financial Times, in fact, acknowledged last year that its full-time MBA rankings essentially broke down into four tiers, with 18 schools, including Haas, in the first tier.
Some tinkering of methodologies can make rankings marginally better. Last year, U.S. News included a factor considering the difference in pay scales of various industries as a fairer measure than overall average salary of graduates, rewarding schools that prepare their students well to land top compensation in their desired fields. (The change, incidentally, corresponded with a jump by Haas from #11 to #7—putting it back into the Top 10.)
For now, it’s essential to read rankings with a wider lens, understanding what they tell us and what they don’t. Methodologies evolve, but not always in the way business programs are evolving. And Haas’ standing as a top business school is solid—despite minor fluctuations from one year to the next. Because what distinguishes Haas from other schools, says Frey, is not a ranking. “It’s our community and our culture. It’s the spirit of Berkeley and the Bay Area. It’s business with purpose.”