A Small Change Leads to Big Questions
Because I'm a student at the University of St. Thomas, I regularly receive news items about things that UST would like me to know about. Something that caught my eye this week was an item about UST's ranking in the annual BloombergBusiness rankings of MBA programs. It wasn't the news of yet another round of rankings by yet another news agency that caught my attention. It was the methodology used in the rankings, or more specifically, a change in the methodology from 2014 to 2015 that intrigued me.
In past years, BloombergBusiness has said the purpose of their rankings research is to determine which business schools offer the strongest education and best prepare MBAs for their careers. In past years (prior to 2015), the methodology rankings were based on three weighted measures; a survey of student satisfaction (45% of the rankings); a survey of employers who hire those graduates (45%) and the expertise of each school's faculty, measured by faculty research in esteemed journals (10%).
This year (2015) BloombergBusiness says that their annual ranking for full-time MBA programs now focuses on "what most people hope to get after business school: a satisfying well-paying job".
With this change in focus, the methodology was changed to expand the base of survey subjects (alumni are now included in addition to employers and students). And additional measures were added; job placement rates now are 10% of ranking and starting salaries are also 10% of rankings. Notably and unceremoniously, the measure of faculty research was dropped from the methodology:
"Older elements of our ranking, including a tally of faculty research, have been scrapped because they don't get at our fundamental question: how well does this business school channel its graduates into good jobs?"
Methodologically speaking, based upon the research question being posed, this could be a justifiable decision. But, I wonder if the new methodology produced rankings that are any more meaningful than those produced by the previous methodology. And who is being served (or dis-served) by this change? For instance, in 2014 UC Berkeley (Hass) and NYU (Stern) were ranked #19 and #22 respectively. In 2015 Hass is #9 (up ten spots) and Stern is #24 (down two spots). Really? What is this saying? (other than that rankings are bullshit publicity tools). What is a prospective MBA student to surmise from this?
But mostly, I can't help but think about the unintended consequences of dropping faculty research from the rankings. This affects both schools of business and the students who attend them. I'll be elaborating on what I think are potential consequences in future blog entries. But, what do you think? Was this a good decision? Is faculty research no longer important to an MBA degree? Do you think dropping faculty research from the methodology of MBA program rankings impacts on the debate surrounding the relevance of research in business schools?
If you want to read more, here is a link to the article about the 2015 research. http://www.bloomberg.com/features/2015-best-business-schools/
If you want to read more, here is a link to the article about the 2015 research. http://www.bloomberg.com/features/2015-best-business-schools/
Hi Lisa,
ReplyDeleteThis is an interesting post. First, I am not sure where I fall on this, but I can share some of my initial thoughts and some of my recent experiences with graduate students at the U of MN. First, my initial thought is that maybe the change in focus is grounded more in the reading we are doing about Useful Research. It seems that "strongest education" and "best prepare MBA's for their careers" (prior to 2015, seem to be somewhat subjective. Beauty is in the eye of the beholder. I went to (supposedly) one of the top programs in the country for museum studies MA degrees and I don't feel like it truly prepared me for my career. It seems that a switch to actual outcomes, did you get a job or not, is a little more clear indicator of success. My experience with the Office of Equity and Diversity staff that work closely with graduate students at the U of MN is that there is somewhat of a trend for graduate students to get a job rather than enter academia, which are two very different world requiring different preparation. Students that I work with are more concerned with getting a job than anything else. It seems that the change in the methodology may reflect the questions that need to be answered. Marketing your program as the one that most likely gets you a job may be more attractive than the other metrics used in the past. The programs may have shifted ranking order due to the emphasis on preparation for jobs vs. other types of focus. Just speculation of course....
Hi Chris, thanks for pointing out that a rankings methodology like this can only produce relative positions among things that are seemingly the same. It doesn't measure how well or if these things are actually accomplishing what they say they are doing.
DeleteLisa
Lisa,
ReplyDeleteFirst, I am not surprised by this shift--it seems that the "Business" of the "Business College" is becoming the optimal measure for "customer value." So instead of looking at what is good for the long-term of the school, student, and field--the short-term goal of getting a job has taken precedence.
Just yesterday I heard an undergraduate in our B school being counseled,
Professor:
"Have you done any major research papers, or worked with a faculty on research?"
Student:
"No"
Professor:
"That's just as well, no one gets hired for doing research."
REALLY????
So your questions are prescient ones- and relate back to more than just higher ed. It's the "Teach to the Test" mentality- driving critical thinking and mastery out of main stream focus.
And not just for MBA's-
One possible reason for this could be the extreme competition since the great recession for well-paid jobs. More overeducated, and underpaid workers than the US can cut the muster to figure what to do with-
Troubling times indeed.
So I guess I am not really surprised by the change in metrics- it hails from an external and customer seeking POV. The rub is that what becomes clearly embedded in the MBA culture is the rising importance of "maximum utility of shareholder profits" while critical thinking, long-term learning, and curiosity slip down in rank.
Now these are my thoughts- but what do you think are the probable ramifications of such a shift? What if anything is there to do about it?
Thanks for the thoughtful reply. Your 'teach to the test' analogy is probably appropriate but paradoxical in the sense that critical thinking skills are something employers want from their MBAs.
DeleteLisa
Hi Lisa,
ReplyDeleteYour post about ranking and methodology seem to rank a bell. The only caveat to your assertion is the fact that people may say something positive or more favorable when you are preferred. Sometimes ranking is not quantitative or representative depending on who is doing the ranking. However, you have a point about the methodology that the Bloomberg Business report; that to me is critical.
Great post Lisa
Thank you
Jesse
Hi Jessie, I'm glad you pointed out the problem of relying heavily on ratings from students and alumni for MBA school rankings. Why would a student or alumni say something detrimental to the reputation of their MBA school if the effect would be to devalue their own resume?
DeleteThanks for the reply.
Lisa
Not surprising. Trying to get more objective outcome data...however, sample bias still exists, who answers these surveys? The research piece shouldn't be left out but using citation measures is very biased and academic. So how do we know if research is valuable? How do we know if an education is valuable? What criteria are possible?
ReplyDelete