You are here

Future Shock

New Rules for Playing the Rankings Game

It's time for institutional leaders to get active in rankings reform.
University Business, Apr 2007

So there we were, stuck in another endless focus group, listening to faculty colleagues go on and on about how unfair life is. As proof, virtually all institutions have to play the rankings game or they'll be left out in the cold come fall semester student enrollment and orientation.

For too many years, high school seniors have wandered through hapless conversations about their college-bound options with guidance counselors or perhaps a favorite teacher. Having fallen through the inevitable cracks of the guidance network, some parents now engage a college counselor to listen and pull it all together-that is, of course, after perusing the typical panoply of college guides.

Nowadays, however, it gets a bit more complex than just picking a favorite ranking guide. Indeed, the problem for rankings these days is that the average American student and parent are far more sophisticated than those of us in the baby-boomer generation ever were. After all, it doesn't take much sophistication to ask whether Billy or Sally will graduate from a college or get accepted into a graduate school. Nor does it take a higher ed insider's knowledge to inquire about the prospective institution's career placement and alumni networking resources.

The modern world is aware that the U.S. News guide may not be the only authoritative source on colleges.

Beyond timeless questions like faculty-student ratio, library resources, student amenities, and intercollegiate athletics, our fickle economy compels parents to ask new questions about pricing, financial aid discount rates, and return on tuition investment. Dad already gets Kiplinger's, Money, Washington Monthly, Newsweek, and The London Times. Uncle Bob works in high-tech and watches the Florida's Top American Research University and Small Times Nanotech indices, and Virginia, the oldest sister, just finished filling out a Princeton Review evaluation form.

After years of blissful ignorance, the modern world is aware that the U.S. News & World Report guide may not be the only authoritative source of useful information about learning and living experiences on campus. On a practical level, spiraling tuition pricing, fee escalation, and declining student aid are increasingly squeezing middle-income families. Beyond price hikes, families are now putting second mortgages on primary residences to fund the four-year baccalaureate experience.

Prodded by government regulations (Spellings Commission) recommending the disclosure of commonly assessed outcomes rather than garden variety inputs (such as faculty resources, endowment, and net assets), today's institutions of higher learning are hard pressed to depend on any of the traditional "Good Housekeeping" seals of endorsement-aka traditional ranking guides.

Traditional ranking critics have played into the growing popularity of new age alternatives like the "National Survey of Student Engagement"-a new kind of survey that provides data on teaching and learning outcomes for about 1,000 participating colleges. Most recently, the Education Sector report entitled "College Rankings Reformed: The Case for a New Order in Higher Education" called for a more academically responsible and accountable approach to measuring institutional quality.

It doesn't take much common sense to figure out that rankings can no longer rely on the escalating war of peer perceptions and high-end campus amenities. For many of us, the time is right for rankings reform because we finally have the independently verifiable, objective assessment tools to measure milestones on the way to student learning and career success.

Under the rules of traditional ranking, rich institutions get richer in perceived value, reputation, and market position. Money still makes the difference. The cruel irony of American higher education is that colleges and universities with the most resources are the least inclined to change the traditional ranking game rules. Those that are most fragile must venture out on the entrepreneurial edges-managing risk and reward in the nontraditional ranking sphere. Yet even major research universities must deal with the same entry-level freshmen statistics as other institutions in terms of the implications of rankings.

We say nontraditional ranking sphere since a number of IHEs have begun to think for themselves and fashion institution-friendly rankings and indices. Perhaps the most well recognized is TheCenter Top American Research Universities project, established at the University of Florida. TheCenter looks at research, assets, academic, and faculty awards, doctorates granted, postdoctoral activity, and SAT scores.

Washington Monthly has a new set of ranking criteria, based on institutions' positive impact on society, placing value on scientific and humanistic research, social mobility, and community service. And new town/gown indices such as "Savior of Our Cities" (from the New England Board of Higher Education) and "Beyond Grey Pinstripes" have become counterintuitive trendsetters even among more traditional liberal arts colleges and business schools.

There's also the new Faculty Scholarly Productivity Index, available from Academic Analytics. It rates faculty output based on published books and journal articles, research citations, and awards. The index ranked The University of Georgia at No. 2, while more seemingly prestigious universities such as Columbia, Cornell, Duke, Harvard, Yale, University of Pennsylvania, and University of Virginia were not even ranked in the top 10.

Or, consider the top 10 nanotech universities index from Small Times magazine. University at Albany ranks first in nanotech facilities and industrial outreach, ahead of places like MIT; University of California, Berkeley; Purdue University (Ind.); University of Michigan; Cornell, Cal Tech; and Carnegie Mellon University (Pa.).

The cruel irony is that
institutions with the most resources are the least inclined to change the ranking game rules.

Don't think for a minute that the rankings have escaped international debate. Last year, a blue ribbon group of global higher ed leaders and publishers met to come up with ranking best practices. The Berlin Principles on Ranking of Higher Education Institutions have provided a starting point for future conversations on global ranking. Unlike other rankings, the Berlin Principles recognize a wide diversity of institutional missions, with emphasis clearly placed on measured outcomes.

Last December, The New York Times reported that the University of Florida is pursuing a high-stakes ranking strategy with the same fervor as its national football championship aspirations. Incredible though it may sound, the university just slapped on a $1,000 tuition surcharge for purposes of reducing student-faculty ratio; no wonder that Florida is now ranked 13th among public universities in the United States. And the list doesn't end there.

In a Times interview, President Mark Emmert of the University of Washington chimed in: "When we think about our peers now, we don't just think about publics, we throw in Stanford and the Ivies." For its part, the University System of Maryland has recently carried out a successful university-wide branding (and ranking) campaign, giving credence to the old adage "All ships rise in a rising tide."

Consider the aspirations of the University of Massachusetts. President Jack Wilson says: "We no longer think of ourselves as a public, land grant, peer university. We want to be measured with and evaluated alongside the best private and public research university systems in the world. And why not? We just won a Nobel Prize at UMass Medical, and we're winning new rankings and 'Best of' categories in the arts, sciences, and technology. ... Over time, our rising university-wide research grants and contracts volume and impressive faculty scholarship productivity will place us among the most competitive ... research institutions in the global marketplace."

Perhaps Manny Fernandez, Florida trustee board chair, offered the most telling comment: "I want to be on the cocktail party list of schools that people talk about. ... I don't apologize for trying to get the rankings because rankings are a catalyst for changes that improve the school."

Not everyone in the Florida higher ed system, however, is convinced that the emphasis on high-end research is the right route to go. As a recent Community College Week article reported, "Florida's public universities have concentrated too much on glitzy research and professional programs while slighting undergraduates, according to a highly-critical study" (commissioned by the Florida Board of Governors).

Both critics and supporters of rankings are taking notice of new and intriguing ways to measure, document, and validate learning outcomes for an increasingly fussy educational consumer marketplace.

We proffer this closing thought: If you can't win playing by the house rules, then play in a house that is willing to play by new rules. It's time to change the rules on playing the rankings game.

James Martin is a professor at Mount Ida College (Mass.). James E. Samels is president and CEO of The Education Alliance. Their book is Presidential Transition in Higher Education: Managing Leadership Change (Johns Hopkins University Press, 2004).

Register now for UBTech 2018

Register now for UBTech 2018, June 4-6 at the Mirage, Las Vegas. At UBTech 2018, you’ll network with a dynamic community of higher ed leaders who are shaping the future of campus technology and explore topics like cyber security, distance learning, campus learning space design, communications, personalized learning and more. Your UBTech registration also includes a free pass to the InfoComm exhibit hall.

Register now>>