illustration showing an enormous pice of paper with tiny people looking at it through magnifying glasses

What’s Important (or not) About College Rankings

In a world that emphasizes outcomes, culture is key
Denton Ketels

U.S. News & World Report first published its list of “Best Colleges” in 1983. Today, a profusion of commercially published rankings enthrall and mystify consumers every fall. For high school students and their parents, rankings by venerable sources like U.S. News, Money, and Forbes ostensibly provide objective guidance about academic reputation, affordability, and other supposed measures of college quality and accessibility.

For the colleges themselves, rankings can be both a headache and a useful measuring stick. Some institutions have noisily boycotted the annual ritual of lobbying for positive peer assessments, which play a big part in reputation ratings. An infamous handful have been caught trying to game the system by cherry-picking data or just plain lying. 

The great majority — Grinnell included — take a more reasoned approach, because even though the data are sometimes questionable and the methodologies imperfect, the lists are not going away. “Like it or not, rankings matter,” says Randy Stiles, Grinnell’s associate vice president for analytic support and institutional research. 

How students perceive rankings

Stiles points to research by the Art and Science Group that says 72 percent of traditional students pay at least some attention to rankings. Seven out of 10 students report that they discuss rankings in person or on social media, mostly with parents and friends. 

What’s more, college test scores are predictive of students’ attitudes and behaviors with respect to rankings. The 2016 poll reveals that students with ACT test scores of 28 and higher are apt to care more about the prestige associated with higher rankings. But students whose scores are 21 and lower are likely to give rankings more weight in choosing a college. 

Incoming Grinnell students seem to bear out that research. While their average ACT score of 30 may indicate awareness of the status that a lofty ranking commands, it also appears to signal greater discernment with respect to the importance of rankings in relation to other factors. Stiles says annual surveys of first-year students show that rankings in national magazines show up about halfway down the list of their top 20 reasons for enrollment. 

“Year after year students report the main reason for coming to Grinnell is the College’s academic reputation,” Stiles says. “Number two on the list is financial aid, which is not surprising because there is very generous aid given here.” 

Rounding out the top five reasons are the size of the College, the ability of graduates to gain admission to top graduate programs, and graduates’ prospects for getting good jobs. 

illustration showing a huge bar chart with tiny people on platforms and a spiral staircase examining its details

Perception versus reality

It could be that student perceptions are formed at least in part from rankings, and Stiles emphasizes that Grinnell does exceptionally well in systems that give considerable weight to academic quality and reputation. The challenge for data analysts is to balance those perceptions with what rankings are really saying about college quality, given that each system calculates performance differently.

Stiles says Grinnell’s approach to making sense of the complexities of college quality is to use “multiple lenses” in comparing and benchmarking performance against similar institutions, or what are referred to as the “peer 16.” That includes a review and in-depth analysis of seven different systems plus Princeton Review every year. 

“Our philosophy is not to manage to these systems,” Stiles says, “but to be informed by them, to be able to answer questions about them, and educate anybody who has an interest in what rankings have to do with the whole world of higher education.”

Stiles’ team studies not only Grinnell’s rankings within each of those systems but also the rankings of those peer liberal arts colleges in the Midwest and on the East and West Coasts. A daunting task, it requires knowing how each system works and what makes some more meaningful than others. 

How rankings actually work

“What these ranking systems all do in one way or another,” Stiles says, “is put together some collection of schools — liberal arts colleges, research universities, publics, privates — and rank them all on one long list. Then there is some collection of measures that are given some collection of weights. All of that gets added up into an overall score, from which is produced an ordered list.”

What makes one set of rankings more influential than another depends to a large extent on commercial reach. “Readership matters a lot,” Stiles says. “Some of these things have a lot of readership, and people give greater credibility to them. Others, not so much.” 

Making sense of college rankings would be easier if all of the rankings systems produced similar results. In many cases they don’t, and Grinnell is a perfect case in point. Last year, Grinnell was No. 19 among national liberal arts colleges in U.S. News rankings. In the other systems that Stiles tracks, the College came in at No. 19 also on Washington Monthly, but at Nos. 73, 9, 156, 31, and 54 on other lists. 

 “U.S. News puts a big emphasis on reputation and resources,” Stiles explains. “Forbes claims to emphasize outcomes or ‘output over input.’ Kiplinger’s is about best value. Money, not surprisingly, is about affordability and the salaries of graduates. The New York Times Access Index emphasizes the percentage of Pell students and the economic diversity. College Factual is outcomes-focused. 

“These days, there is more and more talk about outcomes,” Stiles continues, “and when people say outcomes in these systems, they’re talking about graduation rates and salaries more than anything else.” That’s an important distinction, he says, because rankings that weigh earnings heavily can skew data unfavorably for graduate students whose higher incomes materialize on a longer timeline. 

Also, not all systems are equal in terms of their own development. Stiles says Forbes’ ranking of Grinnell since 2008 has fluctuated by “an incredible variation” of 80 points. “I can guarantee you Grinnell didn’t change that much between 2008 and 2012. But the system changed, as did the way people were using it and the way it was managed. So it’s important to remember that the systems themselves need time to mature and achieve stability.” 

illustration of several people reading over the shoulder of another reading a newspaper called College Rankings

Rankings that resonate

Among all of the annual rankings, U.S. News’ “Best Colleges” remains the source that students use most widely to compare academic quality among 1,800 U.S.-based schools. “People pay most attention to U.S. News,” Stiles says. “It gets a lot of readership.”

In the U.S. News system, the categories given the most weight are reputational assessments by counselors and peers (22.5 percent), graduation/retention rate (22.5 percent), and faculty resources (20 percent). Student selectivity rank is next (12.5 percent), followed by financial resources (10 percent), graduation rate performance (7.5 percent), and alumni giving rank (5 percent).

Stiles says the key take-away from the 2017 U.S. News rankings (released in 2016) is that Grinnell’s overall rank of 19th is stable. “In fact we’re improving lately in overall rank. We have a great academic reputation,” Stiles says. 

“We’ve also become much more selective,” he says. “Just a few years ago Grinnell was 38th among liberal arts colleges for selectivity. Now we’re ninth in that category.”

To illustrate the seven-year data lag that can occur in published rankings, Stiles points to a blip in attrition among the student group that came to Grinnell in the fall of 2012. “That cohort will have a negative impact in our graduation rate when U.S. News rankings are published in 2019,” Stiles says. “We know that’s going to happen, and we’re working hard on graduation/retention as part of the quality initiative that’s connected to the upcoming accreditation review.”

Still, all ranking systems do not use the same measures, and a large readership for U.S. News does not necessarily make it the last word in college quality. Stiles says one of the more discriminating ways to view rankings is in how a system resonates with a college’s core values. One that gives particular weight to criteria consistent with Grinnell’s values is Washington Monthly’s ranking of “Best Liberal Arts Colleges.” 

Washington Monthly’s primary factors are social mobility, research, and service, each of which counts for one third,” Stiles says. “There are lots of details beneath those major categories, but the point is that different systems attribute different weights to measures that are relevant for what’s going on at a college.”

Breaking into the top 10

Everybody wants to be No. 1, or close to it. Human nature dictates that college graduates who encounter a list of best colleges will almost certainly want to know how their own alma mater stacks up. The question is, should Grinnell be content with being No. 19? What exactly would it take to be No. 15, or No. 12, or even third? Stiles explains:

“When you add everything up in the U.S. News system, schools will score numbers in their overall tally of about 70 to 100. The top 10 schools — and there is a lot of variation in the top 10 — go from a score of 100 down to about 87. Among the next 10, there’s a variation of only two points.” Grinnell’s overall score in last year’s U.S. News rankings was 85, tied with the U.S. Military Academy at West Point.

“If this were a road race, you would see the first 10 runners spread out,” Stiles says. “Right behind them would be another 10 in a clump, which is where Grinnell is. In the one-to-10 range it takes a big change to make a move, but even a small move in the overall score in the 10-20 range could move us six or seven places.” 

It’s about culture

While Stiles’ job is to analyze rankings from an institutional perspective, his insights are just as valuable for parents and prospective students who are staying up nights trying to decode the latest list of “best colleges.”

Stiles prefaces his advice with sociologist William Cameron’s famous quote, which he says is applicable to any consideration of college rankings: “Not everything that can be counted counts, and not everything that counts can be counted.” 

“Remember that the data lags,” Stiles says. “Also, cumulative earnings matter; rankings based on graduate salaries five years out do not tell the full story for a college like Grinnell that produces a lot of graduate-degree candidates. 

 “Look at a variety of rankings as a first filter in choosing a school,” Stiles says. “Culture and fit are so important that you’ve got to do a campus visit and check out several institutions to really know. The peer 16 are all fine colleges, and Grinnell is very highly regarded in that mix. You almost can’t go wrong with a liberal arts education at any one of these schools, but it is culture and context that really matter.”

Share / Discuss