Skip to main content

The Rankings Effect

By Christopher Connell    //    Volume 33,  Number 5   //    September/October 2025

In recent decades, college rankings have become prominent in the college admissions process. Several consumer media companies have developed their own rankings, which have put institutions in the spotlight. Many higher education leaders are critical of rankings because they don’t necessarily consider the individual student when deciding on the right college fit. Still, rankings are part of the higher education landscape. Boards should understand how college rankings may influence the public perception of its institution, which could impact student enrollment.

When Ray Rodrigues entered Berry College in Georgia in 1990, all freshmen got T-shirts “proclaiming the fact that it was recognized by U.S. News & World Report (USNWR) as one of the top liberal arts colleges in the South.” Rodrigues wore it with pride and to this day places great faith in the magazine’s ability to determine which colleges are the best.

He does so all the more so because the former Florida state senator today is chancellor of the State University System of Florida, which USNWR has ranked as the number one public university system in the nation every year since 2017.

It holds that perch principally because of its affordability—Florida college students pay just $2,400 in net tuition, which is less than a third of what students pay on average in other states—graduation and retention rates, and students’ indebtedness after college. Wyoming, another low-tuition state, is rated number two for higher education. Florida’s success has helped the universities secure more state funding “which is the reason these rankings are so important,” Rodrigues said.

Other higher education leaders remain highly critical of the validity of rankings by USNWR and its news media rivals. The rankings have been controversial ever since the magazine published its first list of 40 “best colleges” in 1983 based solely on a survey of college presidents’ opinions. Today their methodologies weigh harder evidence including graduation rates and starting salaries for those who receive federal loans and Pell Grants.

But USNWR still surveys college presidents, deans, and admissions directors and their opinions count 20 percent toward its showcase list of the best individual colleges, which is dominated by Ivy League and other private institutions with hefty endowments and lofty reputations. Among public universities, only UCLA, UC Berkeley, Michigan, and Virginia crack the top 25. Florida’s flagship, the University of Florida, is ranked 30th among all universities and 7th among public universities. Florida’s best public system status comes not from that best college list but a separate “Best States” ranking including crime, health care, infrastructure, and other factors.

Rankings have become the tail that wagged the dog for USNWR, which in 2010 ceased publication of its print weekly that competed against Time and Newsweek and instead counts on a raft of Consumer Reports-like lists to drive traffic to its website and generate ad revenues. They include best hospitals, best cars, best cruises, best elementary schools, and more. Its website gets millions of visitors, including 2.4 million to the national universities page alone in the first six months of 2025, according to USNWR spokeswoman Shamari White.

Although the best college list is freely available, USNWR makes money selling a guide and services to help students choose the right college. Colleges can also buy badges with the USNWR logo to trumpet their placement on their own websites.

A retired president whose university ranks in the magazine’s top 100 complains, “U.S. News and World Report has made life a living hell for all but the top 70 schools in the country. They are never going to change. They dance around, but it’s always going to be the same top 70. It has increased competition in unhealthy ways for schools with meager means to invest in things that are very expensive and they really shouldn’t be chasing after.” The president requested anonymity.

Vanderbilt University Chancellor Daniel Diermeier is a fierce critic, claiming these lists mislead students and only exist to sell ads. Vanderbilt enjoys a lofty ranking—18th in USNWR—but Diermeier lambasted the magazine for disregarding such measures of academic quality as class sizes, the percentage of faculty with PhDs, how many freshmen were in the top 10 percent of their high school class, and other factors.

“Some rankings are harmless trivia,” Diermeier wrote in Forbes magazine—itself a publisher of college rankings—but “for many students and their families, the choice of where to go to college is among the most important decisions of their lives. University leaders owe them our best possible effort in providing useful and validated information.”

What is needed, he argues, are ratings, not rankings, that provide “a way to quantify key measures of academic quality and accessibility that is stable, data driven, and transparent.”

Vanderbilt underwrote a study of college rankings by the research organization NORC at the University of Chicago, which faulted the methodologies used by USNWR, the Wall Street Journal, Forbes, the Times Higher Education World University Ranking, and the QS World University Ranking. The study concluded that there is no “clear or stable set of concepts underlying the college ranking systems,” and that they rely too much on self-reported data from the institutions and mislead consumers about the degree of differences between institutions.1

“Everybody’s got the right to do what they do,” said Bruce Evans, chair of the Vanderbilt University Board of Trustees, said, but “as we’ve dug into it, we’ve come to the conclusion that many of those criteria are either subjective or based on data that’s incomplete or flawed.”

While no longer relying solely on college presidents’ impressions of what the best colleges are, 20 percent of a college’s ranking in various categories comes from peer assessment data. In 2024, USNWR sent its peer assessment survey to 4,665 academics, including college presidents, provosts, and deans of admission. Almost a third of the 4,665 recipients who were sent the survey in 2024 filled it out.

One who never did was Mitch Daniels, president emeritus of Purdue University. “I could not pretend I knew whether the history department at Carnegie-Mellon was better or worse than the one at Ohio State,” said Daniels, who led Purdue from 2013 to 2022. (It is currently ranked 46th overall and 18th among public universities, with engineering and other programs near the top).

“Some of the grad school rankings may be of reasonable validity, and the undergrad ones that factor in cost would be worth a family’s attention. But in general (they) are dubious at best, misleading at worst.”

Despite changes by these media organizations to bolster their criteria, “they are still hopelessly arbitrary, and they generally produce wildly disparate lists of ‘top’ schools,” said Daniels, a former governor of Indiana. “Some of the grad school rankings may be of reasonable validity, and the undergrad ones that factor in cost would be worth a family’s attention. But in general (they) are dubious at best, misleading at worst.”

“I never discussed these things with our trustees, nor countenanced any particular attention to them,” he added.

Notwithstanding how small the shifts in most rankings may be from year to year, changes in methodologies can turn things topsy turvy, as happened when the Wall Street Journal altered its formula last year. Twenty-five schools were new to its top 50. Arizona State University (ASU) tumbled 192 places to number 252 among 500 colleges and universities. The newspaper, which partners with the survey organization College Pulse and Statista to arrive at its rankings, based its grades mostly on graduates’ salaries, how many students earned degrees, and how long it took for their investment to pay off. Its formula placed Babson College, an undergraduate business school, in second place among national universities behind only Princeton, a perennial front-runner. Babson didn’t make the USNWR national list.

An Arizona State University spokesman said only of ASU’s precipitous fall that “the positions of many universities changed due to alterations in the publications’ methodology.” The university, one of the nation’s largest, proudly boasts on its website of scores of top laurels from USNWR and other organizations.

The newspaper, asked for an explanation, sent a response from its Statista partner acknowledging that “some colleges experienced larger drops than otherwise expected” from the addition of new data sources such as the calculation of earnings and graduation rates. “These fluctuations cannot be entirely avoided as we strive to improve upon the methodology year over year,” it said.

The rankings can have real-world impacts, not just in influencing students’ and parents’ decisions about where to attend college but also affecting institutions’ funding. In Florida, they are a linchpin of a performance-based funding system that state lawmakers embraced more than a decade ago in allocating resources to public colleges and universities.

The Board of Governors of the State University System of Florida recently approved a new Accountability Plan that will reward institutions for achieving various milestones, including a top 50 ranking “on at least two well-known and highly respected national public university rankings” such as those of USNWR, the Wall Street Journal, QS World University Ranking, and Times Higher Education World University Ranking.

A Skeptical View of Rankings

David A. Hawkins, chief education and policy officer for the National Association for College Admission Counseling, believes the news media rankings “play a minimal, if any, role, in students’ decisions” about which college to attend, with the possible exception of international students whose families may be under the impression they carry “an official endorsement from governmental bodies.”

“Our survey research found that students and counselors were much more likely to find the reference information about college deadlines, application options, requirements, etc., to be much more relevant than the ranked order of colleges,” he said.

“The rankings continue to rely heavily on what we call ‘input measures,’ such as standardized admission test scores, which say nothing of the quality of the institution,” said Hawkins.

“Yes, there is a ‘methodology,’ which sounds scientific, but in reality, anyone can assemble data, weight it, and create lists that put colleges in one order or another,” said Hawkins. He likens them “to Consumer Reports for household products. Unfortunately, choosing a college is far more complex and individualized than buying a washer/dryer.”

And Hawkins argues that the federal government’s College Scorecard and College Navigator, which collect data from every institution about graduation rates, student federal loan amounts, and the number of students qualifying for Pell Grants, are more powerful and accurate tools for students, offering “a depth and objectivity that other sources and publications lack.” (USNWR incorporates this official data into its rankings.)

“Our guidance for trustees is that rankings competition is an unhealthy game and that the efforts of governance bodies are better focused on the educational, financial, and cultural strength of the institution than moving up on a list that everyone seems to hate yet also seems helpless to change,” said Hawkins.

A college does not have to be at the top end of a ranking to crow about its stature. Simply making a list and not being overlooked may be cause for celebration for presidents and trustees alike. Most trustees are not academics and come instead from the business world, which looks to metrics to measure performance.

When Marymount University in Arlington, Virginia, jumped 32 places to number 288 in the USNWR report last fall, President Irma Becerra put out a press release saying, “We are incredibly proud of the momentum that Marymount continues to build in the national landscape of higher education.” And by many measures, her scrappy institution does have a lot to crow about. USNWR ranked it second among private universities in Virginia. “My board is incredibly happy because we’ve had four years of double-digit enrollment growth,” said Becerra, now in her eighth year as president of the school close to the nation’s capital, which has 2,500 undergraduates and nearly 1,600 graduate students. Marymount, traditionally a bachelor’s- and master’s-degree-granting institution only, has begun awarding some doctorates and recently received a new Carnegie classification for colleges that conduct at least $2.5 million in research.

When USNWR, then a print weekly trailing behind Time and Newsweek in circulation, first compiled a best colleges list, it was based on that survey of presidents about colleges’ reputations. That changed in 1988 when the magazine moved an in-house expert on economic data, Robert Morse, into the project with a team of number crunchers, separate from the magazine’s journalists who write its stories about the best colleges list. Those education writers do not themselves make judgments about which schools are better than others.

The best college rankings, and later best hospitals, best cars, best cruises, even best elementary schools and more, became the tail that wagged the dog for USNWR, which ceased publication of the print weekly in 2010 while counting on these lists to drive traffic to its website and generate ad revenues. Although the best college list is freely available, USNWR also makes money selling a guide and services to help students choose the right college. Colleges can also buy “badges” with the USNWR logo to trumpet their placement on their own websites.

Morse, who retired in July after 49 and a half years, became the face, chief architect and defender of the ratings, as well as “one of the most powerful wonks in the country,” as the Washington Post dubbed him in 2011. Morse is an understated, mild-mannered data scientist who achieved a position of power and influence in the world of higher education, to the dismay of some and delight of others. He and his team of seven data analysts have given institutions a way to market themselves and given families tools to help make one of their most important financial decisions.

Morse, on the 25th anniversary of his magazine’s rankings, boasted it had become “the 800-pound gorilla of American higher education, important enough to be the subject of doctoral dissertations, academic papers and conferences, endless debate, and constant media coverage. What began with little fanfare has spawned imitation college rankings in at least 21 countries, including Canada, China, Britain, Germany, Poland, Russia, Spain, and Taiwan.”

In an interview days before his retirement, Morse said, “The primary audience is prospective students and their parents. It’s about what’s the best school for them.” But for institutions, he added, it may be a way to raise funds, compare themselves to peers, or perhaps elevate their status as Marymount has done.

Morse believes “that if you gave colleges a vote, the majority would say they’d prefer not to be ranked.” Morse has heard in meetings with college presidents that trustees know rankings affect public perceptions. Some may “want an explanation of why the school ranked this way, and why it’s rising or why it’s falling,” he said, while “some probably don’t think it’s worth worrying about.”

He is passing the baton to two veterans of his team of data analysts, Eric Brooks, director of education data analysis, and Kenneth Hines, senior director of education data analysis. While the rankings still evoke criticism, they believe it is considerably less than the early years when they were based largely on “word of mouth,” as Brooks put it.

“There was a lot more criticism then than there is now,” said Hines. “We are also still talking to the higher education community and working to improve things that they have questions about.”

“Whether you’re ranking colleges or writing a restaurant review, there’s always going to be people that disagree with your conclusions or take issue with your processes,” said Hines. “What I tell people about the rankings is we are attempting to measure things that students and their families say are important when selecting a college” such as academic quality, graduation rates, and return on investment.

“What I tell people about the rankings is we are attempting to measure things that students and their families say are important when selecting a college” such as academic quality, graduation rates, and return on investment.

The U.S. Department of Education’s official data on graduation rates and debt—which every college must divulge to participate in student aid programs—give people hard facts rather than presidents’ and deans’ impressions of what the best schools are. While college prices have leveled off after years of steep inflation, 40 million American adults still owe $1.7 trillion in student debt.

In 2013 President Barack Obama cited those rising costs in setting out to create an official college ratings system. He fell short but his initiative produced the College Scorecard in 2015 and what National Public Radio termed a “torrent” of useful information about how much schools’ graduates earn and owe.

Some skeptics say that some schools try to game the rankings systems with misleading information.

After a Columbia University math professor published a paper picking apart data that Columbia reported, the university acknowledged mistakes and USNWR dropped it from number 2 to number 18 on its 2023 list. (It rebounded to number 13 in the 2025 rankings). The university also agreed to pay $9 million to settle a class action lawsuit against its board of trustees brought by former undergraduates who claimed they suffered financial harm due to Columbia’s misrepresentation of data to USNWR.2

A former longtime Temple University business dean was found guilty of federal fraud charges, sentenced in 2022 to 14 months in prison, and fined $250,000 for inflating the credentials and success of Temple’s online and part-time MBA students. (The Supreme Court in June 2025 refused to hear the dean’s appeal.) In 2022, the University of Southern California (USC) acknowledged that its Rossier School of Education had inaccurately reported data about the selectivity of its doctoral programs to USNWR. The publisher responded by requiring USC’s president and the chair of its board of trustees to certify the accuracy of the Rossier School’s data for three years.

Dozens of top law schools and some medical schools refused in 2022 and 2023 to answer USNWR’s request for information about their students and practices. New York University School of Law Dean Troy A. McKenzie told students that “the methodology used by U.S. News can give applicants a distorted view of the opportunities for successful professional paths available at law schools.” But USNWR says it got equally good law school data from the American Bar Association that served its purposes.

Although a small number of schools, including several top law schools, refuse to cooperate with the magazine’s request for data, the overwhelming number do. For the 2025 rankings, Brooks said, “Ninety-nine of the top 100 ranked universities and 97 of the top 100 national liberal arts colleges submitted data to us.”

Still, the critics are not convinced, and the debate goes on.

“The rankings bring out the worst in colleges and harm the higher education landscape. Public colleges suffer because the rankings are tilted to favor wealthier, smaller, private colleges. Colleges that choose to focus on excelling in areas outside of U.S. News’s criteria, or that don’t want to participate in the rankings at all, suffer because uninformed students decide where to enroll based partly on rankings,” Akil Bello, then-director of advocacy and advancement for the National Center for Fair and Open Testing, wrote in the Chronicle of Higher Education.3

The criticisms hold no weight with Rodrigues, the Florida chancellor.

“The purely objective data doesn’t lie, and that’s why our system is rated number one,” he said.

Christopher Connell is a higher education writer based in Washington, D.C. and a frequent contributor to Trusteeship.


1. NORC at the University of Chicago, College Ranking Systems: A Methodological Review, September 24, 2024, https://www.norc.org/content/dam/no-search/vu-college-rankings-review.pdf.

2. Emily Pickering and Alexandra Sepe, “Columbia to Pay $9 Million Settlement in U.S. News Data Misrepresentation Lawsuit,” Columbia Spectator, July 1, 2025, https://www.columbiaspectator.com/news/2025/07/01/columbia-to-pay-9-million-settlement-in-us-news-data-misrepresentation-lawsuit/.

3. Akil Bello, “The Reckless Rankings Game, Chronicle of Higher Education, October 4, 2022, https://www.chronicle.com/article/the-reckless-rankings-game.

Close Menu
The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.