Queen’s absent from world rankings

University administration decided not to participate in Times Higher Education’s top 200 because of concerns over the the survey’s new research methods

Rankings from Times Higher Education, Maclean’s and the Globe and Mail are the most renowned surveys of Canadian Universities.
Rankings from Times Higher Education, Maclean’s and the Globe and Mail are the most renowned surveys of Canadian Universities.
Photo: 

This year, Queen’s chose to forgo the opportunity to appear on a list of the top universities in the world.

The Times Higher Education’s World University Rankings were released last month and the Globe and Mail’s Canadian University Report was released on Monday. Queen’s did not participate in the Times international study and were subsequently omitted from the list of the 200 top universities in the world.

Principal Daniel Woolf said the Times Higher Education Survey (THES) requires the consent and cooperation of the universities involved, and Queen’s chose not to participate.

“[Institutional] analysis staff felt very strongly that there were serious methodological problems with the way the data [was] being collected and the questions being asked, and I have to defer to their advice on that,” he said, adding that Queen’s was not alone with these concerns and the University of Waterloo and the University of Western Ontario also chose not to participate.

“Given the sharp reaction to the THES rankings by a whole bunch of British vice-chancellors ... and by a couple of Canadian university presidents, I think [Queen’s staff] were quite right to make that call,” he said. “That doesn’t mean we wouldn’t participate in this particular ranking exercise in another year ... But we’d have to be satisfied that they were using appropriate methodology.”

The Times rankings have been published since 2004 and are often considered the premier international university rankings report. Office of Institutional Research and Planning Manager Roger Healey said the THES methodology was simplistic until recently.

“They used very simple, fixed indicator measures to come up with their ranking numbers,” he said. “Many of those numbers they were able to get publicly, without our input, because they’re on our website.”

Healey said the new methodology implemented in March required help from the research and planning department. He said the 2010 methodology is a radical departure from years past.

“They virtually reinvented it,” he said. “They started to involve some indicators that are quite controversial in the international rankings community.”

The THES methodology used to use six factors to develop the rankings until it increased to 13 this year. Healey said the research and planning department had concerns with the weighting of some of the new factors.

“We looked at it all very carefully and had some issues with some of the measures they were talking about using, [so] we declined to participate,” he said, adding that one heavily weighted factor in the rankings is the number of times the university’s research is cited.

“We [would have] had to do some work to help them compile,” he said.

Healey said the changes made to the survey this year compromised its reliability as a legitimate indicator of international university rankings.

“The jury’s still out as to whether this stuff is reliable and valid or not,” he said. “There’s a lot of controversy about how they’re used. We just wanted to wait and see how they were going to use them.”

Healey said the University chose to forgo the survey because they still take part in domestic ranking systems.

“We have to be mindful of what’s going on from a Canada point-of-view,” he said. “We’ve been involved right from day one with the Maclean’s ranking, which came out in 1991. We were heavily involved in making sure that survey was as accurate as it could be.”

Still, he said there is concern over how the University is seen on an international stage.

“We do know that some international students and faculty look at this stuff,” he said. “I hope they don’t put all their weight on it, but if it’s something they look at, we have to take it seriously.”

In his January 2009 vision document, Where Next?, Principal Woolf made similar claims about the value of these rankings for internationalization—one of his four key pillars for development.

“We should pay attention to ranking exercises such as the Globe and Mail, Maclean’s and the Times Higher Education Supplement to the extent that they often guide many international universities’ decisions on potential partnerships,” he wrote.

Kutay Ulkuer of Undergraduate Admissions said THES and international rankings have more impact on international students than Canadian high school students.

“Within Canada, I find that rankings are not as important to students,” he said. “But when we look at things internationally, rankings sometimes do have a higher impact on how we recruit and how our recruitment is formulated.”

Ulkuer said the international confidence in the Canadian educational brand is high.

“[International] students look into several different things when deciding where to study,” he said. “First of all, it starts with the country. Canada enjoys a very good reputation [as] an education destination.” He said the confidence comes from a high standard of average quality.

“The United States houses the top universities, some of the best universities in the world, but there are also universities of very low quality,” he said. “Canada doesn’t have that.”

In an op-ed piece last month in the Guardian titled “University world rankings are pointless,” President and Provost of University College London, Malcolm Grant, voiced his opinion on the practice.

“Imagine a newspaper decided to create a table ranking the world’s cities. Is Moscow better than Sydney? Would Hong Kong squeeze in above Manchester? Or Bangkok above Brighton?” Grant wrote. “It would be a nonsensical exercise.”

University College London was ranked 22nd among the top 200 in the Times rankings.

“Global rankings have afforded annual light entertainment but they are now seriously overreaching themselves,” Grant wrote. “They do a disservice if they influence student choice, or come to be treated as a performance measure by the leaders of hugely diverse institutions.”

The Globe and Mail’s Canadian University report on Monday surveyed over 35,000 current undergraduates, asking over 100 questions. The mark Queen’s received in the Physical Fitness, Sports and Recreational Facilities category improved from a C+ in 2010 to an A+ in the recent 2011 report.

Sean Cameron, a Burlington high school student, is in the process of applying to business schools at Ontario universities. He said rankings are only one factor in choosing a university.

“Queen’s has had a reputation as a good school in a lot of programs,” he said. “Just because they’re not participating in a certain survey, they’re not going to be hurt that much.”

He said his visit to campus mattered more than the university rankings he read.

“Once I visited, it was really great to see what the campus was like, and I think that was a lot more important than reading something in the Globe or reading something in Maclean’s.”

Cameron listed small class sizes, an excellent reputation among employers and an attractive campus as selling points of the Queen’s Business program, but said individual students look for different things.

“To some of my friends, the most important thing is athletic complexes, or scholarships,” he said. “It’s all about what the individual person feels is their best fit.”

He said rankings can be an important initial tool, but more research must be done in choosing the right fit.

“It’s good to see Queen’s on the list,” he said. “It’s not a bad thing to participate in surveys, but it’s not the [most important] thing.”

His mother, Sara Cameron, said rankings can be contradictory, and schools often use them arbitrarily.

“We have been to so many campuses,” she said. “Everybody can call themselves number one in something.”

She said the administrators at her son’s school advised them not to get lost in the numbers, and not to focus on the ‘best’ school, but the ‘right’ school.

“Some people are very analytical, and get very caught up in the rankings,” she said. “But when you get on campus, that’s where you make the decisions.”

She said rankings can sometimes make people lose sight of what’s really important.

“Rankings drive people to try and find the best, instead of finding the right fit,” she said. “And getting on campus is how you find the right fit.”

With files from Jake Edmiston

When commenting, be considerate and respectful of writers and fellow commenters. Try to stay on topic. Spam and comments that are hateful or discriminatory will be deleted. Our full commenting policy can be read here.