Survey says

The National Survey of Student Engagement is a far more reliable gauge of what’s happening on campus than rankings released by the Globe and Mail and Maclean’s, says the director of institutional research and planning

The University consistently scores high on survey questions measuring its campus environment.
The University consistently scores high on survey questions measuring its campus environment.
Photo: 
This graph details how first- and fourth-year students at Queen’s responded in the NSSE benchmark result of “student-faculty interaction.” The purple shade represents Queen’s, the pink shade the average of the Canadian universities, the orange American doctoral-extensive universities and the blue a sub-set of the American doctoral-extensive universities.
This graph details how first- and fourth-year students at Queen’s responded in the NSSE benchmark result of “student-faculty interaction.” The purple shade represents Queen’s, the pink shade the average of the Canadian universities, the orange American doctoral-extensive universities and the blue a sub-set of the American doctoral-extensive universities.
Credit: 
Source: the 2004 National Survey of Student Engagement
This graph details how first- and fourth-year students at Queen’s responded in the NSSE benchmark category of “supportive campus environment.” The bar colours are the same as the bar colours of the previous page.
This graph details how first- and fourth-year students at Queen’s responded in the NSSE benchmark category of “supportive campus environment.” The bar colours are the same as the bar colours of the previous page.
Credit: 
Source: the 2004 National Survey of Student Engagement

Next year, the results of a U.S.-based survey will help determine—in part—how the provincial government funds Queen’s and the rest of Ontario’s universities.

But Patrick Deane, vice-principal (academic), said it’s too early to determine what exactly the McGuinty government meant when it said new funding packages for Ontario’s universities will include “results-based” funding.

“The government wants to see results of its investments,” Deane said. “And I think the [National Survey of Student of Engagement] will be indirectly important to that.” Students won’t find the National Survey of Student Engagement (NSSE) in the Globe and Mail, which released its annual University Report Card on Wednesday, nor in Maclean’s, which will release its annual rankings of Canadian universities on Monday.

In fact, the 2004 NSSE results have been available for almost a year. But due to a disclosure agreement signed by the participating Canadian universities—including Queen’s—the complete results have only been circulated internally.

“We decided to explore the data and certainly not use it for ranking and promotion issue,” said Chris Conway, director of institutional research and planning at Queen’s. “We [were] cautiously [using] it until we were able to discover more about the data and [more about] how well we understood it and how comfortable we felt with it—so you don’t see any big reports out among participating institutions in great depth and detail,” he said.

The NSSE is an enormous project carried out by Indiana University’s Centre for Postsecondary Research and Centre for Survey Research. For the first time, Queen’s and seven other Canadian universities—the University of British Columbia, the University of Alberta, the University of Western Ontario, McMaster University, the University of Toronto, the University of Waterloo and McGill University—were among the 472 institutions participating in 2004.

Conway said since the Canadian institutions were participating for the first time in 2004, they met to “Canadianize” the survey—by changing American terms such as freshman and senior year—without losing its main goals. The survey’s 2004 version featured responses from more than 160,000 first- and fourth-year students in Canada and the United States.

Conway said next year, all universities in Ontario will participate in the 2006 NSSE to determine in part how they will receive funding from the McGuinty government. “By about this time next year, we’ll be that much more comfortable and that much more visible in our reporting,” Conway said.

Conway said Queen’s has the authority to deal internally with its own NSSE results in whatever manner it pleases, but it can’t release the specific results of the other Canadian universities.

“The immediate interest within the media in the NSSE results was who won,” Conway said. “It’s not about who won. In fact, the measures used are such that it’s almost impossible to figure out what winning even is.”

The National Post published an article in late August about how the Canadian universities fared in two of the 120 questions, which related to a student’s overall satisfaction with his or her university.

Conway said he only gave the National Post the Queen’s results for those two questions.

After an article appeared in the Journal’s Sept. 20 issue citing the information from the National Post’s article, the Journal learned the NSSE had much more of a story to tell than overall student satisfaction.

Conway gave the Journal the University’s complete results in comparison to the American universities, but couldn’t release how Queen’s compared in specific questions to the other Canadian universities. Conway said this would violate the disclosure agreement.

“We’re all annoyed at rankings,” Conway said. “We’re all frustrated that simplistic measures end up being reported in various places.”

Conway said widely broadcasted media ranking wasn’t the way to create a culture of improvement for the universities participating in the 2004 NSSE survey.

“There is something almost intrinsically troublesome about simplified rankings in a climate in which you’re supposed to create improvement,” Conway said. “Which is not to say [the NSSE results] can’t be reported on—of course they can. With proper context and [an] understanding [of] what the data is attempting to do.”

The NSSE quantifies items such as how often students see faculty outside of class, whether they receive prompt feedback from faculty, to what extent tests reflect course material and to what extent course material challenges them.

The NSSE randomly selected first-year and fourth-year students. At Queen’s, 519 first-year students participated, and 469 fourth-year students participated.

The survey categorized 40 of the 120 questions into what the NSSE refers to as “benchmark scores,” expressed in 100-point scales.

According to a 2004 NSSE report, the survey created the benchmark scores “in an effort to make it easier for people on and off campus to talk productively about student engagement and its importance to student learning, collegiate quality, and institutional improvement.”

The five benchmarks are:

•Level of academic challenge

•Active and collaborative learning

•Student-faculty interaction

•Enriching educational experiences

•Supportive campus environment

“We’re very pleased we did as well as we did in some areas, and we’re disappointed that we didn’t do better than others,” Conway said. “I don’t want to get into this ‘Queen’s came in first on that question.’ That’s nonsense, that’s not what this is about.” Conway said in the 2004 survey Queen’s didn’t do so well overall in the “student-faculty interaction” and “active and collaborative learning” categories when compared to American universities. “If there were ever two items that are clearly most linked to resources available, those would be them,” Conway said.

“If you look at the funding situation of the U.S. versus the Canadian institutions, they are on a per student basis funded about 50 to 70 per cent higher than we are. It’s not terribly surprising Canadian scores would be lower.”

According to the 2004 NSSE Institutional Benchmark report, Queen’s ranked in the 10th percentile out of 100 for “student-faculty interaction” and “active and collaborative learning” benchmarks.

A question within the “active and collaborative learning” benchmark asked: “How often have you asked questions/contributed to class discussion?” Twenty-five per cent of Queen’s first-year students said they never asked a question or contributed to class discussion. Ten per cent of Queen’s fourth-year students said the same. Within this question, Conway said, the difference in results across Canadian universities didn’t provide a significant enough difference to be useful in any practical way.

But a significant difference could be seen when compared to American universities.

Only five per cent of first-year students in American doctoral-extensive universities said they had never asked a question or participated in class discussion. Three per cent of fourth-year students said the same. “What’s happened? Increased class sizes in first year [and] a desperate desire to maintain the nature of the senior experience,” Conway said. “So we still have seminars in fourth years, we still have relatively small class sizes [in fourth year], we still have honours programs—but first year has taken a hit.” “Now to the extent that this is about resources, you’re beginning to see a pattern here about what may be happening in Canada,” Conway said.

But the University’s result for “supportive campus environment” was much higher. In that overall benchmark score, the University ranked in the 90th percentile.

“If we hadn’t done well in that, we would have to throw away a lot of myths,” Conway said. “Queen’s is widely perceived to be an environment in which students interact more with one another—students are clustered fairly closely either on campus or near campus.

“There are all these things that you’d think Queen’s should do well in—well, we do, but it’s not based on resources.”

He said the NSSE doesn’t tell a single specific story about what is happening at Queen’s.

“We do very well in three areas, but there are two that we don’t,” he said.

Conway said unlike the Globe and Mail and Maclean’s university rankings, the NSSE tries to find tangible measurements that can be specifically addressed. “The kind of the information you get from looking at survey results that are much more detailed is way more useful than the wringing of hands that occurs when you say, ‘oh gosh, our class sizes are so big’ or ‘they’ve gone up x per cent in the last year,’ ” Conway said.

He said by looking at a specific question, such as what percentage of first-year students ask questions or contribute to class discussions, the NSSE provides items that can be specifically addressed in the short-term and long-term. “We’ve got a very silent group, to a much greater extent than to what appears in the U.S.,” Conway said. “What does that mean? Well, you’ve got to start dissecting and looking at what that says about class sizes, about teaching styles, about how well teaching style has adapted to larger class sizes, about the nature of the delivery.”

Conway said the University’s NSSE results indicate it can be a challenge for the University to attract the best students in the world—something Principal Karen Hitchcock has outlined as priority in her vision statement, “Engaging the World”—when there’s a lack of resources.

“Certainly our financial picture relatively compared to institutions outside of Canada is a big deal,” Conway said. “To the extent it appears that our NSSE results are bearing that out, are showing that funding actually makes a difference, certainly it places us at a kind of a disadvantage.”

Deane said Hitchcock is concerned with collecting information to find out what can be improved at Queen’s on the micro and macro level.

“You can’t engage the world if you are not engaging your own students,” Deane said.

AMS Academics Commissioner Pat Welsh said he’s happy to see the NSSE surveying both first-year students and fourth-year students. Often, Welsh said, a student’s problems with respect to class sizes and interacting with professors are rectified by fourth year.

“[The NSSE] draws a great magnifying glass to some problems we need to acknowledge of the first-year experience,” Welsh said.

One of the reasons the University participated in the NSSE, Conway said, was because it was getting frustrated with the manner in which external university rankings—like those from the Globe and Mail and Maclean’s—were conducted.

Deane said the NSSE provides a more helpful and more specific range of responses.

“I think [the NSSE] is very well regarded for its reliability of information,” Deane said. Conway said although not all of the Maclean’s rankings are wrong, he lacks faith in the rankings’ reliability. “It’s just a pretty indirect, reasonably inaccurate way of getting the information about things like student-faculty interaction,” Conway said.

“If Maclean’s is to be believed, student-faculty interaction and quality of relationships with students can be adequately measured by how many courses [have] between one and 25 [students], how many courses [have] between 26 and 50 [students]—which, when you get right down to it, is nonsense.”

Conway said since the Globe and Mail bases its University Report Card solely on student satisfaction, the results ultimately boil down to how happy a student feels.

Like the Maclean’s rankings, the information from the Globe and Mail’s university rankings isn’t meaningless, Conway said.

But by only asking how satisfied students are with class sizes, for example, he said the only constructive response amounts to more money.

“You have to take satisfaction surveys with a grain a salt,” Conway said. “They are limited.” He said satisfaction surveys within an institution—like the University exit poll students complete in fourth year—are useful over a longer period of time because the results are reasonably consistent.

“It’s much less likely to provide a reliable measure of the difference between Queen’s and U of T,” Conway said. “Because who knows what the expectations are of students going into U of T, who knows what set of criteria they’re using to assess class size?”

When commenting, be considerate and respectful of writers and fellow commenters. Try to stay on topic. Spam and comments that are hateful or discriminatory will be deleted. Our full commenting policy can be read here.