USAT revision process has a long way to go

Queen’s broken instructor evaluation system up for review

The USAT process allows students the chance to evaluate their professors.
Credit: 
Illustration: Amelia Rankine

With the end of the semester in sight, students and faculty alike are preparing for the annual 15 minutes of class time where the students do the evaluating. 

As a means of evaluating teaching practices at Queen’s, the Queen’s University Faculty Association (QUFA) collective agreement has mandated the use of University Surveys of Student Assessment of Teaching (USATs) since 1995.

While these student assessments are mandatory for instructors with class sizes greater than 10 students, the changes implemented as a result of these evaluations have been difficult to quantify. 

According to the Office of the University Registrar’s website, each USAT evaluation consists of four university-wide questions, up to seven selected by the department and as many as 10 selected by the instructor from 200 possible inquiries. 

Once completed, the finished surveys are returned to the instructor so they can read the hand-written comments. Eventually, the surveys are scanned so the data is compiled in a straightforward graphical report for either the department heads or the Dean of the faculty. 

In 2015, the president of the QUFA Lynne Hanson signed a Memorandum Agreement (MOA) to review teaching assessments with the University by appointing a Joint  Committee on the Administration of the Agreement (JCAA). 

The deadline for the review detailing JCAA’s recommendations was originally set for June 30, 2016. 

According to QUFA, the committee was given an extension from the JCAA until 2017 as the revision task has proven to be much more time-consuming than originally anticipated. 

Historically, students and faculty have raised concerns around the tangible change brought about through the use of USATs. In March 2016, a Journal editorial said the revision of the process was a welcome step only if the results of USATs were taken seriously. 

A 2016 Journal article delved into the lengthy and controversial dismissal of Queen’s professor Mortezah Shirkhanzadeh, due in part to his concerning USAT results.

In 2016, Shirkhanzadeh’s three-day suspension from Queen’s University was in part attributed to his refusal to meet with the dean after the results of his USAT assessments were discovered. 

Shirkhanzadeh’s USAT evaluations were passed on to higher administration by the dean even though the response rate for one of the questions was 16 per cent — much too low for statistical reliability. To date, there’s no faculty policy in place requiring instructors to meet with the Dean under such circumstances. 

This incident began to raise the issue of unfairly biased USAT assessments and the impossible task of dissolving the barriers department heads, instructors and students face.

“Students themselves show a lot of biases when it comes to a variety of aspects of a professor. We don’t want to ask misleading questions,” AMS Academic Affairs Commissioner Victoria Lewarne, told The Journal in an interview. 

According to Lewarne, revision of the current instructor evaluation system has been held back entirely by time constraints. AMS representatives, faculty members and representatives from QUFA’s JCAA have been sorting through massive amounts of possible USAT questions, and altering them to best suit the academic climate at Queen’s.

The questions were taken from evaluation systems at universities across Canada such as Western, Dalhousie and the University of Alberta. 

As of recently, nine final questions have been chosen. “There were a lot of questions and we wanted to find the ones that had the best value,” Lewarne said. 

The next proposed steps are to run pilot projects with the final drafted list of appropriate questions. These projects will go to classrooms from all faculties to determine if the questions chosen accurately represent both the instructor and course. 

“Looking forward we’re looking into those deeper questions. For example, what system should we be using? Should we shift away from paper and into more online [forums]?” Lewarne said. 

While the extension lengthened the revision process, the changes are expected to make an appearance in the next academic year.

In an email to The Journal, QUFA’s president and media spokesperson Kayll Lake and QUFA’s JCAA co-chair Michael White said they’re working together to draft an alternate student survey. Conducted through an online form, it could be tested as soon as 2018.

“At the moment we’re looking at asking better questions. There’s nothing ‘standing in the way.’ People are working very hard to move forward on what is a very important and complex task,” White and Lake continued.  

Meanwhile, the faculty attitudes towards USATs clearly indicates the system is broken.

Having been a CHEM 112 professor for the last 15 years, Dr. Michael Mombourquette said USATs are “extremely inaccurate.” 

“I think they’re influenced by things that aren’t related to a professor’s teaching,” he said. “The ratings are not reflective of the dedication or work the professor has put in.”

According to Mombourquette, his ratings used to be among the highest in the department. He said this changed when he began sharing the course with another, more personable, professor. “That year my rating went from way above average to literally the worst in the department.”

“I was in tears when I read those USATs. The negativity was overwhelming,” Mombourquette said. Despite the dramatic differences in his USAT scores between those years, he said his teaching methods had remained almost exactly the same. 

“I was seriously wondering whether I was cut out to be a professor. I had been teaching for 15 years at that point. I still wondered: What the hell happened?” Mombourquette said. 

One little-known aspect of the USAT evaluation process is the effect the ranking has on a professor’s career and salary. According to Mombourquette, department heads assign professors a ranking from 1-20 based on perceived teaching ability, amongst other factors. 

“My salary has been hurt by USATs. I don’t think a student’s personal opinion should have an impact on [that] like it does. It’s wrong.” Mombourquette said, citing his average rating as a barrier to any possible promotion.

Mombourquette also stressed factors like where a student went to high school and what level of difficulty they’re used to tend to impact scores as well. “It’s pretty obvious that some students are well prepared and some have no idea what they’re doing. You’re going to get a different review from both of those students as well,” he said.

While the JCAA has received an extension until 2017, their plans to simply re-evaluate the questions asked might just address one small part of a larger issue. 

“The way it is right now, the questions have limited usefulness. [The revision] could be a good thing,” Rob Campbell, upper-year biochemistry professor, said in an interview with The Journal. 

“As of right now, we’re only able to respond [to USAT comments] after the course is done.”

Ideally, Campbell said USATs would be an accurate reflection of instructor’s teaching abilities. But the system has a long way to go before all involved parties are represented in a way that’s conducive to change.

“Students should know that we really do look at them. And those that care about their teaching do pay attention to them,” Campbell said. 

When commenting, be considerate and respectful of writers and fellow commenters. Try to stay on topic. Spam and comments that are hateful or discriminatory will be deleted. Our full commenting policy can be read here.