Funding formula attacked

Unemployed graduates could be costing Queen’s additional public funding under the provincial government’s Key Performance Indicator (KPI) operating grants.

The Ontario government recently announced it is increasing operating grants for postsecondary institutions by $72 million more than last year, bringing the total amount budgeted to $1.72 billion.

Of the $72 million additional dollars, $23.3 million will be tied to KPI surveys conducted by the Ontario Universities Application Centre and reviewed by the Ministry of Education.

The three indicators used to determine the distribution of funds are: graduate employment after six months; graduate employment after two years and the university’s graduation rate.

From the indicators the government will evaluate each individual program at a given university. The universities who surpass the funding threshold, which is determined by the average score of all the universities, in any of the three indicators are eligible for the funds. All those schools that score below the threshold receive no additional funding.

The Tory government hopes the KPI operating grants will reward quality programs and strengthen accountability in post-secondary institutions. There has, however, been criticism from faculty and students concerning what the indicators actually mean.

“The KPIs don’t measure performance in any meaningful sense,” stressed Henry Mandelbaum, Executive Director of the Ontario Confederation University Faculty Associations (OCUFA).

“Province wide indicators are convenient tools, just not good ones.”

Joel Duff, Chairperson for Canadian Federation of Students (CFS), told The Journal the KPIs are problematic because they don’t reflect a university’s own mandate, which could lead to an inaccurate evaluation of a program’s quality.

“Is there a crisis with accountability? No. Accountability is used [as a] disguise so that the government can micromanage our public institutions,” said Duff.

The AMS echoed Duff’s concerns.

“Key Performance Indicators should not be tied to funding,” said Christopher Lee, AMS Academic Affairs Commissioner.

“While Queen’s has fared relatively well under the current indicators, there is the concern that if future indicators were to be established they may not be congruent with [Queen’s] mandate.”

The KPI operating grants have also stirred up the age-old debate over the purpose of attending a university.

“Students are going to university so that they can have a successful career afterwards,” said Dave Ross, Senior Media Relations Coordinator for the Ontario Ministry of Education.

That is why graduate employment rates are central to the KPI operating grants, Ross explained.

Some students disagree with the idea that employability is central in measuring a program’s quality.

“[The KPI] de-emphasizes the value of education... which is the most effective device for equalizing society,” Duff said.

“Our government has failed to show a competence in determining what social priorities are.”

“Certain conditions in the community [surrounding a university] that conspire against employment are not taken into account.”

Duff said tying public funding to employability indicators will eventually result in an overproduction of graduates in one area, as students will naturally migrate to programs that are better funded.

“Talk about brain-drain,” Duff said. “This will drive people out of the province. There will be no jobs left in those areas.”

The majority of Ontario universities did very well in the KPI surveys, with most schools scoring over 90 per cent in all three indicators.

“Universities did well on the criteria the government is using...[and they are still] going to punish someone,” said Mandelbaum, director of the University Faculty Association.

Duff believes this is simply indicative of the provincial government’s mandate.

“It is the government’s agenda to cast a critical eye on all public institutions which are seen as hotbeds of waste.”

Paul Lomic, president of the Society of Professional and Graduate Students (SPGS) at Queen’s, expressed concerns about KPI operating grants.

“It is very important to evaluate university programs, but it then has to be asked, ‘what’s the best way to respond to the data?’,” said Lomic.

“A poor performing program is an argument for more funding, not less.”

Lomic pointed to PHD programs at Queen’s that have an extremely low completion rate as an example of a need for additional public funding.

“[A PHD thesis] is a big commitment, both in time and financially, and students get into them and find that they have very low funding.”

Ministry spokesman Dave Ross questioned Lomic’s claim.

“I don’t buy the argument that these programs aren’t getting adequate funding to run quality programs,” Ross said.

The CFS, OCUFA and AMS all said that instead of using province wide indicators to determine a program’s quality, benchmarks should be established by all post-secondary institutions that reflect their own proper mandates.

“While province-wide indicators are problematic, use of indicators within the institution can be beneficial,” said Lee.

All final editorial decisions are made by the Editor(s)-in-Chief and/or the Managing Editor. Authors should not be contacted, targeted, or harassed under any circumstances. If you have any grievances with this article, please direct your comments to

When commenting, be considerate and respectful of writers and fellow commenters. Try to stay on topic. Spam and comments that are hateful or discriminatory will be deleted. Our full commenting policy can be read here.