Category Archives: Quality

One of the main struggles with measuring performance in higher education – whether of departments, faculties, or institutions – is how to measure the quality of teaching.

Teaching does not go entirely unmeasured in higher education. Individual courses are rated by students through course evaluation surveys, which occur at the end of each semester. The results of these evaluations do have some bearing on hiring, pay, and promotion (though how much bearing varies significantly from place to place), but these data are never aggregated to allow comparisons of quality of instruction across departments or institutions. That’s partly because faculty unions are wary about using individual professors’ performance data as an input for anything other than pay and promotion decisions, but it also suits the interests of the research-intensive universities who do not wish to see the creation of a metric that would put them at a disadvantage vis-a-vis their less-research-intensive brethren (which is also why course evaluations differ from one institution to the next).

Some people try to get around the comparability issue by asking students about teaching generally at their institution. In European rankings (and Canada’s old Globe and Mail rankings), many of which have a survey component, students are simply asked questions about the quality of courses they are in. This gets around the issue of using course evaluation data, but it doesn’t address a more fundamental problem, which is that a large proportion of academic staff essentially believes the whole process is inherently flawed because students are incapable of knowing quality teaching when they see it. There is a bit of truth here: it has been established, for instance, that teachers who grade more leniently tend to get better course satisfaction scores. But this is hardly a lethal argument. Just control for average class grade before reporting the score.

It’s not as though there isn’t a broad consensus on what makes for good teaching. Is the teacher clear about goals and expectations? Does she/he communicate ideas effectively? Is he or she available to students when needed? Are students challenged to learn new material and apply this knowledge effectively? Ask students those kinds of questions and you can get valid, comparable responses. The results are more complicated to report than a simple satisfaction score, sure – but it’s not impossible to do so. And because of that, it’s worth doing.

And even the simple questions like “was this a good course” might be more indicative than we think. The typical push-back is “but you can’t really judge effectiveness until years later”. Well, OK – let’s test a proposition. Why not just ask students about a course they took a few years ago, and compare it with the answers they gave in a course evaluation at the time? If they’re completely different, we can indeed start ignoring satisfaction types of questions. But we might find that a good result today is in fact a pretty good proxy for results in a few years, and therefore we would be perfectly justified in using it as a measure of teaching quality.

Students may be inexperienced, but they’re not dumb. We should keep that in mind when dismissing the results of teaching quality surveys.

Well, the first thing to note is that the system hasn’t finished growing. While the major metropolitan areas of the north and centre have PSE attendance rates that approach those in Canada, outside of those very small areas, the average is less than half that. Even in fairly prosperous coastal provinces like Zhejiang and Guangdong, participation rates are less than half of what they are in the rich metropolitan zones. Now, some of the growth in participation rates will be taken care of via demography. China’s 20-24 age cohort will shrink in size by about a third between 2011 and 2016 thanks to the One Child Law (demographically, think of China as just a really big Cape Breton), so participation rates will rise significantly even if China does no more than stay steady in terms of enrolments. But it’s still a safe bet that in most of the country, we can expect more universities to open, and existing second-tier institutions will need to be expanded.

Participation rates by Region, 2008

So with demand for education set to rise, and the country still struggling to absorb all the graduates from the past few years, I suspect the bigger issue going forward for China is going to be quality. China’s worked out how to expand its system. What it hasn’t done quite yet is worked out how to spread excellence beyond its top research schools (the Chinese equivalent of the Ivy League is the C-9: Peking, Tsinghua, Fudan, Zhejiang, Nanjing, Harbin Tech, University of Science and Technology of China, and the Jiao Tongs at Shanghai and Xi’an).

And even at these schools, some of the excellence is only skin deep: they might be able to butt into world league tables based on publication counts, by doing things like requiring all graduate students at 985 universities to get two publications in Thomson ISI-indexed journals (seriously… can you imagine doing that here? The system would totally collapse), but those articles’ citation counts are much lower than at large western universities, indicating that the rest of the scientific world doesn’t think they’re up to much. But changing that means changing academic cultures – some of which have become sclerotic and corrupt (this stinging editorial in Science magazine [link is to an ungated copy] by two Chinese academics who had returned home from academic careers in the US, Shi Yigong and Rao Yi, is one of many pieces of evidence that could be cited here). As we know here in North America, there is very little that is harder than changing academic cultures. If China works out how to fix that problem, then there’s genuinely no reason it won’t lead the world at pretty much everything. But I have my doubts.

The outlook in China then is pretty simple – mostly continuations of recent trends, with a greater emphasis on quality and employability. And until they get that sorted out, there will continue to be opportunities for western institutions seeking to poach those who can’t get into 985 universities.

Everyone should check out this story from the Guardian on Tuesday, which nicely encapsulates the way universities have rhetorically boxed themselves in on the student experience.

Some background: in late 2010, the UK government decided to cut operating grants to universities by 41%, and to allow tuition fees at universities in England and Wales to rise to £9,000 (+150% or so). Even though the policy change hasn’t had a huge effect on access, students are clearly now paying a lot more for essentially the same experience. Last week, one student consumer group published a report showing exactly how little has changed: despite the massive fee rise, students are only getting an extra 18 minutes a week of contact time with professors.

This brings us to the story I’ve linked to at the beginning of this post, in which Universities UK CEO, Nicola Dandridge, dismissed the findings about the 18 extra minutes by saying, “It is misleading to make a crude assumption that time spent in lectures and seminars can be equated with university course quality”.

Hmm.

Hmmmmmm.

I get the point Dandridge is trying to make – it’s not just course hours that matters, but also teacher quality, infrastructure, curriculum, etc. But would she have said the same thing if the journalist had been asking about MOOCs vs. traditional universities? Not in a million years. Unless you’re employed by a university that owns shares in EdX, the standard response is that “MOOCs are a great product for some, but most students still want the irreplaceable experience of being in a classroom with a great teacher”.

Universities can’t simultaneously say, “the in-class experience is brilliant and irreplaceable”, and “it really doesn’t matter how many contact hours you have”. That’s called “trying to have it both ways”.

It’s absolutely true, of course, that the way learning occurs at universities is only partly dependent on what happens in classrooms; a lot of the benefits of university learning comes from the serendipity that occurs when you cram lots of young, curious people into the same physical space for four years, and let them rip. And MOOCs are low on serendipity.

The problem is that we don’t know much about measuring serendipity (which is why we fall back on measures like class time and contact hours), and where we do have an inkling, universities often avoid presenting this information (at least in a format accessible to outsiders). And yet, when it comes to undergraduate education, this very serendipity provides universities their genuinely unique value proposition – it’s what they can do that no one else does.

If universities genuinely want to prove value, they need to focus on measuring serendipity, and working relentlessly to increase it. That’s their UVP. That’s the ballgame.

This is a key question confronting every university administration. Our PSE institutes are businesses – complex organizations which require enormous amounts of money, from diverse sources, in order to succeed. For many reasons, it is a blessing that they are not oriented towards profit. But without a clear bottom line, how do you actually know when to spend, and when not to spend? What replaces the discipline of the market, come budget time?

At one level, the answer is the same as it is for every other non-profit: priorities are governed partly by mission, and partly by the availability of funds. But universities’ mandates are unbelievably broad – almost limitless, actually. Take “meeting students’ needs”, which is a common enough mandate. What could that encompass? Better seating in the cafeteria? A new gym, or hockey arena? What about an international airport? Similarly, consider a university’s mission to, “advance the frontiers of knowledge”. Does that just entail upgrading computer labs, or could it also include, say, purchasing a space flight simulator? What about an actual space station?

Here’s the problem. The only bar universities set for themselves is “quality”, and virtually anything can be justified in the name of “quality”, so long as the money is available to pay for it. So if you give universities money – any amount of money at all – they will spend it, because they have no instinct to tell themselves not to. The polite term for this is, “the revenue theory of expenditures”. However, another way to think about this is to consider that universities – like beagles and hamsters –lack a brain mechanism that tells them to stop eating.

Basically, universities are good at growing, but terrible at shrinking. The answer to every problem is always “more”. If more comes from government, great. If it comes from alumni or business, great. If it comes from students, great. Universities are agnostic about sources of money, but fundamentalist about aggregate incomes.

But we seem to be entering a phase in which governments aren’t in a position to provide more money, and students and parents seem more reluctant than usual to pay or donate more. And if more revenue is out of the question, then diets are the order of the day. How will universities react? Nick Rowe has painted one particularly nasty, but plausible, scenario, which bears pondering. His specifics are a bit different than mine would be, but the basic point – that universities on a diet are incredibly difficult to manage – deserves a lot more attention that it currently receives.

University income has been flying upward for most of the last decade; when these beagles land, it will be with a heck of a thud.

I see that a number of foundations – including the Knight, McCormick and Scripps-Howard Foundation– have written an open letter to American university presidents, urging that they make Journalism schools “more like medical schools” and teaching them through immersion in “clinical, hands-on, real-life experience”. From a historical perspective, this is a deeply weird development.

Foundations have played a significant role in changing the course of professional education on a couple of occasions. In 1910, the American Medical Association and the Carnegie Foundation teamed up to pay Abraham Flexner to report on the state of medical education in North America. His finding – that medical schools were too vocational an insufficiently grounded in scientific disciplines such as biology – was a key development in the history of medicine. It was only after Flexner that university-based medical schools decisively ousted the proprietary medical schools as the primary locus of training future doctors, and turned the medical profession into one which mixed practice with research.

Forty-five years later, widespread dissatisfaction with American business schools led the Carnegie and Ford Foundations to instigate reports and programs designed to transform business schools into more research-oriented units with intimate links to various branches of the social sciences such as sociology, anthropology and economics. These had substantial short-and medium-term impacts, if more ambiguous long-term ones.

In both cases professional education was to be improved by making it more “disciplinary” and more concerned with “fundamental knowledge”. It’s therefore more than passing strange that Foundations are now telling universities to make their J-schools less concerned with fundamental knowledge and more concerned with day-to-day experience.

Partly, it’s a different in the nature of the Foundations involved (Carnegie and Ford were considerably more removed from the worlds of medicine and business than the Scripps Howard Foundation is from journalism). Part of it, too, might be the nature of journalism itself; its practitioners may simply not need fundamental knowledge in order to be effective in the way doctors need biology and business-folks need econ/finance. (By extension, maybe journalism shouldn’t be taught in a university setting at all.)

What is unmistakable though – and more than slightly worrying – is the flat-out threatening tone taken by the Foundations in their letter, telling Presidents that institutions which don’t get with the program “will find it difficult to raise funds from Foundations concerned with the future of news”. Apart from being classless, a touch more humility about proclaiming any given educational model as the One True Way is surely in order.

It’s a big day at HESA, as it’s release day for our final report on the Consultation on the Expansion of Degree-granting in Saskatchewan that we’ve been working on for a few months (available here). I can’t tell you what it says before it comes out – but I would like to talk about one of the key themes of the report: trust.

If you issue degrees, people need to be able to trust that the degree means something. In particular, students need to know that a degree from a given institution will be seen as a mark of quality by employers; otherwise, the degree is worthless. Worldwide, the function of quality assurance agencies – third-parties giving seals of approval either to individual programs or to institutions generally (either by looking directly at quality or by looking at an institution’s internal quality control mechanisms) – is to assure the public that degrees are trustworthy.

In Canada, many people have looked askance at these bodies, seeing them as unnecessary bureaucratic intrusions. “We never needed these before,” people grumble. “Why do we need them now?”

To an extent, the grumblers have a point. Trust is usually earned through relationships. People in, say, Fredericton, trust UNB not because some agency tells them to trust it but because it’s been granting degrees for going on 200 years now; they’ve seen the graduates and can gauge the quality for themselves. This is true across most universities in Canada; they’re old, solid and hardly fly-by-night and people know who they are. And there tend not to be more than four in any given urban area, so pretty much everyone knows someone who went to school “X” and can thus gauge an institution’s quality directly.

But what happens when you let new players, like private universities or community colleges, into the degree-granting game? What happens when universities start having to look abroad for students? How can employers in Canada trust new players? How can employers in Turkey or Vietnam trust any Canadian university they’ve never heard of?

Canada was able to get away without quality assurance for so long mainly because our system of giving a relatively small number of large public universities a monopoly over degree-granting was well-suited to engendering trust – especially when 90% of their students were local. But open up degree-provision, or widen the scope of your student base, and suddenly trust isn’t automatic anymore. You need a third-party to give a seal of approval to replace the trust that used to come naturally.

Quality assurance isn’t anyone’s idea of fun. But it isn’t the frivolous, makework bureaucracy the grumblers criticize, either. Rather, it’s a rational response to changing patterns in the provision and consumption of higher education.