Tuesday, 19 November 2013

The growing cost of administration caused another, really quite disturbing, tension. Sadly, a culture of mistrust unfolded between academic and general staff. I have been both. Since the 1990s I have had general staff positions at every level from two to eight (for those who are unfamiliar with this remnant of public service culture, this range is from very low to quite high on a scale of one to ten) and academic staff positions that include the full spectrum of casual teaching, part-time research-only work and permanent full-time teaching research combinations, based in academic Faculties (spanning science, social science and the humanities) and central university units. What this means, though, is that I have not only seen, but I have also felt the tensions from both sides of the fence.

For academics, a great deal has changed quickly and it all seems to have resulted in more work for them, but more resources for administration. Many academic colleagues from a variety of institutions have pointed that their student numbers have increased significantly, administrative staff even more so, but the number of academics has not kept pace with student growth at all. Exact figures exist – between 2008 and 2012 at the Australian Catholic University, student numbers increased by 56 per cent, but academic staff by only 24 per cent. In that time frame, by contrast, administrative staff increased by 67 per cent. This seems compelling, but numbers, as we know, can be complicated things. These figures for example do not really tell us what it was like before (whether administration was severely short-handed, for example) and do not indicate any shift of administrative tasks from academics to general staff that may have taken place.

But the sentiment it expresses is a trope in the ‘Jeremiad’ literature too. The university seems to have been taken over by administrators. A loss of esteem and power is associated with this, with a corresponding gain by managers, marketing professionals and accountants. Academics have not made this up, it has happened. One symbol, a senior and very esteemed professor told me, was that when searching vainly for a car parking spot on campus he passes many spaces reverently reserved for non-academic managers. Academics, according to the literature on ‘academic capitalism’ have become plug-and-play ‘content specialists’ now seemingly ancillary to the real purpose of the university, which is to run itself. This is the ‘university of excellence’ that American scholar Bill Readings critiqued in the famous 1990s book The University in Ruins – a kind of non-university, where its purpose (in alignment to the postmodern condition) has disappeared from its core, and where it does not in fact do anything but be ‘excellent’.

The casualization of academic labour seems a symptom of this and also contributes to the tension between academics and general staff. More scholars are employed as casuals in Australia than have part-time or permanent jobs. They look at the relative security of staff who have not had the additional strain, time and financial cost of completing a PhD and feel that the university system has betrayed them. Far from a community of scholars, collectively responsible for the quality and value of university knowledge, scholars (both casual and not) feel themselves to be treated as a highly trained proletariat fulfilling every (sometimes ill-informed) desire of a set of less-qualified administrators.

While many academics deserve more respect and understanding than they feel they are currently getting, in other regards it is an improvement on the past. The situation I saw, even in the 1990s, was one where demure secretaries called every academic by their full title, while they were ‘girls’ known only by their first name. As a young 20-something administrator, senior professors expected me not only to stop my work, no matter how urgent, to make them a coffee, but also to entertain them with (often slightly sexualised) chit-chat for as long as they wished. A more professional respect for administrative staff was a long time coming, I think.

Nevertheless, the literature recounting academics’ litany of complaints, as I’ve mentioned several times, is significant. I shall turn to the feelings of those on the other side, the now-highly professional and often substantially educated general staff. A somewhat caricatured version of their commonly-voiced view of academics is that scholars are arrogant divas, always claiming that some policy should not apply to them, never wanting to do anything that every other employee in the country is required to do (like undergo performance management or use company branding), is unprepared to participate in training that would make them better teachers or more sensitive to Aboriginal students, and unavailable whenever the university asks them to help with something. Then, these whining recalcitrants believe they should be running the place, even though their administrative skills prevent them even from filling in a basic form without assistance and they are often incapable of using equipment as simple as a photocopier. ‘Collegial’ management is chaotic, wasteful and often alarmingly self-interested. It is high time, some maintain, that universities clamped down on these arrogant academics and made them toe the line.

From the perspective of professional behaviour among academics they do, we must somewhat reflectively admit, have a point. Nevertheless, we should also acknowledge that there were benefits to the structural ideal of the community of scholars in protecting university knowledge. It may have caused patterns of behaviour that to outsiders appeared crazy, but to many academics they seem so obvious that they feel outraged at any alternative suggestion.

This is why the universities are in such a mess. The structures that academics have been trained to value completely contradict the structures in which they work. It is time to consider what these structures are.

[from here I talk about three fascinating interacting economies in the university that have emerged since the 1980s…but to read those bits you'll have to wait for the book. Hahahaha *evil laugh*]

--- This is a short segment of the first draft of Chapter 7: The DVC Epidemic in the book I'm writing, Knowing Australia: a history of the modern university, for UNSW Press ---

Friday, 8 November 2013

A major target for criticism by academics has been the cost of compliance with regulatory frameworks imposed by government. This is a result of longer historical developments than scholars often suppose. Back in the old days (from the 1957 Murray review onwards) the Universities Commission decided on everything. Salaries were fixed (so the unions only negotiated with one ‘boss’, the government) and so was everything else. The Universities Commission told universities which buildings they could build, how much equipment they could purchase, which courses they could run (calculated on ‘manpower’ projections) even the precise size of the offices allowed for professorial and sub-professorial academics. In their history of Monash University, Graeme Davison and Kate Murphy tell of staff attempts to get around the Universities Commission office size specifications, so that tutorial groups could fit into staff offices. They were so creative that there remain, in certain buildings, some otherwise-unexplainable architectural oddities, a result of compliance with bureaucracy at its maddest.

This type of regulation, which stories like this one at Monash show, must have often been irksome. But it was cheap. Deregulation of many aspects of running universities gave institutions the ability to be more flexible under a range of different circumstances, but it also meant that universities needed the people and information with which to make such decisions themselves. And with deregulation, the decisions and tasks only expanded.

While the flexibility to make decisions that made sense locally must have been freeing, it also meant that for government, monitoring universities became more important. With so much at stake, governments needed to know what universities were doing, constantly. This reflects the behaviour of what political scientists refer to as the ‘regulatory state’. The regulatory state describes a shift in the character of governments as they responded to global economic change. Rather than using their discretion to govern broadly, like using taxation to control the economy and making decisions in an ongoing way, the regulatory state created rules and institutions within which everything (‘everything’ meaning the market) must act. In many states, a key example is the central independent bank. Kanishka Jayasuriya argues that this marks a shift in the character of organisations like universities away from being welfare-like bodies, providing an educational service, to institutions that regulate an educational market for citizens. These citizens were then defined by their capacity to participate in the market. An upside to this market-citizenship is that it led to moves to include more citizens in universities, for excluding people was also to deny them citizenship. The effect (as will be discussed in chapter ten) was widening participation in higher education, extending its benefits to more members of society.

This transformation, however, came with lots of rules. While the Australian Reserve Bank became responsible for regulating many aspects of the economy that used to be the task of government, universities were now largely responsible for the health, safety, civility and prosperity of the nation. Rules and measures to assure that banks and universities fulfilled their hefty responsibilities were needed: and the creation and monitoring of those rules became the task of the state. Within universities, this led to changes in their structure. Someone in each institution needed to be held to account for maintaining the rules and for performing the tasks within them. These people were the DVCs. This is one of the main reasons for the DVC epidemic.

It began with DVCs (or PVCs) for research. Generic DVCs to support vice-chancellors had been in place at some of the larger institutions since the 1960s and 1970s, but now more specialised labour was needed. At Macquarie University, a PVC for research was created in 1987. The University newspaper argued that the position was now needed, in part because of the new imperative to commercialise research (more about that in the next chapter) and due to the new complexity of relations with Canberra in research funding – the ARC was on the offing. But by bit, beginning with the research portfolios, the universities created a top layer of DVC-types to be the face of each issue.

No longer was a DVC in fact just a deputy to the vice-chancellor. Their responsibilities began to reflect a new specialisation of labour required at the interface between universities and society, especially government. Their task within the university was to implement, monitor and assure compliance with the myriad rules applied by the regulatory state. So, exorbitant salaries aside, they seemed to become the ‘enemy’ of rank and file university staff.

---- This is a first draft of a segment of Chapter 7: The DVC Epidemic for the book I am writing, called Knowing Australia: a history of the modern university, with UNSW Press ----

Tuesday, 5 November 2013

In August 1991, British computer scientist Tim Berners Lee published details of the idea he had been developing for some time – the World Wide Web – and created the world’s first web pages. Networked forms of communication and data sharing (known as the internet) had been in use among specialised groups of scientists, government organisations and a small number of educators for some time. But the World Wide Web changed the ways we would access and use information, redefining many aspects of our social and working life.

The web revolutionised businesses and the global economy and, before the end of the 1990s, many people in universities worldwide were persuaded it would transform them, too. Some believed it might cause the end of the university; some thought it marked a new beginning. Yet others found such revolutionary language in either direction to be an overstatement of what was fundamentally just a continuity of past habits, especially among distance educators, to adapt new technologies to help facilitate the human interaction that was so central to the university’s educational mission. For higher education, the web represented a possible democratisation of educational opportunity, literally giving access to segments of society who were never able to enter higher education in the past. In the very same moment, however, the web was also the universities’ latest get-rich-quick-scheme, representing, for some scholars, all that went wrong in universities at the end of the twentieth century.

Simultaneously expressing hope and despair, the debates around eLearning remind me of a classic essay by German scholar Walter Benjamin, ‘The Work of Art in the Age of Mechanical Reproduction’, published in 1936. Benjamin argued that when art could be endlessly reproduced, it would no longer be elite. To him, this was a good thing – art was more accessible, it could be owned and accessed by everyone. At the same time, though, art would also become commodified in a new way – it would become a consumer product for a mass market. This would change the value of the work of art in a really important way. The value of art would move away from its aesthetics, or its place in the history of artistic thinking – all of the reasons art was appreciated. Instead, value would be in art’s market value, in its price.

In exactly the same way, university engagement with digital technology has significant implications, but it is not clear yet just how positive they all are. Australian scholar, Gerard Goggin, author of a history of the Internet in Australia, argues:

We are in the grip of a powerful social imaginary of the university in which digital technology is a cardinal element. The history of technology and social relations, and their critiques, show us that this can be a very heady thing indeed.

This chapter explores the development of digital technologies in Australian universities from the perspective of these two powerful ‘social imaginaries’. The first represents a kind of redemption for the university, undermining some of its old habits in producing and reinforcing elite Australia, just as the photographic reproduction of art did for Walter Benjamin. The second is in the ways eLearning also represented a commodification of higher education, knowledge and information. It may also, just as mass-produced art was valued (by its producers) for its price, represent a shift from an ideal for university knowledge to a crude online education market. Leaping from the ivory tower, so to speak, into the waiting arms of late capitalism.

--- This is a rather rough and totally unedited segment of my first draft of Chapter 9: Knowledge in the Age of Digital Reproduction, from the book I am writing Knowing Australia: a history of the modern university, for UNSW Press. ----

Monday, 4 November 2013

The idea of studying anywhere had particular appeal for a nation so large, but sparsely populated, like Australia. Indeed, Australia and Canada had a particular advantage in eLearning: they each had strong traditions in distance education. The distance and isolation of many communities in made distance education a useful tool for economic and community development. As a result, while initially, distance education in Australia was more coordinated and targeted, with more substantial government oversight than in most other countries.

The School of the Air, based in Alice Springs, was established at the Royal Flying Doctors offices in Alice Springs as early as 1951, highlighting to the public the difficulty of offering even primary school education in remote areas of Australia. The University of New England (UNE) began teaching external students soon afterwards, in order to fulfil its mission to educate regional Australia. The UK’s Open University inspired later developments so that in the 1970s ‘dual mode’ institutions like Deakin University enrolled students in the same courses face to face and by distance in order to offer the opportunity for higher education to students who might otherwise miss out.

As technologies changed, so did the modes of communication with distance students. Telephone calls and television broadcasts were added to written ‘correspondence’ courses, and teleconferences were conducted to allow collective student interaction. While email was used very quickly indeed, videoconferencing was less effective, for throughout the 1990s it was expensive to run and very difficult to access in any widespread way.

For distance education practitioners, eLearning was merely an extension of what they had always done: trying to find new and better was of communicating with students and fostering interaction between them. The discipline of educational design, which grew rapidly as technological and staff development was demanded within all universities, simultaneously drew upon and ignored existing pedagogic practices in distance education. Distance educators were often frustrated by the wheel reinvention that eLearning designers conducted and the language of innovation that they used to pronounce their discovery of things distance educators had been doing for many years. I must admit, for five years before beginning my history PhD, I was one of those upstart eLearning practitioners. I am grateful for the patience with which my colleagues in distance education nevertheless supported my development. It was not only me: the long term thinking and design habits of distance educators gave Australian eLearning a critical and methodological edge, particularly in the early years.

Not all eLearning was for the purpose of distance education, however, and terms to describe the use of technologies for campus-based students included ‘blended learning’ or ‘technologically enhanced learning’ and so on. In light of how rapidly campus-based technologies were adopted, within a decade of the universal rollout of learning management systems, some were already declaring the ‘e’ in eLearning redundant. Technology imbued all that we did, learning and otherwise. Campus-based and distance education each required technology, for everything did.

While distance education was not the sole function of eLearning, Peter Goodyear pointed out that it nevertheless carried a utopia that learning could be completed ‘anytime, anywhere’. The democratising discourse of distance education and the World Wide Web both compel the addition, ‘anyone’. Anytime, anywhere, by anyone might be a little utopian, but for many members of society eLearning offered opportunities to participate in collaborative learning that they would not otherwise have. As well as those located in remote regions, the disabled, single parents, full-time workers and others have enjoyed the flexibility that eLearning offered. Of course traditional distance education could have done something similar, but – well, frankly, it didn’t. eLearning made these opportunities more commonplace. Nevertheless, as Goodyear argued, the optimistic ‘can-do’ quality of eLearning and its capacity to democratise education needs some closer analysis.

--- This is a rather rough and totally unedited segment of my first draft of Chapter 9: Knowledge in the Age of Digital Reproduction, from the book I am writing Knowing Australia: a history of the modern university, for UNSW Press. ----

E-learning happened incredibly quickly. Nerdy types in universities – and one of the great things about universities is that they offer a generally supportive environment for technological experimentation – began to develop ways of using libraries and teaching online as soon as web technologies afforded it. Nevertheless, the Australasian Journal of Educational Technology shows that in the 1990s, most innovation was being poured into multimedia, with the web a more marginal issue for some years. Part of this was download speed: the ‘world wide wait’ was the web’s nickname in the mid-1990s, as cups of tea were made just waiting for basic pages to load over spaceship-looking modems that also inconveniently tied up the phone line. Delivering interactive course materials by CD or even floppy disk was often a better option – and there was no guarantee that students could access the web anyway.

While there were some frustrations as the technology emerged, the web came with some really sexy ideas. Hypertext was one. The idea that information could be organised in such a way that people could create their own pathway through it began to disrupt the idea of the author’s singular and linear authority. Roland Barthes’ 1968 declaration of ‘The Death of the Author’ now had a technology to assist with the author’s ongoing demise. Putting power into the hands of the ‘user’, the ‘reader’ or even the ‘consumer’ became a key element of the fantasy of the World Wide Web. These had resonance with some of the pedagogies promoted by the likes of Paulo Freire in the 1960s and 1970s that sought a relocation of power and even knowledge, from teacher to student. Use of technology to support student learning was therefore somehow intrinsically ‘student centred’, a phrase meant to describe a revolutionary pedagogy but which soon became a type of university dogma deployed for enhancing profit – we will get to that later.

In 1996, the landscape changed again when Canadian Murray Goldberg presented WebCT to fellow educational technologists. In WebCT, Goldberg quickly proved there was a market for mass distribution of what became known as a ‘learning management system’. It was so successful that within four years of that presentation, WebCT was being used by 6 million students in 57 countries. Blackboard Inc. was established in 1997 and showed a similar rate of growth. Those two companies later displayed such predatory behaviour that they bought out nearly every competitor, eventually buying out each other to form a singular eLearning monolith.

It was truly a heady time for the World Wide Web, with investment in anything ending with .com or beginning with e- escalating at such a pace that companies could vastly increase their stock market value simply by adding a prefix or suffix. The total value of the stock market grew at such ridiculous rates that speculators soon identified it, not as the revolution in business that in fact it would eventually prove to be, but an empty bubble – which burst in 2000, just before the internet really came into its own. Several of the dot coms, like Amazon, returned as the slower process of embedded change proceeded, but a good deal was lost in the rush.

The idea that hypertext embodied - that authority over the route through knowledge could now be moved away from central authoritarian experts - extended at around this time beyond the reader’s agency. Now, it was hoped, the web would also be ‘user-generated’ – the reader would become the author. The ‘wiki’, a simple technology that allowed multiple users to produce content online, became increasingly popular. The format (and the idea of shared online authority) was immortalised in Wikipedia, established in 2001.

The soft-anarchist idea of redistributing authority over knowledge, news and information was also disrupting the notion that anyone ‘owned’ knowledge at all, a threat aimed in particular at media empires such as that owned by Rupert Murdoch, but also seen as a disruption to recent global moves in the ‘knowledge economy’ to commercialise education. Movements that opposed traditional constructions of intellectual property and copyright were attached to some sophisticated theoretical discussions. These included the suggestion that sharing knowledge, making it more open, would foster a new kind of wealth for the world: a position advocated by Yochai Benkler in a book published in 2006. An earlier version of this idea was expressed in Lawrenece Lessig’s business, Creative Commons, which in 2002 founded a legal structure by which knowledge and media could be shared. The ‘mashup’, creating something new from a range of work types was a politically subversive act in opposition to big media’s alliance with nation states to control copyright. Though the mashup is arguably now common practice by nearly every university teacher in preparation for lectures.
The suggestion that the web offered a structure for knowledge that was fundamentally different to older forms of publishing encouraged software developers to think of ways of automating and facilitating user generation of material. Web 2.0 (where the point-zero was pronounced) was the trendy term of the moment, coined in 1999 but still in use a decade later. This form of interaction led to the development of social networking tools, such as Facebook, launched in 2004, which set a new standard for the purpose and nature of the web.

For university policy makers, it was a confusing set of issues to navigate. In the new environment where a growing proportion of university income was from course fees (so that some courses were in open competition with those offered at other universities) should universities go to extra lengths to protect their course material? Or did the new environment give universities a moral imperative to become more open and make their courseware available freely? While most universities initially took the first route, Massachusetts Institute of Technology (MIT) very publicly took the second. From 2002, MIT progressively made their courses available. They started with 50 courses and by 2012 uploaded more than two thousand.

The consequences were a little like the avalanche that was caused when universities began to advertise to attract students. An online presence was no longer sufficient for the rapidly growing number of universities. Many found that they now needed to be able to profile themselves through what they made available online. In 2007, apple’s online music store, iTunes, began iTunes U, inviting universities to make use of their popular platform to showcase what they do. Carefully designed widely available mass courses were experimented with by George Siemens, Stephen Downes and Dave Cormier in 2008; in 2011, Stanford University offer introductory Artificial Intelligence to more than one hundred thousand registered participants for free, twenty thousand of them completing the course. In 2012, Coursera was established, franchising this concept: Massive Open Online Courses, or in acronym-loving nerd speak, MOOCS. In MOOCS, thousands of students were able to choose (so advocates claimed) the best material by the world’s experts, from anywhere in the world.

--- This is a segment of my first draft of Chapter 9: Knowledge in the Age of Digital Reproduction, from the book I am writing Knowing Australia: a history of the modern university, for UNSW Press. ----

"[There is an] ideological force driving the assault on the independence of universities in the (broadly conceived) West. This assault commenced in the 1980s as a reaction to what universities were doing in the 1960s and 1970s, namely, encouraging masses of young people in the view that there was something badly wrong with the way the world was being run and supplying them with the intellectual fodder for a critique of Western civilisation as a whole.

The campaign to rid the academy of what was variously diagnosed as a leftist or anarchist or anti-rational or anti-civilisational malaise has continued without let-up for decades, and has succeeded to such an extent that to conceive of universities any more as seedbeds of agitation and dissent would be laughable.

The response of the political class to the university's claim to a special status in relation to the polity has been crude but effectual: if the university, which, when the chips are down, is simply one among many players competing for public funds, really believes in the lofty ideals it proclaims, then it must show it is prepared to starve for its beliefs. I know of no case in which a university has taken up the challenge.

The fact is that the record of universities, over the past 30 years, in defending themselves against pressure from the state has not been a proud one. Resistance was weak and ill organised; routed, the professors beat a retreat to their dugouts, from where they have done little besides launching the intermittent satirical barb against the managerial newspeak they are perforce having to acquire."

At least now there is one woman considered to be an exceptional highlight.

Next we need a line-up and a theme that will do more than speak to the fantasies of a handful of politicians and vice-chancellors and actually confront the really significant challenges higher education has, perhaps with a view to lobbying on its behalf instead of telling politicians just how much higher education supports them. I am not currently hopeful.

Search This Blog

About Me

I have kept this blog since 2008. In that time I completed a PhD in the history department at the University of Sydney called “The Ownership of Knowledge in Higher Education in Australia, 1939-1996” and have begun new work. This blog recounts my pathway through my research and thinking to include work on social inclusion, historiography, labour history and the history of knowledge. Hopefully it goes without saying that anything here is a draft. It is a blog, not a book. Lots of times I could be wrong - if I am, please tell me.