In twenty years time, much fewer school leavers will be going to college. This will be simply because there will be so many more options available to them. Many of the options will be based on technology and the Internet and will be attractive for many reasons, not least that they will be much less expensive for the school-leaver. The strongest economies in Europe already have a balanced academic/vocational delivery model and England is following with an apprenticeship levy. This rebalancing of the academic and vocational is happening worldwide, including Ireland. As well as the increased availability of apprenticeship courses in many more fields, colleges will deliver more of their own courses in an apprentice style or work-based learning mode and these will include a significant amount of online learning. These will reduce costs for learners by allowing them to live at home for longer and earn while they complete their courses. Much more, if not virtually all, university courses will become available in distance-learning modes over the Internet and this will allow school leavers to go straight into jobs and gain their qualifications more gradually, even taking more time to select the right course. In addition to cutting the cost of their education through working and living at home, fees will decrease simply because the number of people simultaneously taking these courses will reduce unit costs. You can, right now, get a four-year degree in Computer Science from the University of the People, an accredited online university for $4,000. Indeed the claim that students need personal interaction with their lecturers and peers will be counteracted by the claims from employers that full-time students are not prepared for the workforce and that those who work and learn at the same time, learn more efficiently and are much better prepared for the world of work. To top it all, employers may become less interested in degrees. Even now, major employers are announcing that they no longer require degrees from recruits where they once did. This is because learners now have many more ways, including free courses on the Internet, to gain knowledge and competencies, and employers are becoming better at assessing these competencies before hiring.

In March of 2016, the expert group on future funding for higher education released the report "Investing In National Ambition: A Strategy For Funding Higher Education", also known as the "Cassells Report" after Peter Cassells who chaired the expert group. As an invited participant to one of the consultative meetings informing the report, I was intrigued that as we discussed possible funding mechanisms for higher education, nobody seemed interested in discussing the possibility of reducing the cost of providing higher education. To some extent, I accepted that this was probably beyond the brief of the expert group. However, the report that emerged did briefly mention the suggestion that information technology might reduce the cost of higher education, but quickly dismissed the idea, suggesting that it is not a "quick fix". This was despite having a statement in the executive summary that the purpose of the report was to "consider issues related to long-term sustainable funding of higher education".

Now rather than just disagree with the statement that technology cannot be an immediate solution to the funding problem (which is arguably false), I would like to suggest that the report is significantly flawed in addressing its objectives by not considering how costs can be reduced, in the medium term if not the short term, and particularly through the use of information technology. Information technology is revolutionising almost all information and communication businesses so why should it not have a significant impact in higher education? George Bernard Shaw once suggested that all professions are "conspiracies against the laity". Now, I don't want to accuse the higher education community, (of which I am a member of and in which I have many admirable colleagues), of conspiracy, but there is a natural tendency to support the system you are part of, even when times have changed and there may be less need for it. So it is not surprising that if you gather together a consultative group of higher education professionals they will tend to tell you they need more rather than less money.

However, there may be another, less self-serving, explanation for the seeming lack of interest in cost-cutting. Perhaps the expert group did not have the full range of expertise required to consider this option. One clue that this might be the case, is a statement in the report that the recent reduction in funding for higher education is resulting in less time to "accommodate diverse learning styles". It is well known in the educational research community for some time that "learning styles" do not exist, as such, and the idea that improved learning outcomes can be achieved by facilitating diverse "learning preferences" is essentially a myth. The issue here is not that the report might advocate for an expensive ineffective teaching method, but that such a well-known educational myth slipping into the report suggests that the group might not have had the full range of expertise that it required.

As it happens, my personal opinion is that the group has recommended the best of the three funding options available to it, namely student loans. However, because public institutions do seem to have an insatiable appetite for funding, it still could end badly if there are no attempts made to reduce unit costs. In the U.S. there are a large number of people who cannot afford to repay their loans, either because they never completed their studies or because the qualifications they earned have not significantly improved their earning capacity. This is a significant cause of personal suffering for many and also considered to be a significant risk to the economy as a whole. For that reason, even if we do improve access to student loans, it is important to try to reduce costs in order to minimise the level of debt incurred by students.

Although I have suggested that a major mechanism for reducing costs in the future may involve having more students studying off campus in distance learning modes while working, many of the same principles can be used immediately to reduce on-campus costs. Campus students can be given access to online modules designed for distance learners, or many of the free online courses on the web. Online modules can be created specifically to be shared by several campuses or colleges. The technology exists now to create such modules quite cheaply, for little more than the cost of delivering to a local group of students. If used in a "flipped classroom" mode, where the students interact with course materials online before attending tutorials or workshops on campus, not only can it reduce unit costs but also improve the learning experience of the students. Such shared online courses will necessarily have large numbers of students enrolled in order to gain economies of scale and so may need a certain amount of more personalised tuition added. However, even that can be made more efficient with tools that are available now. Computerised quizzes can provide feedback to learners on their progress and can be used to provide timely information to lecturers in order to see who most needs their help thus allowing them to use their time more effectively. The use of computer-aided peer assessment systems as well as rubric based assignment grading, often with standardised feedback, allows lecturers to get important timely feedback to large groups of students with much less effort.

Even more sophisticated tools based on artificial intelligence are now emerging that will allow lecturers provide a higher quality learning experience to larger numbers of students. Adaptive systems using deep learning techniques are able to analyse the behaviour and performance of large numbers of students using the system to determine what learning materials to present to individual students next as well as when to do so. Recently, as an experiment, in Georgia Tech university, a "chatbot" similar to those in computerised help systems, was added to the available human teaching assistants and was so successful that one student nominated it for a teaching award. Interestingly, some other students spotted that it was a chatbot because it responded so quickly to queries.

Spending more money, although often required on a temporary basis, is both the least ingenious and least sustainable way of solving a problem. Asking lecturers to work longer hours is not a very clever, or sustainable, way to improve productivity either. With technology, a small amount of ingenuity and possibly a significant amount of courage, we can improve quality, improve access and reduce costs in higher and further education in Ireland. And if we don't, someone else will.

“This House Believes Artificial Intelligence (AI) Could, Should and Will Replace Teachers” is the topic of a debate at the upcoming at the OEB conference in Berlin later this year (http://www.online-educa.com/OEB_Newsportal/this-house-believes-ais-could-and-will-replace-teachers/). Often when I ask lecturers the question, “If computers could replace lecturers, should they?” I get a negative response justified by various quite valid arguments that teachers will always be required for one reason or other. However, the question they are answering is not the one posed. It ignores the “if” at the start. As an engineer I have always assumed that it was my job to improve the world by making our work more efficient and reducing waste. Sometimes this might include the disappearance of certain professions, but looking back in time it seems that this disruption, although unpleasant for the disappearing profession at the time, was best for society in the long run. So if you are in agreement with that general principle the answer to the question has to be “yes”, computers should replace lecturers if they can.

But before we give up the ghost and let management replace us with computers, there is the question of whether they are capable of replacing us. In the same spirit of misreading the question above, lecturers are loath to admit that computers can, to any great extent, do the tasks that we do, well enough to replace us. This is the natural inclination of any profession to protect itself but it may not be the best strategy if it ignores the changes that are undermining the profession and inhibits the its ability to adapt.

So, to what extent is information technology capable of replacing lecturers? You might say that the first big scare we got was from the MOOCs, Massive Open Online Courses. The demise of higher education as we know it was being heralded by the sight of tens of thousands of learners taking courses from rockstar professors. It was cold comfort to the profession that the MOOC drop-out rate was high, or that the teaching was simple, or interaction between learners limited (and non-existent with the professors), because large numbers were still learning some very useful stuff.

However, we had an ace up our sleeve; assessment and accreditation. Despite our claims of lofty learning objectives, we know that young people come to university to have a good time and to get a certificate that will get them a decent paying job. Employers, parents and students themselves trust us to maintain good standards and not be handing out certificates to anyone who is willing to pay the fees.

So we’re OK then? Maybe not. A number of years ago the presidents of US universities were polled on what they believed would cause the most change in higher education in the future. Sensibly, they suggested it wasn’t MOOCs. However, they did identify the idea of unbundling, or the separation of learning from assessment, which allows learners to learn as they please and to submit themselves for assessment whenever they are ready. If implemented in universities it would unleash a wave of innovation in learning both inside and outside universities as learner seek the most cost effective ways of learning wherever they can find them. Since then this has started to happen. Universities in the US are offering challenge examinations with credits attached. People predicted that the prestigious institutions would not get involved in this but MIT is launching micro-masters degrees where you can study the module for free in the form of MOOCs and then pay for assessment to get the credits.

It could be even worse. These free and cheap online courses may get a lot better. Progress is being made in Artificial Intelligence (AI) and it may have a significant impact in several areas. Deep learning, an AI technique, can monitor the activity of very large numbers of learners in order to optimise and personalise learning pathways (including remedial activities) for individual students. AI has also made progress in creating “bots” who can act as first line advisers to learners who are in difficulty. (Some recent tests found that the only way students could identify bots was by their response speed).

OK, so we may get replaced, eventually, but we’re smart and we can find other useful thing to do (like research). And the universities will still act as assessors and accreditors so they’re OK. Well maybe. Employers have mixed feeling about the education young people receive at university. On one hand they complain about the lack of key skills in graduates but on the other hand they find universities to be largely trustworthy in maintaining standards and they particularly like their ability to help them evaluate job applicants. University degrees are effectively a cheap selection tool for employers. However, they are not so cheap for the candidates or the state who may subsidise the process.

But what if employers found other more efficient ways of assessing what a job candidate knows or can do? Recently a major international management consultancy firm announce that it would no longer require a university degree from new employment applicants. Whatever, this implies about the value of degrees, it certainly indicates that they believe that learning can come from other sources and that they have the competence to evaluate such learning. In domains where there are dire skills shortages, individuals are turning to alternatives such as “boot-camps” and free online courses with added fees for assessment. Many of these are creating “alternative credentials” with electronic certification which can be displayed online and, more importantly, examined in detail by potential employers who can drill down to see much more detail on the contents of the courses and performance of the candidates. Should such credentials gain the trust of employers the associated course may well prove to be more attractive to school leavers who need to balance the pleasure of the “college experience” against their employability and the total cost of their education.

So to conclude, there is every reason to believe that technology may be able to replace a lot of what we as teachers do in the near future, and the trust the public rightly places in our institutions may not be enough to protect the profession. We may be heading for a time where the need for teachers in much less and their role is much different. Not only do we owe it to ourselves to prepare for such a possibility, we also need to admit that if it provides better, cheaper and more accessible learning for the public, it is to be welcomed.

Friday, August 7, 2015

Over the last few years I have been reading of research claiming that research activity does not improve teaching skills and that the two may even be negatively correlated. Perhaps my willingness to accept this was influenced by my own personal experiences (strangely, I even witnessed PhDs in education who could not present well). I, perhaps naively, assumed that senior management in higher education had access to the same information sources as me and would be aware of this (and that a few in my own institution were exceptions due to the fact that we were a small provincial institution).

However, recently, in a blog by Greg Foley, I became aware that this important finding is not widely known, as he described how a university president made several inaccurate comments on the value of research to teaching. (On contacting Greg separately he provided me with some helpful references by Richard Felder here and here). More recently, at a consultative meeting of the great and the good of higher education in Ireland (Yes, I know, “what was I doing there?”), my suggestion that research be separated from teaching in higher education based on such research was met with some horror and disbelief.

So this brings me to this question - How is it that when taking part in discussions on and making decisions about issues that are not in our own domains, we in academia place so little importance on evidence. Indeed, like the consultative group mentioned above, I have been at many internal academic meetings where there have been quite lively debates (bitter arguments?) based on a set of conflicting opinions and with very little evidence being presented.

Could it be that where the evidence runs counter to our own personal interest we deliberately choose to ignore evidence. Perhaps, if arguing about something that is outside our own domains, we are unaware of the research and are too lazy to check it out. My suspicion is that it is the first of these two, as when I have referred to the above research in such debates, it is generally met with scepticism and then ignored.

Friday, July 24, 2015

This very interesting article by Jason Potts of RMIT on "Why MOOCs will fail" seems to confirm what I have for a while suspected to be true. He makes two main points, firstly; that young people go to university to get a mate, and secondly; to send signals to potential employers. They can get neither of these from MOOCs (or possibly all online courses for that matter) and that is why they will fail.

The first point sounds like a joke, but I do get the impression (from my own kids if not elsewhere), that higher education is a pleasant life experience, that they feel entitled to, and the potential to meet romantic partners is no doubt part of that. He does make the interesting point that a very substantial portion of the economic return achieved by attending college is due to marrying a partner of similar social status and income potential. I'm not sure that this is explicitly in the minds of young people who want to go to college, but it may well be hidden in some subconscious evolutionary psychological urge to meet "people like us" in a romantic way.

The second point I think, is much more important. Both parents and kids do think explicitly about how employers will view the education they pay for and receive respectively. This may be why they choose more prestigious institutions even though there may be no evidence that the quality of learning is any better. From my rudimentary knowledge of economics, I believe that this is a form of signalling. The peacock has no great use for his flamboyant tail, but the tail does signal to a mate that he is in good health and has excess nutrition that he can waste on growing and carrying around the tail (perhaps like a first year with an iPhone and Beats headphones). Going to a good college signals potential employers that (i) you had the smarts and/or work ethic to get in, (ii) you had the smarts/work ethic to get through (and perhaps surviving bad teaching as a bonus if the employer is aware of it), and (iii) you come from the right type of background (who are smart and/or value education). If this is a very reliable signal, it saves them a lot of effort in recruitment and the candidate pays for it.

So even if I were to suggest that online work-based learning is superior to traditional campus based education, it's not going to disappear very quickly and growth will most likely be slower than we zealots would like. I don't think MOOCs or online learning will fail. They will succeed for other reasons. We'll just have to live with this problem and work around it.

Could it be that sending our children to college is an extravagance? Something that would be nice to have, but we can’t really afford and do not really need?

For many years now we have been told that it is reasonable to be expect to send your children to college and that if you can’t afford it, you should be able to get assistance in doing so. We are also told that it is in the interests of the economy that as many people as possible get a higher education; that, as a nation, we cannot afford not to send our children to college. This may well be true, but the question here is; can we afford to do it the way we are doing it, either as individuals or as a nation?

As an engineering student in the seventies I would, naturally, muse during lectures about the efficiency of the process. If the lecturer took an alternative approach, not only would a significant amount of time be saved by the lecturer, more interestingly a much greater amount of time would be saved by the 50 or 100 students sitting in the class. Now, 40 years later, there are many alternative teaching techniques available, but not a lot has changed.

To be fair, I have become aware of many of the alternative teaching approaches through my work for the last 20 years with learning technologies and more specifically in online distance learning. And if I were to be honest, the efficiencies I have observed in online distance learning have more to do with the type of student than the technologies used for teaching.

Our distance learners seem to be able to cover material in less time than the full-time students and achieve better scores in examinations. How can this be so? Is it the teaching medium? Is it that they can replay difficult parts of lectures over and over again or post questions to their lecturers and classmates at any hour of the day or night? Perhaps, but I think it may be something else.

Our distance learners seem to be very highly motivated. They are very interested in the content and keen to achieve. They see the relevance of the knowledge, often directly in work they are currently doing, and seem to assimilate it faster and remember it better.

We have always been aware of the merits of “work-based learning” and it has been the basis of the apprenticeship system that has served us well for centuries, even in so-called higher professions such as accounting, law and architecture. So why have we moved away from this model of learning? Could it be that knowledge became so specialised that students had to travel to the source of that knowledge? Could it be that as we grew richer over a period of a few hundred years, first the middle classes and then the working classes felt they had the right to emulate the practices of the aristocracy?

We might argue about the reasons the current system of higher education emerged but there is growing evidence that higher education can be supplied more cheaply and more effectively through a combination of work-based and online learning. As well as being able to reduce the cost of providing courses, the financial burden on individuals and families, as well as the state can be reduced in other ways. As learners are mostly working and do not need to live away from home, they can more easily afford the fees, often with assistance from their employers and with less subsidy from the state.

But what about the social development aspect of full-time higher education? Perhaps because I live in a small town in the West of Ireland I have a broader range of friends than many in my profession, but as you can imagine, my many friends who never received a higher education would laugh if I suggested to them that they were less socially developed than me because they did not go to college.

Would school leavers be mature enough to survive in this new model of learning? Well many believe that they were in the past, and that perhaps we don’t challenge them enough these days. Many young people actually still don’t know what they want to do when they leave school. It could be argued that it is too early to choose a profession and that it might be better to get a menial job in an field that you might be interested in and take a little more time before committing to a course of study. This kind of flexibility is much more feasible with work-based online learning.

And what about my own children? Despite my advice to go and get a job when they finish school, they are insisting that they get the pleasure of going to college just as I did. I am lucky that the state is borrowing money to subsidise them and I am well enough paid to afford to pay the rest and indulge them. Others are not so lucky. Should individuals and the state be spending or borrowing so much for what now could be seen as a pleasant rite of passage for privileged individuals? An extravagance?

Brian Mulligan is a lecturer and Programme Manager in the Centre for Online Learning in Institute of Technology, Sligo, Ireland. He can be contacted via his blog at elearngrump.blogspot.com

It is now thirty years since I started teaching at Institute of Technology, Sligo and twelve years since I started working with distance learners online. During that twelve years I made two significant observations that have led me to the conclusion that the way we approach higher education needs to be changed. However, the change I am proposing here is not a small one: We should get rid of full-time undergraduate education.

In our early days of our online teaching, worrying that some people might be sceptical of this form of education, we always ensured that our online students sat the same examinations as our full-time students. We very quickly noticed how much better these working adults performed in examinations than the full-time students. We would have liked to attribute this performance to our online teaching methods, but we knew it was more likely to be due to the fact that they were situated in workplaces where they could see the relevance of what they were learning. Although the first observation came as early as 2003 when we ran our first examinations, the second observation came much more slowly. It was that online learning has the potential to be much more cost-effective than campus-based education and in certain situations, to be of even higher quality. I was led to conclude that undergraduate education, in most countries, is more expensive than it needs to be, and less effective than it should be.

So, if this were true, how might you design an alternative approach to undergraduate education. Well, as it happens, such an approach already exists in the apprenticeship model. We have long recognised that the best way for people to learn a trade was to combine work with learning. In fact it is only relatively recently that many higher professions such as architects, lawyers and accountants have moved away from this work-based approach to learning.

However, there were good reasons why universities emerged in the middle ages as repositories of knowledge and places where rich young men were sent to become familiar with all of the advanced knowledge of the time. As we moved towards the massification of education during the the last century, it was expedient that other forms of education copied this model and even tried to gain some of the status of these institutions by taking the title of “University”. But this is the 21st century, and we are now well into the information age, where we do not need to travel to access the knowledge of our greatest minds or enter into rich discussions with fellow learners. We are not working under the constraints of the past that required physical access to these centres of learning.

To add to this, the cost of higher education has been steadily increasing to the point where states, if not people, can barely afford it. As manufacturing and services companies constantly strive successfully to reduce their costs and improve their quality, do we, as educators not owe the same to our funders and learners; a better education at a lower cost?

So I’d like to propose that we get rid of full-time undergraduate education and replace it with work-based learning, where learners take positions, even menial ones, in workplaces closely associated with the profession they wish to pursue and take most of their courses online, attending their colleges occasionally to help build relationships with their classmates and carry out activities that are best done in that setting. It may be necessary to stretch out the courses over a longer time, but it will result in significant savings, including the opportunity to earn while studying, and result in better learning outcomes.

Will our young people be mature enough to survive in this new model of learning? Well many believe that they were in the past, and that perhaps we don’t challenge them enough these days. What about the the social and personal development aspect of a college education? Well, I made the point to my brother, who entered the Civil Service as an 18 year-old in 1972, that, as I had been to university, I was more developed socially and personally than he was. I will leave it to you to imagine what his response was. And what about our guilt at denying our young people the pleasure of a college education? Spending the state’s money on pleasures we cannot afford might just fit the definition of extravagance.

This has also been posted on Ferdinand von Prondzynski's University Diary blog where there is more discussion via the comments: http://goo.gl/7rtt5A