MOOCs

Massive open online courses (MOOCs) have captured the nation’s imagination. The notion of online classes enrolling more than 100,000 students is staggering. Companies are springing up to sponsor MOOCs, growing numbers of universities are offering them, and the rest of America’s colleges are afraid they will be left behind if they don’t.

But MOOCs alone are unlikely to reshape American higher education. When history looks back on them, they may receive no more than a footnote. However, they mark a revolution in higher education that is already occurring and which will continue.

America is shifting from a national, analog, industrial economy to a global, digital, information economy. Our social institutions, colleges and universities included, were created for the former. Today they all seem to be broken. They work less well than they once did. Through either repair or replacement — more likely a combination — they need to be refitted for a new age.

Higher education underwent this kind of evolution in the past as the United States shifted from an agricultural to an industrial economy. The classical agrarian college, imported from 17th-century England with a curriculum rooted in the Middle Ages, was established to educate a learned clergy to govern the colonies. This model held sway until the early 19th century.

In the years before the Civil War, the gap between colleges and society grew larger. European higher education modernized, creating models that would inspire America to grow our own. Innovations, mostly small, were attempted; many failed. During and after the war, the scale of experimentation increased with the founding of universities such as Cornell University, Johns Hopkins University and the University of Chicago a few decades later. Other institutions, such as Harvard University, remade themselves. The innovations spread. By the mid-20th century a new model of higher education for an industrial era coalesced. It was codified in California’s 1960 master plan, balancing selectivity with access and workforce development.

This transition brought new institutions that better met the needs of an industrializing America.

An entity called the university was imported from Germany, with what would become a mission of teaching, research and service. It offered instruction in professions essential to an industrial society, organized knowledge into relevant specialties, and hired expert faculty in those areas. It not only transmitted the knowledge of the past, but advanced the frontiers of knowledge for the future.

The federal government created the land-grant college to bridge between the old agrarian America and the emerging worlds, agrarian and industrial America. Now found in all 50 states, the land-grant college was designed to provide instruction in agriculture and the mechanic arts without excluding classical studies.

Specialized institutions emerged. Some, like the Massachusetts Institute of Technology, were modeled on the European polytechnics; they promoted industrial science and technology and prepared leaders in these fields. Others, the normal schools, sought to provide more and better teachers as the evolving economy demanded more education of its citizenry.

The two-year college — originally called a junior college, later a community college, sometimes Democracy’s College — was initially established to offer lower-division undergraduate education in the local community.

As these institutions emerged, the curriculum changed. Graduate studies were introduced. New professional schools in fields like engineering, business and education became staples. Continuing education and correspondence courses were added. Elective courses and majors arose. Disputation, recitation, and memorization, the teaching methods of the agrarian college, gave way to lectures, seminars, and laboratories.

The colleges that persisted adopted many of the era’s changes, and the classical curriculum largely disappeared.

This is the history of higher education in America. Change has occurred by accretion. The new has been added to the old and the old, over time, modernized. Change occurs with no grand vision of the system that the future will require. New ideas are tried; some succeed, many fail. By successive approximations, what emerges is the higher education system necessary to serve the evolved society.

Social change is a constant, and so is the need for higher education to adapt to it. When the change in society is deleterious, as in the McCarthy era, it is the responsibility of higher education to resist it and right the society. It is a natural process, almost like a dance. However, in times of massive social change like the transformation of America to an information economy, a commensurate transformation on the part of higher education is required.

We are witnessing precisely that today. MOOCs, like the university itself or graduate education or technology institutes, are one element of the change. They may or may not persist or be recognizable in the future that unfolds.

What does seem probable is this. As in the industrial era, the primary changes in higher education are unlikely to occur from within. Some institutions will certainly transform themselves as Harvard did after the Civil War, but the boldest innovations are likelier to come from outside or from the periphery of existing higher education, unencumbered by the need to slough off current practice. They may be not-for-profits, for-profits or hybrids. Names like Western Governors University, Coursera, and Udacity leap to mind.

We are likely to see one or more new types of institution emerge. As each economic and technological revolution creates new needs for higher education, unique institutions emerge to meet them. In the agrarian era, only a tiny percentage of the population needed higher education, and the college served these elite few. When industrial America required more education, more research, and mass access to college, two major institutions were established: the university and the community college.

The information economy, which requires a more educated population than ever before in history, will seek universal postsecondary education and is likely to create new institutions to establish college access for all at low cost. These institutions will operate globally, not locally, which will dictate a digital format. Because information economies emphasize time-variable, common outcomes — unlike the industrial era’s common processes and fixed times (think assembly lines) — universal-access institutions will offer individualized, time-variable instruction, rooted in mastery of explicit learning outcomes. Degrees and credits are likely to give way to competency certification and badges.

Traditional higher education institutions — universities and colleges—will continue, evolving as did their colonial predecessors. Their numbers will likely decline. At greatest risk will be regional, part-time commuter universities and less-selective, low-endowment private colleges, particularly in New England, the Mid-Atlantic, and the Midwest. The future of the community college and its relationship to the universal-access university is a question mark. It is possible that sprawling campuses will shed real estate in favor of more online programs, more compact learning centers and closer connections with employers and other higher education units.

In this era of change, traditional higher education—often criticized for being low in productivity, being high in cost, and making limited use of technology — will be under enormous pressure to change.

Policy makers and investors are among those forces outside of education bringing that pressure to bear. It’s time for higher education to be equally aware and responsive.

Arthur Levine, a former president of Teachers College, Columbia University, is president of the Woodrow Wilson National Fellowship Foundation.

There’s a legendary story about Anne Sexton’s learning how to write a sonnet by watching I.A. Richard’s educational-television series in the late fifties. I’ve thought about that fairly often while reading the daily stories on MOOCs. In the Sexton/Richards instance, there was a fortuitous electronic meeting of an excellent teacher who saw possibilities in the then “new” technology of television and a motivated student who was ready to write as if -- and according to her this was indeed the case -- her life depended on it.

That hyperbolic tone of the last sentence above -- a tone that readers of Sexton’s later poems and interviews are already familiar with -- is also the tone of a good many declarations about MOOCs.

Thomas Friedman’s latest column “The Professors’ Big Stage” is a case in point. His piece on “the MOOCs revolution” is riddled with contradictions, shallow thinking -- and an error in basic arithmetic.

Friedman begins by excitedly informing us that he’s just returned from a “great conference” sponsored by M.I.T. and Harvard on “Online Learning and the Future of Residential Education.” He doesn’t explain why he had to attend in person, or question why the conference wasn’t online, but he adds his own title, “How can colleges charge $50,000 a year if my kid can learn it all free from massive open online courses?" That premise, it soon becomes clear, is moot.

More on Friedman and MOOCs
"Thomas Friedman has as much
credibility on education as I do on
dunking a basketball," writes
John Warner.

As Friedman goes on to extol the virtues of using MOOCs as supplements for traditional courses and programs, MOOCs then become an example of preliminary programmed learning -- the sort of thing that community colleges have been doing in terms of remedial aid for quite a while. Publishers like Bedford/St. Martin’s have offered online drills for years. And if the MOOC is tied to an accredited college’s course, then Junior and his dad are still paying for Junior’s education.

According to Friedman, students enrolled in a hybrid course at San Jose State, which combines M.I.T.’s introductory online Circuits and Electronics course with traditional in-seat class time, have done quite well: “Preliminary numbers indicate that those passing the class went from nearly 60 percent to about 90 percent.” There’s even better news for the students involved in that course than Friedman’s assessment: he sees the improvement as one-third; in fact, a jump from 60 percent to 90 percent means the number of students passing the class increased by one-half, or 50 percent.

We should note that this is an argument for remedial preparation and/or immersion in a subject -- not necessarily an argument for online versus in-seat instruction.

And that, of course, is just one class. Friedman sees MOOCs as going far “beyond the current system of information and delivery -- the professorial ‘sage on the stage’ and students taking notes, followed by a superficial assessment. This description not only fails to describe adequately the current system but also ironically illuminates some of the biggest problems with MOOCs. Given the scale of MOOC courses, the only kinds of student assessment that can be accomplished are superficial. And we will have to hope that some enrolled students, unlike Friedman, still believe in note taking. The MOOC lecture system, however, puts that sage right back on the stage -- as Friedman’s very title for his op-ed indicates.

Moreover, his discussion of Michael Sandel, the Harvard professor whose Justice course will have its American debut on March 12 as the first humanities offering on the M.I.T./Harvard edX online learning platform, focuses not on aspects of the course but on Sandel’s old-fashioned appearances on the lecture circuit.

Sandel, whose course has been translated into Korean and shown on national South Korean television, recently traveled to Seoul (again, why?), where he lectured “in an outdoor amphitheater to 14,000 people, with audience participation.” There was no indication as to how long the Q&A session ran.

Academicians often fall prey to magical thinking; at my former college, each time we hired a new provost (10 in my 16 years), we were certain that this was the one who would be our savior.

Each time we created a new central curriculum (three in my 16 years; the final stage just before I left was to exempt adult students from completion of the college’s core requirements), we were certain that this was the answer. Smaller, struggling colleges may see offering licensed supersized online courses as cost-saving -- an escape from the situation they currently find themselves in, in which every small school worries about going online or bust.

Many of these colleges turned to creating their own individual online courses -- already being referred to as “traditional online courses” -- as a solution, only to find that the expenses have outweighed the successes: they are costly in terms of faculty training, serve very small audiences (often sitting only a building or two away), and put severe strain on IT departments.

Online consortiums in which struggling schools have banded together have also proved to be problematic; I am thinking in particular of one class that I was asked to review for my former college, which was a member of such a consortium: an accelerated multi-genre writing class, which asked students to write one poem, one short story, and one essay over a period of five weeks. The "final project" consisted of one additional work, in the students' choice of genre. It was thus possible to fufill 50 percent of the course requirements with two haiku.

MOOCs, of course, have their ur-versions, which include not only Henry Ford’s production line and the rise of fast food, but massive online delivery experiments in the mid-1990s, online remedial drills, large introductory-course in-seat lectures, Sunrise Semester, and the Great Lecture Series, but also the 19th-century lecture. And possibly there was someone who asked Harvard for credit for attending Thoreau’s lecture on “Society” -- or for attending a lecture by P. T. Barnum.

Friedman does note, near the end of his exhortatory column, that “We still need more research on what works.”

Indeed. Along with the return of the sage on the stage, this newest educational/industrialized development has brought along with it -- no surprise to anyone who has taught a traditional online class, a class with online components, or a traditional in-seat class -- some old concerns: problems with technology; problems with underprepared and unmotivated students; problems with class participation in discussions (one sage walked off the stage); and concerns about retention and plagiarism.

Assessment will continue to be one of the biggest concerns: both assessment of the overall course and assessment of any student work that goes beyond the level of a drill. Financial issues will come in to play, as will work force issues. Hierarchical divides among students, faculty members, and institutions will not disappear.

Finally, there is a dynamic in a traditional classroom that MOOCs simply can’t provide. In small, in-seat courses and workshops, students discover that they are part of a community, in which each person has a responsibility to contribute and the reward of personal interaction. Such courses allow for flexibility, Socratic questioning, and serendipity. Face-to-face meetings and small-group dynamics are important parts of education and socialization. And they provide an essential break for students from their hours of online gaming, posting and browsing.

One other analogy that comes up in discussions of MOOCs is “correspondence course.” It’s considered a dirty term, and yet, it may be an accurate description as thousands of students and piecework adjuncts labor at their solitary tasks.

And there may be something to be learned from a fictional account of a correspondence school: J. D. Salinger’s “De Daumier-Smith’s Blue Period.” The alienated protagonist concludes that “We are all nuns” -- working silently, separately, seeking salvation.

Carolyn Foster Segal is a professor emeritus of English at Cedar Crest College. She currently teaches at Muhlenberg College.

The top of the annual performance review form at my university has a blank space for us to list any additional education we obtained during the previous year. I’ve never filled that space in before, but that will change in my review for 2012 because I spent part of my sabbatical last fall as a student in a massive open online course (or MOOC).

I'm an American historian by training, but ever since I left graduate school a global perspective has become increasingly important for historians of all kinds. That’s why I decided to get some free professional development in world history, courtesy of Coursera. I learned a lot of interesting and useful specific factual information from the MOOC instructor (or superprofessor, as the lingo goes) that has already helped me become a better teacher and scholar.

But I didn’t just listen to the lectures. Like any other student (since that’s what I was), I also wrote out all the assignments and helped grade papers written by my peers in class. This peer grading process differs from peer evaluation (which I use in class all the time) since students not only read each other’s work, they assign grades that the course professor never sees. Professors in the trenches tend to hold their monopoly on evaluating their students’ work dearly, since it helps them control the classroom better by reinforcing their power and expertise. On the other hand, superprofessors (and the MOOC providers that teach for them) have begun to experiment with having students grade other students out of necessity since no single instructor could ever hope to grade assignments from tens of thousands of students by him or herself.

With MOOCs in their infancy, few precedents exist for designing online peer grading arrangements for humanities courses. For this reason, I don’t intend to criticize my superprofessor’s choices here. However, I do have to describe some of the peer grading process from my class in order for my critique of peer grading in general to make sense. All students in the MOOC were supposed to write six essays between the start of the course and its end. For each assignment, we could choose one of three single-sentence questions to answer in 750 to 1,000 words. The week after we submitted those essays, we were supposed to grade the essays of five of our peers with respect to their argument, evidence and exposition, and leave comments. If you didn’t grade the essays your peers wrote, you didn’t get to see the grade you earned.

With respect to the grades I earned, I think my peers graded my essays just right. The grading scale in our MOOC went from zero to three. When I already knew a fair bit about the topic of the question that I answered or I tried very hard to write the best essay I could, I earned mostly threes from my peers. When I didn’t try very hard, I tended to get twos. While I listened to all my superprofessor’s lectures fairly closely, I never read the recommended textbook, which also undoubtedly hurt my scores.

For me at least, the primary problem with peer grading lay in the comments. While I received five comments on my first essay, for every subsequent essay I received number grades with no comments from a minimum of two peers and as many as four. In one case, I got no peer grades whatsoever. That meant that the only student who evaluated my essay was me. Every time I did get a comment, no peer ever wrote more than three sentences. And why should they? Comments were anonymous so the hardest part of the evaluative obligation lacked adequate incentive and accountability.

I read in The New York Times a few weeks ago that a study had begun to examine whether peer grades would match the grades assigned by professors and teaching assistants in one sociology MOOC. While that would prove an impressive feat if true, it would in no way validate the process of peer grading. Learning, as any humanities professor knows, comes not through the process of grades but through the process of students reading comments about why they got the grades they got. That’s how students find out how to do better next time.

To be fair, the course included a good set of instructions about how to grade a history essay linked from the course homepage. Unfortunately, there was no way for the superprofessor to force students to read those instructions, and due to the inevitable pressure to cover as much world history as possible, he never discussed how to grade in any of the class lectures. How could he? Good grading technique is difficult enough for graduate students to learn. Because of the size of the course I think I can safely assume that many of my fellow MOOC students inevitably had no history background at all, yet the peer grading structure forced them to evaluate whether other students were actually doing history right.

The implicit assumption of any peer grading arrangement is that students with minimal direction can do what humanities professors get paid to do and I think that’s the fatal flaw of these arrangements. This assumption not only undermines the authority of professors everywhere; it suggests that the only important part of college instruction is the content that professors transmit to their students. How many of the books you read in college can you even name, let alone describe? It’s the skills you learn in college that matter, not the specific details in any particular class, particularly those outside the major.

Over the course of my career, I have increasingly begun to spend much more time in class teaching skills than I do content. Some of this has been a reaction to encountering students who do not seem as prepared for reading or writing college-level material as the students I had back when I started teaching. However, I have also come to believe that teaching these skills is much more important than teaching any particular historical fact. After all, it really is possible to Google nearly anything these days.

Certainly good students can do a good job grading peer essays and I got a few short but insightful comments on the papers I wrote for my MOOC. Even if all of my comments had been less than helpful, I didn’t come into the MOOC process seeking to improve my writing skills. I wanted to learn new information, and many other students who engaged the material the same way that I did probably felt the same way.

Students like me won’t be the ones who’ll suffer because of peer grading. Its victims will be the future students who take MOOCs to earn college credit at increasingly cash-strapped universities. Who will teach them how to write well? Who will monitor their progress through the peer grading assignments? Who will help them understand that history is as much about argument as it is about facts or that literature can be appreciated on multiple levels? While other students can certainly teach other students some things, they can never teach students everything that a living breathing professor can.

Education startups like Coursera are experimenting with peer grading not because it is the best way for students to learn history or English, but because it is the only way that the MOOC machine can ever run itself in a humanities course. If MOOCs incurred high labor costs the same way that colleges do, those startups would never be able to extract a profit from those classes. While that’s a legitimate concern for Coursera’s venture capital investors, everyone else in academia – even the superprofessors – should give more weight to purely educational concerns.

In a similar fashion, MOOCs (or massive open online courses) seem to have arrived almost out of nowhere, in quick succession – first Udacity in February of last year, followed by Coursera in April, then edX in May. Remarkable as it may seem, MOOCs as we know them today have been with us only for as long as it has taken the Earth to make one orbit around the sun.

“I like to call the last year ‘the decade of online learning,’ ” joked Anant Agarwal, president of edX, during my recent visit to the offices of his bustling startup in the Kendall Square area of Cambridge, Mass.

As accelerated as the progression of MOOCs has been from curious acronym to household name, and as much as it may seem that MOOCs themselves have fallen from the sky, in truth MOOCs have been ripening for some time.

Consider the free “courses” delivered through iTunes U for the last several years, or TED Talks, and Khan Academy, not to mention some of the early progenitors of MOOCs themselves, including Dave Cormier, credited with coining the phrase in 2008, as well as George Siemens, Stephen Downes, Alec Couros, David Wiley, and others.

Recall Carnegie Mellon’s Open Learning Initiative, the “open educational resources” movement, and MIT’s OpenCourseware, launched all the way back in 2002. And let’s not forget Fathom.com, an initiative out of Columbia University launched at the turn of the millennium, or even the early days of America Online and Compuserve, both of which offered educational content through their services as early as the 1990s.

MOOCs, then, are not as new as they seem – though the world today appears to be more ready for them than it was in decades past. Indeed, it isn’t hard to see how forces as diverse as Clayton Christensen’s theory of “disruptive innovation” from the late 1990s, the expansion of online enrollments over the last decade, the reformist intentions of the Spellings Commission on the Future of Higher Education from 2005-2006, the great recession of 2007-2009, or the completion agenda supported by the Lumina and Gates Foundations over the last few years have all contributed to a public thirst for what look like very high-quality educational offerings at very low – or even zero – cost.

And like many innovations before them, MOOCs have been received with the usual contradictory apocalyptic fervor – where some believers foresee the arrival an educational golden age and others see the eventual destruction of our institutions, our faculty, and the intangible value of face-to-face learning.

Writing in The American Interest this month, for example, Nathan Harden claimed that “ten years from now Harvard will enroll ten million students." He went on to argue that as a result of the MOOC movement, “the changes ahead will ultimately bring about the most beneficial, most efficient and most equitable access to education that the world has ever seen."

At the other end of the apocalyptic continuum, Gregory Ferenstein, writing for TechCrunch last month, foresaw a future in which MOOCs wreaked a terrible devastation on the land, as “part-time faculty get laid off, more community colleges are shuttered, extracurricular college services are closed, and humanities and arts departments are dissolved for lack of enrollment.”

The real significance of MOOCs lies, however, not in their being a harbinger of our educational salvation or demolition. Nor does their real significance lie principally in their potential to increase access or reduce costs – at least not for Agarwal and edX.

“We are about two things,” Agarwal told me. “We are about dramatically increasing quality and impacting campus learning. We are being very deliberate. This is not a numbers game – this is not a game at all. This is a quality quest.”

Funded with $60 million in seed capital from MIT and Harvard, edX can make a claim to being the first MOOC platform to market, inasmuch as its predecessor, MITx, was launched in December 2011. Until this week, the edX consortium featured five independent member institutions (MIT, Harvard, the University of California at Berkeley, Georgetown University, and Wellesley College) and one state university system comprising 15 colleges and universities (the University of Texas System). Thursday, it added six more, including several outside the United States.

In less than a year, edX’s 25 courses have enrolled close to 700,000 people. “That’s more than the combined alumni of MIT and Harvard over their combined 500-year history,” Agarwal observed with a mixture of pride, enthusiasm and amazement. What really pleases him, though, is something else.

Rolling his chair across the office, Agarwal waves me over to his monitor and shows me the virtual laboratories edX has been developing for its courses. We start with his own course on Circuits and Electronics (6.002x in the edX course catalog).

“Many MOOCs are just about analyzing problems,” he said. “We give you a blank sheet of paper and say, ‘Go build, design, create, construct something.’ ” With drag-and-drop alacrity, Agarwal moves the components of a circuit into place on a piece of digital graph paper and clicks a button to test its performance. “Computers do the grading,” he said, “in real time.”

“The media focus on numbers, they focus on cost,” Agarwal sighed. “But they should focus on something else – quality. And they should focus on efficiency. What is efficiency? It’s a ratio of quality and cost.”

Agarwal knows that MOOCs have their doubters, and he believes that they can only be persuaded with proof. He cites the case of San Jose State University, which licensed his own course on circuits and ran it as an adjunct to the school’s own classroom-based instruction. The results, Agarwal claims, were impressive. “The fail rate dropped from 40 percent to 9 percent,” he told me. “That’s a quality improvement.” And the costs to San Jose State were minimal. That’s efficiency. Agarwal says San Jose will be sharing more details about their experience with edX in the near future.

With the avidity of the prototypical startup entrepreneur, Agarwal talked excitedly about the potential for MOOCs to improve pedagogy. “We have our xConsortium,” he said. “All of the schools in our consortium have access to all the data in the platform in an anonymized format. This is what I call ‘the particle accelerator of learning’ – big data in learning in real-time.” In a sense, then, edX’s quality quest, as Agarwal calls it, is seeking out the educational equivalent of the Higgs Boson, as well the other fundamental elements of learning, in order to better understand what kind of learning objects, what kind of real-time remediation, and what kind of learning materials – whether analysis or laboratory or other – produce the best results from one learning context to the next.

I ask Agarwal what distinguishes edX from its fellow MOOC platforms. “We have a fundamentally different mission,” he replied. “We’re nonprofit. We’re open source. Our technology is for everyone. And we have a commitment to campus learning.”

Earlier this month, the American Council on Education completed an evaluation of five courses on the Coursera platform, developed respectively by Duke University, the University of California at Irvine, and the University of Pennsylvania. Intriguingly, all five courses were approved for credit through the ACE credit transfer program. But just in case the future of MOOCs was beginning to make sense to you, consider this – all three of these institutions have made it clear that they, at least, will not be awarding credits for the courses, irrespective of the fact that they developed the courses themselves.

MOOCs are puzzling.

Will they last? It’s not, I suspect, a question that would bother Agarwal very much one way or the other. “For us,” he said, “it’s not about MOOCs. We are trying to reimagine our own campus. The lecture wasn’t working. Quality has been static for decades, but costs are going up. There’s a trillion dollars in student debt. We are trying reimagine campus education from the ground up – with new ways of learning that are more enriching, more engaging, more efficient, and that produce better outcomes.”

How do you like them apples?

Peter Stokes is executive director of postsecondary innovation in the College of Professional Studies at Northeastern University, and author of the Peripheral Vision column.

With great interest, I read the recent news announcing that the American Council on Education (ACE) had evaluated five Coursera MOOCs and recommended them for credit. But I had hoped for something different.

Having traditional prestigious institutions making their online content open to the world – of course without their prestigious credit attached – was an exciting development. A race to post courses ensued. On the surface, it’s an altruistic move to make learning available to anyone, anywhere for free.

Dig deeper and we are left to ask, how many MOOC courses will really be worth college credit, where will the credits be accepted, and for how long will college credits even be the primary measurement of learning?

Now that ACE has evaluated a few courses, MOOC providers will see how their process goes as students start actually finding proctors and taking tests -- or finding other methods of assessment -- to prove they learned the material. But a few courses will not be enough to really help students earn degrees, and with MOOC courses and providers continuing to proliferate, this does not seem like a viable way to keep up with demand.

Regardless, it is more than likely that the universities that agreed to the ACE CREDIT review are never going to accept an ACE CREDIT transcript themselves. The students with ACE CREDIT transcripts will need to present those transcripts to “lesser known” schools that are not among the elite players – colleges with much lower tuition and a willingness to serve post-traditional students.

More troubling is the fact that the ACE process for credit review is still course-based. Will this really be flexible enough in the future? Will it measure competencies and individual learning outcomes? Even if it seems scalable, will it mean all MOOC evaluations have to run through ACE and only ACE? Will students have to wait until ACE has evaluated a MOOC course before they can get credit?

Moreover, this raises the question: Are course evaluations and testing really the best or only way to deal with this new era of learning? What about experiential learning? If someone has college-level learning from their life experience is it invalid unless they take a course?

As Inside Higher Ed points out in its article, this was a fast move in an industry that moves at a glacial pace. But when ice really begins to melt, it can quickly turn into a waterfall. Students have more options for learning, and can get more information, from a variety of sources. So the question for education becomes, how can we best accommodate that?

I would assert that a portfolio assessment of students’ learning is the best way. Just as an artist shows a portfolio to a prospective employer, students should be able to demonstrate learning from wherever they have learned -- work, MOOCs, informal training, military service, volunteer service, and more -- all in one place. And much of this learning will not involve a course at all.

If MOOCs are to be truly disruptive, they must link to competencies, credentials, degrees and/or ultimately jobs. Using a course-by-course, credit hour-by-credit hour approach to do this will not dramatically change the way people earn degrees. And dramatic change that allows for individual demonstrations of competencies is the only way to provide the education quality and agility necessary to truly recognize learning derived from free resources on the web. By focusing on competencies, we can align and accept learning experiences from everywhere.

Pamela Tate is president/CEO of the Council for Adult and Experiential Learning.