Jeff Young has a great piece in the Chronicle of Higher Ed today about what he calls the “bandwidth divide” and how most MOOCs require learners to have persistent high-speed internet access. When we created the Mechanical MOOC course, we built it on existing open resources mostly because we though it was the most efficient and cost effective way to do it–by leveraging the investments already made in creating MIT OpenCourseWare, OpenStudy and Codecademy.

We realized very quickly that a lot of additional flexibility came with leveraging these resources. Because they were from mature projects focused on openly sharing their resources and functionality, they had developed alternate modes of delivery to address bandwidth issues:

the 6.189 course used an open textbook that was downloadable

the 6.189 course materials (assignments, notes) themselves could be downloaded in a single zip file

the 6.00SC videos used were downloadable from iTunes U and the Internet Archive

OpenStudy was launching a beta mobile interface just as the course kicked off

And our learners downloaded the materials is large numbers:

Beyond that, we were able to also leverage the deep investments made in translating these open resources. The text is available in a dozen languages, and the course materials have been translated into Chinese. By building our course on open resources, we saved money and leveraged the work that these projects have already put into reaching audiences working without persistent internet or in other languages. A win-win-win.

How big is the typical MOOC? – while an enrollment of 180,000 is often cited as the largest MOOC so far, 50,000 students enrolled is a much more typical MOOC size.

How many students complete courses? – completion rates can approach 20%, although most MOOCs have completion rates of less than 10%.

What factors might affect completion rate? – the way that the course is assessed may affect completion rates; the completion rates of courses which use automatic grading range from 4.6% to 19.2%, while the rates for courses which use peer grading range form 0.7% to 10.7%. This may present a greater challenge for teaching MOOCs in certain subjects.

Do more students drop out if courses are longer? – there does not appear to be a negative correlation between course length and completion rate, which is interesting as you might expect fewer students to ‘keep going’ and complete longer courses.

It’s great to see some data on completion rates, and this will certainly stir up more debate on the topic.

But one issue not addressed in the current discussion is who really cares about MOOC completion? Certainly the groups offering them do, and educational researchers do. A fair guess that many non-profit funders do as well. Interestingly, though, some of the data coming out of the Mechanical MOOC Python course suggest that in the absence of extrinsic carrots like credit or certificates, learners may not.

In the eighth and final week of the class, we asked the 5,775 learners who signed up for the first iteration of the Python course a series of end of course questions; we received 21 partial and 61 complete responses. Assuming a survey completion rate of 3% (typical of what we see for MIT OCW surveys) and 5% (really good for an OCW survey) that would suggest a rough engaged population of learners (that is, still reading the e-mails we were sending out to structure the course) of between 2,733 and 1,640 people during the last week of the course.*

One question asked which was the last week of the course out of the eight they had completed. Here’s the response:

At the point of the survey, midway through the eighth week, 12.1% indicated they had completed the course and 13.8% had completed week 7. If we assume 25% attrition from those that completed week 7, maybe 10.4% of the 13.8% would be expected to finish the course. So in very rough numbers, 20.5% of the survey respondents might be expected to finish.

Apply that number to the estimated engaged population of learners above, and we can get very rough numbers of estimated completers: 560 – 336, or 9.7% – 5.8%. or somewhere in the mid to low range of MOOC out there, which might be expected, since we weren’t offering a certificate or other incentive for finishing. Now there are plenty of places to take issue with the above numbers, and since our course set up doesn’t have a solid way of counting course completers, this really should be taken for the back-of-the-envelope analysis it is. But…

What is really interesting to me here is the distribution of learners across the weeks completed. There is a large cohort of students (68.9% of respondents) that reports most recently completing weeks 4-7, which is to say they progressed significantly through the course but most of them were not positioned to finish the course “on schedule.”

How do they feel about this? Apparently pretty good. Granted the n’s are painfully small here, but if you ask how successful they felt they were in the portions of the class they completed, most report being completely or mostly successful:

Further, if you ask whether they feel prepared for further study based on what they had learned so far in the class, they likewise responded largely that they were very or somewhat prepared:

The data’s a little thin, yes, but this would seem to at least suggest that while MOOC providers and higher education commentators wring their hands about the completion rates of MOOC, the learners may not really care that much. If they are learning for the sake of learning, they may be quite content to fit in what learning they can given the constraints of their lives and be happy with wherever they finish up.

There’s a great deal of excitement (and fear) over whether MOOCs will replace parts of the current higher education system, but right now I suspect most of the activity with MOOCs (as has been the case with OER more generally) is in extending educational opportunity beyond the current higher education system. If this is the case, we may need some better metric for understanding student success and satisfaction than completion rate.

Another good year for MIT OpenCourseWare. Big story here is the tremendous jump in YouTube numbers–this plus the 12 million iTunes U downloads and the redistribution we are getting through Chinese sites like 163.com (which we get no reporting from) means that a lot of the OCW activity is through our video redistribution.

Just completed the 2011 evaluation summary. Hope as always to follow with a more detailed report, but for now, this gives a general idea of directions and trends.

Most interesting thing in here for me is the increase in % of students (up to 45% from 42%), making them now the largest constituency instead of self learners (at 42% down from 43%). These are margin-of-error-ish changes, but interesting nonetheless. Could be a result of the time of year we did the survey, could indicate more people returning to school in a tough economy–lots of possible explanations.

Also interesting that the primary student use is now complementing materials from an enrolled course (up to 45% from 39%) instead of learning outside the scope of formally enrolled coursework (down from 44% to 40%). This may indicate that more students are coming from undergraduate and community colleges, as this lines up more with past measures of usage scenarios at that level, but I’ll have to dig deeper to see if that holds.

October is traditionally OCW’s annual high-water mark for traffic, and last month was no disappointment in that regard. The site received a record 1,733,198 visits from 1,026,004 unique visitors. This eclipses the previous high of 1,602,561/1,015,112 from August last year, and is a 12.4%/12.8% increase over last October.

Other groups are also growing at an impressive rate. 18.01 Single Variable Calculus has nearly 8,500 participants. 21F.101 Chinese I has over 2,000. Several others are closing in on a thousand participants, and all but one of the recently introduced OCW Scholar groups have attracted participants in the hundreds.

• Educational resources are, on balance, beneficial to those who have access to them.
• Being “open” doesn’t diminish the value of “educational resources.”
• Obtaining permission to publish under full copyright is as expensive as publishing under an open license.
• The capacity open licenses provide for translation of OER into other languages, which has extended access to millions, is itself sufficient benefit to justify their use.

In preparation for the upcoming OCW Consortium meeting, I’ve been surveying the OCW/OER landscape, and come up with what is (for me, anyway) a bit of a startling observation: the number of universities in the 2010 US News World’s Best Universities list that have significant OCW/OER programs underway. By my count, 9 of the top 25, and 15 of the top 50.

This is in no way to downplay the importance of the hundreds of other universities worldwide that are sharing their materials as well. The less resourced universities that serve larger and less prepared student populations provide some of the most valuable materials, but if you want a measure of where the world’s leading universities are in their thinking about OCW and OER, this is worth considering.

While we haven’t yet conducted a survey to gather detailed feedback about the new OCW Scholar courses, we do have one month of analytics now, and the results look pretty good. Collectively, the five courses received just of 95,000 visits in their first month, and all are in the top 30 courses on the site for the period January 12 to February 12, 2011. The Top courses, 18.01SC, received just under 28,000 of those visits.

All the user feedback we’ve gotten has been positive as well, along the lines of this comment from a student (location unknown): “This is a great help for my incoming local engineering licensure exam. I kind of sped past these topics during college. So when i saw the videos on 8.02 the thoroughness of explanation just amazed me . I think I am beginning to enjoy engineering again :)”

I’ve had my share of existential moments, but this is one I didn’t realize I was having. Or rather my profession was having. I like Taylor’s book for the most part, and think it serves as a useful examination of the field, but I do think it does miss on a couple of fronts, which I will discuss in later posts.

But for now, the soundbite: Ira Fuchs quote “If you take away OCW completely, I’m not sure that higher education would be noticeably different.” Sure, especially US higher education. The same could be said of Wikipedia. And once again I am filled with the sense that as a movement, we are failing to adequately define success and so leaving ourselves open to having others define it for us.

When OCW was announced, I think there were many out there who hoped it would provide the leverage to break away from the artisan model of teaching to something that was more scalable. There seem to be two varieties of this hope: One that, faculty around the world would just pick up and use MIT’s curriculum, saving time and improving quality in one fell swoop (the “dirty underwear” model); two, that OCW would repeat Wikipedia’s success and that teachers around the world would collaborate on one “killer app” curriculum. A third variety that emerges in Taylor’s book is that online resources might supplant live teachers entirely-the OLI model.

All three I think grow from a view of education that holds it is essentially knowledge transfer, and that there ought to be one “best” way to do it, measurable and precise. Education, at least for me, is intensely local and personal, learning how to learn. I won’t dwell, and plenty of people have spoken more intelligently and articulately on the issue. Comments like Ira’s I think express the frustration of revolutionaries expecting a revolution.

OCW by its nature, though, reinforces the artisanal model of education by providing an example of one of the best artisanal communities of educators in the world hard at work. When we were first going to faculty and encouraging OCW participation, one of the constant refrains we heard was that MIT’s materials were designed for MIT students, and likely weren’t going to be appropriate for most people out there. Not that they were necessarily too high level, just that they were created for a specific community working with in specific conditions.

However, OCW materials do provide educators a window into how the MIT faculty community operates, how they craft educational experiences, and other craftsmen and women around the world can draw inspiration and resources from OCW as they create their own educational experiences. But this is not the kind of activity you see writ large on the face of institutions. Nor does it change the fundamental model.

Large parts of the OCW story also take place outside the walls of institutions as well, offering educational opportunity to people who previously had none, and Ira’s comment completely ignores this issue. OCW has the potential of impacting a great many lives, and appears at some level to have already done so for hundreds of thousands. But this is a difficult story to document and tell, not measured in pre-tests and post-tests.

Which doesn’t mean we shouldn’t be doing it, it just means we have a lot of hard work ahead of us. This again is not a post about Ira Fuch’s comment, it’s a post about our own failures to make the case. To define success. To share what we know about the ways OCW is making a difference around the world. Ira’s right, we haven’t noticeably changed higher education. But we have noticeably changed lives all around the world, and we need to be getting that message out there.