Archives

In Fall
2012, I offered my course CSCI 1730. This is a junior-, senior-, and
beginning-graduate-level course in programming languages (not in how to
program, but rather in linguistic mechanisms). Together with my PhD student
(and graduate TA) Joe Politz, I decided to offer it on-line in addition to
in-class.

My primary
goal was to understand this new teaching medium. As someone who runs very
interactive classes and teaches solely by writing on a board, I had long been
convinced that my teaching methods would simply never work with a remote
audience. Having maintained this position for many years, I felt it important
to experiment and learn how to adapt: everyone of a certain age (or pop culture
sensibility) recognizes the phrase, “video killed the radio star.”

I did not do it for the reasons that the
founders of Coursera have proclaimed: that they had almost no student
engagement in their classes, they were tired of telling the same old jokes, and
so on. One might conclude from their narrative that teaching and learning at
Stanford must be a terrible experience; though a more charitable (and much more
likely) reading is that they are exaggerating for corporate effect.

Hype and
exaggeration apart, I do believe higher education is at a potentially critical
juncture. Against this backdrop, Brown is engaging in a large planning effort, investing
significant energy and resources on campus space. We are fortunate to be having
this discussion after the MOOC (Massive Open On-Line Course, the idea of
teaching courses through electronic media to large numbers of students—as
personified by courses on Coursera, Udacity, EdX, and other organizations)
phenomenon has begun; it would be unfortunate if it did not significantly
affect these conversations, especially due to the impact on the classroom
(which I think is likely to be enormous).

THE ONLINE COURSE, AND BROWN'S VALUE ADDITION

It was always clear that we could not offer exactly the same
course as we gave Brown students. One of the important parts of my course is a
set of open-ended written assignments. I consider these extremely important in
measuring student understanding of the material, but we almost certainly lacked
the resources to grade them for the on-line students. Nor were we willing, as
many MOOCs are, to “grade” using simple computer-driven textual analysis; we
wanted to read the responses in depth. Thus the courses differed, and we were
able to point to tangible differences—beyond the evident intangibles—between
the Brown and on-line offerings.

CERTIFICATION LEVELS

Because we were not offering Brown’s course in full, we were
free to customize our course to different on-line clientele. Instead of grades
(which would suggest having done the equivalent of the Brown course), we publicized
three different “certification levels”:

Lite:
Completing a sufficient number of daily quizzes (but no more)

Mezzanine:
Beyond Lite, completing the minor project that occupies the first month

Ninja: Beyond
Mezzanine, completing the major project that occupies the remaining two months

When we noticed that many of our initial sign-ups were
professional programmers, we added a fourth:

Sprint: The
minor project, and quizzes during its duration

The Sprint option enabled people to engage intensively for
one month, and then disengage fully from the course and return to their
professional and other lives. The completion numbers indicate that this was a wise
addition.

BY THE NUMBERS

We had
about 1650+ signups initially. In keeping with all other MOOCs, attendance
dropped off rapidly (especially after we made the opening assignment especially
hard). Our completion ratio was about what one might expect for an upper-level
technical course: 80 students finished, distributed as follows:

Lite: 23

Sprint: 23

Mezzanine: 32

Ninja: 2

The distribution of sign-ups looked like a heat-map of
computer science: large clusters in the US Northeast, the Pacific Northwest,
and Northern and Southern California; a strong showing in the London area; and
an especially strong cluster in India’s technology hub (and my hometown),
Bangalore (now known as Bengalooru). We were surprised by the relative lack of
signups from China, Japan, and Korea, but attributed this to our publicity
methods and to potential language difficulties.

The distribution of finishers was not at all the same. We
had one each from Argentina, Australia, Tanzania (a Dutchman who has lived
there for a long time doing missionary work with his doctor wife), Thailand,
China, Finland, Belarus, Hungary, Romania, Belgium, Spain, and Portugal. Only
Russia, Germany, Canada, Japan, and India, other than the US, provided multiple
finishers; the Indians were distributed around the country, in no way matching
the distribution of signups. The American finishers also did not correspond to
the signup distribution, with a very strong showing from the Midwest and
Northeast, nobody from the US Pacific Northwest, and one each from Northern and
Southern California. In general, therefore, tech hubs seem to offer masses of
enthusiasts whose initial interest does not translate into completion. (To our
delight, though, we had at least one person on each settled continent!)

I also analyzed the finishers by self-described occupation.
“IT” means anyone in the computing industry; “student” could mean anywhere from
high-school upwards, though I don’t believe any of the high-schoolers who
enrolled got very far. Note that some people did not provide this information.

The ages
were distributed as follows; though we had several in the 13-18 age range sign
up, none of them survived the course:

19-25

26-34

35-50

Over 50

Lite

5

3

3

1

Sprint

3

8

5

1

Mezzanine

5

12

7

Ninja

1

1

At
sign-up, we also asked people what their likelihood was of finishing each of
the certification levels. Suffice it to say these expectations greatly outstripped
reality (not least because roughly 1500 participants failed to complete any
level).

THE BOTTOM LINE

I expected
my in-class experience would remain largely unchanged, while I would learn most
from the on-line component. The exact reverse was true. The on-line component
went along mostly predictable lines, with few surprises. In contrast, the
provision of videos had a dramatic and (in my mind) undesirable effect on the
in-class experience: of sixty students, only about twenty attended class regularly.

Many students
attributed their lack of attendance to the “early” hour of the class: 10am on
MWF. As a card-carrying computer scientist, I’m guilty of having had similar
views as an undergraduate. However, the same course has been offered at 10am
for years, and attendance was always close to perfect, and this year’s class
didn’t seem especially different in constitution. In short, there is the potential that the provision of videos will have
a significant impact on class attendance, even in relatively interactive,
discussion-oriented classes.

PUBLICITY

We made
our decision during the summer preceding the course, well before Brown’s
Coursera announcement. We therefore had to do all publicity ourselves. We made
announcements on some mailing lists, and on our own social media pages. We did
not employ any other means of advertisement, such as purchasing Google ads. It
was never our goal to bulk up with large numbers of students (we were frankly
surprised when signups first crossed 100!), so other means of advertising made
no sense.

FORMAT

I normally put all my course material on-line, without any
firewall (like the abominable Blackboard and its siblings). What changed is
that we created mechanisms for grading on-line student work (more on this
later), and also published videos of all the classes. Rather than create
off-line video snippets (as used in flipped classrooms), we simply recorded
class and published it in full. Some on-line students reported that they
enjoyed the sense this gave of actually being in the class.

To avoid visibility problems, I changed from writing on the
board to writing on a tablet computer projected on a screen: nearly the same
writing experience for me, but with perfect visibility on video. (Indeed, the
tablet offered some advantages a whiteboard does not, such as the ability to
move a block of text from one location to another.) To protect the privacy of
students, we recorded from the back of the room so their faces were not seen.

After every class, we converted the videos and published
them on YouTube. On-line student discussion took place on Piazza, where Brown
students were welcome (but most did not actively participate, at least not by
name).

PLATFORMS

Instead of
sticking with one packaged platform, we used a variety of on-line media: Google
Plus, Google Documents, Google Groups, Batchgeo (to make maps), Dropbox (to
share videos), Piazza (for discussion), JotForm (for uploading solutions),
Brown Computer Science facilities, and software we wrote. We chose to do this
so we could better understand from scratch what tools such an effort needs, and
not be hemmed in by one platform. Because I had a staff of world-class problem
solvers, I was confident we could fight our way out of any tight corners, and
this approach indeed worked well.

STUDY GROUPS

We felt it
was important to help people form local study groups, and many students were
interested in this, too. Lacking a platform to do this for us, we created an
open Google Map that any participant could edit, so they could drop pins indicating
where they were and find one another. This worked well enough, and several
study groups sprang up around the world.

ONLINE STUDENT BEHAVIOR

The
on-line students generally behaved in exemplary fashion. Once we had weeded out
the “tourists” (my term for those who were never going to be serious students
in the class), the remainder were often genuinely grateful for the class
experience, and were far less demanding than I expected. Indeed, I think they
were undemanding to the point of hurting their educational experience.

I was especially
afraid of being pestered with email messages of the “i dont know how to install
ur software” variety. These never materialized. The few people who contacted us
by email had good reasons and kept it brief and on point. We would actually
have enjoyed more interaction with some of the on-line students.

The
beginning of the semester was problematic on Piazza. Because there was nothing
much to do, the on-line students turned it into yet another Web discussion site
(perhaps to shake out their anxieties), holding forth vapidly on the course
topic and much else. I believe this turned off many Brown students, in response
to which we created a Brown-only announcement mailing list. Perhaps if we had
performed better crowd control initially, Piazza would have remained the single
forum everyone used.

I
encountered only one moment of angst: when a male on-line student made an
inappropriate remark responding to a female on-line student. I caught this
within an hour of its appearance (during which time it had received fewer than
twenty views), deleted it immediately, and posted a chastising comment on the
discussion site. Happily, the female student stayed with the course until the
very end, and remained a strong contributor.

There
was just one sense in which on-line students were very demanding: in digital
formats. We initially expected we would simply upload our videos to YouTube.
But some students complained they couldn’t easily access YouTube, or wanted the
video for off-line viewing (e.g., while commuting to and from work), so we had
to make a direct link also accessible. Some wanted low-resolution versions of
the video due to weak Internet access. Some wanted access to the digital
version of what I wrote on the “board”. Some even wanted only audioaccess to the
lectures. Keeping all these different needs satisfied was a significant and
constant burden. Surveys suggested each of these formats was useful to just
enough students to be worth continuing to provide, and once we had begun to
offer one we couldn’t take it away.

The
timing of our home works had an interesting and unintended consequence. Because
I was redesigning the course from scratch, many of the projects were brand new
and needed debugging. We put out assignments on Fridays. Most of the on-line
students, being working professionals, did them immediately, and helped us find
and fix most of the problems. Thus, by the time most Brown students got to the
assignments, they encountered much better versions of them.

STAFFING

I did not have any additional resources to teach the on-line
offering. My regular course staff consisted of my grad TA and six undergrad
TAs. I informed the undergrad TAs that, because this was a project being run by
my grad TA and me, they were under no obligation to participate. Though they largely
did not help with Piazza, the video recording and publication was handled
almost entirely by them. (These videos obviously benefited the undergrads also,
but without them there would have been no on-line course at all, so in that
sense the UTAs were indispensable. To wit, I’d like to thank Liam Elberty,
Jonah Kagan, Peter Kaufman, Scott Newman, Jon Sailor, and Varun Singh.)

COMPARISON TO COURSE GRADES

Several people have asked me how these certification levels
correspond to letter grades. They don’t at all, because the Brown students had
to do additional work (the written home works). However, very loosely, doing a
reasonable job on the written home works, combined with completing the Sprint
requirements, earned a C; doing better on the written home works and completing
the Ninja requirements at a reasonable level earned a B; and doing well on both
the written home works and the Ninja requirements earned an A. In short, the
grade requirements for Brown students were much higher than for on-line
students (which is why we created entirely different names rather than using
letter grades). Despite this, Brown students did much better than the on-line
students: 40 A’s, 7 B’s, 8 C’s, and 8 NC’s (in a non-required course).

GRADING

Because we only graded the programming-related assignments
for on-line students, all their grading could be automated. Most on-line
programming courses have students upload programs that are run by grading
scripts. We decided that we didn’t want the headache of dealing with
potentially malicious programs (it may help—or hurt—that Joe and I both do
computer security research), nor the expense of running these programs on a
cloud provider. We therefore instead handed out a binary program for each
assignment that would run the same checks on the students’ own machine, and
report the results back to us. (As Joe pointed out, this puts the trust
relationship in the right direction: we have no reason to trust them, but if
they don’t trust us enough to run our program, why are they taking a course
from us?)

Of course, when the students are reporting their answers to
us, it’s too easy for them to cheat. We therefore embedded a little ad hoc
cryptographic protocol—Joe appositely labeled it “craptography”—in the grading
programs to make this difficult. Our goal was not to create something
impregnable, but rather to prevent casual and, indeed, all but determined cheating.
This process worked well in retrospect.

WHO GAINED FROM THIS EXERCISE?

I gained the most. I got to experiment with what is clearly
an upcoming challenge to our profession. I got the opportunity to reach out to
whole new segments of the computing population. (We already have a new master’s
student applicant from this on-line audience, and I wouldn’t be surprised if
some of the participants end up becoming PhD applicants down the road.)

Joe and the other course staff also learned a lot about the
needs and demands of on-line teaching platforms. One TA, in particular, has a
deep interest in MOOCs, and has been considering job offers from companies such
as Coursera and Khan Academy. For these students it was a valuable real-world
software requirements-gathering experience.

The benefits for Brown students were probably fewer, but
that is also because we worked to insulate them from the on-line crowd. I do
think the students benefited some from interactions, especially with
professionals. For instance, they got to see some important differences between
how they and professionals tackled some tasks, and at least some students found
this thought-provoking.

My wife pointed out one subtle benefit for Brown.
Over
the years, I’ve found it difficult to explain the chasm between our courses and
those almost everywhere else (in the world). Offerings like this give the world
a window into what we do, and let them judge just how demanding (and good) our
courses are. This raises the profile of our students with potential employers
and others who need to evaluate them. By not only being uncompromising in the
quality of our courses but by also showing that there’s more to a Brown course
than what is offered on-line, we also signal to the best students worldwide
that we are a place where they might feel at home.