Newswise — Grading homework assignments for a large class can be a time-consuming chore, and overworked teaching assistants often have little time to give students detailed feedback. But help may be on the way: A new crowdsourcing tool developed at the University of California, Santa Cruz, lets students get involved in the grading.

CrowdGrader (www.crowdgrader.org) allows students to submit their homework online and then distributes the submitted solutions anonymously to other students in the class for grading. Using a novel crowdsourcing algorithm that relies on a reputation system to assess each student's grading accuracy, CrowdGrader combines the student-provided grades into a consensus grade for each submission. As an incentive to take the grading task seriously, each student's overall grade for an assignment depends in part on the quality of his or her work as a grader.

Luca de Alfaro, professor of computer science at UCSC's Baskin School of Engineering, worked with graduate student Michael Shavlovsky to develop CrowdGrader. They have been evaluating it for programming assignments (C++, Java, and Android) in computer science classes taught by de Alfaro and others at UCSC and at the University of Naples in Italy. Though still an experimental project, the preliminary results are encouraging, de Alfaro said.

"My impression is that the accuracy is not perfect, but it's no worse than what a TA does. There is always imprecision in grading," he said. "The real benefit is in the learning experience."

Grading other students' work actually helps students develop their programming skills, he said, because they get to see how other students have solved the same problem. Also, because their own work is graded by up to five other students, they get more feedback than they do from a teaching assistant.

"TAs are fairly consistent in the way they grade the homework submissions, but they tend to have schematic grading criteria that focus on some things and might miss others. From what I saw, the feedback from students gave a more holistic evaluation of the students' work," de Alfaro said.

De Alfaro and Shavlovsky described CrowdGrader in a recent technical report (arxiv.org/abs/1308.5273), which also presents the results of preliminary user studies. While more user studies are under way, de Alfaro said that he was happy with the results so far. He plans to use CrowdGrader in more classes this fall.

Crowdgrader is available online (www.crowdgrader.org), and anybody can use it. "It's hosted on a very stable platform, so if anybody wants to try it for their own classes, they can," de Alfaro said.

This work was supported in part by a Google Research Award for de Alfaro's work on crowdsourced ranking.