Due to all the changes we made the last months, most of the tests didn’t work at all. They refered to none existing properties, methods or fixtures. I tried to fix them all, and now, the major part of our unit tests work, and pass.

I ran into a small problem: I first ran all the tests one by one, with the command:

ruby -I test test/unit/name_of_the_file.rb

34 files only for the unit test, so 34 times this line. The command

rake test:units

launches all the test at once. For now, it launches 99 tests, has 138 assertions, and 2 errors, then it aborts…

We have an internal test server set up, now. Everybody interested in the current state of development is very welcome to experiment with it. We are doing our best to keep this test server at a very recent state of development. Feedback, bug reports, etc. are greatly appreciated. Interested people who have access to lab 2270 in Bahen can go to desktop machine b2270-20 and try it out. There is a user ‘test’ with password ‘test’ set up and you should find a link to the test server (running cutting edge OLM) available on the desktop after login. Then, use username ‘s’ and some password (having more than one character) to log in to checkmark. If you have questions, we are sitting right next to you 🙂 Thanks!

We are already two weeks into the summer and I haven’t published the promised set of goals for the summer. This happened partly thanks to my procrastinating, and partly because Nelle, Mike, and Severin dove into the project with such vigour that I was afraid of holding them back.

Nevertheless, we need to keep our eyes on the goals:

0. Refactor the database schema and finish the refactoring from last term

This wasn’t on my original set of goals, but was identified on day 1 by Nelle, Mike, and Severin as the first step. We did a big refactoring last term, and not all of the dead code was removed. We are well on the way to completing this task.

1. Deployment.

This is the most important goal of the summer. We plan to use OLM in the classroom in September so all decisions on features this summer will be based on whether or not they can be fully completed by September.

The two things we are doing to work on this goal are that we will be meeting with Alan in the last week of May to figure how to get an intermediate version running on our production server. Hopefully if we spend time on this in late May and early June, we can iron out the system administration aspects of deployment.

Also, Severin is working on a test server that we can frequently update both to run tests and to use a demo server so that we can start to try out some of the new features on other instructors and students.

2. User permissions and visibility

The roles (student, ta, instructor) exist in the code but we aren’t doing full checks to make sure that users can see only the views they are supposed to.

3. File submission backend

It would be really nice to use a version control system to store the student file submissions. The current plan is to build an abstraction layer for file add, delete, replace, and view. This will allow us to build 3 versions of the backend: an in-memory store for testing, a file-based store for simplicity and completeness, and the desired SVN store. There are a bunch of design issues here that are unresolved.

4. TA mapping

The view that allows an instructor to attach TAs to the groups that they will mark is incomplete.

5. Administrative interface

This version of OLM includes a web interface for the instructors to do all the setup for an assignment. This includes assignment setup, group formation, rubric creation, downloading data (grades, annotations, files), and probably a few other things. We haven’t put any thought into the UI for this yet, so that’s on the list.

6. Grader view

This view is mostly complete, but it needs to be tightened up and made prettier. Undoubtedly, it needs more testing.

7. UI design

We’ve spent most of the effort so far (with a few exceptions) on the back end, so we have lots of work to do on the UI. Fortunately, we will be able to work with a Season of Usability student on this part of the project.

No list of goals would be complete without a list of the new feature that would be nice to have. I’m not sure how many of these we will be able to get to, but I’m hoping we can get a few of them in.

Logging: I’d like to be able to log all events and display them to the instructor.

Metrics: I’d like to be able to display graphs and different views of the data we collect such as mark breakdowns, histograms of marks, number and length of annotations, number of submissions per group, etc.

Mark entry: If OLM becomes really useful, it would be nice to have a view to use to enter grades from another source such as lab marks or test marks.

Another marking scheme: The rubric approach has proven to be pretty flexible, but it would be nice to have another way to specify a marking scheme.

Automated testing: This one is the highest priority long term goal. I don’t think we will get to it this summer, but I would like to be able to allow students to submit their code, and receive immediate feedback from an automated system.

One of my long term goals with OLM is to support automated testing where students could submit code and get immediate automated feedback. They might do this multiple times before submitting a final version. This means that we need to be able to keep multiple versions of a student’s submission. So, rather than inventing an ad hoc way to do this, our plan is to store student submissions in an SVN repository.

We don’t want to write a full subversion interface in OLM, partly because it just seems wrong, and partly because I’m not sure that there is good motivation to ask our first year students to use a version control system for their assignments.

Let’s first look at the issues from the point of view of a first year student who doesn’t know about how the files are stored in OLM. When a student uploads a file with the same name as one that was previously updated, the new file becomes the new version. From an SVN point of view no attempt is made to merge the files. I don’t think there is a problem with this scenario, but students who understand how SVN works may not expect this behaviour.

However, if students are already using repositories for their assignment work (as they do in our second year programming courses), it would be really nice if OLM could just use those repositories. When we use repositories in our courses, students submit assignments by committing files to their repo before the due time.

Design question 1: Should students who are already using repositories have a web interface to “submit” their assignment?

If they do have a web interface, it complicates OLM’s job, because OLM then has to worry about conflicts. There may also be confusion for the students about which mechanism to use, and the differing behaviour between the two mechanisms. Students may need to worry about dealing with conflicts and merging changes when they are actively using a repository.

Design question 2: Do students need to be able to submit directories through the web interface? Should a student’s submission be able to include subdirectories?

The issue here is whether the web interface for assignment submission needs to support the ability to create and delete directories, or whether we can just support browsing them. Here again there is a distinction between the cases where the students are actively using the repositories and where they have only the web interface for assignment submission. If students have direct access to the repository, then it makes sense to do all directory adds and removes through SVN. If they don’t have access to the repository, then do they really need subdirectories? (We didn’t support subdirectories in the original OLM.)

The problem lay where I didn’t expect it – it was Firebug. I was using an alpha version of Firebug, which hadn’t updated itself to the latest version. It was capturing the AJAX POST request going out, but didn’t let the parameters carry through. So, I updated to Firebug 1.2.1, and I’m set. Everything’s jake.