Monthly Archives: February 2008

How do we nurture and sustain our open source projects? That’s one of the key questions that emerged from a workshop showcasing three JISC Capital Programme projects working on eassessment held recently at the University of Cambridge. The projects – AQuRate, Minibix and ASDEL – demonstrated the end-to-end assessment process which interoperability can support, with items authored in AQuRate, uploaded to and extracted from the Minibix item bank and successfully delivered in ASDEL. All three tools implement the QTI v2.1 specification, building where possible on earlier open source work funded by JISC. Code will be added to the projects’ SourceForge space, and it’s hoped that the quality of these tools combined with the collaborative tools provided by SourceForge will help to sustain them after the end of the funding period in March.

A recent graphic from Le Monde illustrates the geographic variation in popularity of different social networking services across the world. The graphic, based on data from August 2007, shows that while MySpace and Facebook have an unsurprising stranglehold on the North American market, things are very different in the rest of the world.

Slideshows and mp3s from last week’s joint JISC CETIS Assessment and Educational Content SIGs meeting are now available on the wiki. It was a lively and interesting day, covering a wide range of topics of relevance to both communities.

Steve Lay of CARET, University of Cambridge, who had kindly offered to host the event, provided an update on the IMS QTI specification. Steve is co-chair of the IMS Assessment SIG which is responsible for the development of the QTI specification, and provided attendees with background information and an update on the current position of QTI v2.1. The specification was released in public draft form in July 2006, and it was hoped that the final version would be released in early 2008. Delays to the interoperability demonstration required before the specification can be released have set back release to later this year, with an addendum to the public draft scheduled to appear earlier.

Steve also described some of the issues around profiling specifications and the role of IMS’s Application Profile Management Group, particularly in relation to the IMS Common Cartridge specification which currently includes a profile of QTI v1.2.1. His examination of the pressures put on the scope of the specification is particularly useful.

Wilbert Kraan from CETIS complemented this with an update on content packaging specifications, covering OAI Object Reuse and Exchange (ORE), Content Packaging v1.2, IEEE RAMLET and a proposed packaging transcoding service. CP v1.2 is still in draft stage and will, like QTI v2.1, be released to the public once IMS members have developed implementations and shown them in interoperability demonstrations. There is quite a lot of updated material in the new version but the lack of current implementations mean that it’s immediate future is uncertain.

RAMLET is an ontology which enables mapping between IMS Content Packaging, METS, MP21 DID and Atom. Wilbert raised the particularly interesting question of the applicability of this approach to question and test materials, not just in QTI but also other formats, potentially including html. Steve confirmed the ease with which content should be able to be transformed to QTI, as well as highlighting the potential value for enhancing accessibility.

CETIS’s Deputy Director Adam Cooper presented a postcard from the IMS Quarterly meeting in Long Beach held the week before. This was an extremely useful update on recent developments within IMS and current work in progress, which includes Enterprise Web Services v2.0, Learning Tools Interoperability v2.0, Common Cartridge and Common Cartridge Schools (CCK12), Digital Interactive Content Exchange and various ‘odds and sods’ including QTI v2.1.

Moving away from the more abstract topic of specification development to their real world uses, Ross Mackenzie and Sarah Wood of the Open Universitydiscussed their experiences with creating Common Cartridges for the OU’s Open Learn, releasing free content under a Creative Commons licence for use worldwide. Content, largely drawn from OU archives, was transformed into XML, an approach which allows the subsequent rerendering of material in multiple formats. After hand crafting a small number of cartridges, an automated process was developed which has so far produced around 400 cartridges for download; assessment material has not yet been covered but is of obvious interest. Issues around certification and validation were highlighted, with proposals by some Common Cartridge Alliance members that costs of up to several hundred dollars would be appropriate for cartridge testing being inappropriate for an initiative which aims to give content away for free.

Cartridge creation tools mentioned included OU Publisher (which it’s hoped will be made available in Moodle at some point), eXe and Microsoft Grava; desktop players include UCompass based on Adobe AIR and a Microsoft development based on Silverlight; it’ll be interesting to see how this particular battle works out.

Assessment SIG regulars will be familiar with the work Niall Barr of NB Software has done around assessment and QTI, including some valuable developer resources. He’s now moved into the area of working on the IMS Common Cartridge and Tools Interoperability specifications with particular reference to assessment and the QTI specification, and presented some of his work to the meeting. An mp3 recording of his talk is available and we hope to have the slides available shortly.

Linn van der Zanden of the Scottish Qualifications Authority (SQA) closed the meeting with a fascinating look at some of the more innovative assessment activities the SQA have been piloting in recent months. This particular project, led by Mhairi McAlpine, has introduced blogs and wikis to support assessment of a PBNC in Health and Safety. This course places heavy emphasis on collaborative work which raises difficulties in assessing individual contributions. The use of a team wiki enables assessors to evaluate individuals through the use of the history function, with discussion pages providing evidence of debate and dissent. This approach also helps to identify ‘freeloaders’ who contribute little, and stronger personalities within the group which may take over activities. Personal blogs support reflective learning, while traditional eassessment facilities support the submission of project plans. Login requirements provide a degree of authentication of contributions, and students have responded positively to the approach. The current small scale project involving fifty students in two colleges is likely to be scaled up for rollout on a wider scale over the next few years.

Our thanks go to our friendly and helpful hosts at CARET and to all our speakers who helped to make this such a useful and interesting event, and my thanks go to Sheila, our Educational Content SIG coordinator, for collaborating on the event and chairing the meeting so effectively on the day. You can read Sheila’s discussion of some of the issues raised by the meeting on her blog.

Attendees at last September’s SIG meeting will remember Martin Hawksey’s lively presentation on the Re-Engineering Assessment Practices in Scottish Higher Education (REAP) project. Funded by the Scottish Funding Council and supported by JISC, the project explored ways in which technology can be used to enhance and transform assessment practice in large first year university classes, resulting in enhanced learner skills, greater achievement rates, and deeper engagement.

A final report on the project is available, discussing a range of topics such as project achievements and lessons learned, preparing for, managing and coping with large scale organisational changes, the pedagogic principles underlying transformation and a study on the use of electronic voting systems (EVS) and the surprising impact they can make on learning and achievement.

The figures reported are impressive: one course saw mean pass marks rise from 51.1% to 57.4%, another’s examination failure rate dropped from 24% to 4.6%, while a third saw a 10.4% gain in mean examination marks; hundreds of hours of staff time were saved through reductions in lectures, tutorials and the use of online assessments while students actually spent more time ‘on task’, and the nature of staff-student contact became more supportive and facilitative. Self-assessment and peer assessment gave students more responsibility for and ownership of their learning, to which students generally responded positively.

As the report suggests, ‘these findings suggest that these processes of transformation are a plausible prospect more generally in the HE sector’.

The report focuses on six technologies or practices in particular: grassroots video and collaboration webs, predicted to enter the mainstream over the next year; mobile broadband and data mashups (two to three years); and collective intelligence and social operating systems (four to five years). The emphasis is on educational applications of these technologies, with a range of example projects and products illustrating them in action.

Earlier Horizon reports and other publications can also be freely downloaded from the NMC site.