Feedback

I’ve mentioned the appear.in service a couple of times. This allows you to convene small meetings (up to 8 people) with video, voice and chat without the need for logins or additional browser plugins on both desktop and mobile (my quick video demo here). Today I got an email from appear.in saying

Get notified when someone enters your room!

We have now made it even easier to start a video conversation. When someone enters your appear.in room, you will receive a desktop notification that you can click to enter the room.

How can you use notifications?

Get notified when someone shows up for a meeting

People who want to talk to you can just go into your room

Make sure everyone on your team is alerted when your team meetings start

Notifications work using a Chrome extension, but once you have this installed to can monitor multiple rooms.

So if you were wanting to run remote tutor support hours you could claim an appear.in room and enable notifications. Once you advertise your office ours you can monitor the room, get on with other work and wait for notification.

Because appear.in allows you to ‘lock rooms’ if you are providing one to one support you can prevent someone else ‘walking in’.

The awkward bit is handling the locked room. There is no queuing service and anyone visiting a locked room will be presented with the message below. Unfortunately if someone visits a locked room, sees the locked message when the message doesn’t go away when the room is unlocked.

A way around this might be to have two rooms – corridor and office. The corridor room would always be open. As people arrive in the corridor room you could greet them and invite them to your ‘office’ and lock the office during consultation. Once done you could go back to the ‘corridor’ room if anyone else is waiting. If the ‘corridor’ gets busy (more than 7) you’ll have to sit in it yourself or lose the ability to enter (unless as an owner you get priority).

[Writing this it’s all sounding very faffy. I’d imagine you could do something similar with Google Hangouts but I love the fact appear.in requires no login. What do you think?]

Here is some text I prepared for a possible Google Apps Developer blog guest post. It doesn’t look like it’s going to get published so rather than letting it go to waste I thought I’d publish here:

Martin Hawksey is a Learning Technology Advisor for the JISC funded Centre for Educational Technology and Interoperability Standards (JISC CETIS) based in the UK. Prior to joining JISC CETIS, and in his spare time, Martin has been exploring the use of Google Apps and Apps Script for education. In this post Martin highlights some features of a Google Apps Script solution which combines Google Spreadsheet and Google Documents to speed up and standardise personal feedback returned to students at Loughborough College.

One of things that drew me to Apps Script over two years ago was the ease in which you could interact with other Google services. I also found that both using Google Spreadsheets and a coding syntax I recognised ideal as a ‘hobbyist’ programmer.

Late last year when I was approached by Loughborough College to take part in their ‘Fast Tracking Feedback’ project, I saw it as an ideal opportunity to get staff using Apps Script and showcase the possibilities of Apps Script to the Google Apps for Education community.

The goal of the project was to produce a mechanism that allows tutors to input assignment grades using a custom UI that mirrors the final feedback sheet or enter details directly into a Google Spreadsheet. These details are then pushed out as individually personalised Google Documents shared with the student. This sounds relatively simple, but the complication is that each assignment needs to map to a predefined set of rubrics which vary between units. For example in one course alone there are over 40 units and every unit can be assessed using multiple assignments with any combination of predefined criteria ranging from pass, merit and distinction.

Below is an example student feedback form highlighting the regions that are different for each assignment.

The video below shows a demonstration of the current version of the of the ‘Fast Tracking Feedback’ system is set-up and used:

Solution highlights

A number of Apps Script Services have been used as part of this project. Lets look at how some of these have been implemented.

DocList Service – The self-filing Google Spreadsheet

The eventual plan is to rollout the Fast Tracking Feedback system to teaching teams across the College. To make the life of support staff easier it was decided to use a common filing structure. Using a standardised structure will help tutors stay organised and aid creation of support documentation.

When a tutor runs the setup function on a new feedback spreadsheet it checks that the correct folder structure exists (if not making it) and moves the current spreadsheet into the pre-defined collection.

UI Service – Hybrid approach

A central design consideration was to make the Fast Tracking Feedback system easy for College staff to support and change. Consequently wherever possible the Apps Script GUI Builder was used to create as much of the user interface as possible. Because of the dynamic nature of the assessment rubrics part of the form is added by selecting an element holder and adding labels, select lists and textareas. Other parts of the form like the student information at the top can be added and populated with data by using the GUI Builder to insert textfields which are named using normalized names matching the spreadsheet column headers. The snippet of code that does this is:

Document Services – Master and custom templates

The process for filling in personalized feedback forms has three main steps. First a duplicate of the Master Template is made giving it a temporary name (DocList Services). Next the required assessment criteria are added to the form using the Document Services mainly using the TableCell Class. Parts of the document that are going to be filled with data from the spreadsheet are identified using a similar technique to the Apps Script Simple Mail Merge Tutorial. Finally for each student the assignment specific template is duplicated and filled with their personalised feedback.

Currently the system is configured to place generated feedback forms into a draft folder. Once the tutor is happy for the feedback to be released either individual or class feedback forms are distributed to students from a menu option in the feedback spreadsheet for the assignment, a record being kept of the status and location of the document.

This project will effectively combine Google Apps for Education and Google Apps Script in order to create a tool which allows tutors to enter grades and feedback in a single spreadsheet which then automatically populates individual feedback proforma, simultaneously sharing these results with students, progress tutors, and administrators as appropriate.

The benefit will be an increase in the efficiency with which assessment feedback can be shared, improving the speed and quality of paper-less student feedback. A successful conclusion to this project will be demonstrated by reduced submission turnaround times and a reduction in the errors brought about by inconsistencies in data entry.

Project funding is not just for deploying technology but also increases the capacity within the organisation at the operational level. With this in mind I have been working with Loughborough, helping them in the technical aspects of developing the Fast-Tracking Feedback System and also learn about Google Apps Script via a series of workshops. Friday was the first of these and I thought I’d share the story so far.

So below are two links of the current version of the Google Apps Script Spreadsheet and example Document template followed by a quick video to show how it is used. Obviously these are still work in progress as there is still 6 months to run on the project but there’s already enough there for others to benefit from and perhaps feedback on design.

Last week my colleague, Kenji Lamb, and I were up in Inverness providing some support to the University of the Highlands and Islands (UHI) EDU Team. We were exploring the use/approach to assessment and feedback, sharing what is going on in the sector for the EDU Team to disseminate around UHI. Below are a couple of slide decks I used over the two days.

Having worked on the REAP project a couple of years ago there was a bit of material I recycled from that (as the ripples from this project are still resonating finding there way into publications like Effective Assessment in a Digital Age and workshop/design tools like the JISC funded Viewpoints project) Note to self: must write about Viewpoints once online tool is available

Recently a member of staff from one of our supported institutions interested in the use of this form of feedback contacted me with concerns over students reposting personal feedback in the public domain i.e. just as a tutor respects a student’s privacy in not publishing a student work without permission, shouldn’t students do the same. In particular they were wondering if any student declaration was needed to prevent this from happening.

My initial response was along the lines of that any feedback produced by the tutor would remain the intellectual property of the institution and any public reposting would automatically need the consent of the institution, therefore all the tutor needs to do is highlight the existing legal position rather than having students make any extra declarations. But as I wasn’t completely sure of my interpretation of IPR I put a query with JISC Legal and here was the response I got (Disclaimer: The following text is provided as information only and does not constitute formal legal advice):

The recording of the feedback given by the lecturer will either belong to the lecturer or the institution. S.11(2) of the Copyright, Designs and Patents Act 1988 provides that the employer will be the first owner of copyright, unless there has been an agreement otherwise. It could be that there is sufficient ‘dramatic’ content in giving the feedback too that there is a performer’s right in the recording too, which would stay with the academic, unless there is agreement to transfer those to the institution.

In any case, the student would need to get permission before doing any of the copyright-restricted acts, which would include copying the work, adapting it, and communicating it to the public by internet dissemination in this particular case. It may be worth reminding the students of this, and I’d suggest including an explanation that the feedback is personal and given within the teaching relationship, and so dissemination of the work would be disrespectful as well as copyright infringement. Beyond the legal issue, it might also be worthwhile addressing the underlying reasons why the student or students might want to share the feedback – is there a need for more generic feedback that can be shared more widely?

So generally speaking my guidance was along the right lines, but the information from JISC Legal not only identifies particular nuances of the legal implications but also highlights how the risk of getting into problems can be mitigated and addressing some of the fundamental pedagogy. I hard to see how advice like this could get any better.

This isn’t the first time JISC Legal have provided some first-rate guidance and if you haven’t checked out their service it’s well worth an explore. Before you think this level of support is only available to other JISC Advance and JISC related staff it’s not. JISC Legal endeavour to support anyone in the UK tertiary education sector “to ensure that legal issues do not become a barrier to the adoption and use of new information and communications technologies”.

As well as individual guidance JISC Legal have a wealth of support material. Recent goodies include:

On the 28th May 2009 I wrote a post on Generating Student Video Feedback using ScreenToaster. As ScreenToaster is now ‘toast’ I thought I’d repost highlighting screenr instead. As the process for using ScreenToaster/screenr is so similar I haven’t re-recorded the demo video, but hopefully you get the idea (I’m glad I downloaded the original and put it on vimeo ;)

In my original post I highlighted Using Tokbox for Live and Recorded Video Feedback as a possible solution to distribute video feedback. At the time I felt there were two niggling issues with using Tokbox. First there was the requirement to install the ManyCams software to allow you to display your desktop and secondly Tokbox was very slow in uploading video you had recorded. For live video feedback Tokbox might still be worth considering, but shortly after publishing the post I discovered ScreenToaster., but for recorded feedback you might do better with screenr.

ScreenToaster Screenr allows you to record your desktop without installing any software. It’s very easy to setup and the videos you create can be immediately uploaded allowing you to decides how you want to distribute and share them [You can also publish them directly to YouTube and/or download the video in MP4 format. The following video shows you how easy it is to setup and highlights some of the useful features. Even if you are not interested in delivering video feedback to students this is still a great site to record other material like demonstrations of software.

A long, long, long time ago I wrote a post Using Tokbox for Live and Recorded Video Feedback in which I demonstrated how the free ManyCam software could be used to turn your desktop into a virtual webcam to provide feedback on students work in a Russell Stannard styley. Recently my colleague Kenji Lamb was showing me how you could directly record your webcam using YouTube, so I thought I would revisit this idea.

This time instead of focusing on the use of the visual element as a tool to direct students attention to a specific part of a assessment submission (e.g. highlight and talking about parts of a word document), I thought it would be interesting to demonstrate it in a more abstract way using images to reinforcing audio comments (e.g. you did good – happy face; you did bad – sad face).

When previously looking at audio feedback I’ve been very aware that reducing as much of the administrative burden is very important. Online form filling whether it be through the VLE, other systems or in the YouTube example, can be a bit of a chore so in this demonstration I also touch upon using bookmarklets to remove some of the burden. Here is a link to the bookmarklet I created for student feedback on YouTube (YouTube Feedback Template – you should be able drag and drop this to your bookmark toolbar but if you are reading this through an RSS reader it might get stripped out).

For the next post in my ALT-C series I’m going to highlight a session I didn’t actually attend but immediately regretted when comments started filtering in on twitter.

The session was based around the paper by Rodway-Dyer, Dunne and Newcombe from University of Exeter which summaries a study of audio and visual feedback used in two 1st year undergraduate classes. Click here for the paper and abstract.

Comments I picked up on this paper via twitter appeared to show audio feedback was not well received. Issues highlighted were:

the finding that “76% of students wanted face-to-face from a tutor in addition to other forms of feedback” [@adamread, @JackieCarter]

students found that receiving negative audio comments was harder than when written [@adamread, @ali818, @narcomarco]. Although this is still open to debate as @gillysalmon said that “duckling project at Leicester has found human voice easier to give negative feedback by audio than text”

Obviously there are issues with making assumptions based on a few 140 character tweets and it should be noted that the authors conclude that “overall, it seems that "there is considerable potential in using audio and screen visual feedback to support learning”, although students did express concerns in a number of areas.

Having had a chance to digest the paper the question I’m left with is how much of the negative experiences were a result of the wider assessment design rather than the use of audio feedback in itself. For example, reading the focus group discussions for audio feedback in geography I noted that:

students were not notified that they would be receiving audio feedback;

that despite the tutors best attempts students hadn’t engaged with assessment criteria; and

that this was the first essay students submitted at university level and they were unclear of the expected standards.

Similar issues to these were addressed in the Re-Engineering Assessment Practices (REAP) project, which produced an evolving set of assessment principles. Principles which could be successfully applied to the geography example might be:

Help clarify what good performance is – this could be achieved in a number of ways including creating an opportunity for the tutor to discuss criteria with students, or perhaps providing a exemplar of previous submissions with associated audio feedback.

Providing opportunities to act on feedback – as this was the students first submission providing feedback on a draft version of their essay not only allows students to act on feedback (it’s not surprising when students ignore feedback if they have no opportunity to use it).

Facilitates self-assessment and reflection - One of the redesigns piloted during REAP was the Foundation Pharmacy class, in which students submitted a draft using a pro-forma similar to that used by tutors to grade their final submission. Students were required to reflect on distinct sections of their essay, which again also allowed them to engage with the assessment criteria.

Encourage positive motivational beliefs – using the staged feedback described above would perhaps also address the issue of students becoming disillusioned.

Talking to a friend during the lunch break the research methodology used by the authors was also mentioned, in particular the use of ‘stimulated recall’. For this the authors played back examples of audio feedback to the tutor asking him to explain his thought processes and reflect on how his students would have responded to his comments. This methodology seems particularly appropriate to evaluate the use of audio feedback, and is something I want to take a closer look at.

The presenter, Phil Ice, has been working on audio feedback in the US for a number of years and has a number of interesting findings (and research methodologies) I haven’t seen in the UK.

For example, Ice and his team report:

students used content for which audio feedback was received approximately 3 times more often than content for which text-based feedback [was] received”

and that

students were 5 to 6 times more likely to apply content for which audio feedback was received at the higher levels of Bloom’s Taxonomy then content for which text-based feedback was received”.

These results were from a small scale study of approximately 30 students so aren’t conclusive. Ice has also conducted a larger studies with over 2,000 students which used the Community of Inquiry Framework Survey. Positive differences were found across a number of indicators including excessive use of audio to address feedback at lower levels is perceived as a barrier by students.

Ice has also conducted studies which breaks audio feedback into four types: global – overall quality; mid level – clarity of thought/argument; micro – word choice/grammar/punctuation; and other – scholarly advice. The study indicates that students prefer a combination of audio and text for global and mid-level comments.

Findings from Ice have been submitted for publication in the Journal of Educational Computing Research (which will soon feature a special issue on ‘Technology-Mediated Feedback for Teaching and Learning’).

Screenshot showing inline audio comments

Finally, I would like to mention the method Ice uses for audio feedback. He uses the audio comment tool within Acrobat Pro 8 to record comments ‘inline’. This appears to be particularly useful for students to relate comments to particular sections of their submitted work. Click here for a sample PDF document with audio feedback (this isn’t compatible with all PDF readers - I’ve tested on Acrobat Reader and Foxit Reader).

Hopefully this post has not only stimulated some ideas in the use of audio feedback, but also highlight a range of methodologies to effectively evaluate it.

The National Student Survey results has been published by HEFCE which has no doubt left school/department managers burning the midnight oil to see how they have faired. Feedback remains to be a talking point with only just over half of Scottish students agreeing or strongly agreeing that feedback has been prompt, detailed and helpful.

But what about the students who neither agree or disagree? If you turn the question around and ask what proportion of students disagree or strongly disagree with the level of feedback they receive then you are looking at approximately a quarter of students. Obviously this is still a substantial number and still makes feedback the worst performing area, but if you are drilling down into course level performance perhaps it is worth bearing in mind.

Table 1 below shows the results for the percentage of Scottish students who responded disagree or strongly disagree to the NSS questions.

Looking at how this analysis effects the overall satisfaction with Scottish HEIs the most notable changes are University of Stirling and Robert Gordon University who (by my calculations*) jump 2 rankings. Below (#) denotes rank.

University of St Andrews

91% (1)

3% (1)

University of Glasgow

91% (1)

5% (2)

University of Aberdeen

89% (3)

6% (4)

University of Stirling

88% (4)

5% (2)

University of Dundee

88% (4)

7% (5)

University of Strathclyde

87% (6)

7% (5)

Robert Gordon University

84% (7)

7% (5)

Glasgow Caledonian University

84% (7)

8% (8)

University of Edinburgh

82% (9)

9% (9)

Napier University

81% (10)

9% (9)

Heriot-Watt University

81% (10)

10% (11)

Glasgow School of Art

69% (12)

22% (12)

*Data provided by the NSS is susceptible to rounding errors. For example University of St. Andrews has an overall percentage agree for Q22 of 92% yet the percentage breakdown is 35% agree and 56% strongly agree, which equals 91%. To allow comparison with the percentage of disagreement, the sum of percentage of responses for agree and strongly agree have been used.

Disclaimer

The views I express here are mine alone and do not necessarily reflect the views of my employer or any other party.

All code, applications and templates on this site are distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.