Over the summer, managed teaching room computers were upgraded to Windows 10 and Office 2016. To ensure compatibility, the TurningPoint software on these teaching machines has also been upgraded to a new version (v.8). While the new version of TurningPoint is very similar to previous versions, there are some changes, and a new licensed receiver will need to be used.

What does this mean for staff?

Presentations created in the OLD version (TPv5) will run in the new version (TPv8) however, there is no backwards compatibility so once they have been run/used in TPv8 you will not be able to edit or run them in the old version.

The old receivers/dongles will NOT work with the new version of TurningPoint. The Digital Education team have issued central services with a set of new licensed receivers. These will be issued when a loan is taken out for the handsets. Note that the new receivers will accept a maximum of 500 responses.

Teaching Rooms with built in handsets (Harrie Massey LT, Christopher Ingold Auditorium, Cruciform LT1) have the new receivers installed on the managed PC. If you choose to use your laptop and have updated the version of TP on it, ensure you loan the appropriate dongle from the Central loans desk. If using Cruciform LT1, you can plug in the labelled cable emerging from the teaching podium into your laptop.

If you are using equipment loaned by Departmental teams, check to ensure that you have been provided with a new receiver, if you intend to use the managed PC in the Teaching space. (Note: old receivers will continue to work with the old version of the software but you may experience some issues if moving between different campuses and teaching setups.

We recommend that you ‘test’ your presentation before running a ‘live’ session using the latest version of TP and contact Digital Education Services – digi-ed@ucl.ac.uk if you experience any problems.

For more information regarding TurningPoint and to access training guides, click here

Questions or experiencing issues with the new software? Please email digi-ed@ucl.ac.uk.

Learning Technologies in Science, Technology, Engineering and Public Policy (STEaPP)

In Science, Technology, Engineering and Public Policy (STEaPP) learning technology support is provided by the department’s Learning Techologist (Alan Seatwo). Part of this work involves assisting colleagues to explore the use of emerging teaching themes prior to the start of the MPA Programme. These sessions focus on implementing the UCL E-learning baseline, exploring classroom learning technology and using video for students’ presentation assessment and as a self reflection tool. Prevalent learning technologies in the department include the UCL Moodle Virtual Learning Environment (VLE), rapid eLearning development tools, video editing & production, cloud storage, webinars, screencasts, online surveys and classroom learning technologies, such as electronic voting handsets. In this post Alan explains how his department used learning technologies in 2015.

The department was already equipped with a collection of good quality video recorders and in August the department further invested in new hardware and software for video recording & editing, such as a range of camera/ mobile phone mounts, a tripod, wireless microphone and a copy of Adobe Creative Suite. In addition, streaming video and webinar platforms were explored during the organisation and delivery of a seminar by Professor Daniel Kammen and a written report was presented to the department for possible use in the future.

Doctoral student virtual presentation

There have been no reports of teaching staff and students experiencing major issues using Moodle. Although there have been some maintenance down time from UCL networks, overall access to Moodle is excellent. Colleagues are supportive of the idea of using classroom-learning technology. Specifically, Word-Cloud was used in How to Change the World 2015; Kahoot! and Socrative were used in Policy Making and Policy Analysis; Communication and Project Management Skills; and the Vodafone – UCL Public Policy Intensive Programme. Feedback about the use of such software from colleagues and students was very positive.

UCLeXtend is a separate Moodle platform for external use. The Vodafone – UCL Public Policy Intensive Programme was granted the use of the platform to deliver the online learning elements. This enabled the department to experiment with organising and delivering online learning programmes to non-UCL users that might be useful for future use.

Vodafone UCL Public Policy Intensive Programme UCL eXtend course

Students’ presentations were recorded, stored and made available for course assessment and self-reflection. The experience of exploring video streaming in Professor Kammen’s event enabled the process of screencasting, video recording and webinars to be refined. The average turn around time to deliver edited student presentation videos is around 24 hours after recording takes place.

Two Virtual Open Day sessions were conducted in Blackboard Collaborate (webinar software). A series of online interviews using BB Collaborate, Skype and Google Hangouts were also held with potential students.

Other learning technologies being used in the department include:

Opinio to support research activities in STEaPP Grant Research Funding Proposal Form, City Health Diplomacy and Science Diplomacy;

Articulate Storyline 2 to create two online self assessments in the undergraduate programme: ENGS102P: Design and Professional Skills 2015/16.

There have been no major issues reported by staff using Moodle to organise and disseminate learning content and facilitating discussion via the forums. The level of usage from students is also good. Data from Moodle shows that students responded to staff instructions to access learning content and submit their assignments electronically. One of the areas that can be further enhanced is the use of learning analytics, which can assist staff to identify usage trends of their designed activities and content.

When you have a problem or question E-Learning Environments (ELE) are always more than happy to hear from you, and will do all we can to help you as quickly as we can. However, this process can be slowed down if we don’t have all the information we need to investigate your problem, or answer your question. So here are some top tips for what to include in an email/ ticket to ELE, so you can help us to help you.

1. Course name (and link)

UCL is a large university with hundreds of courses, and even more modules. Therefore it is very difficult for us to investigate a problem without knowing the name of a course/ module, so that we can look at the problem and try to replicate it. A lot of problem solving is reverse engineered, so we will try to replicate the problem for ourselves and then figure out what is wrong, by using our familiarity with the components of the technology. It is also helpful to include a link to the course/ module in question, as sometimes these are not obvious when searching in Moodle/ Lecturecast. Asking for the course name is always our first step, and so by including this in your original email then you will save time and help us resolve the problem faster.

2. Activity/ resource name (and link)

As well as there being a lot of courses at UCL, individual courses may have more than one of a particular activity, such as a Turnitin assignment or forum. It will take ELE extra time if we have to search through all of them to find the problem, and it also means that sometimes we are not always sure if we have found the problem. By including the name and location of the activity in the original email ELE can go straight to it, and get to work determining the problem.

3. Screenshots

When we look at a course, it might not always be possible for ELE to replicate a problem. This might be because the issue is related to a particular browser you are using, or due to permissions on your account. As these parameters might not apply to ELE we may not be able to see the problem, which makes it much harder for us to help with the answer. If you can take a screenshot (using the PrtScn key) and then paste that into a document and send it as an attachment, it will help us see the problem and any error messages you are receiving. It can even mean that we can answer the question or give a solution straight away upon seeing the screenshot.

4. Error messages

Screenshots of error messages are good, but if you can’t take one then including what an error message says will help ELE to diagnose and resolve the problem. It also helps us if we have to deal with any third party suppliers (such as Turnitin).

4. Specifics

A summary of the problem is best as ELE might not have a lot of time to read a long email, and it may be possible to determine and resolve an issue with only a few key details, listed above. However it can also help to be specific. If you are reporting a problem then list what steps you are taking that are causing the problem, which buttons are you clicking and in what order? Details are also helpful if you are asking a question about a new activity you’d like to start, but you’re not sure which tool to use. If you include specific details about what you want to do then ELE can suggest the tool that fits your needs best.

By following these tips you will have an easier and quicker experience with ELE, and we will be able to get through more problems or questions in less time.

In 2012 I conducted research, in parallel with my job at UCL, focusing on increasing student interaction and staff engagement of an in-class question and response system colloquially known as ‘clickers’. Evidence suggests clickers provide interaction opportunities to stimulate and engage learners[1] and have a benign or positive effect in student performance[2]. Clickers are popular across many disciplines, in particular the physical sciences, but there is a particularly low interest in medical sciences.

I wanted to directly address this shortcoming so I enlisted two academics in the UCL Medical School. I assimilated the current method of teaching, and the materials used (K1). From here we adapted a learning activity to align with the new tool being applied (A1). I underpinned the use of the technology with existing literature and the evidence of realigning the ‘sage on the stage’ to the ‘guide on the side’ [3](K2), which evidence suggests is an effective method for learning and teaching (K3, V3). I provided pre-lecture technical support to reduce technical barriers and was on-hand in the lecture to support as/when needed (A2). Questions were designed into the lectures and the clickers provide immediate feedback (A3). Staff react to clicker data with an approach called ‘contingent teaching’[4] where they dynamically respond to the answers/feedback provided (A3).

I designed evaluation questions for each lecture based on Bloom’s Taxonomy[5] for learners-based evaluation of the teaching approach and learning outcomes (A4). Questions were derived from categorising Bloom into three sub-categories; remember or understand, apply or analyse the topic and evaluate or create new knowledge (K5). When questioned, 74% of students agreed or strongly agreed that the clickers and the related teaching approach encouraged interaction and helped to achieve metacognitive learning (K5). I integrated these data with post-lecture interviews for the lecturers. Using this analysis, we designed next steps for future use and identified gaps and areas for improvement (A5).

I conducted evidence-based research and followed best practice around clickers to ensure inclusion was academically merited (V3). Measuring (and increasing) engagement within the traditional lecture was aiming to promote participation for learners (V2). It was understood that clickers do not directly enhance learning but can lead to higher-order learning. I used my understanding of the wider field of evidence to define their most appropriate use within the lectures (V1, V3).

By implementing a technology which was new to staff and guiding them with appropriate techniques known to increase interaction and engagement, I provided an evidence-informed approach which could be used to transform didactic content delivery into something more engaging. My research adds to a disproportionately small body of knowledge for clickers in medical education and the study overall was positive. Staff involved still use the clickers, the impact I measured plus the evidence collected, can be further used to promote clickers within UCL, the Medical School and beyond. It earned me a Distinction in my MSc Learning Technologies and furthered my ambition to make a lasting, positive difference to higher education.

(493 words)

HEA Professional Standards Framework links referenced in this case study:

Areas of Activity

A1 Design and plan learning activities and/or programmes of study

A2 Teach and/or support learning

A3 Assess and give feedback to learners

A4 Develop effective learning environments and approaches to student support and guidance

A5 Engage in continuing professional development in subjects/disciplines and their pedagogy, incorporating research, scholarship and the evaluation of professional practices

Core Knowledge

K1 The subject material

K2 Appropriate methods for teaching, learning and assessing in the subject area and at the level of the academic programme

K3 How students learn, both generally and within their subject/disciplinary area(s)

K5 Methods for evaluating the effectiveness of teaching

Professional Values

V1 Respect individual learners and diverse learning communities

V2 Promote participation in higher education and equality of opportunity for learners

V3 Use evidence-informed approaches and the outcomes from research, scholarship and continuing professional development

Over the summer an new desktop service ‘Desktop@UCL ‘was rolled out to all Cluster room, Lecture Theatre and Kiosk PCs. As part of this project the version of the software used with electronic voting was upgraded from version 4.3.2 (also known as TurningPoint 2008) to version 5.2.1

If you have a personal installation of TurningPoint 2008, we recommend that you upgrade it to version 5.2.1 The download for TurningPoint 5.2.1 can be found on the Software Database.

Unfortunately presentations created in one version cannot be run in the other version. If you attempt to open a presentation created in TurningPoint 2008, in TurningPoint 5.2.1, it will prompt you to convert the file, which is a one way process. There is no backwards conversion process for presentations created in 5.2.1, back to version 4.3.2 (TurningPoint 2008). If you have presentations created in TurningPoint 2008 that you want to be able to use on either version, then the best advice is to make two copies of the file. Label one ‘2008’ and use it with TurningPoint 2008, the other label ‘521’ and use with TurningPoint 5.2.1

There are updated user guides for creating and delivering presentations with the new software here

E-Learning Environments is happy to provide 1:1 and small group support. In particular we can usually offer to support staff the first time they use E voting in action, which can provide much reassurance and confidence. We are also happy to advise on ways in which EVS can be used within teaching and on the design of effective voting questions.

If you have any questions about the use of Electronic Voting then please contact E-Learning Environments.

Jane Britton and Matt Whyndham recently piloted LectureTools with a small group of 17 students in a short course in project management. LectureTools is a cloud-based electronic voting system which students and their teachers can access via their laptops or mobile devices. The system works by importing an existing PowerPoint presentation and then adding interactivity to it, through varied question formats. LectureTools allows students to take notes on their devices alongside each slide; they can also flag when they are confused about a particular slide, or submit questions, which will be displayed on the tutor ‘dashboard’ (see the screenshots below, click on each one to see a full size image).

LectureTools presenter interface (the ‘dashboard’), showing an activity slide at the top, student responses and comprehension on the right, and a panel displaying student questions on the middle left. A preview of the adjacent slides is shown at the bottom of the screen.

LectureTools student interface, showing the PowerPoint slides on the left with the interactive options above and the note-taking area on the right.

As E-Learning Evaluation Specialist within ELE, I carried out an evaluation, gathering data from a range of sources. These included an observation of one of Jane’s interactive lectures and a student questionnaire followed by a focus group discussion with a sample of participants. Both educators were also interviewed at the end of the course. Students rated the system positively overall for stimulating their engagement in the subject, allowing them to measure their understanding, fostering discussion in the classroom and facilitating easy note-taking. In addition, they perceived that it helped them prepare for their forthcoming examination. Student comments included:

“I liked the LectureTools a lot. I’m really impressed by it. It’s so easy to use and so helpful and most of us nowadays work on computers anyway during the lecture so it just makes it easier not to write everything in Word, copy the slides, we have everything on one screen.”

“We haven’t really asked a question to a lecturer but I think that’s great, that you can write a question and then the lecturer looks there and then they can answer it.”

Both Jane and Matt felt it was helpful to know what students were thinking and to be able to provide timely feedback, although having a class of students all staring at their laptops at various points was initially disconcerting:

“I think I notice that you get a lot of heads down working all of a sudden and it looks very disconcerting at first … you need to just be aware that they are working and they’re thinking about your stuff but they’re not looking at your face.”

One potential issue that came out of the observation and the survey of students was the opportunity for distraction; this generally happened when students had typed in their responses to open questions and were waiting for other students to ‘catch up’:

“I do think that the multiple choice questions, or putting the order questions, those are very good ones because all of us answered relatively quickly … so we had no time for distractions but the written ones … when you don’t have anything to do you start to do other things.”

Learning activities need to be carefully structured in order to give students enough time and opportunities to think about their topic, but not so much that they use the laptop to access resources not related to their studies. For this reason, the students and Jane and Matt considered that closed questions such as multiple choice questions might be better than open questions for large lectures.

A working paper of this study will shortly be uploaded to UCL Discovery.

E-Learning Environments is working with other staff in various departments around the university to explore the potential of LectureTools to facilitate interactive lectures. If you would like more information or would like to pilot LectureTools or a comparable electronic voting system, please contact myself or Janina Dewitz, our Innovations Officer.