Abstract

Abstract

We describe a study on the motivation of trainees in e‑learning‑based professional training and on the effect of their motivation upon the perceptions they build about the quality of the courses. We propose the concepts of perceived motivational gap and real motivational gap as indicators of e‑learning quality, which reflect changes in both perceived and real students' motivation. These indicators help evaluate the changes in the trainees' motivation, as well as the bias that occurs in the perceptions about initial motivation.
In the sample analyzed, the real motivational gap was more negative when the perceived motivational gap was negative and not so positive when the perceived motivational gap was positive. We found that there is a perceptual bias on initial motivation when the perceived motivational gap is not null. This means that, for the sample analyzed, the trainees may have “adjusted” their perception regarding the initial motivation as a function of their final motivation, bringing it closer to the latter and supporting their final status. We also show that these gaps help explain how the trainees' perception of quality is affected: the gaps were minimized at higher levels of perceptions of quality and when they were positive, the perception of quality was higher than average.
The two proposed conceptual gaps are useful to measure quality in e‑learning and implement specific actions to improve it. The results of our study are useful as they create insights on perceptions of quality in an indirect way, i.e., without asking the trainees to think about what they believe quality is, so that they can quantify it. They also enable training companies to create additional and complementary indicators of quality of e‑learning courses that can help explain changes in perceptions of quality.

Abstract

With the uptake of distance learning (DL), which has actually been marginal for most academics, teaching contexts, traditional power structures and relationships have been transformed, leaving lecturers potentially disenfranchised. Institutional and cultural change is vital, particularly changes concerning academic roles. The advent of DL has caused role ambiguity; however published literature related to academic roles is confusing and lacks clear guidance. For academics involved in post graduate clinical education, information is even more incomplete. Using a framework of communities, this study is a direct response to these concerns. The aim was to systematically and critically evaluate the implementation of clinical DL in an effort to improve practice.
Maintaining a practitioner inquiry methodology, this study investigated the development and delivery of a new DL module. Data collection consisted of documentary analysis of meetings, interviews with staff and students, student evaluations and analytics. Data analysis incorporated both quantitative and qualitative methods to triangulate the research findings.
New competencies for academics emerged, including leadership and management. Barriers to staff progress included: ambiguity in roles, lack of leadership and unpreparedness for responsibilities, time, and workload. Student barriers included: time, fear, relevance of learning, isolation and increased autonomy. Explicit planning, organisational support and working within communities were requisite to create a ‘sustaining’ technology.
This study contributes to educational practice on two levels. Firstly, by striving for rigour, it demonstrates that practitioner inquiry is a legitimate research approach that is accessible and valuable to teachers. Secondly, it adds to useful and applied knowledge concerning DL practice. Avoiding traditional workload assumptions that are erroneous and inaccurate, this study provides new models of organisational roles and responsibilities. The results challenge the evolutionary nature of academia, suggesting working in communities and new competencies are required whilst traditional roles and culture must be redefined.

Abstract

This paper primarily discusses the methodology of a case study into interactions and working practices of an e‑learning team, on and offline. Although several ethnographies have been published on online learning, there are apparently none involving communities developing courses. This is a unique insight, bringing a new view of course and staff development. The e‑learning team develops courses in the Faculty of Medical Sciences Graduate School in a UK higher education institution. Interactions occur online and offline, the team’s workplace ‘setting’. The ethnography is to inform future staff development by analysing interaction outside the team with the subject specialists, generally time‑poor clinicians and research scientists who have varied experience of e‑learning, but are required to provide course content and to teach their subjects in online distance learning courses. Records kept by team members were enlarged upon via weekly interviews and collated by a team member who developed a narrative, subsequently coded into content themes. The main themes were technology, pedagogy and communication. Conversation analysis provided theories on methods useful in staff development for later action research. Consideration was also given to issues of power within the interactional relationships. The paper discusses challenges and strengths of this collaborative self‑ethnography as a research methodology in this e‑learning setting. It was concluded that collaborative self‑ethnography is a highly suitable research methodology for this type of study.

Abstract

This paper presents a methodological discussion of the potential and challenges of involving mobile eye tracking technology in studies of knowledge generation and learning in a science centre context. The methodological exploration is based on eye‑tracking studies of audience interaction and knowledge generation in the technology‑enhanced health promotion exhibition PULSE at a science centre in Copenhagen, Denmark. The current study is part of the larger PULSE project, which aims to develop innovative health promotion activities where a science centre exhibition is a key setting. The primary target groups were families with children age 6–12 years and school classes with students from 4th to 6th grade. The main purpose of the study was to understand the methodological potential and challenges mobile eye tracking comprises during the different stages of research on informal e‑learning in a science centre context utilising digital platforms to enhance informal learning and interaction. The paper presents how eye‑tracking methods influence research on: 1) an interventional level: what role eye tracking and eye‑tracking equipment plays in interventions; 2) a data level: what new types of data eye‑tracking methods specifically contribute; and 3) an analytical level: how analysis of eye tracking can supplement and contribute to other analytical approaches. Finally, the article discusses how the methodological approach presented invites consideration of other ways of understanding how users experience technology‑enhanced exhibitions.

Abstract

The rise of technology’s influence in a cross‑section of fields within formal education, not to mention in the broader social world, has given rise to new forms in the way we view learning, i.e. what constitutes valid knowledge and how we arrive at that knowledge. Some scholars have claimed that technology is but a tool to support the meaning‑making that lies at the root of knowledge production while others argue that technology is increasingly and inextricably intertwined not just with knowledge construction but with changes to knowledge makers themselves. Regardless which side one stands in this growing debate, it is difficult to deny that the processes we use to research learning supported by technology in order to understand these growing intricacies, have profound implications.
In this paper, my aim is to argue and defend a call in the research on ICT for a critical reflective approach to researching technology use. Using examples from qualitative research in e‑learning I have conducted on three continents over 15 years, and in diverse educational contexts, I seek to unravel the means and justification for research approaches that can lead to closing the gap between research and practice. These studies combined with those from a cross‑disciplinary array of fields support the promotion of a research paradigm that examines the socio‑cultural contexts of learning with ICT, at a time that coincides with technology becoming a social networking facilitator. Beyond the examples and justification of the merits and power of qualitative research to uncover the stories that matter in these socially embodied e‑learning contexts, I discuss the methodologically and ethically charged decisions using emerging affordances of technology for analyzing and representing results, including visual ethnography. The implications both for the consumers and producers of research of moving outside the box of established research practices are yet unfathomable but exciting.

Abstract

This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e‑learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e‑learning). The approach supports identifying factors that are not obvious to either the participants or the researchers prior to commencing the workshop process. This paper also discusses the facilitator’s different clinical and ethnographic roles and highlights the risks and ethical issues involved during both the workshop process and the workshop data analysis. As such, these collaborative and immersive aspects frame workshops as a research approach that has the potential to advance meaning negotiation between researchers and participants.

Abstract

E‑learning has made course evaluation easier in many ways, as a multitude of learner data can be collected and related to student performance. At the same time, open learning environments can be a difficult field for evaluation, with a large variance in participants’ knowledge level, learner behaviour, and commitment. In this study the effectiveness of a mathematics pre‑course administered to four cohorts of prospective students at a technical faculty in Germany was evaluated. Deficits in basic mathematics knowledge are considered one risk factor regarding graduation in STEM‑related subjects, thus the overall goal was to investigate if the pre‑course enabled “at risk” students to improve their starting position. A data analysis was performed, relating students’ preconditions when entering university, their attitude towards mathematics, and their use of learning strategies with further study success. The strongest determinant of first year performance were results in a diagnostic pretest, confirming both the importance of basic mathematics knowledge for academic achievement in engineering and the reliability of the chosen pre‑posttest design. Other outcomes were quite unexpected and demanded deeper analyses. Students who had participated in additional face‑to‑face courses, for example, showed less learning gains than students who had participated in an e‑tutoring version. It also could be observed that meta‑cognitive variables failed to explain successful course participation. Reasons for these outcomes are discussed, suggesting reliability threats and interactions between students’ preconditions and their learner behaviour. A significant and unmoderated impact on students’ learning gains in the pre‑course was found for the number of online test attempts, making this variable a reliable indicator of student engagement. The evaluations show that open learning designs with heterogeneous learner groups can deliver meaningful information, provided that limitations are considered and that external references, like academic grades, are available in order to establish consistency.

Abstract

E‑learning projects and related research generate an increasing amount of evidence within and across various disciplines and contexts. The field is very heterogeneous as e‑learning approaches are often characterized by rather unique combinations of situational factors that guide the design and realization of e‑learning in a bottom‑up fashion. Comprehensive theories of e‑learning that allow deductive reasoning and hence a more top‑down strategy are missing so far, but they are highly desirable. In view of the current situation, inductive reasoning is the prevalent way of scientific progress in e‑learning research and the first step toward theory development: individual projects provide the insights necessary to gradually build up comprehensive theories and models. In this context, comparability and generalizability of project results are the keys to success. Here we propose a new model – the E‑Learning Setting Circle – that will promote comparability and generalizability of project results by structuring, standardizing, and guiding e‑learning approaches at the level of a general research methodology. The model comprises three clusters – context setting, structure setting, and content setting – each of which comprises three individual issues that are not necessarily sequential but frequently encountered in e‑learning projects. Two further elements are incorporated: on the one hand, we delineate the central role of objective setting and the assessment of the goal attainment level (guiding element); on the other hand, we highlight the importance of multi‑criteria decision‑making (universal element). Overall, the proposed circular model is a strategic framework intended to foster theory development in the area of e‑learning projects and research.