Transcription

1 Teaching smarter: How mining ICT data can inform and improve learning and teaching practice Shane Dawson Graduate School of Medicine University of Wollongong Erica McWilliam and Jen Pei-Ling Tan ARC Centre of Excellence for Creative Industries and Innovation Queensland University of Technology The trend to greater adoption of online learning in higher education institutions means an increased opportunity for instructors and administrators to monitor student activity and interaction with the course content and peers. This paper demonstrates how the analysis of data captured from various IT systems could be used to inform decision making process for university management and administration. It does so by providing details of a large research project designed to identify the range of applications for LMS derived data for informing strategic decision makers and teaching staff. The visualisation of online student engagement/effort is shown to afford instructors with early opportunities for providing additional student learning assistance and intervention when and where it is required. The capacity to establish early indicators of at-risk students provides timely opportunities for instructors to re-direct or add resources to facilitate progression towards optimal patterns of learning behaviour. The project findings provide new insights into student learning that complement the existing array of evaluative methodologies, including formal evaluations of teaching. Thus the project provides a platform for further investigation into new suites of diagnostic tools that can, in turn, provide new opportunities to inform continuous, sustained improvement of pedagogical practice. Keywords: Academic analytics, data mining, evaluation, ICT, social networking Introduction Like higher education institutions (HEIs) around the world, Australian universities are undergoing rapid organisational and operational change. This is due to ongoing government reforms to higher education, increased accountability and competition, as well as changes in the composition and expectations of the student body. Alongside the many institutional transformations directed towards improving the student learning experience (Ryan, 2005), we have seen an associated re-examination of the criteria and methods for measuring what represents quality education (Coates, 2005). This in turn has focused attention on finding new value-add methodologies for monitoring, demonstrating and reporting quality education outcomes. The concept of academic analytics is now gaining increasing momentum as a process for providing HEIs with the data necessary to respond to the reportage and decision making challenges facing contemporary universities. For example, Campbell, DeBlois and Oblinger (2007) have proposed that the analysis of data captured from various IT systems could be used to inform decision making process for university management and administration: In responding to internal and external pressures for accountability in higher education, especially in the areas of improved learning outcomes and student success, IT leaders may soon become critical partners with academic and student affairs. IT can help answer this call for accountability through academic analytics, which is emerging as a new tool for a new era. (Campbell et al., 2007, p.40) Academic analytics describes a method for harvesting and analysing institutional data for informing decision making and reporting processes. In essence, academic analytics involves the extraction of large volumes of data from institutional databases and the application of various statistical techniques in order to identify patterns and correlations (Campbell et al., 2007). While the practice of data mining has long Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 221

2 been utilised in the commercial sector, to date there has been limited interest within the academy (Goldstein & Katz, 2005). However, the quantity and diversity of data currently accessible to HEIs are now making it possible to exploit more fully the potential of academic analytics in order to inform a range of key activities within the academy, from strategic decision-making to instructor practice. The challenge for HEIs is no longer simply to generate data and make it available, but rather to readily and accurately interpret data and translate such findings to practice. The ubiquitous adoption of information and communication technologies (ICT) across the HE sector in recent times has provided institutions with additional expansive data sets that capture student learning behaviours through user online interactions. For example, the vast majority of Australian universities have adopted learning management systems (LMS) to support student learning. At Queensland University of Technology in Australia for instance, approximately 40,000 students and 5,000 staff use the commercial LMS BlackBoard (BB) as part of their daily course experience. Increasingly these systems provide the essential infrastructure which mediates student access to learning resources, and facilitates student-student and student-lecturer interaction. Additionally, these systems can provide sophisticated levels of institutional-wide data on areas of student demographics, academic performance, learning pathways, user engagement, online behaviour, and development and participation within social networks. These data can also be used to promote practitioner reflection for professional development as well as identifying students who may require additional scaffolding and/or early learning support. The importance for early intervention has been highlighted in recent research undertaken by John Campbell (2007) at Purdue University. Campbell stresses the importance of further developing the field of academic analytics in order to assist instructors in making informed decisions regarding their pedagogical practice. Deriving from his work in academic analytics, Campbell developed a predictive model to identify potential at-risk students. This model is based on prior academic results (aptitude) and the analysis of student behaviour within the LMS (effort). The identification of online student engagement/effort affords teachers timely opportunities for implementing learning assistance and strategic interventions. The capacity to establish early indicators of at-risk students allows them to redirect or add resources to facilitate progression towards patterns of behaviour more characteristic of a low risk category (e.g. participation in group discussions, regular access of discipline content and review of assessment criteria). In the Australian context, Dawson (2006a; 2006b; 2008) has also applied academic analytics to identify relationships between student online user-behaviour and perceived sense of community. Essentially, these findings demonstrate the potential for ICT user behaviours to inform and predict factors influencing student learning outcomes and overall satisfaction (e.g. cognitive engagement, participation in social networks and sense of community). Despite the vast volumes of data captured and recorded within these systems, there has been minimal research that investigates how the rubber of such analytics can meet the road of design, delivery and evaluation of teaching and learning practices (2008; Goldstein & Katz, 2005). This paper addresses this research deficit by presenting examples of data derived from an institution-wide LMS in order to demonstrate the usefulness of applications of ICT-driven academic analytics. Although HEIs have widely adopted LMS, the extraction and reporting of captured data has often been fragmented, with poor visualisation tools militating against effective interpretation and subsequent user-action (Mazza & Dimitrova, 2007). Put simply, the reporting of data derived from these LMS can and should be accessed and implemented into the current suite of tools available for teaching staff in more effective ways. This will allow a better understanding of the student cohort learning behaviours, and thus an improved capacity to respond to their immediate learning needs and to assess the impact of enacted learning and teaching activities. What follows is the outline of a current international study investigating the application of academic analytics not only to inform instructor practice but also to identify key tools for benchmarking an institution s level of integration of ICT into mainstream educational practice. The discussion presents examples of the application of ICT data derived from an institutional LMS to: inform the decision making process for senior management with regards to future ICT related initiatives; and, enable individual instructors to evaluate student engagement and the impact of implemented learning and teaching practices. Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 222

3 Project and findings The project outlined here was an initial exploratory study designed to identify the range of applications for LMS derived data for informing strategic decision makers and teaching staff. Specifically, the study extracted data from an institutional LMS (BlackBoard Vista formerly known as WebCT Vista) and analysed this data in the context of the various levels of institutional operations, namely, enterprise (whole of institution), faculty management, and instructor. All data pertinent to the study were drawn from a large Canadian University with approximately 45,000 full-time equivalent students and 5000 teaching and sessional staff. The ICT data was extracted from BlackBoard Vista using the WebCT PowerSight application over a period of 19 weeks (second semester) from August to December Approximately 800 teaching units were included in the enterprise-level analysis. Enterprise Faculty exemplars were identified from the initial enterprise-level assessment. Exemplars consisted of faculties that were most active, as determined by the overall number of interactions recorded by the LMS per full-time equivalent student (FTE) enrolled for the teaching period investigated (Table 1). Interactions were recorded by the LMS within each distinct tool area such as discussion forum; content page, and assessment page. Interactions were then identified as either non-student (teaching and support staff) or student (enrolled students). Due to the diversity of interaction types recorded and the differences in their associated teaching intent, the aggregation of interaction data at this level has some inherent complications. For instance, discussion forum interactions are dissimilar to content interactions and as such, foster alternate student learning outcomes. Thus, the enterprise analysis merely provides the capacity to identify trends of behaviour and LMS adoption. The Faculty-level data was further refined by identifying the most active teaching units in terms of online interactions. These were extracted and analysed for ICT tool, instructor and student usage. Analysis of this data allowed for specific teaching units to be examined for learning trends and potential indicators of student engagement and achievement. One unit of a large teaching class (n = 1026) is presented and discussed as an exemplar of this process. Table 1: Most active faculty identified through LMS enterprise data Applied Science Arts Dentistry Education Land & Food Sys Medicine Science Total % of total 11.45% 2.44% 0.12% 16.88% 46.99% 1.62% 20.5% 100% interactions / FTE * * Interactions refer to all online access by student and faculty e.g. posting to a forum; accessing content or participating in chat sessions. Analysis per Full Time Equivalent standardises against the interaction increases due to faculty size The LMS data were analysed using the software package SPSS for Windows (Vers 15.0). Statistical analyses incorporated descriptive statistics, ordinary least squares regression analyses, as well as parametric and non-parametric tests of significant differences between and among groups. Examination of the enterprise-level data indicated diverse differences in faculty usage trends. For example the highest users of the LMS were the Land and Food Systems Faculty and the Science Faculty, comprising 47% and 20% of the total recorded online interactions respectively (excluding distance education offerings). In contrast, the faculty of Dentistry contributed less than 1% of the total online interactions per full time equivalent student. The range of data extracted from the LMS also provided an indication of the types of ICT tools utilised across the university. In this study, the dominant tool was the discussion forum representing over 80% of all interactions in the online environment. The second most commonly utilised tool among staff and students was the content page (Figure 1). During the initial weeks of teaching, other LMS tools, such as the announcement and assignment tools, were incorporated into the online teaching units. However, the degree of usage of these tools was minimal in comparison to the discussion forum and content page (Figure 1). Examination of the LMS data aggregated to an enterprise level can provide a strong indicator of the stage of e-learning adoption across the university. For example, the discussion forum in this instance was identified as the dominant tool in terms of integration and quantity of recorded interactions, while other tools such as online assessment quizzes, wikis and blogs were infrequently utilised or adopted by the end- Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 223

4 Percentage of total interactions content-page discussion Time (weeks) Figure 1: Identified dominant tools (discussion forum and content page) as a percentage of the total interactions recorded by the LMS users. This is not to say that these tools have not been incorporated into individual learning and teaching practices via alternate external platforms such as local versions of Moodle or Elgg. However, given the high level of technical sophistication often required to house or access such external platforms, it is unlikely that these technologies were or are pervasive. Rogers (2003) oft cited diffusion of innovations theory suggests that the adoption of an innovation through a social system is commonly integrated first by early innovators, then early majority and finally the late majority. The widespread adoption of the discussion forum observed in the present study suggests that this resource is no longer considered a new innovation. The discussion forum can be seen to be a mainstream tool commonly integrated into educational practice and as frequently utilised as the PowerPoint presentation. However, technology - and more importantly the student experience and familiarisation with ICTs are now no longer limited to discussion forum and static content pages. Yet while the more technically ambitious educators might be embedding of web 2.0 technologies (e.g. social networking sites; collaborative editing; social bookmarks; blogs; wikis) into their teaching and learning activities, such resources remain on the periphery of mainstream education. This can be remediated in part by using the data extracted from the LMS to monitor the adoption of current and future Web 2.0 related technologies and identify instances of how these tools are integrated into educational practice for the purposes of research and provision of teaching support. In this way, the data derived from BB and other associated ICT systems can be used to guide and inform the diffusion of technology and integration into learning and teaching activities. Visualisation of data Although not available in current versions of BB, the aggregation of data around tool design provides instructors and managers with an effective method for assisting in the interpretation of the derived data. For example: discussion forum, chat, and who s online are three LMS tools that focus on student engagement. Table 2 outlines the LMS tools available in BB and their associated thematic category. The analysis and presentation of data in alignment with tool design categories illustrates that content and engagement are primary modes of LMS tool use (Figure 2). Although these categories are not exclusive, because each tool can be adopted for multiple purposes, the categorisations do serve as a starting point to assist staff in the alignment of observed student online behaviour. Further ongoing analysis of this data at a faculty and individual teaching unit provides a more refined and readily interpretable framework for informing teaching practice. Table 2: LMS tool design framework Administration Assessment Content Engagement Announcements My grades Student bookmarks, Notes Discussion forum File manager Assessments Content page, Printable view Chat Tracking Assignments File, Search, Weblinks Who s online Calendar Syllabus Mail Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 224

5 Percentage of total interactions Admin Assess Content Engage Time (weeks) Figure 2: Enterprise usage trends for LMS tool design categories: Administration, assessment, content and engagement Faculty Analysis of faculty level data sought to identify trends in the adoption of various LMS tools and instructor and student use of the resources. Again, the data allows for a more refined benchmark of activity that provides senior management with an indication of which teaching units have high or low adoption and usage trends. For example within the faculty of Science, data related to the level of tool interaction per member (non student and student) revealed significant differences between the various Science Schools (Tables 3, 4). School 1 was observed to be the most active in terms of Instructor usage (non-student). Despite this high level of activity this School was not the highest rated in terms of the level of student interaction with the online teaching units (Table 3). Anderson, Rourke and Garrison (2001) have described the importance of promoting a teacher presence for effective online learning and developing student community. In essence, these authors noted that the instructor facilitation of discussion and an ongoing active presence in the discussion forum and other online communication media is essential for maintaining high levels of student engagement. Although, teacher presence is crucial for developing student online engagement the results of this study suggest that the relationship is not uni-dimensional. That is, the quantity of teacher presence and quality of teacher presence are influencing factors in developing and maintaining student online interactions. For instance, while School 1 presented the highest levels of instructor interaction, it did not, however, promote the highest levels of student interaction. Discussions with course coordinators indicated that response time to student discussion queries may be a significant factor in promoting discussion and further student involvement. School 2 (highest level of student interaction) established early guidelines relating to response times via and discussion forum. In contrast School 1 responded rapidly to student queries and discussion points. While the quantity of work undertaken by the instructor to assist students in their learning was admirable, the depth of discussion among the student cohort was minimal. As Salmon (2000) has suggested, the early establishment of goals and guidelines in discussion forum etiquette provides a foundation for high order usage. In this instance providing students with reflection time and an expectation of student discussion and peer response may well be crucial to facilitating engagement. Table 3: Kruskal-Wallis ANOVA s test for differences in instructor and student mean levels of interaction between Science schools Group Instructor mean Student mean School School School School Chi = ; Df = 3 Chi = ; Df = 3 Asymp Sig = Asymp Sig = Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 225

6 Table 4: Mann Whitney U test for differences in instructor and student mean levels of interaction between Science schools X: School 1 School 2 School 3 Y: Instructor mean Student mean Instructor mean Student mean Instructor mean School 1 (X:Y) (X:Y) (X:Y) (X:Y) (X:Y) (X:Y) School : * : ** School : ** : ** : ** : ** School : ** : * : ** : ** : Not significant ** Significant at p<0.001; * Significant at p<0.05 Student mean : Not significant The analysis and presentation of data at a faculty level provide stakeholders with a method for distinguishing differences in online behaviours across Schools and a benchmark of current activity for future comparison. These measures are of particular interest when attempting to implement learning and teaching plans or identifying courses that promote high levels of online engagement. The data derived from the system in its current form cannot be used to assess the quality of these interactions. Correlations of the harvested LMS data with other quality metrics such as standardised evaluations of teaching will be needed to assist educators in developing more sensitive and accurate performance frameworks. Teacher/instructor The study examined the online behaviour trends of a large first year pre-requisite Science class (n = 1026). The analysis of data was undertaken with a view to identify potential differences between high academically performing students and those requiring early learning support interventions. Campbell and Oblinger (2007) discussed the application of ICT data to inform a predictive classification model of identifying students as having high, medium or low risk of attrition or failure. This model, based on the degree of student time online, was developed at Purdue University and serves to provide instructors with lead indicators for early identification of students requiring additional learning support. However, the model does not differentiate where students spend time online. For instance, time spent in content areas or located within discussion forum activities may well have a very different learning objective and therefore outcome. Numerous authors have espoused the importance for developing social learning opportunities where students actively debate, exchange and clarify ideas with other peers (Brook & Oliver, 2003; Brown & Duguid, 2000; Lave & Wenger, 1991; Shapiro & Levine, 1999; Vygotsky, 1978). In an online context this is commonly mediated through the discussion forum. The present study supports the notion that teaching staff are frequently embedding discussion forum activities. However, further investigation is required to determine if the inclusion of such discussion activities is an enabler of social learning oriented pedagogies in general. Figure 3 illustrates the peak periods for student online engagement in terms of the number of postings per student in the study. Immediately apparent in the figure are four discernable peaks extending beyond the class average of postings. These peaks directly correspond to the assessment periods 3 mid-term assessment items (pre-week 10) and a final exam post week 16. While this finding appears to support the notion that students are likely to be most involved at times when assessment performance is looming, the data can also be used to highlight peak periods for staff intervention. Additionally, the qualitative analysis of postings can assist in determining the types of questions posted, i.e. concepts, factual recall, or administration relating to assessment processes. The identification of the types and frequency of question categories could assist instructors in the future design and development of the online environment, such as the inclusion of Frequently Asked Questions, and the design, and timing of discussion forum activities and triggers. To post or not to post Prior research has foregrounded the strong relationship between student time online and academic performance. Morris, Finnegan and Wu (2005) for example, noted that student time online in class discussion forums was related to their overall academic performance, and that students with higher levels of participation in online discussions have higher grades than their less interactive peers. Similarly, this study observed that students who actively participated in discussion forum activities (i.e. made a posting Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 226

7 Posts/ Student Posts/ student AVERAGE Figure 3: Discussion forum postings per student in a large Science class over a 19 week period to the class discussion forum) scored an 8% mean higher grade than non-participating students (66.9% and 58.89% respectively) (t = 6.870, df = 1024, mean difference = 8.02, p<0.001). Cohen (1988) has noted that for effect size (ES) calculations for t tests of means, 0.2 is considered small, 0.5 medium and 0.8 large. The differences observed between students who actively make postings to discussion forums and those who do not produced an ES of This statistical result provides a clear message to students those who achieve better grades also use the discussion forums, and therefore, there is a greater likelihood of achieving better grades if one actively and productively engages with forum activities. For teaching staff, the early monitoring of discussion forum activity can assist in the timely identification of students potentially requiring additional assistance. Session times and academic performance As previously noted, the project aimed to identify potential differences in online behaviour between high and low performing students. Thus, the sample of student cohort was grouped according to academic performance. The recorded student behaviours for the quartile groupings were then tested for significant differences. The results indicated a significant difference between low and high performing students in terms of the quantity of online session times attended during the course (Table 5). Similar findings were observed for the total time online. Interestingly, the observed difference in the total time per session for low and high performing students was not statistically significant. In short, low and high performing students spend similar amounts of time in each session however, low performing students attend fewer online sessions. Although further research is needed to further substantiate this point, it can be argued that low-performing students may not be optimising their learning time online in the same way their higher-performing counterparts are able to. This could be due in part to their struggle with the level of discipline and intrinsic motivation required for this type of learning. Ertmer and Newby (1996) have previously described differences between expert and novice learners. They found that expert performers display a range of learning strategies (reflection, discipline, control, playfulness) to optimise and evaluate their individual progress. As noted by a number of researchers (e.g. Bocchi, Eastman, & Swift, 2004; Doherty, 2006; Kerr, Rynearson, & Kerr, 2006; Ross & Powell, 1990; Rovai, 2002, 2003), online learning demands high levels of motivation, discipline, persistence and academic integration from students in order to perform well. In contrast to their low performing peers, high performers were also observed to spend a greater degree of time online in both discussion forum and content pages (Table 5). Similar findings have been noted by Morris et al. (2005), who identified significant differences in online behaviour between successful (i.e. received a passing grade) and non-successful completers of a course. Successful students were observed to spend a greater degree of time online in both discussion and content areas of the online site. The tracking of student online behaviour provides clear benefits in assessing student progress and the early identification of potential at-risk individuals. The identification of the point of disengagement from learning activities or the point at which a student drops below a class average in quantity and frequency of online sessions (content and discussion areas), may assist staff in maintaining student motivation, and overall persistence. Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 227

8 Table 5: Mann Whitney U test for differences in quantity of recorded sessions for each quartile Quantity of online sessions 0-25 quartile quartile quartile X Number of discussion postings 0-25 quartile quartile quartile X X X Number of content views 0-25 quartile quartile quartile X Statistically significant at p < 0.05; X Not statistically significant Social network visualisation Despite the relative ease in extracting tracking data on student online interactions, the aggregation and visualisation of this data by staff should not be presumed to happen automatically. In short, the poor visualisation tools available to teaching staff constrain staff understanding of the linkage between student online interactions and implemented pedagogical approach. To mitigate this problem, the results informing this project have led to the development of a prototype social networking visualisation tool 1 that extracts discussion forum data in order to build a visual representation of the learning network. This network visualisation can be generated at any stage of course progression, thereby providing a timeline of engagement or insights into student online behaviour following designed learning activities. Additionally, the identification of students actively involved in a network also provides instructors with timely information about those students who are disconnected from the learning network. The enabled software provides a small link on the BB discussion forum page for extracting and tabulating the discussion forum data. The extracted forum data is exported into Netdraw (Borgatti, 2002) a social network visualisation tool. Figure 4 illustrates the visualisation results from a large class BB discussion forum. Each node represents an individual student, and while student names have been removed for the purpose of this paper, the diagram clearly illustrates, on an immediate basis at the time of analysis, the degree of engagement, central nodes in discussion and students potentially isolated from the discussion network. The identification of students involved in the discussion network allows a list of nonparticipating students to be developed. It is a small step to automate this process and thus allow staff to intervene at selected and/or critical points in time, either to mobilise a specific student to interact or to enquire about a student s learning issues and status. Conclusion This present study has provided strong indications that there are discrete online behaviour patterns that are representative of students overall engagement and final assessment grade. The visualisation of online student engagement/effort affords instructors with early opportunities for providing additional student learning assistance and intervention when and where it is required. The capacity to establish early indicators of at-risk students provides timely opportunities for instructors to re-direct or add resources to facilitate progression towards patterns of behaviour that represent what Campbell and others (2007) have previously termed as a low risk category (e.g. participation in group discussions, regular access of discipline content and review of assessment criteria). It is clear that the future trend will be toward greater adoption of online learning in HEIs. Thus, there will be an increased opportunity for instructors and administrators to monitor student activity and interaction with the course content and peers. The analyses of these data sets are directly relevant to student engagement and evaluating implemented learning activities. Questions related to how and when students are engaged, and what activities promote student engagement, can be answered through the monitoring and analysis of student online behaviour. While the data examined in this paper are indicative at this stage, and the interpretation of results discussed here may be influenced to some degree by a number of exogenous variables, the findings nonetheless provide new insights into student learning that complement Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 228

9 Disconnected students Figure 4: Sociogram illustrating density of the social network and students isolated from the network the existing array of evaluative methodologies (e.g. evaluations of teaching). In this regard, the study provides a platform for further investigation into new suites of diagnostic tools that can, in turn, provide new opportunities and new data sets to inform instructor reflection for continuous and incremental improvement of pedagogical practice. Endnote 1 The social network software was developed by Aneesha Bakharia and Shane Dawson. The script was developed using Greasemonkey, a Mozilla Firefox browser extension. The use of the Greasemonkey script avoided issues surrounding access to a proprietary database. For further details refer to: Acknowledgements The authors would like to acknowledge the contributions from Aneesha Bakharia, who provided expert technical advice and developed the social network resource. This research has been supported by the Australian Learning and Teaching Council. References Anderson, T., Rourke, L., & Garrison, D. R. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), Bocchi, J., Eastman, J. K., & Swift, C. O. (2004). Retaining the online learner: Profile of students in an online MBA program and implications for teaching them. Journal of Education for Business, 79(4), Borgatti, S. P. (2002). NetDraw: Graph visualization software: Harvard: Analytic Technologies. Brook, C., & Oliver, R. (2003). Online learning communities: Investigating a design framework. Australian Journal of Educational Technology, 19(2), Brown, J. S., & Duguid, P. (2000). The social life of information. Cambridge: Harvard Business School. Campbell, J., De Blois, P. B., & Oblinger, D. (2007). Academic analytics: A new tool for a new era. EDUCAUSE Review, 42(4), Campbell, J., & Oblinger, D. (2007). Academic analytics. Retrieved 25th October, 2007, from Coates, H. (2005). The value of student engagement for higher education quality assurance. Quality in Higher Education, 11(1), Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates. Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 229

Benchmarking across universities: A framework for LMS analysis Lynnae Rankine University of Western Sydney Leigh Stevenson Griffith University Janne Malfroy University of Western Sydney Kevin Ashford-Rowe

Using a blogging tool to assess online discussions: an integrated assessment approach Xinni Du Learning and Teaching Unit (LTU) University of New South Wales Chris Daly School of Mining Engineering University

Capturing clinical experiences: Supporting medical education through the implementation of an online Clinical Log Linda Corrin & Martin Olmos Graduate School of Medicine University of Wollongong The capturing

Asynchronous Learning Networks in Higher Education: A Review of the Literature on Community, Collaboration & Learning Jennifer Scagnelli CREV 580 Techniques for Research in Curriculum and Instruction Fall

The Effect of Web-Based Learning Management System on Knowledge Acquisition of Information Technology Students at Jose Rizal University Ryan A. Ebardo Computer Science Department, Jose Rizal University

Recommendations for enhancing the quality of flexible online support for online teachers Andrew Yardy Elizabeth Date-Huxtable University of Newcastle Flexible online support of teaching and learning at

Learning Analytics: Targeting Instruction, Curricula and Student Support Craig Bach Office of the Provost, Drexel University Philadelphia, PA 19104, USA ABSTRACT For several decades, major industries have

Professional development online: Ethics education for accountants and business managers Josie Fisher and Cathryn McCormack University of New England The teaching of business ethics is well-documented in

A simple time-management tool for students online learning activities Michael de Raadt Department of Mathematics and Computing and Centre for Research in Transformative Pedagogies University of Southern

Defining an agenda for learning analytics Associate Professor Cathy Gunn Faculty of Education The University of Auckland Learning analytics is an emergent field of research in educational technology where

Twenty first century edgeless universities: Designing community spaces for connectedness across degree programs Elizabeth Greener and Shannon Johnston Queensland University of Technology Flexible learning

The move to Moodle: Perspective of academics in a College of Business Dr Robyn Walker College of Business, Massey University Associate Professor Mark Brown Office of the Assistant Vice Chancellor (Academic

Auditing education courses using the TPACK framework as a preliminary step to enhancing ICTs Chris Campbell The University of Queensland Aspa Baroutsis The University of Queensland The Teaching Teachers

Factors affecting professor facilitator and course evaluations in an online graduate program Amy Wong and Jason Fitzsimmons U21Global, Singapore Along with the rapid growth in Internet-based instruction

Checklist of Competencies for Effective Online Teaching 1. Instructional Principles Audience Analysis Who are your learners? What is their reason for taking your class? What is their preparation for success

Using a collaborative investigation and design strategy to support digital resource development in an online unit of study Shannon Kennedy-Clark Division of Education Australian Film Television and Radio

Flipped classroom in first year management accounting unit a case study Xinni Du Blended Learning Advisor, School of Business University of Western Sydney Sharon Taylor Senior Lecturer, School of Business

Proposal ID738 Development and delivery of E learning materials on two different e-learning platforms M. D. Nowbuth, Department of Civil Engineering, University of Mauritius, mnowbuth@uom.ac.mu & Y. Moonshiram-Baguant,

A framework for evaluating online learning in an ecology of sustainable innovation George Bradford, Ben Kehrwald, and Stuart Dinmore Learning and Teaching Unit University of South Australia In this session,

Plug and play learning application integration using IMS Learning Tools Interoperability Simon Booth University of Stirling Susi Peacock Queen Margaret University Stephen P. Vickers The University of Edinburgh

Wiki-based interventions: A curriculum design for collaborative learning Zainee Waemusa School of Education Auckland University of Technology Andrew Gibbons School of Education Auckland University of Technology

Where has the Time Gone? Faculty Activities and Time Commitments in the Online Classroom B. Jean Mandernach, Swinton Hudson, & Shanna Wise, Grand Canyon University USA Abstract While research has examined

A Content inacol Standards of Quality for Online Courses The course goals and objectives are measurable and clearly state what the participants will know or be able to do at the end of the course. Course

Students evaluations of teaching quality and their unit online activity: An empirical investigation Rodney Carr and and Pauline Hagel Faculty of Business and Law Deakin University This paper reports on

Running head: JUSTIFICATION FOR CERTIFICATION PROGRAM Justification For Certification Program for Teaching Online Daniel Aguilar Jose Banda Maria Eugenia Perez University of Texas at Brownsville February

A topical start-up guide series on emerging topics on Educational Media and Technology CEMCA EdTech Notes 1 Introduction Introduction Teachers who lack the visual cues that are available in a face-to-face

Learning Management System Self-Efficacy of online and hybrid learners: Using LMSES Scale Florence Martin University of North Carolina, Willimgton Jeremy I. Tutty Boise State University martinf@uncw.edu

Guideline 1: Selection of faculty Is the faculty prepared to teach an online class? Faculty that are effective have high, but clear and reasonable expectations. For example: Approved faculty need to have

Experiences in Implementing a Postgraduate Engineering Management David Thorpe University of Southern Queensland, Toowoomba, Queensland thorped@usq.edu.au Robert Fulcher University of Southern Queensland,

1 Learning Analytics: Readiness and Rewards May 2013; Norm Friesen (nfriesen@tru.ca) Executive Summary: This paper introduces the relatively new field of learning analytics, first by considering the relevant

Abstract Title Page Title: Designing Technology to Impact Classroom Practice: How Technology Design for Learning Can Support Both Students and Teachers Authors and Affiliations: Maria Mendiburo The Carnegie

Effective Strategies for Building a learning community Online Using a Wiki MARTINA A. DOOLAN University of Hertfordshire Abstract This paper aims to share practitioner experiences of using Wiki technology

FOUNDATIONS OF ONLINE LEARNING IN CALIBRATE The policies and implementation of Medallion Learning's Calibrate ensure the use of best practices in online education in the areas of Content Development, Course

A Structured Methodology for Multimedia Product and Systems Development A STRUCTURED METHODOLOGY FOR MULTIMEDIA PRODUCT AND SYSTEMS DEVELOPMENT Cathie Sherwood and Terry Rout School of Computing and Information

School Focused Youth Service Supporting the engagement and re-engagement of at risk young people in learning Guidelines 2013 2015 Published by the Communications Division for Student Inclusion and Engagement

Australian Government Department of Education and Training More Support for Students with Disabilities 2012-2014 Evaluation Case Study OnlineTraining Limited professional learning modules MSSD Output 5:

Students Perception Toward the Use of Blackboard as a Course Delivery Method. By Dr. Ibtesam Al mashaqbeh Abstract: The aim of this study was to investigate students perception toward the use of blackboard

CONSUMERS OF ONLINE INSTRUCTION Dr. Lillie Anderton Robinson, North Carolina A&T State University, andertol@ncat.edu ABSTRACT There is a great demand for the online delivery system of instruction. It accommodates

Tools for engaging online learners: Increasing student involvement in online classes ABSTRACT Enid Acosta-Tello National University Faculty who have taught in a traditional classroom are finding themselves

AUSTRALIAN CATHOLIC UNIVERSITY STRATEGIC PLAN FOR ONLINE TEACHING AND LEARNING 2007 2009 Contextual Statement Australian Catholic University (ACU National) is committed to enhancing student access to online

Management of e-learning within a university setting through partnership with industry Jeremy Gauder ACU National Online Australian Catholic University Allan Christie and James Strong NetSpot Pty Ltd Increasingly,

The learning outcomes of an online reflective journal in engineering Stuart Palmer, Dale Holt and Sharyn Bray Institute of Teaching and Learning Deakin University Reflective thinking based on experiential