Transcript of "Learner Analytics: from Buzz to Strategic Role Academic Technologists"

2.
“But everything we know about cognition suggests that a small group of people, no matter how intellingent, simply will not be smarter than the larger group. ... Centralization is not the answer. But aggregation is.” - J. Surowiecki, The Wisdom of Crowds, 2004

8.
What’s the promise of Analytics for Academic Technologists?1. Decision-making (and service-evaluating) based on practices (not just perceptions) and performance outcomes2. If we’re moving into a strategic role re: teaching and learning, analytics can: – demonstrate the link between technology and learning – distinguish our role from a technology service provider(PS - anyone else concerned about the validity of student evaluations and self-reported data?) – “Rate your level of technology expertise (novice, intermediate, expert)”

14.
Learner Analytics:“ ... measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” (Siemens, 2011)

24.
Learner Analytics on Chico Vista Usage1. What is the relationship between LMS usage and student achievement?2. What is the relationship between the number of LMS tools used (aka ‘breadth of faculty LMS adoption’) and student achievement?3. Perform analysis within courses4. Ultimate goal: provide administrators and faculty with what-if modeling tools, building on reports in data warehouse 24

26.
Call to Action1. Metrics reporting is the foundation for Analytics2. Don’t need to wait for student performance data; good metrics can inspire access to performance data3. You’re *not* behind the curve, this is a rapidly emerging area that we can (should) lead ...

33.
Research Findings1. There is not a relationship between sophistication of technology and sophistication of application/deployment – Largest raw number of advanced users had simple transactional reporting tools2. Factors leading to higher levels application: – Leadership commitment to evidence-based decision making – Staff skills – Effective end user training 34

35.
Data Dashboard Theoretical Framework & Guiding Questions1. What percentage of students reach each of the leading indicators?2. What is the impact of reaching each of the leading indicators on success rate?3. Does meeting any of the indicators reduce or eliminate gaps between Advancing by Degrees: A Framework for Increasing student groups? College Completion -Institute for Higher Education Leadership and Policy and The Education Trust 36

42.
Why such a small increase?1. Variation in usage creates “missing data” for tools not used in other courses2. Lesson Learned: perform analysis relative to students within the same course3. Next Generation implementation: Purdue Biology course using “Signals” early warning system with students (Arnold, 2010) – D/F grades reduced 14% – B/C grades increased 12% 43

43.
Macfadyen and Dawson (2010)In a fully online biology course at the University of British Columbia (n=118, 5 sections, 3 semesters), found that:1. 33% of student grade variability could be explained by 3 variables (discussion messages posted, mail messages sent, and assessments completed)2. 13 variables (out of 22 studied) had significant correlations with final student grade (R2 values from .05 to .27) – Significant variables included number online sessions, total time only, and activities within content, mail, assessment, and discussion areas – Variables not significant included some predictable items, such as visits to MyGrades, uses of search, ‘who is online’, and the ‘compile’ tool. They also included surprising items, such as the number of assignments read, the time spent on assignments, and announcement views3. 73.7% of the students correctly classified as at-risk (i.e. final grade of D or F) through predictions based on these three variables 44