Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Learning analytics in a time of an insatiable thirst for data and evidence: A provocation

Public lecture, 22 November, 2017 - International Seminar "Evidence-based research: methodological approaches and practical outcomes“ hosted by the UNESCO Chair in Education and Technology for Social Change at the Open University of Catalonia, Spain

Learning analytics in a time of an insatiable thirst for data and evidence: A provocation

1.
By Paul Prinsloo
(University of South Africa)
@14prinsp
Public lecture, 22 November, 2017 - International Seminar "Evidence-based research:
methodological approaches and practical outcomes“ hosted by the UNESCO Chair in
Education and Technology for Social Change at the Open University of Catalonia, Spain
Learning analytics in a time of an insatiable thirst
for data and evidence: A provocation

2.
Acknowledgements
I do not own the copyright of any of the images in this
presentation. I hereby acknowledge the original copyright and
licensing regime of every image and reference used.
This work (excluding the images) is licensed under a Creative
Commons Attribution 4.0 International License.

3.
Image credit: http://www.basicknowledge101.com/subjects/reality.html
Open Public Session Lecture:
Will the future of higher
education be evidence-based?

4.
Open be evidence-based?Some (more?) important questions to ask are…
• Why is there a need for evidence? (e.g. efficiency, transparency,
accountability?)
• Who will define what counts as evidence and what is not
regarded as evidence? (e.g. the role of gatekeepers, those who
formulate criteria and standards; quantitative/qualitative data)
• Who will verify the evidence as valid and appropriate for the
purpose for and context in which it was collected?
• Who will use the evidence and for what purpose?
• What is the relationship between data, evidence, intelligence,
knowledge and wisdom?
• And finally, how do these questions impact the collection,
analysis and use student data?

5.
Student
Learning journey
Success
Fail
Mapping the collection, analysis and use of student
demographic and learning data to inform/support
pedagogy and learning
• What happened/is happening – descriptive analytics
• Why did it happen/is it happening – diagnostic analytics
• What will happen – predictive analytics
• How can we make it happen – prescriptive analytics
Early departure

6.
Student Success
Fail
What data do we need to describe, understand, predict and
prescribe the learning journey?
What data do we already have, in which formats, for what
purposes, where are the data stored and governed by who?
What data don’t we have and that we need to describe,
understand, predict and prescribe the learning journey?
What are our assumptions about learning, and data-as-evidence?
What data do students need to make better informed choices
and to take ownership of their learning journey?

7.
Learning analytics
• defines ‘learning’, ‘progression’ and ‘(dis)engagement’;
• decides samples, what data to collect (and when), what to use as
proxies and what data to ignore;
• who will/ how to interpret the analyses and findings;
• decides what findings to share with whom as well as choosing
methods of dissemination (e.g. dashboards, early alerts, etc.)
(Re)considering learning analytics:

8.
Student Success
Fail
Our understanding of learning analytics as process of
collecting evidence and measuring/increasing
success and efficiency is shaped by…
… how we understand and describe
students and learning

9.
Source credit: https://www.edsurge.com/amp/news/2017-11-13-what-can-higher-ed-learn-from-precision-medicine
Students as broken and sick…
“Focused hospitals are built around a very specific
value-adding process activity: They take incomplete
or broken parts and then transform them into more
complete outputs of higher value while charging a fee
for the outcome.”

11.
Student Success
How will we know that learning analytics is having a
positive impact?
Impact as:
• More successful/satisfied students?
• More effective and appropriate teaching?
• Better allocation/utilisation of resources?

12.
Retrieved from
https://pdfs.semanticscholar.org/245d/03eb70a6ad06314257d997f570b1a71ab4cc.pdf
Where is the evidence? A call to action for
learning analytics

14.
We need to understand evidence of impact - or a
lack of - as entangled in our assumptions about
evidence, data and the messy processes informing
implementation
Image credit: https://pixabay.com/en/telephone-telegraph-pole-wire-1822040/

16.
Source credit: https://www.avert.org/professionals/hiv-around-world/sub-saharan-africa/south-africa
?
Antiretroviral program (ART) in South Africa: Number
of people receiving antiretroviral treatment

18.
Source credit: https://www.avert.org/professionals/hiv-around-world/sub-saharan-africa/south-africa
At the intersections of evidence, politics and
policy
Antiretroviral program (ART) in South Africa: Number
of people receiving antiretroviral treatment

19.
Example 2: At the intersections of politics and
evidence Problem: Russia 20th century – we need more
sustainable water supply
Stalin:
What is the problem? Do research. Propose an
evidence-based plan
Palchinsky:
Smaller dams will be more effective
Stalin:
I want to build the world largest hydroelectric
dam
Stalin 1, Palchinsky, 0
Outcome: Massive costs, 100,000 people
displaced
Source credit:
https://www.amazon.com/Adapt-
Success-Always-Starts-
Failure/dp/1250007550

20.
Source credit: http://yournewswire.com/30-solid-scientific-studies-that-prove-vaccines-cause-autism/
Example 3: At the intersections of evidence and
Zeitgeist

33.
2. How do we think about ‘evidence’ in a
world of ‘fake’ news’, myths, alternative
facts, and post-truths?
Image credit: https://www.flickr.com/photos/freepress/6641427981

34.
“This is a president who has never
shown much fidelity to facts, unless
they are his own alternative ones.”
Site credit: https://www.nytimes.com/2017/09/09/opinion/sunday/trump-epa-pruitt-science.html

41.
Source credit: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3067552
“The promised Wealth of Networks has given way to a black box society –
one where trolls, bots, and even foreign governments maraud to distort the
information environment on Twitter, Facebook, Google News, Reddit, and
other networks.”

43.
5. How do we think about ‘evidence’ in a
world where ‘knowing’ does not mean
we will act?

44.
Lessons from Rwanda - Lessons for Today - Assessment of the Impact
and Influence of the Joint Evaluation of Emergency Assistance to
Rwanda –
Source: https://www.alnap.org/help-library/lessons-from-rwanda-lessons-for-today-assessment-of-the-impact-and-
influence-of-the
Image credit: https://www.flickr.com/photos/dfid/3403032782
“… the refusal of international agencies and world
leaders to take seriously and use the data they were
given” (Patton, 2008, p. 9)

45.
6. How do we engage with ‘evidence’ in a world where
‘knowing’ does not mean we have the resources or
capacity to respond effectively and appropriately?
Image credit: https://commons.wikimedia.org/wiki/File:Wounded_Triage_France_WWI.jpg

47.
El-Khawas, E. 2000. Patterns of communication and miscommunication between research and policy. In S.
Schwarz & U. Teichler (Eds.), The institutional basis of higher education research. Experiences and perspectives
(pp. 45-55). London, UK: Kluwer Academic Publishers
8. How do we present ‘evidence’ in a world
where the evidence will only be considered
“when a problem is recognized as serious and
requiring a new solution; when the policy
community develops a financially and
technically workable solution; and when
political leaders find it advantageous to approve
it” (El-Khawas, 2000, p. 50; emphasis added)

49.
Contested Political
Incomplete Fragile
(Re)considering learning analytics
in the context of the social
imaginary and realities
surrounding ’evidence’

50.
1. (Re)considering Evidence-Based
Management (EBM)
“Evidence-based education seems to favour a
technocratic [and quantitative] model in which it is
assumed that the only relevant research questions
are about the effectiveness of educational means
and techniques, forgetting, among other things,
that what counts as “effective” crucially depends
on judgments about what is educationally
desirable” (Biesta, 2007, p. 5)

51.
1. (Re)considering Evidence-Based
Management (EBM)
Evidence-based decision making is entangled
in/at the intersection of the impatience of
policy makers to find quick solutions (in line
with their political agendas), stakeholders ‘on
the ground’ and specialists who research the
problems
(El-Khawas, 2000)

54.
We have access to ever increasing volumes, velocity and variety
of student digital data, that allows us to expand not only on the
traditional scope of institutional research with regard to student
data, but also to infer relations unthinkable ten years ago. We
may therefore be tempted to rush to look for patterns without
considering our own assumptions and epistemologies
Noisy systems with underdeveloped theory – there is a real
danger to mistake the noise for a signal, and not realising that the
noise pollutes our data with false alarms and “setting back our
ability to understand how the system really works” (Silver, 2012,
p. 162)
2. (Re)considering Data

56.
2. (Re)considering Data (3)
• Data do not speak for itself (as claimed by Mayer-
Schönberger & Cukier, 2013)
• It is not enough to know that people do things without
understanding why they do act in a particular way
• N ≠ all (contra to Mayer-Schönberger & Cukier, 2014)
• Data sets represent cultural, moral, and instrumental
choices (Brock, 2015)

57.
2. (Re)considering Data (4)
• More or big(ger) data are not (necessarily) better data
(Prinsloo, Archer, Barnes, Chetty & Van Zyl, 2015)
• “The sheer size of analysis does not eschew the limitations
of subjectivity” – the “unbearable lightness of information”
(Papacharissi, 2015, p. 1097)
• What happens when we collect, analyse and use the
bleeps, the logins, the downloads, the posts in and outside
of an institutional Learning Management system (LMS) as
the full/only narrative of what is happening in our
students’ lives, of their aspirations, and of their learning?

58.
What are the implications when we use their data
points to describe, diagnose, predict and prescribe
their learning journeys without ever asking them
what these data points mean to them, and what
data would matter to them to allow them to make
more informed decisions, to complete their journeys
and not ours?
2. (Re)considering Data (5)

59.
What are the implications when we use their data3. (Re)considering learning
How does evidence look and function when we consider
education as an open and recursive system, worlds
apart from research environments where we can control
variables?
How does evidence look when we consider student success
as a complex, non-linear process and result of intersecting,
often interdependent and mutually constitutive variables
in the nexus between students’ life-worlds, capital and
contexts, institutional assumptions, epistemologies and
(in)efficiencies, and macro-societal shifts and changes?

61.
Imagecredit:https://pixabay.com/en/binary-code-man-display-dummy-face-1327512/
Pointer 1: Learning analytics as moral practice
“Learning analytics should not only focus on what is effective,
but also aim to provide relevant pointers to decide what is
appropriate and morally necessary”
(Slade & Prinsloo, 2013, p. 1519)

62.
Pointer 1: Learning analytics as moral practice
“Learning analytics should not only focus on what is
effective, but also aim to provide relevant pointers
to decide what is appropriate and morally
necessary”
(Slade & Prinsloo, 2013, p. 1519)

63.
If learning analytics’ primary aim is to improve students’
learning (Gašević, Dawson, & Siemens, 2015), maybe we should
ask…
• To what extent has learning analytics become our voice-over of
student learning experiences by telling them ‘this is what your
learning looks like and it is the only narrative that matters’…?
• How can we replace our epistemological arrogance and
colonisation of the student experience with an epistemological
shift from ‘knowing’ to listening, humility and respect?
Pointer 2: Change the narrative: Whose story is it
anyway?
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71.
Tuck, E., & Fine, M. (2007). Inner angles. A range of ethical responses to/with indigenous and decolonising theories. In Norman
K Denzin and Michael D Giardina (Eds), Ethical futures in qualitative research. Decolonising the politics of knowledge (pp.
145-168). Walnut Creek, CA: Left Coast Press Inc.

64.
• What happens if we stop talking about ‘students as sick,
‘broken’ and ‘dropouts’? What happens when we discard
a deficit model of student learning to one that focuses on
what they have, what they bring to the table?
• What data do we have that, if they have access to that
data, will allow them to make better and more informed
choices?
• What data do they have that if we have access to that
data, and with their permission, will allow us to help
them make better choices?
Pointer 2: Changing the narrative: Whose story
is it anyway? (cont.)

65.
Page credit: https://www.slideshare.net/prinsp/eden-triage
11 June 2014
Pointer 3: It is also our story and being response-
able

69.
Pointer 4: Recognise the political nature of data
and evidence
How do we collect, analyse and use data when the
data represent the results of generations of structural
inequalities and injustices?
How do we use learning analytics to break inter-
generational cycles of bias, discrimination and
exclusion?

70.
Pointer 5: Consider the difference between
correlation and causation in highly complex and
dynamic systems

72.
Pointer 7: Higher education can not afford NOT
to collect, analyse and use student learning data
• We have a contractual duty to ensure effective and appropriate
learning experiences
• We have a fiduciary duty of care in the context of the
asymmetrical power relationship between institution and
students
• We have the opportunity and the authority to act/care/respond
• We don’t have unlimited resources
• We need to be transparent and accountable for what we can and
cannot do.
• Once we know, we cannot un-know knowing – we have a moral
duty to respond

74.
Image credit: https://pixabay.com/en/question-mark-pile-question-mark-1481601/
We should see evidence as an invitation to a
conversation and not the end of the
conversation
(In)conclusions

75.
Image credit: http://www.basicknowledge101.com/subjects/reality.html
The question therefore is not: “Will the future of
Higher Education be evidence-based?”, but how
do we define evidence, and who defines it, and
what/whose purpose will it serve when enacted
upon?