1. Dr Paul Prinsloo, Education Consultant and Researcher,
Directorate for Curriculum and Learning Development,
University of South Africa
ETHICS AND LEARNING
ANALYTICS AS A FAUSTIAN
PACT: BETWEEN ORWELL,
HUXLEY, KAFKA AND THE
DEEP BLUE SEA
5. OVERVIEW OF THE PRESENTATION
• Some questions to ponder on…
• Learning analytics and ethics –introductory remarks
• A short history of profiling
• Ethics in learning analytics – different approaches
• Orwell, Huxley, Kafka and Faust
• A short overview of different frameworks
• Learning analytics as moral practice
• Some questions to ponder on… (revisited)
• (In)conclusions
6. SOME QUESTIONS TO PONDER ON…
1. What are some of the dangers in learning analytics?
2. Is “raw data” an oxymoron?
3. Should students be allowed to opt-out of having their
personal digital footprints harvested and analysed?
4. To what extent should students have access to the
content of their digital dossiers, who have access to
these dossiers, and what it is used for?
5. How complete and permanent a picture do our data
provide about students?
7. SOME QUESTIONS TO PONDER ON… (cont.)
6. To what extent do we provide students the option to
update their digital dossiers and provide extra (possibly
qualitative) data?
7. Do students have the right to request that their digital
dossiers be deleted on graduation?
8. If we outsource the collection (and analysis) of student
digital data to companies, do students need to give
consent? *Who owns a student’s data?+
9. Is bigger data sets always better or provide more
complete pictures?
10.What responsibility comes with ‘knowing’?
10. Amidst the hype and
potential…
Learning analytics, like knowledge
and progress, is not an unqualified
good – “it can be used as much as a
curse as a blessing”
(Gray 2004, p. 70 referring to knowledge and
progress)
12. The information network…
An elaborate lattice of information
networks creates digital collages and
biographies from shards of data to form
digital dossiers used by governments,
employers, higher education institutions
and the corporate sector
(Solove, 2004)
13. The gaps…
“We are more than the bits of data we give off
as we go about our lives. Our digital biography is
revealing of ourselves but in a rather
standardized way” (p. 46). Our digital
biographies are therefore not only (to some
extent) unauthorized, but also to a huge extent
reductive, partial and often inaccurate.
(Solove, 2004)
14. Six provocations of Big Data…
1. “Automating research changes the definition of
knowledge” (p.3)
2. “Claims to objectivity and accuracy are
misleading” (p. 4)
3. “…bigger data are not always better data” (p. 6)
4. “Not all data are equivalent” (p. 8)
5. “Just because it is accessible doesn’t make it
ethical” (p. 10)
6. “Limited access to big data creates new digital
divides” (p. 12)
(boyd & Crawford, 2013)
15. Contested objectivity…
“Data and data sets are not objective; they are
creations of human design. We give numbers
their voice, draw inferences from them, and
define their meaning through our
interpretations” (Crawford, 2013, para.2).
“…raw data is an oxymoron” (Crawford, 2013,
para. 7).
16. A number of disclaimers…
I am not going to refer to
• Aristotle, Emmanuel Kant, John Stuart Mill, and John Rawl
• Various international and South African laws on privacy,
protection of personal information, personal and national
security
• Institutional review boards’ criteria and processes
I will
• Share my personal sense-making of an extremely complex and
volatile field
• Provoke and question some of my own and possibly our
assumptions about data, surveillance and our current techno
determinism and reductionist approaches to interpreting
student data
30. How can learning analytics assist us to stop the
revolving door in higher education?
https://fbcdn-sphotos-d-a.akamaihd.net/hphotos-ak-ash3/p480x480/601036_626579104038059_1576248974_n.jpg
31. A brief overview of
different approaches
to thinking about
ethics in learning
analytics
32. Different approaches to discussing ethical implications
of surveillance, and profiling:
• Philosophical approaches – e.g. Willis, Campbell and
Pistilli (2013)
• Legal approaches – Marx (1998, 2001); Solove (2004)
• A rights-based approach versus a discursive-disclosive
approach (Stoddard, 2012)
• Tensions between “public” and “private” (Marx, 2001)
• Borders of personal information (Marx & Muschert,
2007)
• Socio-critical (Bauman & Lyon, 2013; Prinsloo & Slade,
2013; Slade & Prinsloo, 2013)
33.
34. “Metaphors are tools of shared cultural
understanding” (Balkin, 1998, in Solove, 2004,
p. 28)
Metaphors don’t function “to render a precise
descriptive presentation of the problem; rather,
they capture our concerns over privacy in a way
that is palpable, potent, and compelling”
(Solove, 2004, p. 28).
35. TOWARDS A FRAMEWORK FOR
ETHICS IN LEARNING ANALYTICS
• 1973 – “Code of fair information
practices”
• 20 questions by Marx (1998)
• 9 Principles by Pounder (2008)
36. 1. There must be no personal-date record-keeping systems whose
very existence is secret.
2. There must be a way for an individual to find out what
information about him (sic) is in the record and how it is used.
3. There must be a way for an individual to prevent information
obtained about him for one purpose for being used or made
available for other purposes without his consent.
4. There must be a way for an individual to correct or amend a
record of identifiable information about him.
5. Any organisaton creating, maintaining, sing, or disseminating
records of indentifiable personal data must assure the reliability
of the data for their intended use and must take reasonable
precautions to prevent misue of the data.
1973 Code of fair information practices
37. 20 questions by Marx (1998)
Means: Harm, boundaries, trust, personal
relationships, validity
Data collection contexts: Awareness, consent,
golden rule, minimisation, public decision making,
human review, right of inspection, right to
challenge, redress and sanction, stewardship and
protection, unintended precedents, etc.
Uses: Beneficiary, proportionality, alternative
means, consequences of inaction, protections, etc.
38. 9 Principles by Pounder (2008)
Principle 1:The justification principle
Principle 2:The approval principle
Principle 3:The separation principle
Principle 4:The adherence principle
Principle 5:The reporting principle
Principle 6: The independent supervision principle
Principle 7: The privacy principle
Principle 8: The compensation principle
Principle 9: The unacceptability principle
39. 6 Principles for learning analytics as moral
practice (Slade & Prinsloo, 2013)
1.Learning analytics as moral practice
2.Students as agents
3.Student identity and performance as temporal
dynamic constructs
4.Student success is a complex and
multidimensional phenomenon
5.Transparency
6.Higher education cannot afford not to use data
40. A number of considerations for an ethics
architecture for learning analytics (Slade &
Prinsloo, 2013)
1.Who benefits and under what conditions?
2.Conditions for consent, de-identification, and
opting out – including considerations regarding
vulnerability and harm
3.Data collection, analyses, access and storage
4.Governance and resource allocation
41. SOME QUESTIONS TO PONDER ON…
Revisited
1. What are some of the dangers in learning analytics?
2. Is “raw data” an oxymoron?
3. Should students be allowed to opt-out of having their
personal digital footprints harvested and analysed?
4. To what extent should students have access to the
content of their digital dossiers, who have access to
these dossiers, and what it is used for?
5. How complete and permanent a picture do our data
provide about students?
42. SOME QUESTIONS TO PONDER ON… (cont.)
6. To what extent do we provide students the option to
update their digital dossiers and provide extra (possibly
qualitative) data?
7. Do students have the right to request that their digital
dossiers be deleted on graduation?
8. If we outsource the collection (and analysis) of student
digital data to companies, do students need to give
consent? *Who owns a student’s data?+
9. Is bigger data sets always better or provide more
complete pictures?
10.What responsibility comes with ‘knowing’?
43. (In)conclusions
Referring to his set of 29 questions, Marx
(1998) states that the questions will not
satisfy philosophers and practitioners who
“lust after a Rosetta stone of clear and
consistent justifications” but the questions
strive to serve as “an imperfect compass
than a detailed map” (Marx, 1998, p. 182).
44. A detailed map “can lead to the
erroneous conclusion that ethical
directions can be easily reached or to a
statement so far in the stratosphere that
only angels can see and apply it” (Marx,
1998, p. 182).
45. Thank you for sharing my
personal journey to make sense
of ethics in learning analytics as a
Faustian pact, caught between
Orwell, Huxley, Kafka and the
deep blue sea…
46. Thank you. Baie dankie. Ke a
leboga Paul Prinsloo
TVW4-69
P O Box 392
Unisa
0003
prinsp@unisa.ac.za
http://opendistanceteachingandlearning.wordpress.com
Twitter: 14prinsp
+27124293683
+27823954113
47. REFERENCES
boyd, d., & Crawford, K. (2013). Six provocations for Big Data. Retrieved from
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926431
Code of fair information practices. (1973). Retrieved from
http://simson.net/ref/2004/csg357/handouts/01_fips.pdf
Crawford, K. 2013, April 1. The hidden biases in big data. [Web log post]. Harvard
Business Review. Retrieved from
http://blogs.hbr.org/cs/2013/04/the_hidden_biases_in_big_data.html
Gray, J. 2004. Heresies. Against progress and other illusions. London, UK: Granta Books.
Marx, G.T. 1998. Ethics for the new surveillance. The Information Society: An
International Journal, 14(3), 171-185. DOI: org/10.1080/019722498128809
Marx, G.T. 2001. Murky conceptual waters: the public and the private. Ethics and
Information Technology, 3, 157-169.
Marx, G.T., & Muschert, G.W. 2007. Personal information, borders, and the new
surveillance. Annual Review of Law and Social Science, 3: 375-95.
Pounder, C.N.M. 2008. Nine principles for assessing whether privacy is protected in a
surveillance society. Identity in the Information Society, 1, 1-22. DOI 10.1007/s12394-
008-0002-2.
48. REFERENCES (CONT.)
Prinsloo, P., & Slade, S. 2013. An evaluation of policy frameworks for addressing ethical
considerations in learning analytics. LAK '13 Proceedings of the Third International
Conference on Learning Analytics and Knowledge (LAK13), pp. 240-244. Retrieved
from http://dl.acm.org/citation.cfm?id=2460344
Silver, N. 2012. The signal and the noise. The art and science of prediction. London, UK:
Allen Lane.
Slade, S., & Prinsloo, P. 2013. Learning analytics: Ethical issues and dilemmas. American
Behavioral Scientist, XX(X), 1-20. DOI: 10.1177/0002764213479366
Solove, D.J. 2004. The digital person. Technology and privacy in the information age.
New York, NY: New York University Press.
Stoddart, E. (2012). a. A surveillance of care. Evaluating surveillance ethically. In K. Ball,
K.D. Haggerty, and D. Lyon, (eds.), Routledge handbook of surveillance studies.
London, UK: Routledge, pp. 369 – 376.
Willis, J.E.III, Campbell, J.P., & Pistilli, M.D. 2013, May 6. Ethics, big data, and analytics: a
model for application. EDUCAUSEreview online. Retrieved from
http://www.educause.edu/ero/article/ethics-big-data-and-analytics-model-
application