Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

A brave new world - Student surveillance in higher education - Revisited

1,573 views

Published on

Presentation at the Association for Institutional Research (AIR15) Forum, May 26-29, 2015, Denver, Colorado, USA

Published in: Education
  • Be the first to comment

A brave new world - Student surveillance in higher education - Revisited

  1. 1. A brave new world: student surveillance in higher education – revisited By Paul Prinsloo (University of South Africa) Presentation at the Association for Institutional Research (AIR) Forum, Denver May 26-29 – “Data and decisions for higher education” Photograph: Paul Prinsloo
  2. 2. ACKNOWLEDGEMENTS • Except for the photographs on the title and last slides, I don’t own the copyright of any of the images used and hereby acknowledge their original copyright and licensing regimes. All the images used in this presentation have been sourced from Google and were labeled for non-commercial reuse • This work (excluding the licencing regimes of the images from Google) is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License • An earlier version of this presentation won the Best Paper Award at the Southern African Association for Institutional Research (SAAIR) Conference 16-18 October 2014, held in Pretoria, South Africa
  3. 3. When we hear ‘surveillance’ we may think of… Source credit: http://mashable.com/2013/06/09/edward-snowden/
  4. 4. Source: http://www.pewinternet.org/2015/03/16/Americans-Privacy-Strategies-Post-Snowden/#the-public-has-divided- sentiments-about-the-surveillance-programs
  5. 5. Source: http://www.pewinternet.org/2015/03/16/Americans-Privacy-Strategies-Post- Snowden/#the-public-has-divided-sentiments-about-the-surveillance-programs
  6. 6. Or we may think of… Image credit: http://commons.wikimedia.org/wiki/File:Penetentiary_Panopticon_Plan.jpg ‘Panopticon’ Jeremy Bentham, 1873 Greek mythology – Argus Panoptes – A giant with 100 eyes
  7. 7. And we may also think of…
  8. 8. “Secrets are lies” “Sharing is caring” “Privacy is theft” (Eggers, 2013, p. 303) And more recently… TruYou – “one account, one identity, one password, one payment system, per person. (…) The devices knew where you were… One button for the rest of your life online… Anytime you wanted to see anything, use anything, comment on anything or buy anything, it was one button, one account, everything tied together and trackable and simple…” (Eggers, 2013, p. 21) The eerie resemblance with the way higher education sees the institutional learning management system (LMS) is accidental…
  9. 9. Every breath you take Every move you make Every bond you break Every step you take I'll be watching you Every single day Every word you say Every game you play Every night you stay I'll be watching you O can't you see You belong to me… Sting – Every breath you take
  10. 10. Image credit: http://en.wikipedia.org/wiki/List_of_Back_to_the_Future_characters
  11. 11. Image source: https://www.mpiwg-berlin.mpg.de/en/news/features/feature14 Copyright could not be established • 1749 Jacques Francois Gaullauté proposed “le serre-papiers” – The Paperholder – to King Louis the 15th • One of the first attempts to articulate a new technology of power – one based on traces and archives (Chamayou, nd) • The stored documents comprised individual reports on each and every citizen of Paris The technology will allow the sovereign “…to know every inch of the city as well as his own house, he will know more about ordinary citizens than their own neighbours and the people who see them everyday (…) in their mass, copies of these certificates will provide him with an absolute faithful image of the city” (Chamayou, n.d) The Paperholder – “le serre papiers” (1749)
  12. 12. Image credit: http://iconicphotos.wordpress.com/2010/07/29/the- great-ivy-league-photo-scandal/ “… a person’s body, measured and analysed, could tell much about intelligence, moral worth, and probably future achievement… The data accumulated… will eventually lead on to proposals to ‘control and limit the production of inferior and useless organisms’” (Rosenbaum, 1995) The great Ivy League photo scandal 1940- 1970
  13. 13. So how do we understand and critically engage with the issues surrounding the increasing surveillance of students in higher education? Image credit: http://graffitiwatcher.deviantart.com/art/Big- Brother-is-Watching-173890591
  14. 14. “… ‘educational technology’ needs to be understood as a knot of social, political, economic and cultural agendas that are riddled with complications, contradictions and conflicts” (Selwyn, 2014, p. 6) If we accept that …what are the implications for the collection, analysis and use of student data?
  15. 15. Understanding the collection, analysis and use of student data in the contexts of … • Broader trends in higher education • From surveillance to sousveillance • The discourses in data and increasingly Big Data
  16. 16. Understanding the collection, analysis and use of student data in the context of some of the broader trends in higher education 1. Changes in funding regimes – funding follows performance rather than preceding it 2. Increasing concerns regarding student retention and dropout 3. Ranking systems and the internationalization of higher education 4. Higher education as business – the dominant neoliberal paradigm 5. The algorithmic turn and quantification fetish in higher education 6. The increasing digitization of learning and teaching 7. The gospel of technosolutionism in higher education 8. The hype, promise and dangers of (Big) data
  17. 17. Higher education is mesmerized/seduced by the potential of the collection, analysis and use of student data Image credit: http://en.wikipedia.org/wiki/Medusa (Student) data as Medusa – techno-solutionism in action
  18. 18. Every page you view Every click you make Every link you follow Every step you (don’t) take I'll be watching you Every single day Every word you say Every game you play Every night you stay I'll be watching you O can't you see You belong to me… Adapted from Sting – Every breath you take
  19. 19. As student data are increasingly used to personalise learning and to allocate resources, we are creating a “brave new world” with Delta children wearing khaki, Epsilons wearing black (“they’re too stupid to be able to read or write”), and Gammas and Deltas (Huxley, 2007, p. 22) The dream/nightmare of personalizing learning…
  20. 20. Understanding the collection, analysis and use of student data in the context of the change from surveillance to sousveillance Image credit: http://commons.wikimedia.org/wiki/File:SurSousVeillanceByStephanieMannAge6.png
  21. 21. Jennifer Ringely – 1996-2003 – webcam Source: http://onedio.com/haber/tum-zamanlarin-en- etkili-ve-onemli-internet-videolari-36465 If I did not share it on Facebook, did it really happen? We share more than ever before, we are watched more than ever before and we watch each other more than ever before… From surveillance to sousveillance…
  22. 22. We are increasingly watched/measured We increasingly watch/measure each other We increasingly watch ourselves Image credit: http://en.wikipedia.org/wiki/Surveillance#/media/File:Surveillance_video_cameras,_Gdynia.jpeg
  23. 23. Understanding the collection, analysis and use of student data in the context of the discourses on (Big) data
  24. 24. Different sources/variety of quality/ integrity of data Different role-players with different interests • Individuals • Corporates • Governments • Higher education • Data brokers • Fusion centers Different methods/types of surveillance, harvesting and analysis Issues re • Informed consent • Reuse/contextual integrity/context collapse • Ethics/privacy/ justice/care The Trinity of Big Data Adapted & refined from Prinsloo, P. (2014). A brave new world. Presentation at SAAIR, 16-18 October http://www.slideshare.net/prinsp/a-brave-new-world-student- surveillance-in-higher-education
  25. 25. Three sources of data Directed A digital form of surveillance wherein the “gaze of the technology is focused on a person or place by a human operator” Automated Generated as “an inherent, automatic function of the device or system and include traces …” Volunteered “gifted by users and include interactions across social media and the crowdsourcing of data wherein users generate data” (emphasis added) (Kitchen, 2013, pp. 262—263)
  26. 26. Image credit: http://commons.wikimedia.org/wiki/File:Red_sandstone_Lattice_piercework,_Qutb_Minar_complex.jpg
  27. 27. Big data presents “a paradigm shift in the ways we understand and study our world” (Eynon, 2013, p. 237) Image credits: http://commons.wikimedia.org/wiki/File:DARPA_Big_Data.jpg
  28. 28. Amidst the hype and the potential there are also… • “Rule by algorithm? Big data and the threat of algocracy” – Danaher (2014) • “Big Data’s big unintended consequences” – Wigan & Clarke (2013) • “Six provocations for Big Data” – Boyd & Crawford (2011) • “Critical questions for Big Data” – Boyd & Crawford (2012) • “Judged by the Tin Man: Individual rights in the age of Big Data” – Tene & Polonetsky (2013) • “Algorithmic accountability” – Diakopoulos (2014) • “The scored society: Due process for automated predictions” – Citron & Pasquale (2014) • “Big data, data integrity, and the fracturing of the control zone” – Lagoze (2014)
  29. 29. Caught between visions of Utopias and Dystopias…
  30. 30. Shi(f)t happens… • The claim that Big Data is equavalent to “allness” (Lagoze, 2014) – n=all – providing a complete view of reality • Big data “lessen our desire for exactitude” (Mayer-Schönberger & Cukier, 2013 in Lagoze, 2014) • It is no longer necessary to investigate the why things happen… More important is to note what is happening – data speaks for itself…
  31. 31. “Hidden algorithms can make (or ruin) reputations, decide the destiny of entrepreneurs, or even devastate an entire economy. Shrouded in secrecy and complexity, decisions at major Silicon Valley and Wall Street firms were long assumed to be neutral and technical. But leaks, whistleblowers, and legal disputes have shed new light on automated judgment” Source: http://www.hup.harvard.edu/catalog.php?isbn=9780674368279
  32. 32. Critical questions for big data – boyd & Crawford (2012) 1. Big data changes the definition of knowledge – “Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves” (Anderson, 2008, in boyd & Crawford, 2012, p. 666) 1. Claims to objectivity and accuracy are misleading – “working with Big Data is still subjective, and what it quantifies does not necessarily have a closer claim on objective truth” (Boyd & Crawford, 2012, p. 667). Big Data “enables the practice of apophenia: seeing patterns where none actually exist, simply because enormous quantities of data can offer connections that radiate in all directions” (ibid., p. 668)
  33. 33. Critical questions for big data (2) – boyd & Crawford (2012) 3. Bigger data are not always better data 3. Taken out of context, Big Data loses its meaning – leading to context collapse & lack of contextual integrity 3. Just because it is accessible does not make it ethical – the difference in ethical review procedures and overview between research and ‘institutional research’ 3. Limited access to Big Data creates new digital divides
  34. 34. Points of departure (Big) data is… …not an unqualified good (Boyd and Crawford, 2011) and “raw data is an oxymoron” (Gitelman, 2013) – see Kitchen, 2014 Technology and specifically the use of data have been and will always be ideological (Henman, 2004; Selwyn, 2014) and embedded in relations of power (Apple, 2004; Bauman, 2012)
  35. 35. The collection, analysis and use of student data: Where surveillance and sousveillance meet…
  36. 36. Towards a typology of surveillance in higher education (Knox, 2010) • Panoptic – Automated/internalised visibility • Rhizomatic – Multidirectional/sousveillance/ Surveillant assemblages – merging of different spheres/multiplying/amplifying • Predictive
  37. 37. Predictive surveillance •The (digital) past + (digital) real-time => future behavior … “The past becomes a simulated prologue” (Knows, 2010, p. 6; emphasis added) •Structuring/constraining choices/possibilities – shaping material conditions (Henman, 2004; Napoli, 2013) •Issues of identity/race/gender/class •The end of forgetting (Mayer-Schönberger, 2009; Rosen, 2010)
  38. 38. Preliminary seven dimensions of surveillance (Knox 2010) 1. Automation 2. Visibility 3. Directionality 4. Assemblage 5. Temporality 6. Sorting 7. Structuring
  39. 39. 1. Automation Key questions Dimensional intensity What is the timing of the collection? Intermittently/i nfrequently Continuous Locus of control? Human Machine Can it be turned on and off (and by whom?) All the monitoring can be turned on/off None of the monitoring can be turned off
  40. 40. 2. Visibility Key questions Dimensional intensity Is the surveillance apparent and transparent? All parts (collection, storage, processing and viewing) are visible None of the monitoring is visible Ratio of self-to-surveillant knowledge? Subject knows everything the surveillant knows Subject does not know anything that the surveillant knows
  41. 41. 3. Directionality Key questions Dimensional intensity What is the relative power of surveillant to subject? Subjects hold all the power Surveillant holds all the power Who has access to monitoring/recording/ broadcasting functions? Subjects Surveillant
  42. 42. 4. Assemblage Key questions Dimensional intensity Medium of surveillance Single medium (e.g. text) Multimedia Are the data stored? No Yes Who stores the data? Subject or collector Third party
  43. 43. 5. Temporality Key questions Dimensional intensity When does the monitoring occur? Confined to the present Combines the present with the past How long is the monitoring frame? One, isolated, relatively short frame (e.g. test) Long periods, or indefinitely Does the system attempt to predict future behavior/outcomes No – only assessment of the present Present + past used to predict the future When are the data available? All of the data available only after event is completed Available in real-time and experienced as instantaneous
  44. 44. 6. Sorting Key questions Dimensional intensity Are subjects’ data compared with other data – other individuals/ groups/ abstract configurations/ state mandates? None Other data are used as basis for comparison
  45. 45. 7. Structuring Key questions Dimensional intensity Are data used to alter the environment (i.e. treatment, experience, etc.)? Not used Used to alter the environment of all subjects Are data used to target the subject for different treatment that they would otherwise receive? No data are used as basis for differing treatment Based on data, treatment is prescribed
  46. 46. Do students know/have the right to know… • what data we harvest from them • about the assumptions that guide our algorithms • when we collect data & for what purposes • who will have access to the data (now & later) • how long we will keep the data & for what purpose & in what format • how will we verify the data & • do they have access to confirm/enrich their digital profiles…? Adapted from Prinsloo, P., & Slade, S. (2015). Student privacy self-management: implications for learning analytics. Presentation at LAK15, Poughkkeepsie, NY, 16 March 2015 http://www.slideshare.net/prinsp/lak15-workshop-vulnerability-final
  47. 47. Do they know? Do they have the right to know? Can they opt out and what are the implications if they do/don’t? Adapted from Prinsloo, P., & Slade, S. (2015). Student privacy self-management: implications for learning analytics. Presentation at LAK15, Poughkkeepsie, NY, 16 March 2015 http://www.slideshare.net/prinsp/lak15-workshop-vulnerability-final
  48. 48. What are the implications for the collection, analysis and use of student (digital) data? (Prinsloo & Slade, 2015) 1. The duty of reciprocal care • Make TOCs as accessible and understandable (the latter may mean longer…) • Make it clear what data is collected, when, for what purpose, for how long it will be kept and who will have access and under what circumstances • Provide users access to information and data held about them, to verify and/or question the conclusions drawn, and where necessary, provide context • Provide access to a neutral ombudsperson (Prinsloo & Slade, 2015)
  49. 49. What are the implications …? (2) 2. The contextual integrity of privacy and data – ensure the contextual integrity and lifespan of personal data. Context matters… 2. Student agency and privacy self-management • The fiduciary duty of higher education implies a social contract of goodwill and ‘do no harm’ • The asymmetrical power relationship between institution and students necessitates transparency, accountability, access and input/collaboration • Empower students – digital citizenship/care • The costs and benefits of sharing data with the institution should be clear • Higher education should not accept a non-response as equal to opting in… (Prinsloo & Slade, 2015)
  50. 50. What are the implications …? (3) 4. Future direction and reflection • Rethink consent and employ nudges – move away from thinking just in terms of a binary of opting in or out – but provide a range of choices in specific contexts or needs • Develop partial privacy self-management – based on context/need/value • Adjust privacy’s timing and focus - the downstream use of data, the importance of contextual integrity, the lifespan of data • Moving toward substance over neutrality – blocking troublesome and immoral practices, but also soft, negotiated spaces of reciprocal care (Prinsloo & Slade, 2015)
  51. 51. (In)conclusions “Technology is neither good or bad; nor is it neutral… technology’s interaction with social ecology is such that technical developments frequently have environmental, social, and human consequences that go far beyond the immediate purposes of the technical devices and practices themselves” Melvin Kranzberg (1986, p. 545 in boyd & Crawford, 2012, p. 1)
  52. 52. (In)conclusions Learning analytics are a structuring device, not neutral, informed by current beliefs about what counts as knowledge and learning, colored by assumptions about gender/race/class/capital/literacy and in service of and perpetuating existing or new power relations Welcome to a brave new world…
  53. 53. Last thing I remember, I was Running for the door I had to find the passage back To the place I was before "Relax, " said the night man, "We are programmed to receive. You can check-out any time you like, But you can never leave! " Eagles – Hotel California
  54. 54. THANK YOU Paul Prinsloo (Prof) Research Professor in Open Distance Learning (ODL) College of Economic and Management Sciences, Office number 3-15, Club 1, Hazelwood, P O Box 392 Unisa, 0003, Republic of South Africa T: +27 (0) 12 433 4719 (office) T: +27 (0) 82 3954 113 (mobile) prinsp@unisa.ac.za Skype: paul.prinsloo59 Personal blog: http://opendistanceteachingandlearning.wordpress.co m Twitter profile: @14prinsp Photograph: Paul Prinsloo
  55. 55. References and additional reading Apple, M.W. (Ed.). (2010). Global crises, social justice, and education. New York, NY: Routledge. Bauman, Z. (2012). On education. conversations with Riccardo Mazzeo. Cambridge, UK: Polity. Booth, M. (2012, July 18). Learning analytics: the new black. EDUCAUSEreview, [online]. Retrieved from http://www.educause.edu/ero/article/learning-analytics-new-black Boyd, D., & Crawford, K. (2013). Six provocations for Big Data. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926431 Chamayou, G. (n.d.). Every move will be recorded. [Web log post]. Retrieved from https://www.mpiwg- berlin.mpg.de/en/news/features/feature14 Cirton, D.K., & Pasquale, F. (2014). The scored society: Due process for automated predictions. http://ssrn.com/abstract=2376209 Danaher, J. (2014, January 6). Rule by algorithm? Big Data and the threat of algocracy.[Web log post]. Retrieved from http://philosophicaldisquisitions.blogspot.com/2014/01/rule-by-algorithm-big-data-and- threat.html Deleuze. G. (1992). Postscript on the societies of control. October, 59 pp. 3-7. Diakopoulos, N. (2014). Algorithmic accountability. Digital Journalism. DOI: 10.1080/21670811.2014.976411 Gitelman, L. (Ed.). (2013). “Raw data” is an oxymoron. London, UK: MIT Press. Henman, P. (2004). Targeted!: Population segmentation, electronic surveillance and governing the unemployed in Australia. International Sociology, 19, 173-191 Gray, J. (2004). Heresies. Against progress and other illusions. London, UK: Granta Books.
  56. 56. References and additional reading (cont.) Kitchen, R. (2013). Big data and human geography: opportunities, challenges and risks. Dialogues in Human Geography, 3, 262-267. SOI: 10.1177/2043820613513388 Knox, D. (2010). Spies in the ouse of learning: a typology of surveillance in online learning environments. Paper presented at Edge, Memorial University of Newfoundland, Canada, 12-15 October. Kranzberg, M. (1986) Technology and history: Kranzberg's laws’. Technology and Culture, 27(3), 544—560. Lagoze, C. (2014). Big Data, data integrity, and the fracturing of the control zone. Big Data & Society (July- December), 1-11. Mayer-Schönberger, V. (2009). Delete. The virtue of forgetting in the digital age. Princeton, NJ: Princeton University Press. Mayer-Schönberger, V., Cukier, K. (2013). Big data. London, UK: Hachette. Morozov, E. (2013a, October 23). The real privacy problem. MIT Technology Review. Retrieved from http://www.technologyreview.com/featuredstory/520426/the-real-privacy-problem/ Morozov, E. (2013b). To save everything, click here. London, UK: Penguin Books. Morozov, E. (2013). To save everything, click here. London, UK: Penguin Books. Napoli, P. (2013). The algorithm as institution: Toward a theoretical framework for automated media production and consumption. In Media in Transition Conference (pp. 1–36). DOI: 10.2139/ssrn.2260923 Pasquale, F. (2015). The black box society. Harvard Publishing, US. Prinsloo, P. (2009). Modelling throughput at Unisa: The key to the successful implementation of ODL. Retrieved from http://uir.unisa.ac.za/handle/10500/6035
  57. 57. References and additional reading (cont) Prinsloo, P., & Slade, S. (2014). Educational triage in open distance learning: Walking a moral tightrope. The International Review of Research in Open and Distributed Learning, 15(4), 306-331. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/1881/3060 Prinsloo, P., & Slade, S. (2015, March). Student privacy self-management: implications for learning analytics. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp. 83-92). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2723585 Rambam, S. (2008). Privacy is dead. Get over it. Retrieved from https://www.youtube.com/watch?v=Vsxxsrn2Tfs&index=1&list=PL8C71542205AA51E5 Rosen, J. (2010, July 21). The web means the end of forgetting. New York Times [Online]. Rosenbaum, R. (1995, January 15). The great Ivy League nude posture photo scandal. The New York Times. Retrieved from http://www.nytimes.com/1995/01/15/magazine/the-great-ivy-league-nude-posture- photo-scandal.html Serlwyn, N. (2014). Distrusting educational technology. Critical questions for changing times. New York, NY: Routledge Subotzky, G., & Prinsloo, P. (2011). Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177-193. Tene, O. & Polonetsky, J. (2013). Judged by the Tin Man: Individual rights in the age of Big Data. J. on Telecomm. & High Tech. L., 11, 351. .
  58. 58. References and additional reading (cont) Totaro, P., & Ninno, D. (2014). The concept of algorithm as an interpretive key of modern rationality. Theory Culture Society 31, pp. 29—49. DOI: 10.1177/0263276413510051 Therborn, G. (ed.).(2006). Inequalities of the world. New theoretical frameworks, multiple empirical approaches. London, UK: Verso Books Wagner, E., & Ice, P. (2012, July 18). Data changes everything: delivering on the promise of learning analytics in higher education. EDUCAUSEreview, [online]. Retrieved from http://www.educause.edu/ero/article/data- changes-everything-delivering-promise-learning-analytics-higher-education Watters, A. (2013, October 13). Student data is the new oil: MOOCs, metaphor, and money. [Web log post]. Retrieved from http://www.hackeducation.com/2013/10/17/student-data-is-the-new-oil/ Watters, A. (2014). Social justice. [Web log post]. Retrieved from http://hackeducation.com/2014/12/18/top- ed-tech-trends-2014-justice Wigan, M.R., & Clarke, R. (2013). Big data’s big unintended consequences. Computer,(June), 46-53.

×