Advertisement

A hands-on approach to digital tool criticism: Tools for (self-)reflection

Jan. 4, 2019
Advertisement

More Related Content

Slideshows for you(20)

Similar to A hands-on approach to digital tool criticism: Tools for (self-)reflection(20)

Advertisement
Advertisement

A hands-on approach to digital tool criticism: Tools for (self-)reflection

  1. Marijn Koolen (KNAW Humanities Cluster) Jasmijn van Gorp (Utrecht University) Jacco van Ossenbruggen (Centrum Wiskunde & Informatica, VU Amsterdam) Digital Hermeneutics in History: Theory and Practice University of Luxembourg, 25 October 2018 A Hands-on Approach to Digital Tool Criticism Tools for (self-)Reflection
  2. Setting: ● Many datasets and tools available and used in (digital) humanities research ● Methods (hand)books in humanities primarily aimed at analogue research → not yet up to date with digital research Questions: ● Which methodological steps do we have to take to use digital tools and data responsibly? ● To what extent do these steps differ from the steps taken in the pre-digital era? ○ How to align tool impact with user’s intentions ○ How to reach shared understanding of tools in methodology Goal: develop and test method for performing Digital Tool Criticism Motivation
  3. Overview - Digital Tool Criticism - Workshop Format and Findings - Role of Reflection - Entanglement of Data and Tools
  4. Digital Tool Criticism Definition: With digital tool criticism we mean the reflection on the role of digital tools in the research methodology and the evaluation of the suitability of a given digital tool for a specific research goal. The aim is to understand the impact of any limitation of the tool on the specific goal, not to improve a tool’s performance. That is, ensuring as a scholar to be aware of the impact of a tool on research design, methods, interpretations and outcomes. (Koolen, van Gorp & van Ossenbruggen, 2018)
  5. Digital Tools in Research Design Model taken from Maxwell (2013) Trevor Owens (2014): research design is iterative process, can start from goals, questions, methods, ...
  6. Guiding Questions ● Starting point: (digital) source criticism ○ Method / approach in the humanities and specifically in historical research (cf. Fickers, 2012) ○ Internal source criticism: content of the text ○ External source criticism: metadata of the text (context) ■ Who created the text? ■ What kind of document is it? ■ Where was it made and distributed? ■ When was it made? ■ Why was it made? ● Digital Tool Criticism ○ What makes digital tool criticism different from digital source criticism? ○ Tool hermeneutics: what was its intended use? Does that align with my intended use? How does it affect the digital sources/data it operates on?
  7. Workshop Format ● Do digital research experiments in groups, keep track of findings ○ Participants log/write down steps and choices ○ Present and discuss findings at end of workshop ● Experiments on one or more research phases ○ E.g. exploration, gathering, analysis, synthesis ○ Limited number of digital tools (overlap) ○ Research topic/theme: ■ Broad when focusing on single research phases, one or two tools ■ Narrow when analyzing tool criticism across research phases ● Meta-discussion ○ General findings ○ Lessons learned and suggestions for next steps
  8. ● DH Benelux 2017 workshop: ○ Single research phase: exploration ○ Research theme: Discourse around migration and refugees in broadcast media and politics ○ Two experiments: 1) tools to explore, 2) tools to gather resources ● DH Benelux 2018 workshop: ○ Multiple phases: exploration, gathering, analysis ○ Research theme: representation of labour immigration in NL media in 1950s till 1980s ○ Two experiments: 1) tools to explore and gather, 2) tools to do analysis ○ Create and compare Research DNA visualizations DTC Workshops
  9. Workshop Findings Three parts: 1. General trends in research process 2. Impact of data and tools on research questions 3. Meta-discussion
  10. Findings 1: General Trends in Research Process Analyse the collaborative notes of the individual groups Colour-coded the notes based on 5 aspects: Research question Method Tool Dataset Reflection
  11. Coding Collaborative Notes Colour-coded group notes based on 5 aspects: Research question Method Tool Dataset Reflection
  12. Coding Collaborative Notes
  13. Research DNA Visualizations
  14. Findings 2: Impact of Data and Tools on RQs ● First steps in this exploratory phase: ○ All groups use rapid searches ■ Establish suitability of data/tool for a certain line of inquiry ■ Many dead ends: search reveals data is limited, tool lacks functionality ■ Iteratively adjusted questions, tools and data selections ○ Reflection on tools affects research questions ● Once questions, tools and data are aligned ○ Exploration continues in a specific direction ○ Use same strategy to refine questions and hypotheses ○ In line with findings by Solberg (2012, p. 64). ○ Continuous reflection required to keep alignment!
  15. Findings 3: Meta-Discussion ● Collaborative reflection ○ Explicit discussion and questioning encourages reflection ○ Prompts questions that otherwise would not have been asked ○ Note taking helps understand process and choice points ● Data literacy ○ Need to understand how data is structured ○ “Give us the raw data!”: Scholars often want direct access to underlying data ○ Tools and data need better documentation ○ Tools and data are hard to critique separately ● Workshop format ○ (Collaborative) reflection-in-action increases awareness of tool impact and choices ○ Effective to group tool users and builders
  16. We have to reflect on how digital tools organize, access and analyse our materials Late 19th and early 20th century scholarship was dominated not by big ideas, but by methodological refinement and disciplinary consolidation. Denigrated in the later 20th century as unworthy of serious attention by scholars, the 19th and early 20th century, by contrast, took activities like philology, lexicology, and especially bibliography very seriously. Serious scholarship was concerned as much with organizing knowledge as it was with framing knowledge in an ideological construct. (Scheinfeldt 2008) Need to Integrate Reflection in Methods
  17. Model: Reflection as Integrative Practice Koolen, van Gorp & van Ossenbruggen, 2018. Toward a model for digital tool criticism: Reflection as integrative practice. In Digital Scholarship in the Humanities 2018. https://academic.oup.com/dsh/advance-article/doi/10.1093/llc/fqy048/5127711
  18. Research Design as Wicked Problem ● Wicked problem ○ Design theory concept, a problem that is inherently ill-defined (Ritter 1967) ○ Working towards solution changes the nature of the problem ● Humanities research is designed iteratively (Bron et al. 2016) ○ Impossible to predict where investigation takes you ○ Engagement with research materials shift goal posts ○ Affects appropriateness of design for RQ ● Uncritical use of digital tools exacerbates the problem ○ Graphical User Interfaces (GUIs) often hide relevant data transformations and manipulations ○ Requires active reflective attitude
  19. ● Reflection In Action ○ Process is often unpredictable and uncertain (Schön 1983, p. 40) ○ Some actions, recognitions and judgements we carry out spontaneously, without thinking about them (p. 54) ○ Use reflection to criticize tacit understanding grown from repetitive experiences (p. 61) ● This fits certain aspects of scholarly practice ○ E.g. searching, browsing, selecting using various information systems (digital archives and libraries, catalogs and other databases). ○ But information systems already have pre-selection, rarely well-documented (digital source criticism!) Role of Reflection
  20. ● How can tools include components that encourage reflection? ○ Add about pages and documentation on data pre-processing and algorithmic choices ○ Add functionality for analysing data quality ○ Visualize missing and erroneous data values, provide statistics in aggregate overviews ○ Use pop-up boxes to ask users to question what they see and do ● Methods ○ Research DNA visualizations ○ Analyze transaction log data ○ Make tool evaluation data results available to end-users, not just to developers ○ Also evaluate bias, not just error rates Reflective Tools and Methods
  21. Name:Digital Source & Tool Criticism Canvas (v0.1) Jacco van Ossenbruggen, CWI. 10.5281/zenodo.1283308 Costs What does it cost to make data and tools available to you (licensing, pay wall, renting compute resources, etc)? Are data and tools compatible? Do data and tools need to be adapted otherwise for an adequate fit (data cleaning, software modifications)? How much time and other resources will be involved in preparation and running all computational steps? What are other disadvantages of using these data sets and tools with respect to your research question and method? Benefits What are advantages of using digital data & tools when compared to using more traditional methods? How much will other researchers benefit from your data, tools, method, experience? Data in What existing input data sets do you need? (5W+H) Fitness To what extent does this data fit your question and method? Tools What tools do you need to access, transform and present this data? (5W+H) Fitness To what extent does each tool fit your question and method? Fitness To what extent does your generated data actually answer your question Data out What data will your study generate and how should it be interpreted? (5W+H) Tool Bias How have the tools influenced your method & generated data and overall outcome? Publish How do you communicate these influences to your audience? What is your research question? What is your research method?
  22. Entanglement of Data and Tools
  23. Entanglement of Data and Tools Each step changes the underlying data!
  24. Data Scopes ● Importance of documenting the data manipulation process in digital history research (Hoekstra & Koolen 2018) ○ Many in-the-moment decision are forgotten, consequences become invisible ○ Discuss activities in terms of: modelling, selecting, normalizing, linking, classifying ○ How to communicate/publish data manipulation process ● Related to both digital tool criticism and digital source criticism ○ Should explore how they complement and strengthen each other
  25. Model: Reflection as Integrative Practice Koolen, van Gorp & van Ossenbruggen, 2018. Toward a model for digital tool criticism: Reflection as integrative practice. In Digital Scholarship in the Humanities 2018. https://academic.oup.com/dsh/advance-article/doi/10.1093/llc/fqy048/5127711
  26. Eijnatten, J. van ., Pieters, T. & Verheul, J., (2013). Big Data for Global History: The Transformative Promise of Digital Humanities. BMGN - Low Countries Historical Review. 128(4), pp.55–77. DOI: http://doi.org/10.18352/bmgn-lchr.9350 Fickers, A. (2012). Towards a New Digital Historicism? Doing History in the Age of Abundance. View journal, volume 1 (1). http://orbilu.uni.lu/bitstream/10993/7615/1/4-4-1-PB.pdf Hitchcock, T. (2013). Confronting the Digital - Or How Academic History Writing Lost the Plot. Cultural and Social History, Volume 10, Issue 1, pp. 9-23. https://doi.org/10.2752/147800413X13515292098070 Hoekstra, R., Koolen, M. (2018). Data Scopes for Digital History Research. Historical Methods: a Journal of Quantitative and Interdisciplinary History, Volume 51, Issue 3. Maxwell, J. (2013). Qualitative Research Design: An Interactive Approach, 3rd edition. SAGE publications. Owens, T. (2014). Where to Start? On Research Questions in The Digital Humanities. http://www.trevorowens.org/2014/08/where-to-start-on-research-questions-in-the-digital-humanities/ Putnam L. (2016). The Transnational and the Text-Searchable: Digitized Sources and the Shadows They Cast. American Historical Review, Volume 121, Number 2, pp. 377-402. Scheinfeldt, T.(2008). Sunset for Ideology, Sunrise for Methodology? Found History. http://foundhistory.org/2008/03/sunset-for-ideology-sunrise-for-methodology/ Solberg, J. (2012). Googling the Archive: Digital Tools and the Practice of History. Advances in the History of Rethoric, Volume 15, pp. 53-76. Aysenur Bilgin, Laura Hollink, Jacco van Ossenbruggen, Erik Tjong Kim Sang, Kim Smeenk, Frank Harbers, Marcel Broersma (2018) Utilizing a Transparency-driven Environment toward Trusted Automatic Genre Classification: A Case Study in Journalism History. IEEE 14th International Conference on e-Science. (arXiv preprint arXiv:1810.00968) References
  27. Thank You! Questions? Slides: http://bit.ly/DHiH-2018-ToolCrit
Advertisement