Successfully reported this slideshow.

Tools that Encourage Criticism - Leiden University Symposium on Tools Criticism

0

Share

1 of 44
1 of 44

More Related Content

Related Audiobooks

Free with a 14 day trial from Scribd

See all

Tools that Encourage Criticism - Leiden University Symposium on Tools Criticism

  1. 1. Tools that Encourage Criticism Digital Humanities Infrastructures and Research Marijn Koolen - KNAW Humanities Cluster Research & Development Symposium on Tools Criticism - Leiden University - 21-11-2019
  2. 2. Tools & Reflection In Three Themes ● Digital Tool Criticism ○ Reflection on tool use ● Data Scopes ○ Conceptual framework for data transformations ● Documenting Research Process ○ Tracing tool interactions, choices and decisions
  3. 3. Digital Tool Criticism
  4. 4. ● Fast and easy to use ● Aim to hide technical details but search fields are not ‘technical’ detail ● Most interfaces not made with tool criticism in mind ○ But, there are too many details! ○ Which should be brought to our attention? Search Engines: Experts in Retrieval, Masters at Hiding
  5. 5. Communicating Choices and Options
  6. 6. ● Inspection tool in CLARIAH Media Suite is first attempt ○ What are other ways to identify and flag issues in data and tools? ○ Input from researchers and developers needed! ● How do/can other often used search interfaces deal with transparency? ○ Nederlab - research portal for linguistic analysis ○ Delpher - Dutch digitized newspaper archive ○ WorldCat ○ Europeana ○ HathiTrust Research Center ○ Google Reflection and Transparency in Tool Interfaces
  7. 7. Use of digital tools in the context of research: criticism should incorporate all elements of research, e.g. RQ, methods, data and tools. (Koolen, van Gorp & van Ossenbruggen, DSH 2019) More later in Jasmijn van Gorp’s presentation Tool Criticism as Reflection
  8. 8. ● Switch from close to distant reading ○ Puts us in a new, less familiar environment and paradigm ○ Unknown pitfalls, what are the reflective questions that signal where to expect these pitfalls? ● Perceived conflict of investing time ○ Acquiring new ‘technical’ skills vs. doing the ‘actual’ research. ○ Self-defeating beliefs about own possibilities to learn. (Fook 2015, p. 445) ● What are ways to break out of this impasse? Critical Reflection and Technical Knowledge
  9. 9. ● Tool Criticism through Collaboration and Experimentation ○ Workshops: Tool Crit. 2015, DH Benelux 2017, DH Benelux 2018, Utrecht University 2018 ● Collaboratively using tools prompts discussions ○ Face-to-face: collaboratively looking under the hood and its consequences ○ Explaining how you think it works is a great way to bring out gaps in your own understanding (Sloman and Fernbach 2017) ● Many research questions require huge number of skills ○ Need to collaborate to ensure at least someone involved understands specific tool details ● Experimentation with tools to deepen understanding ○ “We learn differently when we are learning to perform than when we are learning to understand what is being communicated to us.” (Mezirow 1991, p.1) ● But what level of skills do we need? ○ Ben Schmidt (2016): not algorithmic steps, but data transformations Digital Tool Criticism Workshops
  10. 10. Data Scopes
  11. 11. ● Data needs processing to offer insights for research questions ○ Making data transformations offers different perspectives or scopes on data ■ Often left out of publications, outsourced as “technical detail” ■ But details matter, process is intellectual effort! ● Data Scopes concept (Hoekstra and Koolen 2018) ○ Conceptual tool for thinking and communicating about research data processing ○ Especially for combining data from different sources Data Scopes
  12. 12. Data Scopes ● Data needs processing to offer insights for research questions ○ Making data transformations offers different perspectives or scopes on data ■ Often left out of publications, outsourced as “technical detail” ■ But details matter, process is intellectual effort! ● Data Scopes concept (Hoekstra and Koolen 2018) ○ Conceptual tool for thinking and communicating about research data processing ○ Especially for combining data from different sources ● Five types of transforming activities: ○ Selecting ○ Modeling ○ Normalizing ○ Linking ○ Classifying ● A form of scholarly primitives (Unsworth 2000, Anderson et al. 2010)
  13. 13. ● Online book response corpus (Boot 2017) ○ ~400,000 book reviews in Dutch ○ From different review sites (Bol, Hebban, Dizzie, Wat Lees Jij Nu, …) ● Research questions ○ What impact does reading fiction have on readers? ■ How do reviewers describe impact of book? ■ Are there differences across genres/authors? Use Case: Analyzing Online Book Reviews
  14. 14. Reading Impact “Je gaat Stijn eigenlijk een beetje begrijpen, …” “You start to understand Stijn, …” “Helaas is de schrijfstijl bedroevend, presenteert Kluun zich als een pseudo-intellectueel …” “Unfortunately the writing style is pathetic, Kluun presents himself as a pseudo-intellectual …”
  15. 15. Reading Impact Rules ● 349 rules ○ Identifying 4 types of impact: general, narrative, style, reflection ● Term: boeiend ○ Rule 2: Style impact: boeiend + style term - e.g. “boeiend taalgebruik” (engaging use of language) ○ Rule 3: Reflection: boeiend + topic term - e.g. “boeiende thematiek” (engaging themes) ● Phrase: in één ruk * uitlezen ○ Rule 79: General impact - “Ik heb het boek in één ruk helemaal uitgelezen.” (finish in one go) ○ Many variants ■ één/een/1 ■ adem/avond/dag/keer/middag/ruk/stuk/zucht/... ■ uitlezen/uit
  16. 16. ● Gather review data ○ Select review sites ○ Model review from web page (book title, author, ISBN, reviewer, date, rating, website, …) ○ Link author and book title to WorldCat record (for missing data) ■ Select ISBN, publisher, publication year ○ Link ISBN to record in boek.nl database ■ Select genre classification (NUR code) Data Scopes for Reading Impact Analysis
  17. 17. Data Scopes for Reading Impact Analysis (2/2) ● Extract impact expressions ○ Select individual sentences from book reviews ○ Normalise words in sentences to their lemmas ○ Select all sentences that match an impact rule ○ Classify sentences by impact rule ● Analyse impact ○ Select impact matches by book genre or author or book ID or reviewer ID
  18. 18. Entanglement of Data and Tools
  19. 19. Entanglement of Data and Tools Each step changes the underlying data!
  20. 20. Documenting Research Process
  21. 21. ● We should develop/ask for tools that ○ Show or describe implementation choices ○ Allow data inspection ○ Self-document interactions or encourage documenting ● Examples ○ Open Refine: spreadsheets on steroids, powerful data cleaning, enrichment and analysis ○ Notebooks: e.g. Jupyter Lab and Hub ● Workshop Documenting Research Practices (DH Benelux 2019) ○ Research practice and teaching ○ 20 participants, none had systematic approach ○ Assignments aimed at increasing awareness of reasons and methods Documenting Tool Interactions
  22. 22. Data Has No Memory ● Through linking, our review dataset now has complete set of ISBNs ○ Allows comparing reviews of different editions of a book ○ E.g. does plain edition affect readers differently from critical edition? ● Each edition has own ISBN ○ Modeling: group reviews by ISBN, group ISBNs by title+author or NTSC
  23. 23. Data Has No Memory ● Through linking, our review dataset now has complete set of ISBNs ○ Allows comparing reviews of different editions of a book ○ E.g. does plain edition affect readers differently from critical edition? ● Each edition has own ISBN ○ Modeling: group reviews by ISBN, group ISBNs by title+author or NTSC ● Banana peel: we’ve hidden uncertainty! ○ Some reviews don’t specify ISBN (we looked them up separately) ○ So we don’t know which edition is reviewed ○ But transformed dataset implies we do! ● Possible solution: add provenance info on data and process ○ How can tools help with this?
  24. 24. Open Refine
  25. 25. Wrap Up ● Pragmatic approach to discuss transparency in DH research and infrastructure ○ Digital Tool Criticism: Reflection, checklist + questions ○ Data Scopes: Understanding data transformations in research process ○ Document Research Practices: Data has no memory ● Infrastructure should ○ Invite us to collaborate, experiment, question, reflect ○ Reveal and document transformations
  26. 26. Humanities scholar at the CLARIAH Toogdag: “Our students are too stupid to write queries in a structured query language!”
  27. 27. ● A lot of this work is a collaboration with: ○ Rik Hoekstra ○ Jasmijn van Gorp ○ Jacco van Ossenbruggen ○ Antske Fokkens ○ Liliana Melgar ○ Peter Boot ○ Ronald Haentjens Dekker ○ Marijke van Faassen ○ Lodewijk Petram ○ Jelle van Lottum ○ Marieke van Erp ○ Adina Nerghes ○ Melvin Wevers Acknowledgements
  28. 28. Thank You! Questions?
  29. 29. References Anderson, S., Blanke, T. and Dunn, S., 2010. Methodological commons: arts and humanities e-Science fundamentals. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 368(1925), pp.3779-3796. Boot, P. 2017. A Database of Online Book Response and the Nature of the Literary Thriller. In: Digital Humanities 2017, Montreal, Conference abstracts. Burke, T. 2011. How I Talk About Searching, Discovery and Research in Courses. May 9, 2011. Da, N.Z. 2019. The Computational Case against Computational Literary Studies. Critical Inquiry 45:3, pp. 601-639 Drabenstott, K.M., 2001. Web Search Strategy Development. Online, 25(4), pp.18-25. Fickers, F. 2012. Towards a New Digital Historicism? Doing History in the Age of Abundance. View journal, volume 1 (1). http://orbilu.uni.lu/bitstream/10993/7615/1/4-4-1-PB.pdf Hitchcock, T. 2013. Confronting the Digital - Or How Academic History Writing Lost the Plot. Cultural and Social History, Volume 10, Issue 1, pp. 9-23. https://doi.org/10.2752/147800413X13515292098070 Hoekstra, R., M. Koolen. 2018. Data Scopes for Digital History Research. Historical Methods: A Journal of Quantitative and Interdisciplinary History, Volume 51 (2), 2018.
  30. 30. Koolen, M., J. van Gorp, J. van Ossenbruggen. 2018. Lessons Learned from a Digital Tool Criticism Workshop. Digital Humanities in the Benelux 2018 Conference. Putnam L. 2016. The Transnational and the Text-Searchable: Digitized Sources and the Shadows They Cast. American Historical Review, Volume 121, Number 2, pp. 377-402. Schmidt, B. 2016. Do Digital Humanists Need to Understand Algorithms? Debates in the Digital Humanities, 2016 Edition. Sloman, S. A. & Fernbach, P. M. (2017). The Knowledge Illusion: Why We Never Think Alone. Riverhead Books: New York. Unsworth, J., 2000, May. Scholarly primitives: What methods do humanities researchers have in common, and how might our tools reflect this. In Symposium on Humanities Computing: Formal Methods, Experimental Practice. King’s College, London (Vol. 13, pp. 5-00). Vakkari, P. 2016. Searching as Learning: A systematization based on literature. Journal of Information Science, 42(1) 2016, pp. 7-18. Yakel, E., 2010. Searching and seeking in the deep web: Primary sources on the internet. Working in the archives: Practical research methods for rhetoric and composition, pp.102-118. References

×