Doing Web Science Subjectivity - Clare J. Hooper


Published on

Talk at Web Science Montpellier Meetup - 13th May 2011

Published in: Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Doing Web Science Subjectivity - Clare J. Hooper

  1. 1. Doing Web Science Balancing Qualitative andQuantitative Methods, and Grappling with Subjectivity
  2. 2. Subjectivity?
  3. 3. Subjectivity and WebSci• The examination of web-mediated individual and group experiences – How do people experience the mobile web? – What about the social web? – How do we evaluate web design processes?• Experiences are subjective: how to model them?• Mixed methods help
  4. 4. Why I am giving this talk1. Training often only covers half of the equation2. Understanding qualitative and quantitative methods doesn’t equate to being able to use them in conjunction3. Mixed methods (as in ‘lab-work and fieldwork’ as well as ‘qualitative and quantitative’) open up richer insights and multiple types of finding
  5. 5. So…• We need to understand methods and how to combine them.• For example…
  6. 6. Strengths WeaknessesLab-based study Specific, controlled, broad ArtificialCase study Grounded in practice Lack of control; indirect dataQuestionnaire Breadth, efficiency Tough to design, can’t follow up interesting responsesInterview Delve deep Confirmation bias, time requiredFocus group Efficiency, delve deep Confirmation bias, individuals may dominateExpert review More objective insights Confirmation bias; finding experts
  7. 7. Approaches to analysis• Quantitative analysis• Coding qualitative data• Meta analysis – A follow-up study, e.g. an expert review to reflect on results from a lab study – A meta-analysis of data from multiple sources
  8. 8. Combining methods• Statistical analysis and qualitative coding: corroborate (‘triangulate’) findings for richer insights• Expert reviews: gain further insight into prior results (but think about verifying that data…)• Case studies: build on knowledge gained from lab studies, answer ‘how’ and ‘why’ questions• More methods == more certainty
  9. 9. A problem?• There is a perceived ‘pressure’ in some fields to seek quantitative results, and to assume numbers == rigour.• (Also: are we doing stats badly? cacm/107125-stats-were-doing-it- wrong/fulltext)
  10. 10. Summary• Mixed methods isn’t as hard as it sounds!• Multiple perspectives; corroborate results; follow up interesting results• You need to – Know your methods – Know your methodology• We need to talk about education
  11. 11. Some further reading• Mixed methods: multiple perspectives and triangulation – Shneiderman, B., Plaisant, C. 2006. Strategies for Evaluating Information Visualization Tools: Multi-dimensional In-depth Long-term Case Studies. BELIV 2006. Venice, Italy: – Isomursu, M., Tahti, M., Vainamo, S., Kuutti, K. 2007. Experimental evaluation of five methods for collecting emotions in field settings with mobile applications. International Journal of Human-Computer Studies, 65, 404-418.• Pressure for quantitative results – Shneiderman, B. 2007. Creativity Support Tools: Accelerating Discovery and Innovation. Communications of the ACM, 50, 20-32.• Rigour in quantitative and qualitative methods – Fallman, D., Stolterman, E. 2010. Establishing Criteria of Rigor and Relevance in Interaction Design Research. create10. Edinburgh, UK.• Mixed methods in WebSci – Hooper, C.J. 2010. Towards Designing More Effective Systems by Understanding User Experiences. Doctor in Engineering, University of Southampton.