Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Making Sense of Statistics in HCI: part 4 - so what

540 views

Published on

So what? – making sense of results

http://alandix.com/statistics/course/

You have done your experiment or study and have your data – what next, how do you make sense of the results? In fact one of the best ways to design a study is to imagine this situation before you start!

This part will address a number of questions to think about during analysis (or design) including: Whether your work is to test an existing hypothesis (validation) or to find out what you should be looking for (exploration)? Whether it is a one-off study, or part of a process (e.g. ‘5 users’ for iterative development)? How to make sure your results and data can be used by others (e.g. repeatability, meta analysis)? Looking at the data, and asking if it makes sense given your assumptions (e.g. Fitts’ Law experiments that assume index of difficulty is all that matters). Thinking about the conditions – what have you really shown – some general result or simply that one system or group of users is better than another?

Published in: Data & Analytics
  • Be the first to comment

Making Sense of Statistics in HCI: part 4 - so what

  1. 1. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix Making Sense of Statistics in HCI Part 4 – So What? making sense of results Alan Dix http://alandix.com/statistics/
  2. 2. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix why are you doing it? exploration vs. validation process vs. product
  3. 3. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix research exploration ?finding questions ✓ validation answering them explanation finding why and how ethnography in-depth interviews detailed observation big data experiments large-scale survey quantitative data qualitative data theoretical models mechanism
  4. 4. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix development design build test process product summative formative make it better does it work
  5. 5. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix exploration / formative – find any interesting issues – stats about deciding priorities validation / summative – exhaustive: find all problems/issues – verifying: is hypothesis true, does system work – mensuration: how good, how prevalent explanation – matching qualitative/quantitative, small/large samples
  6. 6. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix are five users enough? original work Nielsen & Landauer (1993) about iterative process not summative – not for stats! how many? to find enough to do in next development cycle depends on size of project and complexity now-a-days with cheap development maybe n=1 but always more in next cycle N.B. later work on saturation
  7. 7. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix
  8. 8. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix look at the data don’t just add up the numbers!
  9. 9. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix look at the data eyeball the raw data are there anomalies, extreme values? does it match your model? but remember randomness can be misleading data is not truth!
  10. 10. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix example: Fitts’s Law data easy to jump straight to IoD (log distance/size) but sometimes more complex! time Index of difficulty (IoD) regression line was sig. but really IoD hides details
  11. 11. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix choice of baseline helps see small differences … but magnifies them good for rhetoric … … but may be misleading e.g. old asprin advert 57.92 57.93 57.94
  12. 12. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix https://www.ons.gov.uk/economy/governmentpublicsectorandtaxes/publicsectorfinance/timeseries/dzls/pusf trough to peek and basepoint – where do you start? peek to trough or annual average?
  13. 13. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix
  14. 14. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix what have you really shown? stats are about the measure, but what does it measure
  15. 15. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix what have you really shown • think about the conditions – are there other explanations for data? • individual or population – small #of groups/individuals, many measurements – sig. statistics => effect reliable for each individual – but are individuals representative of all? • systems vs properties
  16. 16. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix a little story … BIG ACM conference – ‘good’ empirical paper looking at collaborative support for a task X three pieces of software: A – domain specific software, synchronous B – generic software, synchronous C – generic software, asynchronous A B C domain spec. generic
  17. 17. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix experiment sensible quality measures reasonable nos. subjects in each condition significant results p<0.05 domain spec. > generic asynchronous > synchronous conclusion: really want asynchronous domain specific A B C domain spec. generic domain spec. generic asyncsync
  18. 18. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix what’s wrong with that? interaction effects gap is interesting to study not necessarily good to implement more important … if you blinked at the wrong moment … NOT independent variables three different pieces of software like experiment on 3 people! say system B was just bad domain spec. generic asyncsync A B C domain spec. generic ? B < A B < C
  19. 19. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix what went wrong? borrowed psych method … but method embodies assumptions single simple cause, controlled environment interaction needs ecologically valid experiment multiple causes, open situations what to do? understand assumptions and modify
  20. 20. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix
  21. 21. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix diversity – individual/task good for not just good
  22. 22. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix don’t just look at average! e.g. overall system A lower error rate than system B but … system B better for experts
  23. 23. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix … and tasks too e.g. PieTree (interactive circular treemap) exploding Pie chart good for finding large things unfolding hierarchical text view good for finding small things
  24. 24. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix more important to know who or what something is good for
  25. 25. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix
  26. 26. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix mechanism from what happens to how and why
  27. 27. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix mechanism quantitative and statistical what is true end to end phenomena qualitative and theoretical why and how mechanism
  28. 28. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix generalisation empirical data at best interpolate understanding mechanism allows: extrapolation application in new contexts
  29. 29. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix example: mobile font size early paper on fonts in mobile menus: well conducted experiment statistically significant results conclusion gives best font size but … a menu selection task includes: 1. visual search (better big fonts) 2. if not found scroll/page display (better small fonts) 3. when found touch target (better big fonts) no single best size – the balance depends on menu length, etc.
  30. 30. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix
  31. 31. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix complex issues probability can be hard!
  32. 32. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix Monty Hall problem 3 doors, one has car, two goats contestant chooses one (say door 1) Monty Hall opens one of the remaining doors with goat should contestant change their mind? even mathematicians get confused!! https://en.wikipedia.org/wiki/Monty_Hall_
  33. 33. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix tip: make the numbers extreme imagine a million doors (one car) – you choose one Monty Hall opens all the rest except one do you change?
  34. 34. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix lots of real examples! DNA in court say DNA accuracy 1 in 100,000 case 1: person murdered after arguing with friend friend matches DNA at scene convincing evidence? case 2: person murdered, only clue DNA at scene find person after police DNA database search convincing evidence? ✗ ✓
  35. 35. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix
  36. 36. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix building for the future adding to the discipline
  37. 37. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix building for the future • repeatability – comparisons more robust than measures – RepliCHI • meta analysis • data publishing and open science
  38. 38. Making Sense of Statistics in HCI: From P to Bayes and Beyond – Alan Dix

×