Bm515 h8


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Bm515 h8

  2. 2. TERM PAPER SCHEDULE  December 19th  Set your topic/research question  Make a short lit. rev. and write the draft version (2-3 pages)  Write methodology  Turn-in your draft: 10 pts  Late submission: -2pt per day  January 09th  Paper Submission  Late submission: -5 pt per day 2
  3. 3. TERM PAPER  Survey  One approach is to choose an area of research, read all the relevant studies, and organize them in a meaningful way.  An example of an organizing theme is a conflict or controversy in the area, where you might first discuss the studies that support one side, then discuss the studies that support the other side.  Another approach is to choose an organizing theme or a point that you want to make, then select your studies accordingly.  Empirical  Gather data on an HCI topic 3
  4. 4. LITERATURE REVIEWS VERSUS RESEARCH ARTICLES Literature reviews survey research done in a particular area.  Although they also evaluate methods and results, their main emphasis is on knitting together theories and results from a number of studies to describe the "big picture" of a field of research. Research articles, on the other hand, are empirical articles, specifically describing one or a few related studies.  Research articles tend to focus on methods and results to document how a particular hypothesis was tested.  The introduction of a research article is like a condensed literature review that gives the rationale for the study that has 4 been conducted.
  5. 5. EMPIRICAL PAPER - STRUCTURE Title Abstract Introduction (Lit. Rev. & Res.Q) Method Results Discussion Bibliography 5
  6. 6. ABSTRACT  A one paragraph summary  A statement on objective/purpose of the investigation  Description of participants  Brief description of what participants did  Summary of findings 6
  7. 7. INTRODUCTION Lit review  Background & rationale (previous research, what they found, what they identify as possible issues/questions)  Use EBSCO, ScienceDirect, SCI, SSCI, etc. Statement of purpose  “The current study was conducted to evaluate the effect of X on Y” .... or ..... “to find out what are the factors that lead to Z” or “to determine the relationship between A and B” 7
  8. 8. METHOD Enough detail for a reader to replicate Who participated (number, characteristics, volunteer or randomly selected) What materials were employed (systems, questionnaires - design, validity and reliability) What data was collected (dependent variables i.e. scores, ratings, responses) What were participants required to do (where, who, 8 sequence of events - include instructions & tasks)
  9. 9. RESULTS How have the data been treated? Text and graphs Statistics - descriptive/inferential Summarize the results 9
  10. 10. INDEPENDENT AND DEPENDENT VARIABLES Independent - what the experimenter does to the subject e.g. exposed to an interface, training, mental model, or selected by age or gender  levels & controls Dependent - any behavior/ performance/ attitude of the subject which is measured as the ‘outcome’ e.g. scores on a test, type of knowledge 10
  11. 11. DISCUSSION  Interpretation-  what do the results mean in terms of your original question  why do you think they turned out like this  Critique your study (limitations) and recommend improvements  Suggestions for further research 11
  12. 12. CREDIBILITY OF THE STUDY Definition of the construct being measured Congruence between method & question Measurement  Bias: instruction & instrument (wording), administering  Reliability (stable/decision consistency)  the degree to which an instrument measures the same way each time it is used under the same condition with the same subjects  Validity  strength of our conclusions, inferences or propositions  Were we right? 12
  13. 13. PILOT YOUR METHOD….  Try your method before capturing data for real…..  Ask friends to answer your survey, take your test, perform your experiment etc.  Look for issues that confuse them (or you!) - modify accordingly 13
  14. 14. STATISTICS(IF YOU CHOOSE QUANTITATIVE APPROACH) Descriptive/summary stats are sufficient  Mean (median, mode)  Range  Standard deviation (if you know how)  Run tests if you are comfortable Provide tabulated raw data if possible, (put in appendices if large) 14
  15. 15. COMMON PITFALLS  Rambling, unfocused style  Keep a question in mind as you write  All claims and opinions, no evidence  Cite literature that supports your argument  Misses relevant topics from class  Try to see how the readings and lectures fit 15
  16. 16. DETERMINANTS OF USABILITY RATING Effectiveness Design Process Usability Efficiency Rating Satisfaction 16
  17. 17. PLAN A USABILITY EVALUATION Describe: What data will you collect? What will these data tell you? What data collection methods will you employ? How long will it take to produce the test results? What form of feedback will you provide? List advantages/disadvantages of this plan 17
  18. 18. ?THOUGHT ACTIVITY VESTEL is designing an interface for their new VCD product (includes remote control) for Turkish market which aims to be the ‘most usable’ on the market - how would you test this claim? ASELSAN wishes to test its new weapon control system . You are charged with designing the test. Educational software company has developed an educational game for use in elementary school - they want to know if it is usable/enjoyable. Municipality of Ankara is producing an information booth for use in visitors centers, ASTI, etc. offering info and advice to visitors. They ask for a usability test. (Zero learning time) 18
  19. 19. SETTING USABILITY CRITERIA “Product X is usable to the extent that 70% of users, with no additional training, can perform all tasks with 95% accuracy, 25% faster than existing application use, and report at least equivalent satisfaction” 19
  20. 20. OR..... “Product X is usable to the extent that 80% of users, with 2 days training, can perform 90% of routine tasks with >90% accuracy, as efficiently as with the existing application, and report increases in satisfaction” 20
  21. 21. INSTEAD OF.... “Product X is usable” (meaningless statement for HCI) “This new application is more usable than the old application” (begs the question...”More usable in what sense? And for whom? And where?” ) 21
  22. 22. WHO SETS THE USABILITY CRITERIA?  Purchasers  can be basis for contract  Designers  basis for design targets  Evaluators  provide context/limits of generalization for evaluation  Users  Key stakeholders with privileged knowledge 22
  23. 23. HOW ARE CRITERIA DERIVED? User analysis Task analysis Situation analysis 23
  24. 24. OUTPUT Scenarios of use  “Stories” of interaction in which users, tasks and contexts are described Scenarios form basis of decisions on  Effectiveness  Efficiency  Satisfaction 24
  25. 25. FRAMEWORK FOR USABILITY EVALUATION  Approach and Type  Approach refers to source of data  User, Expert, or Model  Type refers to purpose of evaluation  Diagnostic (Formative) or Metrication (Summative)  Any evaluation method is a combination of approach and type 25
  26. 26. EVALUATION APPROACH The approach defines the source of the data i.e., where does the evaluator gain the data about usability?  from real users? (User-based)  from usability experts or self evaluation? (Expert-based)  from the application of a formal theory or model? (Model-based) 26
  27. 27. CONDUCTING YOUR TEST: THINGS TO CONSIDER… How many users? Length of test session? Where to conduct the session? Role of facilitator:  put participant(s) at ease (testing the material, not them)  observe and take notes  not to intervene or assist Role,placement and responsibilities of other observers Verbal protocol (“think-aloud”) Token reward for participation (if appropriate) 27
  28. 28. DATA COLLECTION  Quantitative data  Number of errors made using the system  Time required for activity(s)  Qualitative data  Ease of use – are materials convenient, easy to locate, to use?  Learners’ reactions to materials, activities, evaluation 28
  29. 29. ANALYZING & REPORTINGYOUR USABILITY RESULTS Quantitative data  descriptive data (number of users, time spent, errors)  be sure and discuss any data tables (what do they mean?) Qualitative data  consolidate your observations (negatives and positives!)  extract common themes  identify critical themes (e.g. length of time required)  perform member checking if possible  determine solutions for addressing the problems 29  summarize and present your findings and solutions
  30. 30. ANALYZING & REPORTINGYOUR USABILITY RESULTS Observations Interpretation Recommendation 30
  31. 31. PROTOCOL Introduction  Thank you...for agreeing to participate in this session.  Product Description...A CD-ROM "book" on the topic of visual design for instructional multimedia.  Purpose of to make this product better.  This product does have problems.  Any problems you have or find with the product is with the product, not your fault. Instructions...  Ill be asking you do certain things with the program and watching and writing notes as you do them. Thats just to help me remember how things went later on.  To help me do this, Id like you to "think out loud" as you use the program and make your decisions to do certain things.  Id like you to try and perform the given tasks on your own as best you can. If you’re really stuck, I may be able to help, but I’d really like you to try it without my help.  At any time, you can quit a particular task and move on or you may choose to quit the entire session. 31
  32. 32. EfficiencyOBSERVATION SHEET Start time: Finish time: page/link name Notes +/- name of starting page Efficiency Effectiveness 32
  33. 33. AND NOW.....THE USER How should we think of users? User as a psychological being Parameters of human information processing Emergence of skilled behavior 33
  34. 34. NIELSEN (1993) - THE RANGE OF USER TYPES: Knowledgeable about DomainMinimal Computer Experience Extensive Computer Experience 34 Ignorant about Domain
  35. 35. FOUR LEVELS OF THE USER IN HCI Psychophysiological  HCI issue: brain-computer interaction (BCI) Perceptual  HCI issue: Screen layout, readability Cognitive  HCI Issue: Task Structure, Human Learning Social  HCI issue: CSCW, Organizational impact 35
  36. 36. BASIC PROPERTIES OF ALL USERS  Changes with experience  Actively learns  Limited attention  Makes mistakes  Models the system in their mind  Remains unique  Goal oriented 36
  37. 37.  Some Resources    (research based evidences)   comp.human-factors (USENET news group) Some Journals  ACM transactions on computer-human interaction  Human-computer interaction  ACM Interactions  International journal of human-computer interaction  User modeling and user-adapted interaction  Computers in Human Behavior  Human factors in computing systems  37