Benefits of Concurrent Cognitive and Usability Testing
Upcoming SlideShare
Loading in...5
×
 

Benefits of Concurrent Cognitive and Usability Testing

on

  • 916 views

Typically survey pretesting involves separate timelines and research staffs for cognitive and usability testing. In this paper, we make the case that a more comprehensive and less labor-intensive ...

Typically survey pretesting involves separate timelines and research staffs for cognitive and usability testing. In this paper, we make the case that a more comprehensive and less labor-intensive approach to pretesting is to conduct both cognitive and usability testing concurrently. By testing the same questionnaire concurrently with respondents and interviewers (the users in this case), potentially problematic question wording and instrument design can be more efficiently identified in a way that can be used to improve the questionnaire for both the respondent and the interviewer.

In 2005 and 2006, the U.S. Census Bureau conducted separate rounds of cognitive and usability testing on an interviewer-administered non-response follow-up questionnaire in preparation for the 2010 Census. The usability testing was conducted in the Census Bureau’s Usability Lab with an early version of the instrument. Later, the Census Bureau’s Cognitive Lab conducted cognitive testing of the instrument. In doing the testing separately, we learned that in addition to usability issues, usability testing also identifies question wording issues, but that usability staff does not have the specialized experience (or sometimes the authority) to make recommendations in that arena. Similarly, while examining question wording, cognitive testing also identifies poor usability features, but the cognitive-testing staff lacks the experience with such testing to be able to recommend improvements in usability features. Based on this observation, in 2008, the Cognitive and Usability Labs at the Census Bureau conducted 40 cognitive and 20 usability interviews concurrently and in conjunction to test the questionnaire and presented results and recommendations from both types of testing together. When testing is conducted concurrently, staff from both labs, representing both specialties, can be at the table at once, creating a more efficient methodology. By examining these two case studies, this paper will discuss what can be gained by conducting these studies in concert above and beyond conducting them independently. Examples of the kinds of findings that are possible through this joint research and the synergy from having both research teams involved will be described.

Statistics

Views

Total Views
916
Views on SlideShare
916
Embed Views
0

Actions

Likes
0
Downloads
4
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Under Usability Testing, it could be argued that the focus is on both the “look and feel” of the form, i.e., the visual design and the ways in which the various controls behave.
  • Note that I added some text under Usability Lab for you to consider. We do not measure how “intuitive” a user-interface is because there is no defined unit of intuitiveness. We measure accuracy, efficiency (time to complete tasks), and satisfaction with the user interface. Accuracy and efficiency together represent the user’s success in completing the survey. So we measure success and satisfaction, and we diagnose what is was about the design that led to something less than success and satisfaction. We find that a problem with Tourangeau’s model is that people don’t always read the question. They often look first at the response options, and they infer the question. People’s goal with an online survey is to finish it as quickly as possible and get back to their lives. Usability recommendations are grounded in what we know about human capabilities and limitations in attention, perception, memory, learning, expectations, problem solving, decision making, i.e., everything we know about human cognition.
  • If this is where the lists came from, it should be presented earlier. I was thinking the lists were on flash cards.

Benefits of Concurrent Cognitive and Usability Testing Benefits of Concurrent Cognitive and Usability Testing Presentation Transcript

  • Benefits of Concurrent Cognitive and Usability Testing Jennifer Childs, Jennifer Romano,Erica Olmsted-Hawala, and Elizabeth Murphy U.S. Census Bureau Paper presented at the Questionnaire Evaluation Standards Conference Bergen, Norway May 17-20, 2009 1
  • Overview• Background – Cognitive and Usability Labs• Case Study – U.S. 2010 Census – Example 1 – Separate Cognitive and Usability Testing – Example 2 – Joint Cognitive and Usability Testing• Conclusions 2
  • Description of Pretesting Methods• Cognitive Testing • Usability Testing – Focus on respondent’s – Focus on user’s ability to understanding of complete the survey questions • User = Interviewer – Some focus on • User = Respondent navigation – Focus on users’ – Often used to interaction with determine final questionnaire and on question wording effects of visual design – Used to improve visual design and navigational controls 3 View slide
  • U.S. Census Bureau Lab Structure• Cognitive Lab • Usability Lab – Psychologists, – Psychologists, human sociologists, factors and usability anthropologists, specialists demographers, etc. – Based on principles of – Based on CASM and user-centered design Tourangeau’s 4 stages (UCD) – Measures – Measures accuracy, comprehension, efficiency (time to accuracy and ability and complete tasks), and willingness to respond satisfaction 4 View slide
  • Census Bureau Lab Techniques• Cognitive Lab • Usability Lab – Concurrent think aloud – Random assignment (as – Concurrent and/or appropriate) retrospective probing – Scenarios (hypothetical – Retrospective situations) debriefing – Concurrent think aloud – Emergent and – Concurrent and/or expansive probing retrospective probing – Vignettes (hypothetical – Eye tracking situations) – Satisfaction – In-depth ethnographic questionnaire interviews – Retrospective debriefing 5
  • Case Studies:United States2010 Census
  • U.S. 2010 Census• Approximately 10 questions – Names, relationships, ages, sex, race and Hispanic origin for each household member• Mailed forms to most households• Non-response follow-up by personal visit 7
  • U.S. Census Nonresponse Followup(NRFU)• Developed and tested CAPI instrument – Usability and Cognitive testing independently conducted – Example 1• Turned to PAPI data collection – Usability and Cognitive testing conducted concurrently – Example 2 8
  • Example 1:CAPI Instrument Testing• Usability testing and cognitive testing conducted separately
  • CAPI Usability Testing Methods• Early version of software• 2 rounds• Round 1 – 6 users (4 English and 2 Spanish speakers)• Round 2 – 5 users (4 English and 1 Spanish speaker)
  • CAPI Cognitive Testing Methods• 4 Rounds (2 English and 2 Spanish)• English Round 1 – 14 interviews – Paper script• English Round 2 – 16 interviews – Computer• Spanish Rounds 1 and 2 – 30 interviews total – Paper script
  • CAPI Usability Findings (1)• Data Entry – The way the cursor moves between entry boxes• Navigation issues – Next and Back button navigated between “questions” not “screens”
  • CAPI Usability Findings (2)• Question Text – Repeating questions for each household member – Unclear whether or not to read examples associated with question text – Unclear if verification for sex was ok – Identified instances of problematic question wording• Interview Tasks – Administering flashcard was difficult for interviewers – Difficult fills – e.g., Is this (house/apartment/mobile home)
  • CAPI Cognitive Findings• Navigation Issues – “Back” and “Next” buttons• Question Text – Repetitive to read each question for each person – Reference period unclear – Long, complex questions – Problematic question wording• Interviewer Tasks – Difficulty administering flashcard – Difficult fill, e.g., (house/apartment/mobile home)
  • Separate Usability and CognitiveTesting Conclusions• Separate – Unique results – Common results – Areas of expertise and knowledge of relevant research• Together – Fully understand how to best resolve problems
  • Example 2: Joint Cognitive andUsability PAPI Testing• Paper and Pencil Instrument• 40 cognitive interviews• 20 usability sessions• Concurrent testing led to development of joint recommendations 16
  • Methods• Cognitive Testing • Retrospective probing • Comprehension, accuracy, ability to answer questions given personal situation• Usability Testing • Shortened interviewer training • Test scenarios • Accuracy, satisfaction and ease of use 17
  • Joint Findings andRecommendations 18
  • Information Sheet 19
  • Information Sheet FindingsCognitive Findings Respondents understood and were able to use the Information Sheet for all relevant questionsUsability Findings Interviewers were successful in administering the information sheet and associated questions 20
  • Hispanic Origin Question 21
  • Hispanic Origin Findings• Cognitive Findings – A lot of difficulty for Hispanic respondents • Responding “Latino” or “Spanish”– without a country of origin • Describing race and origin • Unable to respond unassisted – With probing by interviewer, able to provide a code- able answer (note: the dash is yellow- I must have added it last time- so check the other slides too…)• Usability Findings Dominican Puerto Scenario Columbian Cambodian Mexican Republic Rican Success 95% 95% 100% 95% 100% Rate 22
  • Hispanic Origin Discussion andRecommendations• Respondents had some difficulty, but…• Interviewers in usability testing were able to successfully navigate the question.• Recommendations focused on training 23
  • Conclusions from Joint Cognitive andUsability Studies• Understanding of how: – Respondents react to questions – Interviewers react to questionnaire – Interviewers react to respondent situations• Recommendations: – Improve the form – question wording, visual design, and/or navigational instructions – Improve interviewer training 24
  • General Recommendations Conduct cognitive and usability testing concurrently with early versions of the questionnaire, with time to change: – Question wording – Visual Design (i.e., format and layout of the questionnaire) – Navigational strategies – Interviewer training Conduct iterative testing 25
  • Future Research• Cognitive and Usability testing of American Community Survey Internet data collection instrument – Early in development cycle – Iterative testing 26
  • Thank you!Questions or Comments? E-mail:jennifer.hunter.childs@census.gov 27