Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

User-Centered Research on the Paying for College Website and Tools - EDUI 2014

1,077 views

Published on

The Paying for College website is designed to help consumers make informed decisions about college finance. The Consumer Financial Protection Bureau (CFPB) began development with a user-centered design process for this tool-set, which is now in its 4th iteration. The college cost and financial aid comparison tool, a central feature of these resources, supports efforts by the Department of Education to standardize financial aid disclosures.

During this session, we’ll cover the most recent rounds of usability testing conducted with multiple groups across the U.S. We’ll highlight difficulties when designing and testing for multiple audiences with different needs as well as testing and iterating with live and prototype versions of the site. Data will also be emphasized as we share collection methodologies (click paths, eye tracking, questionnaires, etc.) and the importance of each. We’ll also provide insights into planning, execution, and reporting and how these findings informed major changes on the website.

Published in: Education
  • Be the first to comment

User-Centered Research on the Paying for College Website and Tools - EDUI 2014

  1. 1. Paying For Education: User-Centered Research on the Paying For College Website and Tools September 30, 2014 Jennifer Romano Bergstrom & Lamar Alayoubi @romanocog
  2. 2. Eye Tracking in UX – Oct 16 2 User Focus – Oct 17 eyetrackingux2014.com October 16, 2014 EDUI = 20% OFF 2014.userfocus.org October 17, 2014 EDUI = 20% OFF
  3. 3. Usability = “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.” ISO 9241-11 @romanocog #eduiconf + emotions and perceptions = UX
  4. 4. User Experience 4 User Experience Design (P. Morville): http://semanticstudios.com/publications/semantics/000029.php @romanocog #eduiconf
  5. 5. 5 When to test @romanocog #eduiconf
  6. 6. 6 What we design for…and reality Krug, S. (2000) Don’t Make Me Think. Pearson Education *NEW 3rd Edition (2014)* @romanocog #eduiconf @skrug
  7. 7. 7 OBSERVATIONAL + Time on page/task + Reaction time + Selection/click behavior + Success/fail rate + Conversion rate UX Data + Satisfaction & difficulty ratings IMPLICIT + Eye tracking SUBJECTIVE + Verbal responses + Moderator follow up + Real-time +/- dial + Electrodermal activity (EDA) + Behavioral analysis + Verbalization analysis + Pupil dilation + Facial expression analysis @romanocog #eduiconf
  8. 8. Why should we measure implicit? @romanocog #eduiconf Traditional UX research is good at explaining what people say and do, not what they think and feel.
  9. 9. 9 User Experience Design (P. Morville): http://semanticstudios.com/publications/semantics/000029.php @romanocog @richmondux Implicit Data @romanocog #eduiconf
  10. 10. 10 Modern eye tracking @romanocog @dcaapor Eye Tracking in the Past
  11. 11. 11 Modern eye tracking Modern Eye Tracking
  12. 12. UX Research on Paying For College 12 Tested the User Experience (UX) of: • Overall navigation • Comparing the cost of colleges side-by-side • Organization of information: • Federal vs. private loan information • Federal loan interest rates • Financial aid disbursement information • College bank accounts. Task Creation SATQ Items Other materials Debriefing Reports . . . . . . . Iterative Process with Client @romanocog #eduiconf “A spectator sport” @skug
  13. 13. Testing Iterations & Rounds 13 High School Students Influencers Round 1: Austin, TX Round 3: Arlington, VA Round 2: Spokane, WA Live site version 1 Live site version 1 Live site version 1 OR Prototype site version 2 Iteration 1: February/ March 2014 High School Students College Graduates Round 1: Atlanta, GA Round 3: Arlington, VA Round 2: Milwaukee, WI Live site version 2 Live site version 2 Live site version 2 Iteration 2: April 2014 @romanocog #eduiconf
  14. 14. 14 Methodology @romanocog #eduiconf Testing Environment Eye Tracker Moderator & Participant
  15. 15. Methods: UX Metrics 15 @romanocog #eduiconf OBSERVATIONAL + Time to complete tasks + Time to first click + First click accuracy + Task accuracy IMPLICIT + Eye tracking SUBJECTIVE + Difficulty and satisfaction ratings + Think aloud protocol + Debriefing interview Use eye tracking to: • Gain insights into attracting visual elements and the order that users observe • Identify the amount of text that respondents read (e.g., terms/instructions) • Classify how respondents complete tasks and the order in which they process elements • Eye tracking [implicit] tells us what participants focus on. • Think aloud & debriefing [subjective] tell us why they focus there. • Observational measures tell us how they go about completing a task. • Combination of subjective, observational, & implicit metrics paints the whole picture.
  16. 16. UX Metrics in Action: Live Website Testing 16 @romanocog #eduiconf
  17. 17. Navigation should be intuitive. Iteration 2 Measure: First-click accuracy Unsuccessful 17 1 Successful @romanocog #eduiconf Iteration 1 Measure: First-click accuracy Accuracy: 94% TASK: Let’s say that you have your list narrowed down to the two schools that you have financial aid offers from. You are interested in graduation rates of those two schools. Which university graduates a higher percentage of students who enroll? 17 First clicks: Compare financial aid offers (14), Compare costs (3), search function (1) 6 6 Successful Unsuccessful First clicks: Compare costs (4), Research schools (3), Compare get started (2), Search function (1), Home (1), Graduate (1) 50% Accuracy:
  18. 18. Navigation should be intuitive. Iteration 2 Measure: First-click accuracy Unsuccessful 17 1 Successful Iteration 1 Measure: First-click accuracy Accuracy: 94% TASK: Let’s say that you have your list narrowed down to the two schools that you have financial aid offers from. You are interested in graduation rates of those two schools. Which university graduates a higher percentage of students who enroll? 18 First clicks: Compare financial aid offers (14), Compare costs (3), search function (1) 6 6 Successful Unsuccessful First clicks: Compare costs (4), Research schools (3), Compare get started (2), Search function (1), Home (1), Graduate (1) 50% Accuracy: UX Best Practices & Recommendations: • Ensure that the primary navigation and other navigation options are intuitive and match user expectations. UX Metrics: • First Click Accuracy: If people make the correct first click, the design is intuitive (for that particular task). @romanocog #eduiconf
  19. 19. @romanocog #eduiconf Navigation should be intuitive and consistent. 13 3 Unsuccessful Successful First clicks: Student loans (7), More about student loans (5), Student financial guides (1), Get Assistance – Info for students (1), Participate – Student loans (1), Compare financial aid offers (1) Iteration 1 Measure: First-click accuracy. Task: “What are the types of loans available?” 11 1 Unsuccessful Successful First clicks: Accuracy: Choose a loan - timeline (8), Choose a loan – Get started 81% (3), Home (1) Iteration 2 Measure: First-click accuracy. Task: “What are the types of loans available?” 94% Accuracy: TASK: Let’s say you have decided on the university you want to attend and you know you’re going to need to take out a loan to pay for tuition. What are the different types of loans that are available to you?
  20. 20. @romanocog #eduiconf Navigation should be intuitive and consistent. 13 3 Unsuccessful Successful First clicks: Student loans (7), More about student loans (5), Student financial guides (1), Get Assistance – Info for students (1), Participate – Student loans (1), Compare financial aid offers (1) Iteration 1 Measure: First-click accuracy. Task: “What are the types of loans available?” 11 1 Unsuccessful Successful First clicks: Accuracy: Choose a loan - timeline (8), Choose a loan – Get started 81% (3), Home (1) Iteration 2 Measure: First-click accuracy. Task: “What are the types of loans available?” 94% Accuracy: UX Best Practices & Recommendations: • Ensure that navigation options are consistent; for a horizontal navigational layout, either all headings should have dropdown options, or none should. • Use category headings that accurately convey what users will find on those pages. TASK: Let’s say you have decided on the university you want to attend and you know you’re going to need to take out a loan to pay for tuition. What are the different types of loans that are available to you?
  21. 21. @romanocog #eduiconf Getting started with a tool should be intuitive. 21 Time to start: 14 sec Iteration 1: Gaze plots for 4 participants on Compare Costs tool main page. Disorderly gaze patterns suggest that it isn’t readily apparent where to start on this page. Iteration 2: Gaze plots for 4 participants on Compare Costs tool main page. More orderly gaze patterns clustered at the top of the page suggest that it is more apparent where to start on this page. 27 sec Time to start: TASK: Let’s say that you have your list narrowed down to the two schools that you have financial aid offers from. You are interested in graduation rates of those two schools. Which university graduates a higher percentage of students who enroll?
  22. 22. Getting started with a tool should be intuitive. 22 Time to start: 14 sec Iteration 1: Gaze plots for 4 participants on Compare Costs tool main page. Disorderly gaze patterns suggest that it isn’t readily apparent where to start on this page. Iteration 2: Gaze plots for 4 participants on Compare Costs tool main page. More orderly gaze patterns clustered at the top of the page suggest that it is more apparent where to start on this page. 27 sec Time to start: UX Best Practices & Recommendations: • It should be apparent to the user how to get started using a web tool. • Use a prominent location and, if necessary, highlight the button that users need to get started. UX Metrics: • Time to First Click: If people make the correct first click quickly, the design is intuitive (for that particular task). • Eye Tracking Gaze Plots: Gaze patterns inform us of the order in which a user processes information. Disorderly gaze patters imply a poor design. @romanocog #eduiconf TASK: Let’s say that you have your list narrowed down to the two schools that you have financial aid offers from. You are interested in graduation rates of those two schools. Which university graduates a higher percentage of students who enroll?
  23. 23. 23 Input methods should be apparent. • When attempting to put a number in a box that requires a previous box to be completed, no indication is given that participants have to fill out the prior information first, resulting in participants becoming increasingly frustrated. • Participants commented that there weren’t enough boxes to put in all of the information they had; specifically when entering scholarship information, there is only one blank for “other scholarships and grants,” prompting several participants to say that they wished there were additional blanks. • The Cost of Attendance dropdown automatically opens, but the Money for School dropdown does not. As a result, some participants did not notice that they could enter additional information, resulting in confusion. “It made me think that the site was incomplete or I missed something.” “I don’t understand. What’s the point if you can’t compare the costs. Maybe I did it wrong.” @romanocog #eduiconf Iteration 1: Compare tool input fields. Iteration 2: Compare tool input fields. TASK: Let’s say that you have your list narrowed down to the two schools that you have financial aid offers from. You are interested in graduation rates of those two schools. Which university graduates a higher percentage of students who enroll?
  24. 24. 24 Input methods should be apparent. • When attempting to put a number in a box that UX Metrics: requires a previous box to be completed, no indication is given that participants have to fill out the prior information first, resulting in participants becoming increasingly frustrated. @romanocog #eduiconf • Think Aloud Protocol: Users verbalize their experience while completing tasks and tell us in-the-moment when they are frustrated. We quantify the percentage of users that verbalize a particular problem and gain insight into how severe the problem is. • Participants commented that there weren’t enough boxes to put in all of the information they had; specifically when entering scholarship information, there is only one blank for “other scholarships and grants,” prompting several participants to say that they wished there were additional blanks. UX Best Practices & Recommendations: • The Cost of Attendance dropdown automatically opens, but the Money for School dropdown does not. As a result, some participants did not notice that they could enter additional information, resulting in confusion. “It made me think that the site was incomplete or I missed something.” “I don’t understand. What’s the point if you can’t compare the costs. Maybe I did it wrong.” Iteration 1: Compare tool input fields. • Provide relevant error messages when users enter information in the incorrect order (or Iteration 2: Compare tool input fields. skip a field) to promote recovery and minimize frustrations. • Provide enough places to enter information – match user expectations. In this particular example, more options for entering scholarship amounts into the tool would have been clear and less cognitively taxing for the user. TASK: Let’s say that you have your list narrowed down to the two schools that you have financial aid offers from. You are interested in graduation rates of those two schools. Which university graduates a higher percentage of students who enroll?
  25. 25. Infographics should be easily understandable. 25 • Because participants are confused about how to enter their information into the tool correctly, the output graphics that the tool generates are also confusing. • The Left to pay number is not calculated into the debt burden, leading to several participants being confused as to why they’ll have so little debt at graduation. • No explanation is provided for where salary is pulled from, leading several participants to question the accuracy of that number. Iteration 2: Compare tool output graphics. @romanocog #eduiconf TASKS: After you graduate, what would be the debt burden for [THE SECOND SCHOOL THEY ENTER]? After you graduate, what is the monthly payment for the loans you would take out while attending [THE FIRST SCHOOL THEY ENTER]? How much debt burden would you have at [SECOND SCHOOL] vs. [FIRST SCHOOL]?
  26. 26. @romanocog #eduiconf Infographics should be easily understandable. 26 • Because participants are confused about how to enter their information into the tool correctly, the output graphics that the tool generates are also confusing. UX Best Practices & Recommendations: • Thoroughly explain how to use a diagram; do not assume that the user will be able to • The Left to pay number is not figure out how to interpret an infographic. calculated into the debt burden, leading to several participants being confused as to why they’ll have so little debt at graduation. • Use hover-over pop-up boxes to explain what each portion of the infographic actually • No explanation is provided for where salary is pulled from, leading several participants to question the accuracy of that number. Iteration 2: Compare tool output graphics. represents. • Identify sources of information whenever appropriate. For example, salary information and graduation rates should be explained and sourced in this infographic. TASKS: After you graduate, what would be the debt burden for [THE SECOND SCHOOL THEY ENTER]? After you graduate, what is the monthly payment for the loans you would take out while attending [THE FIRST SCHOOL THEY ENTER]? How much debt burden would you have at [SECOND SCHOOL] vs. [FIRST SCHOOL]?
  27. 27. @romanocog #eduiconf Users concentrate on the most important site features. 27 Participants in the Graduate group were asked what content on the site was relevant to them. They looked at the primary features of the site in order to locate relevant content. PFC main homepage. Fixation count heat map of 12 participants on the main page, fixating mostly on the PFC header and on the four main menu tabs, including the Repay Student Debt tab.
  28. 28. @romanocog #eduiconf Users concentrate on the most important site features. 28 Participants in the Graduate group were asked what content on the site was relevant to them. They looked at the primary features of the site in order to locate relevant content. • Eye Tracking Heat Maps: Heat maps allow us to see concentrations of people’s fixations; in this instance, people fixate the introductory text and the 4 labels of the main navigation, which is exactly where we want them to look. PFC main homepage. UX Metrics: Fixation count heat map of 12 participants on the main page, fixating mostly on the PFC header and on the four main menu tabs, including the Repay Student Debt tab.
  29. 29. UX Metrics in Action: Prototype Testing 29 @romanocog #eduiconf
  30. 30. Repay Student Debt Prototype Methodology 30 Loan Type: Loan type allowed participants to select what type(s) of loans they have. This was similar to how the live site worked. Borrower Repayment Type: Type: Borrower Repayment type type allowed allowed participants to choose participants whether to choose they were what a kind typical of borrower repayment or plan had special they were circumstances. interested in. Goal Goal type allowed participants to choose whether they wanted to lower their monthly payment, pay less interest over time, or inquire about other programs. CFPB proposed testing three non-functional prototypes of the main page of the Repay Student Debt tool.* The look and feel of those prototypes differed in each round of testing. Version Only Round 1 of 2 the participants Loan prototype saw this included prototype. images, while Version 2 did not. All participants saw one version of this prototype. Borrower prototype included images, Only Round 3 participants The main purpose of the p rsoatowt ythpeis tpersottinotgy pwea.s to determine which starting point participants thought was the most logical. Four different starting points were tested: *Prototypes presented were non-functional PDF prototypes of the starting page of the repayment tool. @romanocog #eduiconf
  31. 31. Participants “clicked” the ‘Get Started’ content on each prototype first. 31 A majority of participants said that they would click on the Get Started content on every version of all prototypes except the repayment type prototype. 16 12 8 4 First “Clicks” on Any of the Get Started *Repayment Type prototype was only seen by Round 2 participants; the Goal Type prototype was only seen by Round 3 participants. 0 Borrower Type Loan Type Repayment Type* Goal Type* Buttons Round 2 Round 3 Borrower Type Version 2. @romanocog #eduiconf
  32. 32. Participants “clicked” the ‘Get Started’ content on each prototype first. 32 A majority of participants said that they would click on the Get Started content on every version of all prototypes except the repayment type prototype. • First Clicks: PDF prototypes were not clickable. We ask participants where they would 16 12 8 4 UX Metrics: First “Clicks” on Any of the Get Started click first and gain insight into how users would interact with these pages if they were live. *Repayment Type prototype was only seen by Round 2 participants; the Goal Type prototype was only seen by Round 3 participants. 0 Borrower Type Loan Type Repayment Type* Goal Type* Buttons Round 2 Round 3 Borrower Type Version 2. @romanocog #eduiconf
  33. 33. Prototype Summary & Recommendations 33 @romanocog #eduiconf • There was no overwhelming favorite among participants regarding which prototype was their favorite. – 4 out of 18 (22%) participants preferred the Borrower Type prototype – 6 out of 18 (33%) participants preferred the Loan Type prototype – 4 out of 8 (50%) participants preferred the Repayment Type prototype – 4 out of 10 (40%) participants preferred the Goal Type prototype Recommendations: Start with either the goal or the loan prototype. Although there was no real consensus around one prototype, almost all participants said something positive about the goal prototype. Having a goal-oriented way of getting into the tool matches how users think of the process. The loan prototype was basically a different-looking version of the live site, which participants found generally intuitive. Include a ‘Help Me Decide’ button. Help Me Decide was a popular first click button on the one prototype it was shown with. It would allow users who don’t necessarily have a goal a way to get in. Include graphics. Although graphics were removed from the second iteration of the prototypes for content-testing purposes, the feedback on them was generally positive in the first round of testing. Remove content below the fold. Most participants clicked on one of the Get Started options; however, some clicked on the other content below the fold. If the main purpose of this site is to get people to use the tool, then content below the fold could be removed to simplify the page.
  34. 34. Prototype Summary & Recommendations 34 @romanocog #eduiconf • There was no overwhelming favorite among participants regarding which prototype was their favorite. – 4 out of 18 (22%) participants preferred the Borrower Type prototype – 6 out of 18 (33%) participants preferred the Loan Type prototype – 4 out of 8 (50%) participants preferred the Repayment Type prototype – 4 out of 10 (40%) participants preferred the Goal Type prototype Recommendations: Start with either the goal or the loan prototype. Although there was no real consensus around one prototype, almost all participants said something positive about the goal prototype. Having a goal-oriented way of getting into the tool matches how users think of the process. The loan prototype was basically a different-looking version of the live site, which participants found generally intuitive. Include a ‘Help Me Decide’ button. Help Me Decide was a popular first click button on the one prototype it was shown with. It would allow users who don’t UX necessarily Metrics: have a goal a way to get in. Include graphics. Although graphics were removed from the second iteration of the prototypes for content-testing • Debriefing purposes, Interviews: the feedback After on them the was tasks generally and testing positive is in over, the first a debriefing round of testing. interview allows us to ask users about their preferences. We can then quantify those preferences to gain insight into which prototypes users prefer. Remove content below the fold. Most participants clicked on one of the Get Started options; however, some clicked on the other content below the fold. If the main purpose of this site is to get people to use the tool, then content below the fold could be removed to simplify the page.
  35. 35. • Combination of subjective, observational, and implicit data to evaluate the UX. • Many UX elements improved from Iteration 1 to Iteration 2; some need further exploration. 35 8% 8% 0% 0% 17% 56% 28% 0% 83% 0% Not Difficult at All Slightly Difficult Moderately Difficult Very Difficult Extremely Difficult 6% 0% 0% 0% 50% 44% 8% 33% 58% 0% Not Difficult at All Slightly Difficult Moderately Difficult Very Difficult Extremely Difficult “Please rate your overall experience with the Paying for College website.” “Please rate your difficulty with navigating on the Paying for College website.” Iteration 1 Iteration 2 Study Impacts @romanocog #eduiconf
  36. 36. • Combination of subjective, observational, and implicit data to evaluate the UX. • Many UX elements improved from Iteration 1 to Iteration 2; some need further exploration. 36 8% 8% 0% 0% 17% 56% 28% 0% 83% 0% Not Difficult at All Slightly Difficult Moderately Difficult Very Difficult Extremely Difficult 6% 0% 0% 0% 50% 44% 8% 33% 58% 0% Not Difficult at All Slightly Difficult Moderately Difficult Very Difficult Extremely Difficult “Please rate your overall experience with the Paying for College website.” “Please rate your difficulty with navigating on the Paying for College website.” Iteration 1 Iteration 2 Study Impacts UX Metrics: • Satisfaction Questionnaire: After the tasks and debriefing, asking users to rate their satisfaction allows us to make comparisons in iterative testing. @romanocog #eduiconf
  37. 37. UX Research Success: User Types 37 High School Students Influencers Round 1: Austin, TX Round 3: Arlington, VA Round 2: Spokane, WA Live site version 1 Live site version 1 Live site version 1 OR Prototype site version 2 Iteration 1: February/ March 2014 High School Students College Graduates Round 1: Atlanta, GA Round 3: Arlington, VA Round 2: Milwaukee, WI Live site version 2 Live site version 2 Live site version 2 Iteration 2: April 2014 @romanocog #eduiconf
  38. 38. UX Research Success: Client Collaboration 38 High School Students Influencers Round 1: Austin, TX Round 3: Arlington, VA Round 2: Spokane, WA Live site version 1 Live site version 1 Live site version 1 OR Prototype site version 2 Iteration 1: February/ March 2014 High School Students College Graduates Round 1: Atlanta, GA Round 3: Arlington, VA Round 2: Milwaukee, WI Live site version 2 Live site version 2 Live site version 2 Iteration 2: April 2014 @romanocog #eduiconf Task Creation SATQ Items Other materials Debriefing Reports . . . . . . . Iterative Process with Client “A spectator sport” @skug
  39. 39. UX Research Success: Multiple Data 39 @romanocog #eduiconf OBSERVATIONAL + Time to complete tasks + Time to first click + First click accuracy + Task accuracy IMPLICIT + Eye tracking SUBJECTIVE + Difficulty and satisfaction ratings + Think aloud protocol + Debriefing interview • Eye tracking [implicit] tells us what participants focus on. • Think aloud & debriefing [subjective] tell us why they focus there. • Observational measures tell us how they go about completing a task. • Combination of subjective, observational, & implicit metrics paints the whole picture.
  40. 40. UX Research Success 40 • All relevant user types participated. • Students • Graduates • Educators • Parents • All relevant parties were involved. • Designers • Developers • Key stakeholders • Multiple data sources provided insight into the complete experience. • Iterative process: Test, improve, test again • Ensure changes improve the UX. • Ensure something else did not break. @romanocog #eduiconf
  41. 41. Thank you! • Twitter: @forsmarshgroup • LinkedIn: http://www.linkedin.com/company/fors-marsh-group • Blog: http://www.forsmarshgroup.com/categories/blog/ Jennifer Romano Bergstrom @romanocog jbergstrom@forsmarshgroup.com Lamar Alayoubi lamar.alayoubi@cfpb.gov
  42. 42. Eye Tracking in UX – Oct 16 42 User Focus – Oct 17 eyetrackingux2014.com October 16, 2014 EDUI = 20% OFF 2014.userfocus.org October 17, 2014 EDUI = 20% OFF

×