Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Establishing best practices to improve usefulness and usability of web interfaces providing atmospheric data

743 views

Published on

Establishing best practices to improve usefulness and usability of web interfaces providing atmospheric data.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Establishing best practices to improve usefulness and usability of web interfaces providing atmospheric data

  1. 1. Establishing best practices to improve usefulness and usability of web interfaces providing atmospheric data Nina S. Oakley Britta Daudert
  2. 2. Atmospheric data are increasingly important to a broad audience Resource Managers Ecologists Social ScientistsPublic Health Officials Policy Makers Farmers Educators Hydrologists Geologists Engineers
  3. 3. Web has become favored way to disseminate atmospheric data
  4. 4. Data can be frustrating to access I just want to download a year of data for Reno Airport... what do I click?
  5. 5. Usability addresses this issue • Developed from e-commerce needs (for web) • “The extent to which a web product can be used to achieve goals with effectiveness, efficiency, and satisfaction.” (ISO, 2014) • Focuses on user rather than developer needs • Assumes users are busy people trying to accomplish tasks • Users (not developers) decide whether a product is easy to use (Dumas & Redish, 1999)
  6. 6. Why is usability important to atmospheric science? • Often many sites to access same data • Users have “reservoir of goodwill”, leave site if frustrated • E-commerce: loyal users spend more money than first- time users (Nielsen, 2000) • Loyal following = funding? • Usability testing relatively cheap! Krug, 2005
  7. 7. Time spent on site before leaving Nielsen, 2011 • Users leave web pages in ~10-20 seconds • Make clear, strong value proposition to get them to remain longer
  8. 8. How to employ usability? • Follow general usability guidelines • literature, usability.gov • Perform usability testing on your site • We attended training • nngroup.com
  9. 9. Site tested: SCENIC Southwest Climate ENvironmental Information Collaborative • wrcc.dri.edu/csc/scenic • Serves researchers in DOI SW-CSC • Interface to ACIS database
  10. 10. SCENIC allows for customizable queries, analyses List data (several formats) Create summaries, perform statistical analyses
  11. 11. SCENIC creates high-resolution graphics
  12. 12. Methods: Usability lab • Choice of Mac or PC, any browser • Facilitator works with participant • Camtasia software for screen recording Krug, 2005
  13. 13. Methods: Recruiting participants • 5 participants uncover 80% of usability issues Nielsen 2000, 2012
  14. 14. Methods: Recruiting participants • Performed 2 rounds of testing with 5 participants • Made improvements between rounds • Test representatives of target user group • Not necessarily your colleague down the hall • Sought people in ecology, resource management • Compensate participants! • Provides motivation to show up, give quality feedback
  15. 15. Methods: Designing test questions • PART 1: Online, 3 Tasks • 1) List data for all stations in Shasta County, CA that recorded snowfall and precipitation data for all dates December 15-December 31, 2013 • 2) Find the highest temperature ever recorded in March at Winnemucca AP, Nevada • 3) Find the lowest minimum temperature among grid points approximately covering Pyramid Lake in December 2013
  16. 16. Methods: Designing test questions • PART 2: Written, Standardized Usability Scale (SUS) • Standard usability test, results interpreted based on large number of usability studies • Produces valid results on small sample sizes (Brooke, 1986; Bangor 2009) • 10 questions, Likert-style scale– 5 choices between agree strongly/disagree strongly • Gives a score or “grade” to the usability of a site
  17. 17. Methods: Designing test questions • PART 3: Verbal, 7 Questions • Questions on interpretation of common terms used in climate data (raw data, tool, product, climate anomaly map, etc) • Card sorting activity to inform on how people search for climate data (where, when, what, type, source) • General questions, discussion about site
  18. 18. Methods: Researching general usability guidelines • Overall goal: reduce cognitive load on user • No need to reinvent wheel (in most cases) • Not specific to climate data, web pages in general Krug, 2005
  19. 19. Methods: Researching general usability guidelines • Adhere to web conventions • Navigation along top of page, links recognizable, search bar in upper right or left, ?=help
  20. 20. Example: User expected links, got text
  21. 21. Methods: Researching general usability guidelines • Be consistent within pages • Color scheme, formatting, layout same throughout SCENIC • Navigation menu always available • Provide help texts • However, most users muddle through first
  22. 22. Example: Help texts help!
  23. 23. Methods: Researching general usability guidelines • Hide unnecessary options • Make labels clear and meaningful • was much more successful that “submit” • Page headings match link name
  24. 24. Example:Autofill mistaken for dropdown menu; getting users to recognize autofill
  25. 25. • Round 1: • Mean = 63 • Round 2: • Mean = 67.5 Results: How did participants rate SCENIC?
  26. 26. Results: How did participants rate SCENIC? • Round 1: Mean = 63 • Round 2: Mean = 67.5 Somewhere between “OK” and “Good” Bangor et al. 2009
  27. 27. Results: How participants search for data • N = 14 (15 participants, 1 abstaining) to meet requirements for significant results (Tullis and Wood, 2004) • WHERE (location) most important, SOURCE (originator of data) least important
  28. 28. Results: Labeling Challenging • Gridded or modeled data? • All gridded data here modeled, not observed • Participants found gridded “more descriptive and useful” • Climate anomaly maps and time series? • Participants agreed on, liked these names Time Series Climate Anomaly Map
  29. 29. Example: Labeling
  30. 30. Results: Labeling Challenging • Tool and product misleading • General agreement “tool” manipulates data, “product” static • Term “Data Tools” did not illicit desired response, changed to “Data Analysis” • Raw data • Some thought had QC applied, some not • Used “Data Lister” • “Historic” data • Confusing, replaced with “Data Lister”
  31. 31. Example: Confusion about historic vs. current data
  32. 32. Results: Biggest challenge • Getting participants to utilize data analysis tools • Want to list data first– all participants! • Some say they would do analysis themselves • Are analysis tools valuable for this audience? • How to motivate people to know they are available? Analysis Tools Data Lister
  33. 33. Example: Muddling and listing data
  34. 34. Conclusions • Usability testing extremely valuable • Applies to SCENIC and future projects • Removed many issues on site • Learned about how people use web • Still improvements to be made • Testing informs on issues, not always clear how to solve • Usability is challenging!
  35. 35. Conclusions- Recommendations • Perform testing early and often • Work with target audience • Consider the way your audience searches for atmospheric data • Naming/labeling most challenging task • Test names on participants, compare with other agencies • Adhere to general usability guidelines • Usability.gov; Krug, 2005, Don’t Make Me Think good places to start
  36. 36. Moving Forward • Broad survey on how people look for climate data • Standardization of terms (labels) across agencies • Interpretation of atmospheric data • Research what help tools are most effective • video, forums, text, tutorials • Other groups in atmospheric science share usability testing results
  37. 37. Thank you! World Usability Day is coming up! November 13 2014 This project was supported by DAS EDGES 2013

×