NEEShub – Heuristic Evaluation

1,041 views

Published on

Published in: Design
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,041
On SlideShare
0
From Embeds
0
Number of Embeds
408
Actions
Shares
0
Downloads
5
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

NEEShub – Heuristic Evaluation

  1. 1. Assignment 05: Heuristic Evaluation Team 02 Omid Farivar | Evan Kerrigan | Benjamin Nayer SI 622: Evaluation of Systems and Services March 17, 2011
  2. 2. Agenda 1. Project Introduction 2. Methods 3. Findings & Recommendations 4. Discussion
  3. 3. What is NEES? ‣ Network for Earthquake Engineering Simulation is an NSF-funded research organization ‣ Serves as a network to coordinate and share data and research among earthquake, civil, and structural engineers and 14 different laboratories throughout the U.S. ‣ Offer multiple services including NEEShub, which is our focus
  4. 4. Project Warehouse? NEEShub? ‣ Suite of software and tools to store, share, and analyze data—tailored to the needs of the NEES community ‣ At its heart is the “Project Warehouse,” which serves as the system’s repository for uploading, managing, and sharing research data ‣ Also offers an array of tools for visualizing and analyzing data outside of the Project Warehouse
  5. 5. photo cc flickr user delilah021 Heuristic Evaluation
  6. 6. Nielsen’s Heuristics 1. Visibility of system status 2. Match between system and the real world 3. User control and freedom 4. Consistency and standards 5. Error prevention 6. Recognition rather than recall 7. Flexibility and efficiency of use 8. Aesthetic and minimalist design 9. Help recognize, diagnose, and recover from errors 10. Help and documentation Source: Nielsen, Jakob. Heuristic Evaluation. (1994). In J. Nielsen & R. L. Mack, (Eds.), Usability Inspection Methods. New York, New York: John Wiley & Sons.
  7. 7. Individual Evaluations ‣ Interviews & personas guided identification of different system aspects—researchers, data viewers, data creators and “uploaders” ‣ Selected separate, pre-existing projects to analyze individually ‣ ‣ ‣ Created and edited projects and experiments Conducted several individual site passes Ranked severity and importance to success
  8. 8. Rating Scale Ranking Meaning 0 Team does not agree that issue impacts system usability 1 Cosmetic problem only; need not be fixed unless extra time is available on project 2 Minor usability problem; fixing this should be given low priority 3 Major usability problem; important to fix, so should be given high priority 4 Usability catastrophe; imperative to fix before product can be released Source: Nielsen, Jakob. Heuristic Evaluation. (1994). In J. Nielsen & R. L. Mack, (Eds.), Usability Inspection Methods. New York, New York: John Wiley & Sons.
  9. 9. Team Evaluations ‣ Assembled as a team to discuss, explain, and defend individual findings ‣ ‣ Team collectively re-rated issue severity A score of ‘0’ was accorded to items that the team felt were not truly usability issues
  10. 10. photo cc flickr user stefanoodle Findings & Recommendations
  11. 11. Finding 01: Inaccurate Presentation of Folder Size and Contents ‣ Project experiment folders in file browser are labeled as very large (10s–100s of GB) ‣ When accessing the folders, it appears to be empty—is the content loading? deleted? restricted? ‣ Violates visibility of system status, since it provides inaccurate system information: users cannot know if something went wrong
  12. 12. Recommendation ‣ ‣ Present correctly labeled folder size ‣ Offer helpful error notifications, particularly for permission issues Use a status spinner to indicate if the system is loading
  13. 13. Finding 02: Inconsistent Filetype Upload Behavior ‣ Users can upload various filetypes to project experiments—videos, images, etc.—but the documentation about acceptable filetypes is inconsistent ‣ Provides unhelpful error responses when uploading incorrect formats ‣ Violates the following heuristics: consistency and standards, error prevention, and help users recognize, diagnose, and recover from errors
  14. 14. Recommendation ‣ Provide clear, consistent contextual help and documentation ‣ ‣ Validate formats during file upload Provide clearer, more accurate error messages
  15. 15. Finding 03: Uploading Sensor Data Places Excessive Recall Burden on Users ‣ Uploading sensor data to an experiment is performed via populating a blank spreadsheet, with specific file format requirements and other formatting constraints provided before you download it ‣ This spreadsheet contains none of this information, forcing the user to remember it ‣ Violates recognition rather than recall heuristic
  16. 16. Recommendation ‣ Add formatting requirement instructions within the provided spreadsheet
  17. 17. Finding 04: Difficulty Performing Multiple File Searches ‣ Each project’s File Browser has a search function, which works normally at first ‣ New results are not returned after attempting later searches—the query simply fails. Reloading page is only solution. ‣ ‣ No error message or feedback— it just fails! Violates the error prevention, error recovery, and system visibility heuristics.
  18. 18. Recommendation ‣ Investigate reasons why issue is occurring and generate a solution ‣ Display current search terms prominently and provide error messages if queries fail
  19. 19. Finding 05: Unclear Options in File Browser Search Menu ‣ Ambiguous drop-down menu with search box—contains options to “Find by: ‘Title’ or ‘Name’” ‣ No ‘Title’ column in the file browser; unclear what that option means… no apparent difference when searching with each option ‣ Violates consistency and standards heuristic
  20. 20. Recommendation ‣ Two Possibilities: • Replace ‘Title’ option with a clearer, more meaningful filter (e.g., ‘timestamp’) • Remove drop-down menu altogether
  21. 21. Finding 06: Difficulty Determining State of Upload Progress ‣ File uploaders expected to handle large amounts of data ‣ No progress feedback provided when uploading files ‣ Can’t navigate away from uploader without losing the upload lightbox ‣ Violates visibility of system status heuristic
  22. 22. Recommendation ‣ Implement a progress bar to display upload status—time elapsed, time remaining, etc.
  23. 23. Finding 07: Usability Strengths of NEEShub ‣ Extensive documentation is provided, especially when creating and editing projects or experiments ‣ ‣ Conspicuous help button for each data field Consistent iconography, such as the help buttons and filetype icons in the file browser • • • Navigation aids “Breadcrumb”-style secondary navigation bar Similar display in the file browser
  24. 24. photo cc flickr user yanivg Caveats & Limitations
  25. 25. Caveats & Limitations ‣ ‣ ‣ Minimum optimal number of evaluators ‣ Our decision to select different, pre-existing projects may have resulted in less-detailed investigations of specific pages ‣ Some of the issues—particularly file-upload ones— may be have less of a significant usability issue as there is an external tool used for file management Prior experience with NEEShub may introduce biases A larger and more diverse set of evaluators may have helped
  26. 26. Q&A Team 02 Omid Farivar | Evan Kerrigan | Benjamin Nayer

×