3. What is NEES?
‣
Network for Earthquake Engineering
Simulation is an NSF-funded research
organization
‣
Serves as a network to coordinate and share
data and research among earthquake, civil,
and structural engineers and 14 different
laboratories throughout the U.S.
‣
Offer multiple services including NEEShub,
which is our focus
4. Project Warehouse?
NEEShub?
‣
Suite of software and tools to store, share,
and analyze data—tailored to the needs of
the NEES community
‣
At its heart is the “Project Warehouse,” which
serves as the system’s repository for uploading,
managing, and sharing research data
‣
Also offers an array of tools for visualizing
and analyzing data outside of the Project
Warehouse
6. Nielsen’s Heuristics
1. Visibility of system status
2. Match between system and the real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help recognize, diagnose, and recover from errors
10. Help and documentation
Source: Nielsen, Jakob. Heuristic Evaluation. (1994). In J. Nielsen & R. L. Mack, (Eds.), Usability Inspection Methods. New York,
New York: John Wiley & Sons.
7. Individual Evaluations
‣
Interviews & personas guided identification
of different system aspects—researchers,
data viewers, data creators and “uploaders”
‣
Selected separate, pre-existing projects to
analyze individually
‣
‣
‣
Created and edited projects and experiments
Conducted several individual site passes
Ranked severity and importance to success
8. Rating Scale
Ranking
Meaning
0
Team does not agree that issue impacts system
usability
1
Cosmetic problem only; need not be fixed unless extra
time is available on project
2
Minor usability problem; fixing this should be given
low priority
3
Major usability problem; important to fix, so should
be given high priority
4
Usability catastrophe; imperative to fix before
product can be released
Source: Nielsen, Jakob. Heuristic Evaluation. (1994). In J. Nielsen & R. L. Mack, (Eds.), Usability Inspection Methods. New York,
New York: John Wiley & Sons.
9. Team Evaluations
‣
Assembled as a team to discuss, explain,
and defend individual findings
‣
‣
Team collectively re-rated issue severity
A score of ‘0’ was accorded to items that
the team felt were not truly usability issues
11. Finding 01: Inaccurate Presentation
of Folder Size and Contents
‣
Project experiment folders in file browser
are labeled as very large (10s–100s of GB)
‣
When accessing the folders, it appears to be
empty—is the content loading? deleted?
restricted?
‣
Violates visibility of system status, since it
provides inaccurate system information:
users cannot know if something went wrong
12.
13. Recommendation
‣
‣
Present correctly labeled folder size
‣
Offer helpful error notifications, particularly
for permission issues
Use a status spinner to indicate if the system
is loading
14. Finding 02: Inconsistent Filetype
Upload Behavior
‣
Users can upload various filetypes to project
experiments—videos, images, etc.—but the
documentation about acceptable filetypes is
inconsistent
‣
Provides unhelpful error responses when
uploading incorrect formats
‣
Violates the following heuristics: consistency
and standards, error prevention, and help users
recognize, diagnose, and recover from errors
17. Finding 03: Uploading Sensor
Data Places Excessive Recall
Burden on Users
‣
Uploading sensor data to an experiment is
performed via populating a blank
spreadsheet, with specific file format
requirements and other formatting
constraints provided before you download it
‣
This spreadsheet contains none of this
information, forcing the user to remember it
‣
Violates recognition rather than recall heuristic
20. Finding 04: Difficulty Performing
Multiple File Searches
‣
Each project’s File Browser has a search
function, which works normally at first
‣
New results are not returned after
attempting later searches—the query simply
fails. Reloading page is only solution.
‣
‣
No error message or feedback— it just fails!
Violates the error prevention, error
recovery, and system visibility heuristics.
21.
22. Recommendation
‣
Investigate reasons why issue is occurring
and generate a solution
‣
Display current search terms prominently
and provide error messages if queries fail
23. Finding 05: Unclear Options in
File Browser Search Menu
‣
Ambiguous drop-down menu with search
box—contains options to “Find by: ‘Title’ or
‘Name’”
‣
No ‘Title’ column in the file browser; unclear
what that option means… no apparent
difference when searching with each option
‣
Violates consistency and standards heuristic
26. Finding 06: Difficulty Determining
State of Upload Progress
‣
File uploaders expected to handle large
amounts of data
‣
No progress feedback provided when
uploading files
‣
Can’t navigate away from uploader without
losing the upload lightbox
‣
Violates visibility of system status heuristic
29. Finding 07: Usability Strengths of
NEEShub
‣
Extensive documentation is provided,
especially when creating and editing
projects or experiments
‣
‣
Conspicuous help button for each data field
Consistent iconography, such as the help
buttons and filetype icons in the file browser
•
•
•
Navigation aids
“Breadcrumb”-style secondary navigation bar
Similar display in the file browser
32. Caveats & Limitations
‣
‣
‣
Minimum optimal number of evaluators
‣
Our decision to select different, pre-existing
projects may have resulted in less-detailed
investigations of specific pages
‣
Some of the issues—particularly file-upload ones—
may be have less of a significant usability issue as
there is an external tool used for file management
Prior experience with NEEShub may introduce biases
A larger and more diverse set of evaluators may
have helped