Website Usability
Upcoming SlideShare
Loading in...5
×
 

Website Usability

on

  • 2,290 views

 

Statistics

Views

Total Views
2,290
Views on SlideShare
2,285
Embed Views
5

Actions

Likes
1
Downloads
24
Comments
0

1 Embed 5

http://www.slideshare.net 5

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Website Usability Website Usability Presentation Transcript

    • Web Site Usability October 20, 2009 Vincci Kwong Franklin D. Schurz Library Indiana University South Bend Indiana Library Federation Annual Conference
    • Website Usability
      • “ The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.” (British Standards Institution, 1998)
      • “ Making whatever you’re working on easier to use for who ever is going to use it.” (Steve Krug, 2001)
      • “ An element of design, focusing on ‘can this be used when it’s done?’ rather than just making it look good.” (Steve Krug, 2001)
      • “ On the Web, usability is a necessary condition for survival. If a website is difficult to use, people leave” (Jakob Nielsen, 2003)
    • Components of Website Usability
      • The five Es:
      • Effective
      • Efficient
      • Engaging
      • Error tolerant
      • Easy to learn
    • Types of Usability Testing
      • Scenario-Based Inspection
      • Heuristic Evaluation
      • User Observation
      • Prototyping Test
      • Card Sorting
    • Perquisites for all Usability Testing
      • Information gathering
        • Who are the users?
        • What do users do?
        • What do users want?
        • Limitation of current system
      • Understanding of design principles and rules
    • Scenario-Based Inspection
      • Evaluator examines the website in which people perform their tasks
      Goal Task Task Task Action Action Action Action
    • Setting Up a Scenario-Based Inspection
      • Come up with scenarios
      • Explore different ways in order to accomplish each scenario
      • Make note on problems encounter when working through each scenario
    • Heuristic Evaluation
      • Inspector examines the website to check whether it complies with a set of design principles
        • - Nielsen’s Heuristics
        • MILE+ Heuristics
          • Triacca, Luca, Alessandro Inversini, and Davide Bolchini. “Quality of Web Usability Evaluation Methods: An Empirical Study on MiLE+.” Seventh IEEE International Symposium on Web Site Evolution . 2005. 22-29. Print.
        • W3C Content Accessibility Guidelines 1.0
        • Section 508 Standards
    • Setting Up a Heuristic Inspection
      • Choose inspectors
      • Decide on heuristic criteria
      • Prepare data collection and analysis form
      • Prepare task description
      • Reserve a room
      Task Scenario No.: 1 Inspector’s Name: Kelly Session Date: 3/20/2009 Session Start Time: 9:00 am Session End Time: 10:00 am Fig. 1 An example of data collection and analysis form. Heuristic violated Usability defect Comments Navigation Inconsistent overall navigation The user was confused by the different navigation systems. Semiotics Unclear links labels The user would like simple wordings without the use of jargon. Graphics Contrast between text and background The user found it difficult to read the wordings on the webpage.
    • Pro and Con of Heuristic Evaluation
      • Pros
        • Less expensive
        • Inspectors usually suggest solutions to the problem they identified
        • Helps to reduce obvious errors
      • Cons
        • Inspector’s view points may not reflect those of actual users
        • Inspector may have their own preferences, biases, and views
        • Results depends on skills and experiences of inspectors
    • User Observations
      • User carries out specific tasks while under observation
    • Setting Up a User Observation Session
      • Choose participants
      • Select facilitator and observers (if any)
      • Prepare task description
      • Create evaluation script
      • Prepare permission form
      • Decide on how to record session
      • Pick a location
    • Choose participants
      • Real user Vs Representative user
      • How many participants?
      • Availability of participant
    • Select facilitators and observers
      • Facilitator
        • Maintains a welcome and relaxed atmosphere
        • Keeps participant talking
        • Ensure the purpose of evaluation is fulfilled
      • Observer
        • - Observe testing session
    • Prepare task description
      • Decide on what function to test
      • Come up with scenarios
        • 1 scenario per function
      • Elaborate scenarios
        • Clear wordings
      • Document task description
        • Task card
    • Create evaluation script
      • A script used by evaluator to guide participant throughout the testing session
      • Popular components of evaluation script
        • Welcome participant
        • Explain the evaluation and confidentiality
        • Perform preliminary interview
        • Administer tasks
        • Conduct exit interview
        • Thank the participant
    • Prepare permission form
      • Participate Consent Form
        • Permission to record
        • Permission for use of data
        • Permission to share test results
        • - Nondisclosure agreements
    • Recording a session
      • Paper Note
      • Audio recording
      • Video recording
      • Eye tracking
    • Our Experience with User Observation Session
    • Choose Users
      • 4 faculty members
      • 4 students
      • 2 staff
    • Prepare task description
      • Each user completed five tasks
        • You have a citation for a journal article that you are interested in and you want to find out if the library has a copy of the journal article.
        • You are working on a research related to economic recession and you want to find some books that provide information on it.
        • You are working on a term paper and you need to cite resources used in your paper, however you have no idea on how to do citation. You want to find out if the library provides information on how to do citation.
        • You just recognized two of the books you checked out from the library are overdue, you want to find out how much you own in the fines.
        • You have some photos that you want to resize, so you want to find out if Adobe Photoshop is available on the computers at the library.
    • Create evaluation script
      • Introduction
        • Welcome and explain the purpose of the usability test
      • Preliminary Interview
        • How much time do you normally spend on the Web in a given week?
        • Have you used any library website to look up information in the past?
      • Evaluation Instruction
      • Task
      • Wrap Up & Brainstorm
        • Additional questions you would like to ask
        • Invite user to provide any suggestion/comment for improvement
        • Express thanks to user
    • Decide on how to record data
      • Record session through video and screen capture
        • - Laptop
        • - Webcam
        • - Microphone
        • - Camtasia Studio
    • Testing is over, now what?
    • Data analysis and Interpretation
      • Review and summarize data
        • Quantitative data
        • Qualitative data
      • Group findings
      • Assign Severities
      • Write evaluation report
    • Review and summarize data
      • Quantitative data
        • Easy to understand
        • More objective than qualitative data
      • Qualitative data
        • Provide insights into cause of problem
          • Labels and headings not intuitive
          • Crowded, clustered, clumsy
          • Nothing stand out to me
    • Group findings
      • Chronological order
      • Severity of defect
      • Type of issue
      • Difficulty of fix
    • Assigning Severities
      • Helps to prioritize the work list
      • Severities scale varies
      Severity Rating Issue 3 Unclear link labels [Semiotics] Inconsistent overall navigation [Navigation] Information overload [Cognitive] Inefficient search function [Others] Insufficient system reaction to errors for a user [Technology/Performance] Unmatched/unexpected information [Content] 2 Contrast between text and background [Graphics] Backward navigation [Navigation] Duplicate Information [Navigation] Item not in category as expected [Navigation] Image information [Graphics] Scripting errors [Technology/Performance] Video is not able to accommodate all users [Others] 1 Grouping of left navigation bar [Navigation] Visited vs unvisited states [Navigation] Inflexible layout [Graphics] Uncertainty on currency of information [Content] Pop-up windows [Others] Unexpected file format [Others]
    • Write evaluation report
      • Document what you did
      • Serves as a communication tool
    • Summing up our experience
      • Pre-test play around
      • Different task sets
      • Wordings of task description
    • Helpful Readings
      • Stone, Debbie, et al. User Interface Design and Evaluation . Boston: Elsevier, 2005.
      • Krug, Steve. Don’t Make Me Think . Indianapolis: New Riders, 2000.
      • Nielsen, Jakob, and Marie Tahir. Homepage Usability: 50 Websites Deconstructed . Indianapolis: New Riders, 2002.
      • Carole, George. User-Centered Library Websites . Oxford: Chandos, 2008.
    • Helpful Resources
      • Paper Prototyping a How-To Video . Fremont: Nielsen Norman Group, 2003.
      • http://cardsorting.pbworks.com/ by Ellyssa Kroski
    • Questions?
      • Feel free to contact me at
        • Email: [email_address]
        • AIM: himffy
        • Yahoo: vincci_kwong
        • MSN: [email_address]
        • Phone: 574-520-4444