Home Mess System III
Upcoming SlideShare
Loading in...5
×
 

Home Mess System III

on

  • 1,367 views

Evaluation: the 3rd phase of interaction design.

Evaluation: the 3rd phase of interaction design.

Statistics

Views

Total Views
1,367
Views on SlideShare
1,278
Embed Views
89

Actions

Likes
0
Downloads
8
Comments
0

1 Embed 89

http://kashesi.wordpress.com 89

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Home Mess System III Home Mess System III Presentation Transcript

  • Home-Mess System
    Presentation III: Evaluation of Prototype IArundhati, Ihab, Ibrahim, Fareed, Zain
    • Testing is the big unknown, often ignored world that intends to uncover design and construction flaws in a solution.
    • The earlier we start testing, the better the testing strategy, the better the test coverage, the sooner we can reveal the flaws and address them.
    • Uncovering a flaw when a solution is still in early construction phases is a reasonably cost effective and easy flaw to resolve
     “TESTING” THE MOMENT OF TRUTH
    Introduction
    Source- http://blogs.msdn.com/willy-peter_schaub
  • Objectives
    • Discovering strengths and weaknesses in prototype design
    • Identification of barriers to successful penetration of the system functions
    • Evaluation of non-response (non-participation) elements in design
    • Exploitation of evaluations results and outcomes for improving system layout and architecture
    “Reaching a deeper understanding of the users' expectations and impressions of the system.”
  • Our Approach
    Step 1: Decide on testing strategy
    • Technical review of the solution
    • Involving both developers and users
    • Starting from the ‘inside’
    • Define the boundaries and scope
    Step 2: Prepare the ‘battle plan’
    • Define the types, objective, target users for testing
    • Define process of task scenariosfor testing
  • Evaluation Methods
  • User Testing
    Cognitive Walkthrough
    FIELD
    THEORY
    TRIANGULATION
    EXPERT
    Heuristic Evaluation
  • Theory Based- Cognitive Walkthrough
    Measure the usability aspect by collecting empirical data of task breakdown and recognizing the sequence/path taken by the user.
     
    Field Based- User Testing
    Observation of users in their home environment. A basic structure would be kept as a guideline. It’s a user centric approach
     
    Expert Based- Heuristic Evaluation
    Identify usability problems based on established human factors principles. The method will provide recommendations for design improvements.
     
  • Cognitive Walkthrough
  • The stages in a cognitive walkthrough and the dependencies between stages
    Preparation Phase
    Execution Phase
  • Task Based Walkthroughs- Approach
    Users Identified:
    what knowledge, skills, experience will they have?
    Tasks Identified:
    set of representative tasks
    sequence of actions needed to achieve each task
  • processing model of human cognition
  • Predefined problem criteria
    User articulates a goal & cannot succeed in attaining it within 2 minutes
    User explicitly gives up
    User articulates a goal and has to try three or more actions to find a solution
    User produces a result different from the task given
    User expresses surprise
    User expresses some negative affect or says something is a problem
    User makes a design suggestion
  • KLM: Keystroke Level Model
    K Press a key or button
    P Point to a target on the display
    H Home hands on input device
    D Draw a line segment
    M Mentally prepare for an action
    R (system response time)
    KLM Operators:
  • User Testing
  • Set task scenarios for user testing
    Recruit prospective users
    Record each session of testing
    Observe and Analyse data
    Follow up with a cooperative evaluation questionnaire
    Think aloud protocol : Approach
  • Task Scenario
    “Create a new task”
    To ascertain the path that the user would follow and their understanding of the system layout
    “Leave a direct message”To test the user’s intuition when no clear path has been defined.
    “Create a new event”
    “Check for reminders”
  • Users Tested
    "Anything that can go wrong will go wrong” - Murphy’s Law
    Measures of usability should cover:
     
    • Effectiveness ( the ability of users to complete tasks using the system, and the quality of the output of those tasks)
    • Efficiency ( the level of resource consumed in performing tasks)
    • Satisfaction (users’ subjective reactions to using the system)
  • Scales Usefulness Ease Of Use
    Very Easy
    Easy
    5
    Somewhat Easy
    4
    Somewhat Difficult
    3
    Difficult
    2
    1
  • Questionnaire
  • User Quotes
    “Good white space – links are obvious – clearly labeled – browsing divided very nicely – good subcategories.”
    “What is ‘camera’ icon for? It was the first choice I noticed.”
    “I think the designers have done well”
    “I don’t know which button to click with the options present in more that one place on the main screen.”
  • Heuristics Evaluation
  • Severity Rating Scale
  • Severity ratings
  • Thematic Problems Identified
  • Compiled Evaluation Analysis
  • Implementing Changes
  • Conclusion
    • Evaluation reveal s flaws in the system
    • Triangulated by methods:- Cognitive Walkthrough - User Testing - Heuristic Evaluation
    • Formulated data and analyzed the severity for each problem
    • Derived possible design and functionality solutions for next phase of prototype development
  • Any Questions?
    See You with our REDESIGNED prototype