The document summarizes Lecture 3 of the Human-Computer Interaction Course 2014 given by Lora Aroyo. It discusses interaction design concepts like design principles, affordances, constraints, mappings, feedback and visibility. It also outlines four psychological principles of user interaction and how they can be applied in design. Specific concepts like consistency, affordances, mappings, feedback and cultural associations are explained in detail along with examples. Design guidelines, standards and principles for optimizing the user experience are also presented.
Chapter 9: Evaluation techniques
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 7: Design rules
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
I made this with my 3 partners for my CEC marks in 3rd sem of MCA. It includes information about HCI, definition, types, how it works, queries of it etc.
One can get idea easily about HCI after refering this presentation.
Human Computer Interaction Chapter 2 Interaction and Interaction Design Basi...VijiPriya Jeyamani
Interaction:
Introduction
Models of interaction
Ergonomics
Interaction styles
The context of the interactions
Paradigms:
Introduction
Paradigms for interaction.
2.2 Interaction Design:
Introduction
What is design?
User focus
Scenarios
Navigation design
Screen design and layout
Interaction and prototyping
Chapter 9: Evaluation techniques
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 7: Design rules
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
I made this with my 3 partners for my CEC marks in 3rd sem of MCA. It includes information about HCI, definition, types, how it works, queries of it etc.
One can get idea easily about HCI after refering this presentation.
Human Computer Interaction Chapter 2 Interaction and Interaction Design Basi...VijiPriya Jeyamani
Interaction:
Introduction
Models of interaction
Ergonomics
Interaction styles
The context of the interactions
Paradigms:
Introduction
Paradigms for interaction.
2.2 Interaction Design:
Introduction
What is design?
User focus
Scenarios
Navigation design
Screen design and layout
Interaction and prototyping
HCI 3e - Ch 6: HCI in the software processAlan Dix
Chapter 6: HCI in the software process
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Human Computer Interaction Chapter 4 Implementation Support and Evaluation Te...VijiPriya Jeyamani
Implementation Support:
Introduction
Elements of windowing systems
Programming the application
User interface management systems
4.2 Evaluation Techniques
What is evaluation?
Goals of evaluation
Choosing an evaluation method
Interaction Design in Human Computer Interaction by Vrushali Dhanokar. This PPT is useful to every students who study Human Computer Interaction in detail. Specially for TE Students of Information Technology in Pune University. Thank You.
What is Heuristic evaluation
Background
Benefits
Main advantages and drawbacks of the method
Scenario and methods of evaluation
10 usability Heuristics in usability engineering
How to conduct heuristic Evaluation
Phases of the Evaluation Method
Problems and Evaluators
Seamlessness thought the whole user experience
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirementsAlan Dix
Chapter 13: Socio-organizational issues and stakeholder requirements
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 8: Implementation support
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 15: Task analysis
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
HCI 3e - Ch 6: HCI in the software processAlan Dix
Chapter 6: HCI in the software process
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Human Computer Interaction Chapter 4 Implementation Support and Evaluation Te...VijiPriya Jeyamani
Implementation Support:
Introduction
Elements of windowing systems
Programming the application
User interface management systems
4.2 Evaluation Techniques
What is evaluation?
Goals of evaluation
Choosing an evaluation method
Interaction Design in Human Computer Interaction by Vrushali Dhanokar. This PPT is useful to every students who study Human Computer Interaction in detail. Specially for TE Students of Information Technology in Pune University. Thank You.
What is Heuristic evaluation
Background
Benefits
Main advantages and drawbacks of the method
Scenario and methods of evaluation
10 usability Heuristics in usability engineering
How to conduct heuristic Evaluation
Phases of the Evaluation Method
Problems and Evaluators
Seamlessness thought the whole user experience
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirementsAlan Dix
Chapter 13: Socio-organizational issues and stakeholder requirements
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 8: Implementation support
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 15: Task analysis
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
One of the shortcomings of many user interviews is the vast gulf between what people think they do versus what they actually do, not to mention what they may have forgotten having done. Fortunately, new research tools are filling that gap by allowing users to quickly provide feedback from their phone right after they use a product, allowing for the capturing of rich, emotional details. These tools are breathing new life into a traditional research tool, the diary study.
This talk share the best practices I’ve developed for designing a digital diary study that collects relevant and insightful data. It will be framed by examples from a recent diary study exploring how people use their fitness trackers (Fitbit, Jawbone Up, etc). Attendees will come away with not only an understanding of how much rich data can be collected this way, but with the basic knowledge needed to execute their own digital diary studies.
Learnersourcing: Improving Learning with Collective Learner ActivityJuho Kim
Slides from my thesis defense: "Learnersourcing: Improving Learning with Collective Learner Activity"
Millions of learners today are watching videos on online platforms, such as Khan Academy, YouTube, Coursera, and edX, to take courses and master new skills. But existing video interfaces are not designed to support learning, with limited interactivity and lack of information about learners' engagement and content. Making these improvements requires deep semantic information about video that even state-of-the-art AI techniques cannot fully extract. I take a data-driven approach to address this challenge, using large-scale learning interaction data to dynamically improve video content and interfaces. Specifically, this thesis introduces learnersourcing, a form of crowdsourcing in which learners collectively contribute novel content for future learners while engaging in a meaningful learning experience themselves. I present learnersourcing applications designed for massive open online course videos and how-to tutorial videos, where learners' collective activities 1) highlight points of confusion or importance in a video, 2) extract a solution structure from a tutorial, and 3) improve the navigation experience for future learners. This thesis demonstrates how learnersourcing can enable more interactive, collaborative, and data-driven learning.
Mugur Mocofan - Mobile learning study cases 2015Diana Andone
Presentation of Mugur Mocofan for the Workshop "Opening Up Education", March 13, 2015, Timisoara Romania, part of Open education Week 2015
http://elearning.upt.ro/workshop-opening-up-education/n-32-70-185/d
ResearchOps Berlin Meetup #2 - UX Maturity - How to Grow User Research in you...ResearchOps Meetup Berlin
In our spring edition of ResearchOps Berlin we will likewise talk about growing and maturing.
Our host FlixBus will give us insights into how they started UX in their organization and how they accelerated research in terms of such as their team set-up or research methods. Luky Primadani, Katja Borchert, Carolina Schomer and Pietro Romeo will provide us with use cases and how they see the next steps in becoming more UX mature.
Generating Mobile Application Onboarding Insights Through Minimalist Instructioncolin gray
Mobile application designers use onboarding task flows to help first time users learn and engage with key application functionality. Although some guidelines for designing onboarding flows have been offered by practitioners, a systematic, research-informed approach is needed. In this paper, we present the creation of a method for designing mobile application onboarding experiences. We used the minimalist instruction framework to engage twelve university students in an iterative set of design and evaluation activities. Participants interacted with a physical prototype of an educational badging mobile application through a semi-structured exploration and reflection activity, bookended by structured mini-interviews. We found that this method facilitated engagement with participants’ meaning-making processes, resulting in useful design insights and the creation of an onboarding task flow. Research opportunities for integrating instructional design and learning approaches in HCI in the context of onboarding are considered.
Designing and Evaluating a Contextual Mobile Application to Support Situated ...HCI Lab
SKERG Seminar on Aug 18, 2015 titled, "Designing and Evaluating a Contextual Mobile Application to Support Situated Learning" by Dr. Abeer Ali Alnuaim, the Vice Chair of the Natural Sciences and Engineering Department at KSU.
Synopsis: This research emerged from seeking to identify ways of getting Human-Computer Interaction Design students into real world environments, similar to those in which they will eventually be designing, thus maximising their ability to identify opportunities for innovation. In helping students learn how to become proficient and innovative designers and developers, it is crucial that their ‘out of the classroom’ experience of the environments in which their designs will be used, augments and extends in-class learning. The aim of this research is to investigate the process of designing a mobile learning application in a blended learning model. This app was designed to support students in a design task and to develop their independent learning and critical thinking skills, as part of their Human-Computer Interaction coursework. It explores the challenges in implementing and deploying such an app in the learning context. A number of evaluations were conducted to assess the design, usability and effectiveness of the app. Promising results show that the app has helped students in developing critical skills for designing technology. However, there were a number of concerns discovered regarding the context of use of a mobile device, including usability of interface elements and acceptability of using the app in a public place.
Nada Sherief, Nan Jiang, Mahmood Hosseini, Keith Phalp, Raian Ali. Crowdsourcing Software Evaluation. The 18th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014). London, UK. 13-14 May 2014.
Usability session @ SEI Universidade do MinhoRuben Goncalves
Usability: n. The degree to which an object, device, software application, etc. is easy to use with no specific training.
In this session we'll start by understanding what is usability and what are the risks (and costs) of creating non-usable apps. Then we'll focus into understand a bit of the science behind usability and how we can use it efficiently.
This session was created for the SEI 2016 of Minho University.
Requirements Engineering for the HumanitiesShawn Day
This workshop explores how requirements engineering can be employed by digital and non-digital humanities scholars (and others) to conceptualise and communicate a research project.
requirementsEngineeringAs the field of digital humanities has evolved, one of the biggest challenges has been getting the marrying technical expertise with humanities scholarly practice to successfully deliver sustainable and sound digital projects. At its core this is a communications exercise. However, to communicate effectively demands an ability to effectively translate, define and find clarity in your own mind.
Similar to Lecture 3: Human-Computer Interaction: HCI Design (2014) (20)
The Rijksmuseum Collection as Linked DataLora Aroyo
Presentation at ISWC2018: http://iswc2018.semanticweb.org/sessions/the-rijksmuseum-collection-as-linked-data/ of our paper published originally in the Semantic Web Journal: http://www.semantic-web-journal.net/content/rijksmuseum-collection-linked-data-2
Many museums are currently providing online access to their collections. The state of the art research in the last decade shows that it is beneficial for institutions to provide their datasets as Linked Data in order to achieve easy cross-referencing, interlinking and integration. In this paper, we present the Rijksmuseum linked dataset (accessible at http://datahub.io/dataset/rijksmuseum), along with collection and vocabulary statistics, as well as lessons learned from the process of converting the collection to Linked Data. The version of March 2016 contains over 350,000 objects, including detailed descriptions and high-quality images released under a public domain license.
FAIRview: Responsible Video Summarization @NYCML'18Lora Aroyo
Presentation at the NYC Media Lab (NYCML2018). There is a growing demand for news videos online, with more consumers preferring to watch the news than read or listen to it. On the publisher side, there is a growing effort to use video summarization technology in order to create easy-to-consume previews (trailers) for different types of broadcast programs. How can we measure the quality of video summaries and their potential to misinform? This workshop will inform participants about automatic video summarization algorithms and how to produce more “representative” video summaries. The research presented is from the FAIRview project and is supported by the Digital News Innovation Fund (DNI Fund), which is part of the Google News Initiative.
DH Benelux 2017 Panel: A Pragmatic Approach to Understanding and Utilising Ev...Lora Aroyo
Lora Aroyo, Chiel van den Akker, Marnix van Berchum, Lodewijk
Petram, Gerard Kuys, Tommaso Caselli, Jacco van Ossenbruggen, Victor de Boer, Sabrina Sauer, Berber Hagedoorn
Crowdsourcing ambiguity aware ground truth - collective intelligence 2017Lora Aroyo
The process of gathering ground truth data through human annotation is a major bottleneck in the use of information extraction methods. Crowdsourcing-based approaches are gaining popularity in the attempt to solve the issues related to the volume of data and lack of annotators. Typically these practices use inter-annotator agreement as a measure of quality. However, this assumption often creates issues in practice. Previous experiments we performed found that inter-annotator disagreement is usually never captured, either because the number of annotators is too small to capture the full diversity of opinion, or because the crowd data is aggregated with metrics that enforce consensus, such as majority vote. These practices create artificial data that is neither general nor reflects the ambiguity inherent in the data.
To address these issues, we proposed the method for crowdsourcing ground truth by harnessing inter-annotator disagreement. We present an alternative approach for crowdsourcing ground truth data that, instead of enforcing an agreement between annotators, captures the ambiguity inherent in semantic annotation through the use of disagreement-aware metrics for aggregating crowdsourcing responses. Based on this principle, we have implemented the CrowdTruth framework for machine-human computation, that first introduced the disagreement-aware metrics and built a pipeline to process crowdsourcing data with these metrics.
In this paper, we apply the CrowdTruth methodology to collect data over a set of diverse tasks: medical relation extraction, Twitter event identification, news event extraction and sound interpretation. We prove that capturing disagreement is essential for acquiring a high-quality ground truth. We achieve this by comparing the quality of the data aggregated with CrowdTruth metrics with a majority vote, a method which enforces consensus among annotators. By applying our analysis over a set of diverse tasks we show that, even though ambiguity manifests differently depending on the task, our theory of inter-annotator disagreement as a property of ambiguity is generalizable.
My ESWC 2017 keynote: Disrupting the Semantic Comfort ZoneLora Aroyo
Ambiguity in interpreting signs is not a new idea, yet the vast majority of research in machine interpretation of signals such as speech, language, images, video, audio, etc., tend to ignore ambiguity. This is evidenced by the fact that metrics for quality of machine understanding rely on a ground truth, in which each instance (a sentence, a photo, a sound clip, etc) is assigned a discrete label, or set of labels, and the machine’s prediction for that instance is compared to the label to determine if it is correct. This determination yields the familiar precision, recall, accuracy, and f-measure metrics, but clearly presupposes that this determination can be made. CrowdTruth is a form of collective intelligence based on a vector representation that accommodates diverse interpretation perspectives and encourages human annotators to disagree with each other, in order to expose latent elements such as ambiguity and worker quality. In other words, CrowdTruth assumes that when annotators disagree on how to label an example, it is because the example is ambiguous, the worker isn’t doing the right thing, or the task itself is not clear. In previous work on CrowdTruth, the focus was on how the disagreement signals from low quality workers and from unclear tasks can be isolated. Recently, we observed that disagreement can also signal ambiguity. The basic hypothesis is that, if workers disagree on the correct label for an example, then it will be more difficult for a machine to classify that example. The elaborate data analysis to determine if the source of the disagreement is ambiguity supports our intuition that low clarity signals ambiguity, while high clarity sentences quite obviously express one or more of the target relations. In this talk I will share the experiences and lessons learned on the path to understanding diversity in human interpretation and the ways to capture it as ground truth to enable machines to deal with such diversity.
Data Science with Human in the Loop @Faculty of Science #Leiden UniversityLora Aroyo
Software systems are becoming ever more intelligent and more useful, but the way we interact with these machines too often reveals that they don’t actually understand people. Knowledge Representation and Semantic Web focus on the scientific challenges involved in providing human knowledge in machine-readable form. However, we observe that various types of human knowledge cannot yet be captured by machines, especially when dealing with wide ranges of real-world tasks and contexts. The key scientific challenge is to provide an approach to capturing human knowledge in a way that is scalable and adequate to real-world needs. Human Computation has begun to scientifically study how human intelligence at scale can be used to methodologically improve machine-based knowledge and data management. My research is focusing on understanding human computation for improving how machine-based systems can acquire, capture and harness human knowledge and thus become even more intelligent. In this talk I will show how the CrowdTruth framework (http://crowdtruth.org) facilitates data collection, processing and analytics of human computation knowledge.
Some project links:
- http://controcurator.org/
- http://crowdtruth.org/
- http://diveproject.beeldengeluid.nl/
- http://vu-amsterdam-web-media-group.github.io/linkflows/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
2. Human-Computer Interaction Course 2014: Lecture 3
Interaction Design Concepts
• Design principles
• Affordance, Constraints, and Mapping
• Feedback
• Visibility
• Conceptual models
• Other factors:
– transfer effects
– cultural associations
– individual differences
Lora Aroyo, Web & Media Group 2
3. Human-Computer Interaction Course 2014: Lecture 3
Four psychological Principles
1. Users See What They Expect to See
2. Users Have Difficulty Focusing on More than
One Activity at a Time
3. It Is Easier to Perceive a Structured Layout
4. It Is Easier to Recognize Something than to
Recall It
Lora Aroyo, Web & Media Group 3
4. Human-Computer Interaction Course 2014: Lecture 3
Principle 1
User’s confusions with the UI:
consistency
exploiting
prior knowledgesee what they want to see
Lora Aroyo, Web & Media Group 4
people don’t read
5. Human-Computer Interaction Course 2014: Lecture 3
Principle 2
Users Have Difficulty Focusing on More Than
One Activity at a Time
– The Cocktail Party Effect
• Principle of Perceptual Organization
– Group alike things together
• Principle of Importance
– Prominent display for important items
Lora Aroyo, Web & Media Group 5
6. Human-Computer Interaction Course 2014: Lecture 3
Principle 3
It Is Easier to Perceive a Structured Layout
• Law of proximity
• Law of similarity
• Law of closure
• Law of continuity
• Law of symmetry
Lora Aroyo, Web & Media Group 6
7. Human-Computer Interaction Course 2014: Lecture 3
Principle 4
It Is Easier to Recognize Something Than to
Recall It
• Principle of recognition
• Knowledge in the head & Knowledge in the
world
Lora Aroyo, Web & Media Group 7
8. Human-Computer Interaction Course 2014: Lecture 3
Principles from Experience:
Affordance
The Principle of Affordance:
– It Should Be Obvious How a Control Is Used
Lora Aroyo, Web & Media Group 8
9. Human-Computer Interaction Course 2014: Lecture 3
Affordances
• The perceived and actual fundamental properties of the
object that determine how it could possibly be used
(Gibson 1977)
• Some affordances are obvious, some learned
• Have suggestions or clues about to how to use these
properties
• Can be dependent on the
– Experience
– Knowledge
– Culture of the actor
• Can make an action easy or difficult
Lora Aroyo, Web & Media Group 9
11. Human-Computer Interaction Course 2014: Lecture 3
Based on slide by Saul Greenberg
In graphical, screen-based interfaces:
– designer has control over perceived affordances
• display screen
• pointing device
• selection buttons
• keyboard
– afford touching
– pointing
– looking
– clicking on every pixel of the display
Affordances in Screen-based UI
Lora Aroyo, Web & Media Group 11
12. Human-Computer Interaction Course 2014: Lecture 3
Constraints
• Restricting interaction to reduce errors
Lora Aroyo, Web & Media Group 12
13. Human-Computer Interaction Course 2014: Lecture 3
Mappings
• Relationships between controls and their results
• For devices, appliances
– natural mappings use constraints and correspondences in the
physical world
• Controls on a stove
• Controls on a car
– Radio volume
» Knob goes left to right to control volume
» Should also go in and out for front to rear speakers
• For computer UI design
– mapping between controls and their actions on the computer
• Controls on a digital watch
• Controls on a word processor program
Lora Aroyo, Web & Media Group 13
14. Human-Computer Interaction Course 2014: Lecture 3
Slide adapted from Saul Greenberg
Mapping Controls to Physical Outcomes
back
right
front
left
back
left
front
right
24 possibilities, requires:
-visible labels
-memory
arbitrary full mapping
back front front back
2 possibilities per side
=4 total possibilities
paired
15. Human-Computer Interaction Course 2014: Lecture 3
Based on slide by Saul Greenberg
Transfer Effects
People transfer their expectations from familiar
objects to similar new ones
– positive transfer: previous experience applies to new
situation
– negative transfer: previous experience conflicts with
new situation
16. Human-Computer Interaction Course 2014: Lecture 3
Visibility
• Making it obvious which actions are available
Consistency
• Similar functions are performed in the same way
• Identical terminology for identical operations
Feedback
• Send information about what is happening back
to the user
Lora Aroyo, Web & Media Group 16
17. Human-Computer Interaction Course 2014: Lecture 3
Feedback
The Principle of Feedback:
– It Should Be Obvious When a Control Has Been Used
Lora Aroyo, Web & Media Group 17
21. Human-Computer Interaction Course 2014: Lecture 3
The Principle of Visibility:
– It Should Be Obvious What a Control Is Used For
Visibility (perceived affordance)
Lora Aroyo, Web & Media Group 21
22. Human-Computer Interaction Course 2014: Lecture 3
Consistency
Uniformity in appearance, placement, and behavior
Lora Aroyo, Web & Media Group 22
25. Human-Computer Interaction Course 2014: Lecture 3
Cultural Associations
• Groups of people learn idioms
– red = danger, green = go
• But these differ in different places
– Light switches
• America: down is off
• Britain: down is on
– Faucets
• America: counter-clockwise is on
• Britain: counter-clockwise is off
Lora Aroyo, Web & Media Group 25
28. Human-Computer Interaction Course 2014: Lecture 3
What’s Wrong?
Mapping – we’d
expect to go
off, low, high
Feedback – when
lamp is on, hard to tell
from switch position
whether it’s
in low or high mode
Lora Aroyo, Web & Media Group 28
30. Human-Computer Interaction Course 2014: Lecture 3
What’s Wrong?
CONSISTENCY: Different procedure for setting different
intervals of time.
FEEDBACK: When timer is at a time under 15 minutes, hard
to tell if it’s actually on or not (silent failure).
Lora Aroyo, Web & Media Group 30
34. Human-Computer Interaction Course 2014: Lecture 3
Standards
• ISO 9241: Ergonomic requirements for office work with
visual display terminals (VDTs)
– defines usability as effectiveness, efficiency and satisfaction with
which users accomplish tasks
• ISO 14915: Software ergonomics for multimedia user
interfaces
– guidelines for design of multimedia interfaces
• ISO 13407: Human-centered design processes for
interactive systems
– management guidance through the development life-cycle
• ISO/CD 20282: Ease of operation of everyday products
– four-part standard to ensure products can be used as consumers
expect them to
Lora Aroyo, Web & Media Group 34
35. Human-Computer Interaction Course 2014: Lecture 3
Guidelines (1/2)
• For optimizing the user experience
– abstract guidelines (principles) applicable
during early life cycle activities
– detailed guidelines (style guides) applicable
during later life cycle activities
• http://www.usability.gov/pdfs/guidelines.html
Lora Aroyo, Web & Media Group 35
36. Human-Computer Interaction Course 2014: Lecture 3
Guidelines (2/2)
• Accessibility
• Page Layout
– Navigation, Scrolling and Paging, Headings, Titles,
and Labels
• Content Organization
– Text Appearance
– Lists
– Links
– Screen–Based Controls (Widgets)
– Graphics, Images, and Multimedia
• Search
Lora Aroyo, Web & Media Group 36
45. Human-Computer Interaction Course 2014: Lecture 3
Tolerance
Prevent user from making mistakes
– Prevention
– Recoverability
• Forward error recovery - system accepts the error
and helps the user to accomplish their goal
• Backward error recovery – undo the effects of the
previous interaction
Lora Aroyo, Web & Media Group 45
47. Human-Computer Interaction Course 2014: Lecture 3
Location on the screen
Lora Aroyo, Web & Media Group 47
Mind the typical Ads location
Use typical locations
48. Human-Computer Interaction Course 2014: Lecture 3
W3C Accessibility Guidelines
W3C Web Content Accessibility Guidelines
http://www.w3.org/TR/WAI-WEBCONTENT/
1. Provide alternatives to auditory and visual content
2. Don’t rely on color alone
3. Use markup and style sheets properly
4. Clarify natural language usage
• abbreviation and foreign text
5. Create tables that transform gracefully
6. New technology pages transform gracefully
• accessible when newer technologies are not supported
Lora Aroyo, Web & Media Group 48
49. Human-Computer Interaction Course 2014: Lecture 3
W3C Accessibility Guidelines
7. Ensure user control of time-sensitive content
– pausing/stoping of animation, scrolling, etc.
8. Ensure direct accessibility of embedded UI
9. Design for device independence
– various input devices
10. Use interim solutions (for older browsers to function)
11. Use W3C technologies and guidelines
12. Provide context and orientation information
13. Provide clear navigation mechanisms
14. Ensure that documents are clear and simple
Lora Aroyo, Web & Media Group 49
50. Human-Computer Interaction Course 2014: Lecture 3
Style Guides
• A typical guide includes:
– description of required interaction styles & user
interface controls
– guidance on when and how to use the various styles
or controls
– illustrations of styles and controls
– screen templates
Lora Aroyo, Web & Media Group 50
51. Human-Computer Interaction Course 2014: Lecture 3
Example Style Guides
• Apple Interface Guidelines
– http://developer.apple.com/DOCUMENTATION/UserExperience/
Conceptual/AppleHIGuidelines/
• Microsoft Windows XP UI Guidelines
– http://www.microsoft.com/whdc/Resources/windowsxp/default.mspx
• IBM’s Common User Access
– http://en.wikipedia.org/wiki/Common_User_Access
• Motif Style Guide
– http://www.opengroup.org/motif/motif.data.sheet.htm
• Sun Microsystems’ Java Look and Feel
– http://java.sun.com/products/jlf/ed2/book/HIGTitle.html
Lora Aroyo, Web & Media Group 51
53. Human-Computer Interaction Course 2014: Lecture 3
Design Rationale
• Design rationale is information that explains why a
system is the way it is
• Benefits of design rationale
– communication throughout life cycle
– reuse of design knowledge across products
– enforces design discipline
– presents arguments for design trade-offs
– organizes potentially large design space
– capturing contextual information
• Process-oriented
– preserves order of deliberation and decision making
• Structure-oriented
– emphasizes post hoc structuring of considered design
alternatives
Lora Aroyo, Web & Media Group 53