The document provides an introduction to human-computer interaction (HCI). It discusses key topics in HCI including the relationship between humans and computers, components of HCI involving humans, computers and their interaction, and models of interaction. The document outlines several learning outcomes related to understanding fundamental HCI concepts such as user interface design, interaction styles, and how technology influences interfaces. It also covers input and output devices, human memory and capabilities, and the importance of considering human factors in design.
Human-computer interaction (HCI) is a multidisciplinary field of study focusing on the design of computer technology and, in particular, the interaction between humans (the users) and computers. While initially concerned with computers, HCI has since expanded to cover almost all forms of information technology design
The document discusses human-computer interaction and interaction models. It describes Norman's execution-evaluation cycle, which outlines 7 stages of interaction: establishing a goal, formulating intention, specifying actions, executing actions, perceiving system state, interpreting state, and evaluating state. The document also discusses Abowd and Beale's interaction framework, which includes the system, user, input language, and output language. It describes how interaction involves translating between these. Finally, the document discusses ergonomic considerations for interaction, like arrangement of controls and the physical environment.
Human Computer Interaction (HCI) is an interdisciplinary field that focuses on the design, evaluation and implementation of interactive computing systems for human use, and the study of major phenomena surrounding them. The goal of HCI is to improve the interaction between users and computers by making computers more user-friendly and responsive to user needs. Key aspects of HCI include usability testing interfaces for effectiveness, efficiency and satisfaction. Emerging areas of HCI research include pervasive/ubiquitous computing which embeds technology in everyday objects and ambient intelligence which aims to make technology invisible to users.
PPT based on Human Computer Interface whch is easier to understand and carryout the presentation in conferences..if u need documentation please make a comment down...enjoy the ppt..have a good luck
HCI 3e - Ch 14: Communication and collaboration modelsAlan Dix
Chapter 14: Communication and collaboration models
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
This document introduces human-computer interaction (HCI). It defines HCI as a field that deals with humans, computers, and the interaction between them. The objective of HCI is to design interactive systems that support people in their everyday lives. HCI considers both the user and the computer, where the user can be an individual or group, and the computer encompasses any technology from desktops to embedded systems. Interaction in HCI refers to any communication between the user and computer, whether direct or indirect. The document provides a formal definition of HCI and discusses elements of a successful HCI product.
The goal of this project is to provide a platform that allows for communication between able-bodied and disabled people or between computers and human beings. There has been great emphasis on Human-Computer-Interaction research to create easy-to-use interfaces by directly employing natural communication and manipulation skills of humans . As an important part of the body, recognizing hand gesture is very important for Human-Computer-Interaction. In recent years, there has been a tremendous amount of research on hand gesture recognition
Human-Computer Interaction is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them” -ACM/IEEE
Human-computer interaction (HCI) is a multidisciplinary field of study focusing on the design of computer technology and, in particular, the interaction between humans (the users) and computers. While initially concerned with computers, HCI has since expanded to cover almost all forms of information technology design
The document discusses human-computer interaction and interaction models. It describes Norman's execution-evaluation cycle, which outlines 7 stages of interaction: establishing a goal, formulating intention, specifying actions, executing actions, perceiving system state, interpreting state, and evaluating state. The document also discusses Abowd and Beale's interaction framework, which includes the system, user, input language, and output language. It describes how interaction involves translating between these. Finally, the document discusses ergonomic considerations for interaction, like arrangement of controls and the physical environment.
Human Computer Interaction (HCI) is an interdisciplinary field that focuses on the design, evaluation and implementation of interactive computing systems for human use, and the study of major phenomena surrounding them. The goal of HCI is to improve the interaction between users and computers by making computers more user-friendly and responsive to user needs. Key aspects of HCI include usability testing interfaces for effectiveness, efficiency and satisfaction. Emerging areas of HCI research include pervasive/ubiquitous computing which embeds technology in everyday objects and ambient intelligence which aims to make technology invisible to users.
PPT based on Human Computer Interface whch is easier to understand and carryout the presentation in conferences..if u need documentation please make a comment down...enjoy the ppt..have a good luck
HCI 3e - Ch 14: Communication and collaboration modelsAlan Dix
Chapter 14: Communication and collaboration models
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
This document introduces human-computer interaction (HCI). It defines HCI as a field that deals with humans, computers, and the interaction between them. The objective of HCI is to design interactive systems that support people in their everyday lives. HCI considers both the user and the computer, where the user can be an individual or group, and the computer encompasses any technology from desktops to embedded systems. Interaction in HCI refers to any communication between the user and computer, whether direct or indirect. The document provides a formal definition of HCI and discusses elements of a successful HCI product.
The goal of this project is to provide a platform that allows for communication between able-bodied and disabled people or between computers and human beings. There has been great emphasis on Human-Computer-Interaction research to create easy-to-use interfaces by directly employing natural communication and manipulation skills of humans . As an important part of the body, recognizing hand gesture is very important for Human-Computer-Interaction. In recent years, there has been a tremendous amount of research on hand gesture recognition
Human-Computer Interaction is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them” -ACM/IEEE
Chapter 4: Paradigms
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 12: Cognitive models
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
This document discusses human-computer interaction (HCI). It defines HCI as the study of how humans interact with computer systems. The history and evolution of HCI is covered, from its origins in the 1970s-1990s to investigate desktop usability, to the modern fields of user experience (UX) design, human-robot interaction, and human data interaction. Key differences between HCI as a field of study and UX as an application of HCI theory are outlined. Finally, potential career paths for HCI graduates such as user researcher, product designer, and interface engineer are presented.
The document discusses various input and output devices used in computer systems. It describes keyboards, mice, touchscreens, displays, printers and scanners. It explains how these devices work and how they allow interaction with computers. Different interaction techniques are suitable depending on the devices used, such as direct interaction with touchscreens versus indirect interaction with mice.
Human Computer Interaction (HCI) is the study of how humans interact with computers and how to design interfaces so that users can interact with systems effectively, efficiently and with satisfaction. HCI aims to make computers more usable by understanding users and designing appropriate input/output devices and interaction styles. The goals of HCI include improving safety, utility, effectiveness and efficiency of computer systems to benefit both users and organizations.
The document discusses various topics related to interaction design basics including goals and constraints of design, understanding users through personas and scenarios, prototyping and iteration, navigation design, screen design principles, and more. It emphasizes the importance of an user-centered design approach and provides examples and guidelines to help design intuitive interactions.
The document discusses key aspects of human-computer interaction (HCI), including its importance, elements, interaction styles, input and output devices, and eye tracking techniques. HCI aims to design human-centered systems by understanding users' visual, intellectual, motor, and memory capabilities. Serious HCI research promises to fundamentally change computing by creating excellent user interfaces. Understanding users and conducting evaluations are important for practitioners. Common interaction styles include command lines, menus, and WIMP interfaces. Input devices include keyboards while outputs include displays, and humans interact visually, auditorily, and through touch. Various eye tracking methods aim to measure gaze, such as electrooculography and video-based techniques. HCI is an interdisciplinary
The document provides an introduction to human-computer interaction (HCI). It defines HCI as the study of the interaction between humans and computers, including the design and evaluation of interactive systems. The document discusses why HCI is important, focusing on creating usable, intuitive systems. It also outlines some of the historical roots of HCI in fields like computer graphics, operating systems, and cognitive psychology. Finally, it discusses potential future developments in HCI, such as ubiquitous computing, mixed media interfaces, and more natural human-computer interaction.
This document provides an overview of human-computer interaction (HCI) from the perspective of a student group consisting of Buwenaka, Piyumika, Thilan, Sachith, and Nuwan. It defines HCI as the discipline concerned with designing, evaluating, and implementing interactive computing systems for human use. The document discusses key aspects of HCI like the importance of understanding how humans and computers interact, defining user interfaces, principles of HCI design, the history and importance of HCI, and different types of user interfaces.
The document discusses models of interaction between users and computer systems. It describes Norman's seven-stage model of interaction which focuses on the user's perspective when interacting with an interface. It also discusses Abowd and Beale's framework which identifies the major components involved in interaction, including user input and system output. Different styles of interaction are examined, such as command line interfaces, menus, and WIMP interfaces.
HCI 3e - Ch 16: Dialogue notations and designAlan Dix
Chapter 16: Dialogue notations and design
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
The document discusses human-computer interaction in the software engineering process. It describes the typical lifecycle of software development, including requirements specification, design, implementation, testing, and maintenance. For interactive systems, a linear waterfall model is not suitable due to the need for extensive user testing and feedback. Usability engineering aims to make usability measurable by specifying requirements. Iterative design and prototyping help overcome incomplete requirements through simulations and prototypes to gather user feedback. Design rationale records the reasons for design decisions to aid communication, reuse of knowledge, and evaluation of tradeoffs.
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirementsAlan Dix
Chapter 13: Socio-organizational issues and stakeholder requirements
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Human Computer Interaction Notes 176.pdfvijaykumarK44
1. Human-computer interaction (HCI) studies the design and use of computer technology and aims to ensure interactions between humans and computers are as usable and understandable as possible.
2. A key goal of HCI is to minimize the barriers between what users want to accomplish and the computer's understanding of the task.
3. Early computer interfaces involved command-line text input which was difficult for many users. Graphical user interfaces using icons, windows, and pointing devices like mice revolutionized human-computer interaction by making interactions more intuitive and direct.
I made this with my 3 partners for my CEC marks in 3rd sem of MCA. It includes information about HCI, definition, types, how it works, queries of it etc.
One can get idea easily about HCI after refering this presentation.
This document provides an overview of human-computer interaction (HCI) as an academic discipline and design field. It discusses what students will learn, including understanding systems and humans through analysis, and applying that understanding to design solutions with a focus on real users. It outlines topic areas like design processes, underlying theories of human cognition, and specific domains. It also explores the roots of HCI in fields like psychology and computing. Finally, it discusses changes in the field with increasing device multiplicity, ubiquitous and wearable technologies, and a shift from computer dialogue to dialogue with the world.
This document provides an overview of human information processing and cognition. It discusses how humans receive and interpret visual and auditory information. It describes short-term and long-term memory, including different memory models. It also covers topics like problem solving, reasoning, decision making, and how emotion can influence cognitive abilities.
Ubiquitous computing (ubicomp) involves integrating computation into everyday objects and environments. It aims to make many computers available throughout the physical world and make them effectively invisible to the user. Ubicomp enhances computer use by bringing computing capabilities to any device or location. Key aspects of ubicomp include ubiquity, adaptation to the environment, and intuitive interfaces. Ubicomp raises issues around privacy, adaptability to different contexts, and availability in various locations. It involves context-aware computing that tailors services to a user's location, activities, and environment.
Human-computer interaction is a multidisciplinary field that focuses on designing computer technology for interaction between humans and computers. The document discusses the history and future of human-computer interaction. It notes that early interactions were complex due to the size of computers, cost, and lack of intention for mass usage. Developments in areas like microprocessors, virtual reality, robotics, smart assistants, and smart gadgets have led to more intuitive present-day interactions. The future of human-computer interaction is expected to be powered by advancements in artificial intelligence.
Chapter 7: Design rules
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
This document provides an overview of human-computer interaction (HCI). It begins with early computing in 1945, which involved large specialized machines. As computers developed, they became smaller, cheaper, and more widely used. HCI emerged as a field to study the interaction between humans and computers. Key aspects of HCI include understanding human abilities and limitations as well as the computer system components that enable interaction such as input devices, output displays, and memory. The document explores various interaction paradigms that have developed over time including command lines, menus, natural language interfaces, and graphical user interfaces. It provides examples of how interaction involves both the human and computer systems working together.
Chapter 4: Paradigms
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Chapter 12: Cognitive models
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
This document discusses human-computer interaction (HCI). It defines HCI as the study of how humans interact with computer systems. The history and evolution of HCI is covered, from its origins in the 1970s-1990s to investigate desktop usability, to the modern fields of user experience (UX) design, human-robot interaction, and human data interaction. Key differences between HCI as a field of study and UX as an application of HCI theory are outlined. Finally, potential career paths for HCI graduates such as user researcher, product designer, and interface engineer are presented.
The document discusses various input and output devices used in computer systems. It describes keyboards, mice, touchscreens, displays, printers and scanners. It explains how these devices work and how they allow interaction with computers. Different interaction techniques are suitable depending on the devices used, such as direct interaction with touchscreens versus indirect interaction with mice.
Human Computer Interaction (HCI) is the study of how humans interact with computers and how to design interfaces so that users can interact with systems effectively, efficiently and with satisfaction. HCI aims to make computers more usable by understanding users and designing appropriate input/output devices and interaction styles. The goals of HCI include improving safety, utility, effectiveness and efficiency of computer systems to benefit both users and organizations.
The document discusses various topics related to interaction design basics including goals and constraints of design, understanding users through personas and scenarios, prototyping and iteration, navigation design, screen design principles, and more. It emphasizes the importance of an user-centered design approach and provides examples and guidelines to help design intuitive interactions.
The document discusses key aspects of human-computer interaction (HCI), including its importance, elements, interaction styles, input and output devices, and eye tracking techniques. HCI aims to design human-centered systems by understanding users' visual, intellectual, motor, and memory capabilities. Serious HCI research promises to fundamentally change computing by creating excellent user interfaces. Understanding users and conducting evaluations are important for practitioners. Common interaction styles include command lines, menus, and WIMP interfaces. Input devices include keyboards while outputs include displays, and humans interact visually, auditorily, and through touch. Various eye tracking methods aim to measure gaze, such as electrooculography and video-based techniques. HCI is an interdisciplinary
The document provides an introduction to human-computer interaction (HCI). It defines HCI as the study of the interaction between humans and computers, including the design and evaluation of interactive systems. The document discusses why HCI is important, focusing on creating usable, intuitive systems. It also outlines some of the historical roots of HCI in fields like computer graphics, operating systems, and cognitive psychology. Finally, it discusses potential future developments in HCI, such as ubiquitous computing, mixed media interfaces, and more natural human-computer interaction.
This document provides an overview of human-computer interaction (HCI) from the perspective of a student group consisting of Buwenaka, Piyumika, Thilan, Sachith, and Nuwan. It defines HCI as the discipline concerned with designing, evaluating, and implementing interactive computing systems for human use. The document discusses key aspects of HCI like the importance of understanding how humans and computers interact, defining user interfaces, principles of HCI design, the history and importance of HCI, and different types of user interfaces.
The document discusses models of interaction between users and computer systems. It describes Norman's seven-stage model of interaction which focuses on the user's perspective when interacting with an interface. It also discusses Abowd and Beale's framework which identifies the major components involved in interaction, including user input and system output. Different styles of interaction are examined, such as command line interfaces, menus, and WIMP interfaces.
HCI 3e - Ch 16: Dialogue notations and designAlan Dix
Chapter 16: Dialogue notations and design
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
The document discusses human-computer interaction in the software engineering process. It describes the typical lifecycle of software development, including requirements specification, design, implementation, testing, and maintenance. For interactive systems, a linear waterfall model is not suitable due to the need for extensive user testing and feedback. Usability engineering aims to make usability measurable by specifying requirements. Iterative design and prototyping help overcome incomplete requirements through simulations and prototypes to gather user feedback. Design rationale records the reasons for design decisions to aid communication, reuse of knowledge, and evaluation of tradeoffs.
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirementsAlan Dix
Chapter 13: Socio-organizational issues and stakeholder requirements
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
Human Computer Interaction Notes 176.pdfvijaykumarK44
1. Human-computer interaction (HCI) studies the design and use of computer technology and aims to ensure interactions between humans and computers are as usable and understandable as possible.
2. A key goal of HCI is to minimize the barriers between what users want to accomplish and the computer's understanding of the task.
3. Early computer interfaces involved command-line text input which was difficult for many users. Graphical user interfaces using icons, windows, and pointing devices like mice revolutionized human-computer interaction by making interactions more intuitive and direct.
I made this with my 3 partners for my CEC marks in 3rd sem of MCA. It includes information about HCI, definition, types, how it works, queries of it etc.
One can get idea easily about HCI after refering this presentation.
This document provides an overview of human-computer interaction (HCI) as an academic discipline and design field. It discusses what students will learn, including understanding systems and humans through analysis, and applying that understanding to design solutions with a focus on real users. It outlines topic areas like design processes, underlying theories of human cognition, and specific domains. It also explores the roots of HCI in fields like psychology and computing. Finally, it discusses changes in the field with increasing device multiplicity, ubiquitous and wearable technologies, and a shift from computer dialogue to dialogue with the world.
This document provides an overview of human information processing and cognition. It discusses how humans receive and interpret visual and auditory information. It describes short-term and long-term memory, including different memory models. It also covers topics like problem solving, reasoning, decision making, and how emotion can influence cognitive abilities.
Ubiquitous computing (ubicomp) involves integrating computation into everyday objects and environments. It aims to make many computers available throughout the physical world and make them effectively invisible to the user. Ubicomp enhances computer use by bringing computing capabilities to any device or location. Key aspects of ubicomp include ubiquity, adaptation to the environment, and intuitive interfaces. Ubicomp raises issues around privacy, adaptability to different contexts, and availability in various locations. It involves context-aware computing that tailors services to a user's location, activities, and environment.
Human-computer interaction is a multidisciplinary field that focuses on designing computer technology for interaction between humans and computers. The document discusses the history and future of human-computer interaction. It notes that early interactions were complex due to the size of computers, cost, and lack of intention for mass usage. Developments in areas like microprocessors, virtual reality, robotics, smart assistants, and smart gadgets have led to more intuitive present-day interactions. The future of human-computer interaction is expected to be powered by advancements in artificial intelligence.
Chapter 7: Design rules
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
This document provides an overview of human-computer interaction (HCI). It begins with early computing in 1945, which involved large specialized machines. As computers developed, they became smaller, cheaper, and more widely used. HCI emerged as a field to study the interaction between humans and computers. Key aspects of HCI include understanding human abilities and limitations as well as the computer system components that enable interaction such as input devices, output displays, and memory. The document explores various interaction paradigms that have developed over time including command lines, menus, natural language interfaces, and graphical user interfaces. It provides examples of how interaction involves both the human and computer systems working together.
Blue Eyes technology aims to create machines that have human-like perceptual and sensory abilities. It uses cameras and microphones to identify user actions and emotions. The technology is being developed by researchers at Poznan University of Technology and Microsoft to build machines that can understand emotions, listen, talk, verify identity, and interact naturally with humans. Some applications include using eye tracking to improve pointing and selection, speech recognition to control devices with voice commands, and monitoring user focus and interests to provide relevant information on screens.
Blue Eyes technology aims to create machines that have human-like perceptual and sensory abilities. It uses cameras and microphones to identify user actions and emotions. The technology is being developed by researchers at Poznan University of Technology and Microsoft to build machines that can understand emotions, listen, talk, verify identity, and interact naturally with humans. Some applications include using eye tracking to improve pointing and selection, speech recognition to control devices with voice commands, and monitoring user focus and interests to provide relevant information on screens.
The document discusses various types of human-computer interfaces. It describes interfaces such as command line interfaces, menu driven interfaces, and graphical user interfaces. It outlines the advantages and disadvantages of each type. The document also discusses other interfaces including natural language interfaces, virtual reality interfaces, and interfaces that can help disabled users interact with computers.
The document discusses universal design and emerging technologies in interface design. It defines universal design as designing systems to be used by anyone in any circumstance. It describes seven principles of universal design for interactive systems including equitable use, flexibility in use, and perceptible information. It also discusses multimodal technology, accessibility features like narrators, and emerging technologies like wearable computing and their impact.
Pen-based systems use a pen or stylus for inputting data by writing on a special pad or directly on the screen. They are commonly used for collecting data or inputting signatures. Touch-screen systems accept input directly through the monitor by touching options with a finger, and are well-suited for simple applications like ATMs or kiosks. Alternative input devices also include game controllers, scanners, microphones, webcams and digital cameras which provide specialized input for tasks like gaming, document scanning, audio/video recording and photography.
This document discusses different methods of input for information technology including voice input, touch screens, pen input, and video input. Voice input allows a user to control a computer through speech by speaking into a microphone. Touch screens are sensitive displays that allow direct input through touch gestures. Pen input utilizes a stylus to write or draw directly on the screen. Video input is the process of capturing video and images and storing them on a computer through devices like webcams.
This document discusses touch screen technology. It provides a brief history, describing the development of early touch sensors in the 1970s and the growing popularity and use of touch screens. It then describes the main touch screen technologies - resistive, capacitive, and interruptive - and explains the basic components of a touch screen system, including the touch sensor, controller, and software driver. Finally, it outlines some key advantages of touch screen technology, such as its usefulness for public displays, retail/restaurant systems, customer self-service, control systems, computer-based training, and assistive technology applications.
This document discusses various aspects of human-computer interaction (HCI) and user interface design. It begins by defining HCI and its goals of making systems useful, usable and satisfying to users. It then discusses why good UI design is important, covering both explicit and implicit forms of interaction. The document outlines challenges in areas like ubiquitous access and personalized spaces. It analyzes interfaces for different devices like PCs, mobile phones, games consoles and remote controls. It also covers multimodal interaction, gestures, wearable and implanted devices. Finally, it briefly introduces the human-centered design process.
The document discusses various input and output devices used in computer systems. It describes keyboards, mice, touchscreens, and other pointing devices used for input. It also covers display technologies like CRT and LCD screens, as well as emerging technologies like digital paper. The document explores how these devices enable different styles of interaction and discusses some of the technical considerations around devices like resolution, color depth, and health concerns related to older display technologies.
Chapter 2: The computer
from
Dix, Finlay, Abowd and Beale (2004).
Human-Computer Interaction, third edition.
Prentice Hall. ISBN 0-13-239864-8.
http://www.hcibook.com/e3/
The document discusses various input and output devices used in computer systems. It describes keyboards, mice, touchscreens, and other pointing devices used for input. It also covers display technologies like CRT and LCD screens, as well as emerging technologies like digital paper. The document explores how these devices enable different styles of interaction and discusses some of the technical considerations around devices like resolution, color depth, and health impacts of older display technologies.
Blue Eyes technology enables computers to understand and sense human emotions and behaviors by collecting data from sensors. It was developed by IBM researchers starting in 1997. The technology uses an emotion mouse to detect emotions through touch, artificial intelligence for speech recognition, eye tracking sensors to understand user focus, and Bluetooth for wireless communication between sensors and computers. The goal is for machines to interact with humans more naturally by understanding emotions and implicit commands instead of just explicit commands.
The document summarizes the Blue Eye Technology, which aims to give computers human-like perceptual and sensory abilities. It discusses how the technology uses non-obtrusive sensors like cameras and microphones to recognize users and understand their actions, emotions, and needs. The Simple User Interest Tracker (SUITOR) is highlighted as an application that can observe users and fetch more relevant information based on what they are viewing. In conclusion, the technology promises to make human-computer interaction more convenient and personalized by simplifying tasks through adaptive sensing capabilities.
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014Journal For Research
This paper describes a command interface for games based on hand gestures and voice command defined by postures, movement and location. The system uses computer vision requiring no sensors or markers by the user. In voice command the speech recognizer, recognize the input from the user. It stores and passes command to the game, action takes place. We propose a simple architecture for performing real time colour detection and motion tracking using a webcam. The next step is to track the motion of the specified colours and the resulting actions are given as input commands to the system. We specify blue colour for motion tracking and green colour for mouse pointer. The speech recognition is the process of automatically recognizing a certain word spoken by a particular speaker based on individual information included in speech waves. This application will help in reduction in hardware requirement and can be implemented in other electronic devices also.
class lecture on input & output devices(part1)sharif_12
The document discusses various common input devices used with computers including keyboards, mice, touchpads, and numeric keypads. It defines input devices as hardware that allows data to be entered into a computer. Keyboards are described as the most widely used input device for entering text, numbers, and commands. Mice and touchpads are pointing devices that control screen cursors. Numeric keypads are specialized for fast entry of numbers. Advantages and disadvantages of each type of input device are provided.
Ch 1 introduction and 2 computer software 1rjsuthar56
The document summarizes the evolution of computers over five generations from the 1950s to present:
- The first generation used punched cards for input and vacuum tubes for memory and storage. High-level programming languages were developed.
- The second generation replaced vacuum tubes with transistors, enabling faster and more reliable computers. Timesharing allowed multiple users to access mainframe computers remotely via terminals.
- The third generation saw the development of integrated circuits and microprocessors, allowing for smaller computers. Networks like ARPANET, the precursor to the Internet, were established.
- The fourth generation featured microcomputers powered by microprocessors like the Intel 8080. The IBM PC launched in 1981, popularizing the use of microcomputers
The document provides an overview of lecture 01 on human computer interaction. It discusses key topics like the human, computer, and interaction; usability paradigms and principles; design basics and rules; user experience design; and prototyping and evaluation techniques. The objectives are to understand what HCI is and why it is important, understand usability and how to design digital products to meet people's needs, and carry out a complete user-centered design process.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
How to Make a Field Mandatory in Odoo 17Celine George
In Odoo, making a field required can be done through both Python code and XML views. When you set the required attribute to True in Python code, it makes the field required across all views where it's used. Conversely, when you set the required attribute in XML views, it makes the field required only in the context of that particular view.
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
2. LEARNING OUTCOME
Relate the interaction between human and computer
IDENTIFY THE FUNDAMENTAL OF COMPONENTS OF HCI
DESCRIBE THE IMPORTANCE OF USER INTERFACE DESIGN
DISCUSS HUMAN INTERACTION USAGE
EXPLAIN HOW COMPUTER TECHNOLOGY INFLUENCES THE
NATURE OF INTERACTION AND STYLE OF THE INTERFACE
DESCRIBE THE VARIOUS INTERFACE STYLES
EXPLAIN THE ROLE OF ERGONOMICS IN INTERFACE DESIGN
DESCRIBE MODELS OF INTERACTION
3. 1.1 RELATE THE INTERACTION
BETWEEN HUMAN AND COMPUTER
“Human-computer interaction is a discipline concerned with
the design, evaluation and implementation of interactive
computing systems for human use and with the study of major
phenomena surrounding them.”
[ ACM SIGCHI Curricula for Human-Computer Interaction ]
WHAT IS HCI?
4.
5.
6. WHY IS HCI IMPORTANT?
The study of our interface with information.
It is not just ‘how big should I make buttons’ or ‘how to
layout menu choices’
HCI can assist in building products/systems that are
Useful, accomplish what’s required
Usable, do it easily and naturally
Used, make people want to use them
It can affect
Effectiveness
Productivity
Morale
Safety
7. WHY IS HCI IMPORTANT?
Increasing participation
Ensuring interfaces and systems are accessible.
International Directives and Standards (EC Directive
90/270/EEC; ISO9241) place requirements on systems in
terms of usability
Safety and Security
improve productivity of individuals and organizations -
cost reduction, improve support, organizational
enhancement
human responses : satisfaction, no machine stress
organization: quality and initiative, flexibility
8. brainstorming
Take 5 minutes for everyone to write down one common
device with substantial HCI design choices and discuss
with partner the pros and cons. How does it affect you or
other users?
9. My Choice
iPod by Apple Computers
Pros:
portable
power
ease of use
# of controls
Cons:
scratches easily
no speech for car use
proprietary
11. Human (User)
Humans are limited in their capacity to process information. This
has important implications for design.
Information is received and responses given via a number of
input and output channels :
Visual Channel - relating to seeing or sight
Auditory Channel - relating to the sense of hearing
Haptic Channel - relating to the sense of touch
Movement
Information is stored in memory of :
Sensory memory
ultra-short-memory
1/5 – ½ seconds
Short-term (Working) memory
10 – 15 seconds – holds small amount of information; typically around 7 items or less
Long-term memory
12. brainstorming
Which line is longer?
Try to brainstorming, which one is longer? A or B? Explain
your answer.
13. Computer
There is not much difference in Human
and Computer
Computer consist of :
Input Devices
Output Devices
Memory
Processing
Computer can be :
Mobile,
Spacecraft Cockpit,
Microwave Oven,
VCRs etc
“HCI is about how to allow humans and computers to interact
toward some common goal”
Humans and computers are similar in that they both :
Receive input
Produce output
Process information in between
input human output
input computer
output
14. Interaction
Communication between The User & The System
Physical Interaction – Interaction devices
Conceptual Interaction – Interaction styles
Interaction Framework
15. The Importance Of User
Interface Design
The ONLY contact medium that the user has with
the system
The interface is the system designer’s way of
representing the system to the user; known as
conceptual model
If the system has the confused interface – user
may chose not to use the system at all OR will use
it incorrectly
A well-designed interface can increase
productivity
16. The Model Human
Processor
Consists 3 interacting systems (each
of it has its own memory & processor)
I. Perceptual processor
Outputs into audio storage
Outputs into visual storage
II. Cognitive processor
Outputs into working memory
Has access to :
Working memory (short-term
memory)
Long term memory
III. Motor processor
Carries out actions
17. Human Capabilities &
Limitation
Understanding human needs knowledge from many fields
Processing information by human can be modelled
Human physiology plays an important role for designing system
Vision of human
Eye tracking, eyes can be tricked, pre attentive processing
Gestalt psychology
Hearing
Audibility, pain threshold, spatial hearing
Touch
Input & output
Memory
Sensorial, short term (working), and long term memory
Short-term memory 7 +- 2 chunks
Long term memory : episodic and structural memory
Generate new information : deduction, induction, abduction
20. Keyboards
Most common text input device
Allows rapid entry of text by experienced users
Keypress closes connection, causing a character
code to be sent
Usually connected by cable, but can be wireless
21. layout – QWERTY
Standardised layout
but …
◦ non-alphanumeric keys are placed differently
◦ accented symbols needed for different scripts
◦ minor differences between UK and USA keyboards
QWERTY arrangement not optimal for typing
– layout to prevent typewriters jamming!
Alternative designs allow faster typing but large social
base of QWERTY typists produces reluctance to change.
2 3 4 5 6 7 8 9 0
Q W E R T Y U I
1
O P
S D F H J L
A G K
Z X C V B N M , .
SPACE
22. Phone Pad and T9 entry
use numeric keys with
multiple presses
2 – a b c 6 - m n o
3 - d e f 7 - p q r s
4 - g h i 8 - t u v
5 - j k l 9 - w x y z
hello = 4433555[pause]555666
surprisingly fast!
T9 predictive entry
◦ type as if single key for each letter
◦ use dictionary to ‘guess’ the right word
◦ hello = 43556 …
◦ but 26 -> menu ‘am’ or ‘an’
23. Handwriting recognition
Text can be input into the
computer, using a pen and a
digesting tablet
◦ natural interaction
Technical problems:
◦ capturing all useful information
- stroke path, pressure, etc. in a
natural manner
◦ segmenting joined up writing
into individual letters
◦ interpreting individual letters
◦ coping with different styles of
handwriting
Used in PDAs, and tablet
computers …
… leave the keyboard on the
desk!
24. Speech Recognition
Improving rapidly
Most successful when:
◦ single user – initial training
and learns peculiarities
◦ limited vocabulary systems
Problems with
◦ external noise interfering
◦ imprecision of
pronunciation
◦ large vocabularies
◦ different speakers
25. Numeric keypads
for entering numbers quickly:
◦ calculator, PC keyboard
for telephones
not the same!!
ATM like phone
27. positioning in 3D space
moving and grasping
seeing 3D (helmets and caves)
Virtual reality and
3D visualization
28. Positioning in 3D space
cockpit and virtual controls
steering wheels, knobs and dials
… just like real!
the 3D mouse
six-degrees of movement: x, y, z +
roll, pitch, yaw
data glove
fibre optics used to detect finger
position
VR helmets
detect head motion and possibly
eye gaze
whole body tracking
accelerometers strapped to limbs
or reflective dots and video
processing
29. 3D Displays
desktop VR
ordinary screen, mouse
or keyboard control
perspective and motion
give 3D effect
seeing in 3D
use stereoscopic vision
VR helmets
screen plus shuttered
specs, etc.
30. VR Headsets
small TV screen for each eye
slightly different angles
3D effect
31. VR Motion Sickness
time delay
move head … lag … display
moves
conflict: head movement vs.
eyes
depth perception
headset gives different stereo
distance
but all focused in same plane
conflict: eye angle vs. focus
conflicting cues => sickness
helps motivate improvements
in technology
32. Simulators and VR Caves
scenes projected on
walls
realistic environment
hydraulic rams!
real controls
other people
33. Can be divided into :
sound,
touch,
feel,
smell
environmental and bio-sensing
Physical Devices
35. Touch, feel, smell
touch and feeling important
in games … vibration, force
feedback
in simulation … feel of surgical
instruments
called haptic devices
texture, smell, taste
current technology very limited
36. Environment and bio-sensing
Sensors are physical devices
that measure physical quantities.
the same property can be measured with different Sensors.
Sensors provide raw information, which can be treated in
various ways, i.e., can be processed to various levels.
sensors all around us
car courtesy light – small switch on door
ultrasound detectors – security, washbasins
RFID security tags in shops
temperature, weight, location
… and even our own bodies …
iris scanners, body temperature, heart rate, galvanic skin
response, blink rate
37. Example of Physical Device
BMW iDrive
for controlling menus
feel small ‘bumps’ for each
item
makes it easier to select
options by feel
uses haptic technology from
Immersion Corp.
Physical controls
specialist controls needed …
industrial controls, consumer
products, etc.
38. Involves re-encoding and remembering
Human can not remember all the things -
must be processed and managed
Context is very important in memory
Human easier to identify than recalls
Appearance of GUI replaces of interface
based on instructions
Human easier to remember images than
words
Use icon rather than the words
Memory Capacity
39. Short-term Memory - RAM
Random access memory (RAM)
on silicon chips
100 nano-second access time
usually volatile (lose information if power turned off)
data transferred at around 100 Mbytes/sec
Some non-volatile RAM used to store
basic set-up information
Typical desktop computers:
64 to 256 Mbytes RAM
40. Long-term Memory - disks
magnetic disks
floppy disks store around 1.4 Mbytes
hard disks typically 40 Gbytes to 100s of Gbytes
access time ~10ms, transfer rate 100kbytes/s
optical disks
use lasers to read and sometimes write
more robust that magnetic media
CD-ROM
- same technology as home audio, ~ 600 Gbytes
DVD - for AV applications, or very large files
41. Blurring boundaries
PDAs
often use RAM for their main
memory
Flash-Memory
used in PDAs, cameras etc.
silicon based but persistent
plug-in USB devices for data
transfer
42. Finite Processing Speed
Designers tend to assume fast processors, and
make interfaces more and more complicated
But problems occur, because processing cannot
keep up with all the tasks it needs to do
cursor overshooting because system has buffered keypresses
icon wars - user clicks on icon, nothing happens, clicks on
another, then system responds and windows fly everywhere
Also problems if system is too fast - e.g. help
screens may scroll through text much too rapidly
to be read
44. Moore’s law
computers get faster and faster!
1965 …
Gordon Moore, co-founder of Intel, noticed a pattern
processor speed doubles every 18 months
PC … 1987: 1.5 Mhz, 2002: 1.5 GHz
similar pattern for memory
but doubles every 12 months!!
hard disk … 1991: 20Mbyte : 2002: 30 Gbyte
baby born today
record all sound and vision
by 70 all life’s memories stored in a grain of dust!
45. The various interface style
a. Command line interface
b. Menus
c. Natural language
d. WIMP interface: Windows, icon, menus
and pointers.
e. Question/answer and query dialog
f. Form-fills and spreadsheet
g. Point-and-click interfaces
h. 3D interfaces
i. Web navigation
46. a. Command Line Interface
operating systems - DOS/UNIX
short instructions for efficiency/repetition
√ good for experts
× poor for novices
hard to remember, so choose command names carefully
includes keyboard shortcuts and function keys.
simple, quick
developed from Teletypewriters (TTY)
not for complex tasks
useful for operating system tools, compilers, …
relies on recall rather than recognition
Operation DOS command Unix program
directory list dir ls
display file type cat
rename file rename mv
search in file findstr grep
47. b. Menus
a set of options
no need to remember, only recognize ) options must be
self-explanatory
they can take up space so we have
pull-down/pop-up/pin-up/cascading/pie
Selection by:
numbers, letters, arrow keys, mouse
– combination (e.g. mouse plus accelerators)
• Often options hierarchically grouped
– sensible grouping is needed
• Restricted form of full WIMP system
48. c. Natural Language
Method whereby inputs to and outputs from a computer-based application
are in a conventional spoken language such as English.
Current implementations are tedious and difficult to work with, not as viable
as other interaction methods.
Applications for Natural Language
Speech Input
Hands-free operation
Poor Lighting Situations
Mobile Applications
In the home
Patients and disabled
Speech Output
On-board navigational systems
Two areas of development
Speech recognition
Semantics
Grammar issues
Vague meanings
Contradictory statements
49. d. WIMP Interface
Default interaction style for majority of interaction computer
systems, especially PCs and desktop machines.
WIMP (Windows, Icons, Menus, Pointers) systems replace
typed commands.
They have the following features:
objects are visible and directly manipulable
rapid, reversible, incremental actions
50. Why are these systems popular?
novices learn quickly
programs tend to have the same look and feel
based on recognition not remembering
provide immediate feedback
contain recognizable widgets, e.g. buttons, dialogs
provide a feeling of control
There are conceptual problems; e.g. the gulfs of execution
and evaluation.
51. Windows
Independent terminals in their own right.
Contain text/graphics and can be moved or
resized
More than one window can be on screen at
once
52. Windows may also be tiled, when they adjoin but do not
overlap each other.
Windows have various things associated with them that
increase their usefulness
Ex:
Scrollbars –allowing the user to move the contents of the window
up and down/from side to side
Title bar – identifying it to the user
Special box in the corners – to aid resizing, closing or making as
large as possible.
53. Icons
a graphical representation of an object
can be dragged and dropped
Icons embody the idea that different people have different
cognitive styles.
Some users prefer text-oriented views and some prefer
graphics.
An icon is an image, picture or symbol representing a
concept.
Can take many forms
Realistic
Highly stylized
Arbitrary symbol(difficult for user to interpret)
54. Consider these guidelines when creating or allocating icons:
represent the object or action in a familiar manner
limit the number of icons
make the icon stand out
try 3D icons
make each one clearly visible from the background
ensure icon `family' harmony
group icons appropriately
add information to show use, e.g. tooltips
55. Menus
a list of command buttons
pull-down or pop-up
Presents a choice of operations or services that can be
performed by the system at a given time
The name used for the commands in the menu should be
meaningful and informative.
Main Menu can be visible to the user
Website use variety of menu bar locations (top, bottom and
either side of the scene)
Main menu can be hidden and upon request it will pop up onto
the screen
56. Types of menu:
Pop up menu – allow one to examine properties
of particular on screen objects
Pull down menu – dragged down from the title at
the top of the screen, moving the pointer into the
title bar area and pressing the button.
Fall down menu – menu automatically appears
when the mouse pointer enters the title bar
without pressing the button
Pin up menu – “pinned” to the screen
58. Pointers
mouse or similar pointing device
Example: Text pointer and Mouse pointer
Most important component to point and selecting
things such as icons.
Different shapes of cursor are often used to
distinguish modes
ex:
normal cursor (arrow) – change to cross hair when
drawing a line
Watch/hourglass cursor – system busy reading a file
59.
60. e. Question/Answer & Query
A simple mechanism for providing input to an application in a
specific domain
input is constrained - ATM, “wizard" dialogue
good for novices but restricted functionality
next question/action depends on last answer
61. Question/ Answer Query Dialog
A simple mechanism for
providing input to an
application in as specific
domain
Construct queries to retrieve
information from a database
User is led through interaction
step by step via series of
questions
Require understanding of
database structure and
language syntax
Limited in functionality and
power
Often do not provide direct
conformation of what was
requested, so the only
validation the user has is the
result of the search
Appropriated for restricted
domains and for novice or
casual users
Effective use require some
expertise
Differences between
Question/Answer & Query Dialog Box
62. f. Form-Fills/Spreadsheets
Primarily for data entry but can also be useful in data retrieval
applications
hard-copy metaphor
used for data entry & specifying a data retrieval operation
good for novices as it is familiar (similar to paper form)
need to allow editing of errors
need to allow easy movement between fields
63. g. Point and click interfaces
Closely related to the WIMP style
More closely tied to ideas of hypertext
not tied to mouse-based interfaces – extensively used in
touchscreen information system
Popularized by world wide web pages
Incorporate all the above types of point and click navigation:
Highlighted words,
Maps
Iconic buttons
65. human-computer interaction in which the user's tasks are
performed directly in a 3D spatial context.
VR is only part of a range of 3D techniques available to
the interface designer.
Simple technique for WIMP elements, buttons, scroll bars,
etc are given a 3D appearance using shading
Complex technique uses interfaces with 3D workspace
Object in perspective when at an angle to the viewer and
shrink when they ‘further away’
66. i. Web navigation
Two basic interaction styles
Link-based navigation
Sensitive to articulatory distance
Ambiguous link labels increase the gulf of evaluation
Search
Sensitive to semantic distance
Inadequate search engine algorithms increase the gulf of execution
Slight advantage in development of mental models
Readers need a sense of context of their place within an
organization of information.
67. Styles of web navigations
a. Text links– The anchor text, link label, link text, or
link title is the visible, clickable text in a hyperlink
b. Navigation bar – A navigation bar or (navigation
system) is a section of a website or online page
intended to aide visitors in travelling through the
online document
c. Sitemap – A site map (or sitemap) is a list of pages
of a web site accessible to crawlers or users. It can
be either a document in any form used as a planning
tool for Web design, or a Web page that lists the
pages on a Web site, typically organized in
hierarchical fashion
68. d. Breadcrumbs - Breadcrumbs or breadcrumb trail is a
navigation aid used in user interfaces. It allows users to keep
track of their locations within programs or documents. The
term comes from the trail of breadcrumbs left by Hansel and
Gretel in the popular fairy tale
e. Named anchor - An anchor element is called an anchor
because web designers can use it to anchor a URL to some
text on a web page. When users view the web page in a
browser, they can click the text to activate the link and visit
the page whose URL is in the link.
d. Dropdown Menu: In computing with graphical user
interfaces, a dropdown menu or drop-down menu or drop-
down list is a user interface control GUI element ("widget" or
"control"), similar to a list box, which allows the user to
choose one value from a list.
Styles of web navigations
69. Human factors in interface
design
Limited short-term memory
People can instantaneously remember about 7 items of
information. If you present more than this, they are more
liable to make mistakes.
People make mistakes
When people make mistakes and systems go wrong,
inappropriate alarms and messages can increase stress
and hence the likelihood of more mistakes.
People are different
People have a wide range of physical capabilities.
Designers should not just design for their own capabilities.
People have different interaction preferences
Some like pictures, some like text.
70. Ergonomics (Human Factors)
Ergonomics in interface design
• Study of the physical characteristics of interaction
• Also known as human factors – but this can also be used
to mean much of HCI!
• Ergonomics good at defining standards and guidelines
for constraining the way we design certain aspects of
systems
71. Role of Ergonomic in
Interface Design
This is a huge and established field; we will consider briefly some
aspects:
arrangement of controls
e.g. controls grouped according to function or frequency of
use, or sequentially
physical environment
e.g. seating arrangements adaptable to cope with all sizes of
user
health issues
e.g. physical position, environmental conditions
(temperature, humidity), lighting, noise
use of colour
e.g. use of red for warning, green for okay, awareness of colour-
blindness, etc
72. Ergonomics: the
arrangement of controls
Control layout is important
Safety critical systems: poor layout ) disaster!
Routine applications: poor layout ) inefficiency, user dissatisfaction,
poor mental model building etc..
Controls can be and laid out in various ways:
functional - task related controls grouped together
sequential - layout in order of use
frequency - common controls easy to access
Other factors
Controls should be easy to reach
Controls should not be so close to each other that they hamper usage
‘Dangerous' controls should be hard to reach - prevents accidents
73. Ergonomics: The physical
environment
Unsatisfactory working conditions can at best lead to stress and
dissatisfaction
'Physical' here means the kinds of things physicists know and love -
heat, light, noise, dusts, chemicals, and so on.
For example, there's a thermal comfort range which suits people
best - unless they are doing hard physical work, in which case
they might prefer a cooler range. Similarly with noise; at night,
you might want things quiet so that you can sleep; in a club, you
might want it a bit louder!
About understanding the effects of these aspects of the
environment on people, and in particular, the harmful effects.
74. Ergonomics: Health issues
and at worst harm workers' health. Some factors to
consider:
physical position - should be comfortable
temperature - should not be extreme
Lighting - should be low-glare & sufficient
Noise - should not be excessive; high levels hamper
perception
Time - don't expect extended use of an interactive
system
75. Ergonomics: Colour
Colour is a powerful cue, but it is easy to misuse.
It should not be applied just because it is available.
Topics:
Colour Vision & Perception
Principles & Guidelines
76. Colour Vision
the eye consists of millions of photo receptors
sensitive to light
two types of photo receptors
1. rods
not sensitive to colour
high density at periphery
highly sensitive
low resolution
77. Colour Vision
2. cones
sensitive to colour; different cones for red,
green and blue light
high density in centre (fovea)
less sensitive - can tolerate bright light
78. How are colours generated?
Subtractive colour system
Non-luminous objects (e.g. paper) selectively
absorb and reflect different wavelengths of
light, creating the perception of colour.
Additive colour system
Luminous objects (e.g. CRT screen) generate
colour by addition of Red/Green/Blue.
79. Other Colour models
As well as the RGB system, a number of other
descriptive models are in use.
The most common other model is the HLS (or
HSB) system
HLS describes more closely the colours that we
actually can see. (Many colours that we can
see are not describable in the RGB system.)
80. Other Colour models
The HLS colour model has three (3) dimensions:
1. hue - the basic component
2. saturation - the degree to which the hue differs
from a neutral gray
3. lightness indicates the level of illumination:
81. Colour Principles &
Guidelines
have some other redundant cue
optimal combinations are known
include a bright colour in the foreground
best background - black
worst background - brown or green
use colour sparingly, design in B&W
use colour to group/highlight information
use colour to support search tasks
avoid using colour in non-task-related ways
82. Colour Principles &
Guidelines
allow customisation
ensure colours differ in lightness (aids colourblind
users)
limit colour to eight (8) distinct colours; four (4)
preferred
avoid saturated blues for text
choose foreground and background colour with
care
colours are hard to distinguish when objects are
small, far apart, or close
on colour spectrum
83. Model Of Interaction
Why develop a model for interaction?
To help us to understand an interactive
dialogue.
To identify likely difficulties.
To provide a framework to compare
different interaction styles.
84. Some concepts:
Users want to achieve goals in some domain.
Operations in the domain are tasks.
Task analysis investigates the problem in terms of
domain, goals, intentions, tasks
The system and the user have different languages
The core language describes computation
aspects of the domain
The task language describes psychological
aspects of domain
85. Mental Model
What is a mental model?
• Norman (1988, p. 17):
... the models people have of themselves, others, the
environment, and the things with which they interact.
Mental models are used to explain observable events in
terms of unobservable structures and events
86. Formation of Mental Model
Cognitive model: Theory of action Norman’s (1986) 7 stage
model of activity for users at an interface
a) establishing the goal - task language; imprecise
b) forming the intention - specific
c) specifying the action sequence
d) executing the action
e) perceiving the system state
f) interpreting the system state
g) evaluating the system state with respect to the
goals and intentions
87. An example: reading breaking news
on the web
1. Set goal: “find out about breaking
news”
• decide on news website
2. Form an intention
• check out a news website
3. Specify what to do
• search with Google for Fairfax
News website
4. Execute action sequence
• find and select suitable link
5. Check what happens at the interface
• see a new page pop up on the
screen
6. Interpret it
• confirm that the web page is
correct one
7. Evaluate it with respect to the goal
• read breaking news
88. Interface Problems:
Since the human and computer do not recognize the
same concepts (speak the
same language) interfaces cause problems. These
problems can be described in terms of:
gulf of execution - difference between user determined
action formulation and the actions allowed by system
gulf of evaluation - difference between physical
presentation of system state and user expectation
89. These gulfs can be `bridged':
users can change to suit the interface
designers can design “knowing the
user“
users can change their interpretation
of system responses
designers can change output
characteristics
90. The Interaction Framework
components
system, user, input, output
input and output together form the interface
each of the components may have its own language
to describe the objects and actions it is concerned with
92. They considered four (4) components, each with
its own language.
This more accurately models the overall interaction
and leads to situation
analyses that locate sources of interface problems
93. Interaction problems are explained as language
translation difficulties
User - Input: (articulating a goal)
How easy is it to translate a goal requirement into
the input language? e.g. Difficult: bank of light
switches, stovetop element controls
Easy: virtual reality system
Input –System
Can all system stimuli be articulated by user
language?
Consider remote control (or front panel) with limited
functions.
94. System - Output (execution & evaluation)
Can system output device provide a complete view
of system state? e.g.
Consider document editing with limited view of data
Output - User (interpretation by user)
Is information presented to user in a way that is easy to
interpret. e.g.
Difficult to read unmarked analog clock.
Difficult to observe result of hierarchical system le
copying using command line interface
95. Observation
Presentation
Performance Articulation
Interactive Cycle
Translations Between Components
articulation
user translates task intentions into the input language
performance
input language is translated into stimuli for the system
presentation
system activities are translated
into output language
observation
output language is translated into
the user’s task model
96. Example: Light in a Room
controlling the lighting in a room
articulation: “I’m going to bed now, so I better turn off the light in the living
room. To do this, I need to flip the switch.”
task language: turn lights on/off
input language: flip switch
system language: close/open circuit for light bulbs
output language: lights on/off
translations
articulation
user decides to turn on the light, and flips a switch
performance
flipped switch closes the circuit
presentation
light bulb emits light
observation
user notices that the light is on
frequent problem
multiple switches in large rooms
97. Interactivity is the defining feature of an
interactive system, e.g. the interface
semantics and closeness to real-time
interaction (speech recognition,
visualisation, menu dynamics).
In older systems, order of interaction is pre-
emptive. Newer systems still have some of
these features, e.g. modal forms.
Of course all interaction occurs in some
wider social and organizational context.
98. People are usually involved and there are
issues of desire to impress, competition and
fear of failure.
Motivation will reduce if systems do not
match requirements but new technology
may increase motivation if systems are well
designed and integrated with the user's
work.
99. Comments:
These two frameworks are quite general.
Frameworks help us to understand the interaction
process.
Frameworks help us to judge overall usability of an
entire interactive system.
Evaluation of a system can only be with respect to
a set of goals and tasks
which the system is being used to achieve. (E.g.
Word vs LATEX)