Scada & hmi


Published on

this is about scada system how it works and layout

Published in: Technology, Business

Scada & hmi

  1. 1. Submitted To:- Submitted By :-Ramandeep Singh Gurvinder Singh Ashish Kapoor Amritpal Singh Jugvinder Singh Sidhu
  2. 2. TABLE OF CONTENT1. What is SCADA?2. What is Telemetry?3. What is Data Acquisition?4. Differences between SCADA and DCS?5. Components of SCADA i. Field Instrumentation ii. Remote Station iii. Communication Network iv. Central Monitoring System (CMS)6. Typical System Configuration7. Modes of Communication8. SCADA Example Application9. SCADA System Benefits10.Futuristic Technoogy for Scada- BAN11. Limitations Continued……….
  3. 3. 12. Human Machine Interface(HMI) i. Introduction ii. Terminology iii. Defination iv. Goals13. Human–Machine Interaction14. Human Machine Interface15. Design Methodologies16. Thirteen Principles of Display Design17. Modalities And Modes18. Interaction Technique19. Human Interface Device20. Biblography
  4. 4. Supervisory Control And Data Acquisition
  5. 5. What is SCADA?SCADA (Supervisory Control And Data Acquisition) system refers to thecombination of telemetry and data acquisition. It consists of collectinginformation, transferring it back to a central site, carrying out necessary analysisand control, and then displaying this data on a number of operator screens.TheSCADA system is used to monitor and control a plant or equipment. Control maybe automatic or can be initiated by operator commands. SCADA systems were firstused in the 1960sSCADA stands for supervisory control and data acquisition. It generally refers toan industrial control system: a computer system monitoring and controlling aprocess. The process can be industrial, infrastructure or facility-based asdescribed below: Industrial processes include those of manufacturing, production, power generation, fabrication, and refining, and may run in continuous, batch, repetitive, or discrete modes. Infrastructure processes may be public or private, and include water treatment and distribution, wastewater collection and treatment, oil and gas pipelines, electrical power transmission and distribution, Wind Farms, civil defense siren systems, and large communication systems. Facility processes occur both in public facilities and private ones, including buildings, airports, ships, and space stations. They monitor and control HVAC, access, and energy consumption.
  6. 6. SCADA SoftwareThe supervisory computer consists of a PC running either Campbell ScientificsHMI software or another vendors software. InTouch, Intellution, Lookout, andother software packages can be used in conjunction with our OPC client/serversoftware application. Like other HMI software packages, our software provides agraphical interface that the operator uses to view the status of remote sites,acknowledge alarms, and control the units.What is Telemetry?Telemetry is usually associated with SCADA systems. It is a technique used intransmitting and receiving information or data over a medium. The information canbe measurements, such as voltage, speed or flow. These data are transmitted toanother location through a medium such as cable, telephone or radio. Informationmay come from multiple locations. A way of addressing these different sites isincorporated in the systemWhat is Data Acquisition?Data acquisition refers to the method used to access and control information or datafrom the equipment being controlled and monitored. The data accessed are thenforwarded onto a telemetry system ready for transfer to the different sites. They canbe analog and digital information gathered by sensors, such as flowmeter, ammeter,etc. It can also be data to control equipment such as actuators, relays, valves, motors,etc.
  7. 7. What are the differences betweenSCADA and DCS?Similar to the SCADA systems are the Distributed Control Systems (DCS). The DCS isusually used in factories and located within a more confined area. It uses a high-speedcommunications medium, such as local area network (LAN). A significant amount ofclosed loop control is present on the systemThe SCADA system coverslarger.geographical areas. It may rely on a variety of communication links such asradio and telephone. Closed loop control is not a high priority in this system.Supervision vs controlThere is, in several industries, considerable confusion over the differencesbetween SCADA systems and distributed control systems (DCS). Generallyspeaking, a SCADA system usually refers to a system that coordinates, but doesnot control processes in real time. The discussion on real-time control is muddiedsomewhat by newer telecommunications technology, enabling reliable, lowlatency, high speed communications over wide areas. Most differences betweenSCADA and DCS are culturally determined and can usually be ignored. Ascommunication infrastructures with higher capacity become available, thedifference between SCADA and DCS will fade.
  8. 8. Common system componentsA SCADAs System usually consists of the following subsystems: A Human-Machine Interface or HMI is the apparatus which presents process data to a human operator, and through this, the human operator monitors and controls the process. A supervisory (computer) system, gathering (acquiring) data on the process and sending commands (control) to the process. Remote Terminal Units (RTUs) connecting to sensors in the process, converting sensor signals to digital data and sending digital data to the supervisory system. Programmable Logic Controller (PLCs) used as field devices because they are more economical, versatile, flexible, and configurable than special- purpose RTUs. Communication infrastructure connecting the supervisory system to the Remote Terminal UnitsSystems conceptsThe term SCADA usually refers to centralized systems which monitor and controlentire sites, or complexes of systems spread out over large areas (anythingbetween an industrial plant and a country). Most control actions are performedautomatically by Remote Terminal Units ("RTUs") or by programmable logiccontrollers ("PLCs"). Host control functions are usually restricted to basicoverriding or supervisory level intervention. For example, a PLC may control theflow of cooling water through part of an industrial process, but the SCADA systemmay allow operators to change the set points for the flow, and enable alarmconditions, such as loss of flow and high temperature, to be displayed andrecorded. The feedback control loop passes through the RTU or PLC, while theSCADA system monitors the overall performance of the loop.
  9. 9. Data acquisition begins at the RTU or PLC level and includes meter readings andequipment status reports that are communicated to SCADA as required. Data isthen compiled and formatted in such a way that a control room operator usingthe HMI can make supervisory decisions to adjust or override normal RTU (PLC)controls. Data may also be fed to a Historian, often built on a commodityDatabase Management System, to allow trending and other analytical auditing.SCADA systems typically implement a distributed database, commonly referred toas a tag database, which contains data elements called tags or points. A pointrepresents a single input or output value monitored or controlled by the system.Points can be either "hard" or "soft". A hard point represents an actual input oroutput within the system, while a soft point results from logic and mathoperations applied to other points. (Most implementations conceptually removethe distinction by making every property a "soft" point expression, which may, inthe simplest case, equal a single hard point.) Points are normally stored as value-timestamp pairs: a value, and the timestamp when it was recorded or calculated.A series of value-timestamp pairs gives the history of that point. Its also commonto store additional metadata with tags, such as the path to a field device or PLCregister, design time comments, and alarm information.
  10. 10. Human Machine InterfaceTypical Basic SCADA AnimationsA Human-Machine Interface or HMI is the apparatus which presents process datato a human operator, and through which the human operator controls theprocess.An HMI is usually linked to the SCADA systems databases and software programs,to provide trending, diagnostic data, and management information such asscheduled maintenance procedures, logistic information, detailed schematics fora particular sensor or machine, and expert-system troubleshooting guides.The HMI system usually presents the information to the operating personnelgraphically, in the form of a mimic diagram. This means that the operator can seea schematic representation of the plant being controlled. For example, a pictureof a pump connected to a pipe can show the operator that the pump is runningand how much fluid it is pumping through the pipe at the moment. The operatorcan then switch the pump off. The HMI software will show the flow rate of thefluid in the pipe decrease in real time. Mimic diagrams may consist of linegraphics and schematic symbols to represent process elements, or may consist ofdigital photographs of the process equipment overlain with animated symbols.The HMI package for the SCADA system typically includes a drawing program thatthe operators or system maintenance personnel use to change the way thesepoints are represented in the interface. These representations can be as simple asan on-screen traffic light, which represents the state of an actual traffic light in
  11. 11. the field, or as complex as a multi-projector display representing the position ofall of the elevators in a skyscraper or all of the trains on a railway.An important part of most SCADA implementations is alarm handling. The systemmonitors whether certain alarm conditions are satisfied, to determine when analarm event has occurred. Once an alarm event has been detected, one or moreactions are taken (such as the activation of one or more alarm indicators, andperhaps the generation of email or text messages so that management or remoteSCADA operators are informed). In many cases, a SCADA operator may have toacknowledge the alarm event; this may deactivate some alarm indicators,whereas other indicators remain active until the alarm conditions are cleared.Alarm conditions can be explicit - for example, an alarm point is a digital statuspoint that has either the value NORMAL or ALARM that is calculated by a formulabased on the values in other analogue and digital points - or implicit: the SCADAsystem might automatically monitor whether the value in an analogue point liesoutside high and low limit values associated with that point. Examples of alarmindicators include a siren, a pop-up box on a screen, or a coloured or flashing areaon a screen (that might act in a similar way to the "fuel tank empty" light in a car);in each case, the role of the alarm indicator is to draw the operators attention tothe part of the system in alarm so that appropriate action can be taken. Indesigning SCADA systems, care is needed in coping with a cascade of alarm eventsoccurring in a short time, otherwise the underlying cause (which might not be theearliest event detected) may get lost in the noise. Unfortunately, when used as anoun, the word alarm is used rather loosely in the industry; thus, depending oncontext it might mean an alarm point, an alarm indicator, or an alarm event.
  12. 12. Components of SCADA SystemComponents of a SCADA System A SCADA system are composed of the following:1. Field Instrumentation2. Remote Stations3. Communications Network4. Central Monitoring StationField Instrumentation refers to the sensors and actuators that are directly interfacedto the plant or equipment. They generate the analog and digital signals that will bemonitored by the Remote Station. Signals are also conditioned to make sure they arecompatible with the inputs/outputs of the RTU or PLC at the Remote Station.The Remote Station is installed at the remote plant or equipment being monitoredand controlled by the central host computer. This can be a Remote Terminal Unit(RTU) or a Programmable Logic Controller (PLC).The Communications Network is the medium for transferring information from onelocation to another. This can be via telephone line, radio or cable.The Central Monitoring Station (CMS) refers to the location of the master or hostcomputer. Several workstation may be configured on the CMS, if necessary. It uses aMan MachineInterface (MMI) program to monitor various types data needed for the operation. Thefollowing is a sample configuration of a SCADA system for water distribution.
  13. 13. SCADA Component:Field InstrumentationField Instrumentation refers to the devices that are connected to the equipment ormachines being controlled and monitored by the SCADA system. These are sensorsfor monitoring certain parameters; and actuators for controlling certain modules ofthe system.These instruments convert physical parameters (i.e., fluid flow, velocity, fluid level,etc.) to electrical signals (i.e.voltage or current) readable by the Remote Stationequipment. Outputs can either be in analog (continuous range) or in digital (discretevalues). Some of the industry standard analog outputs of these sensors are 0 to 5volts, 0 to 10 volts, 4 to 20 mA and 0 to 20 mA. The voltage outputs are used whenthe sensors are installed near the controllers (RTU or PLC). The current outputs areused when the sensors are located far from the controllers.Digital outputs are used to differentiate the discrete status of the equipment. Usually,<1> is used to mean EQUIPMENT ON and <0> for EQUIPMENT OFF status. This mayalso mean <1> for FULL or <0> for EMPTY.Actuators are used to turn on or turn off certain equipment. Likewise, digital andanalog inputs are used for control. For example, digital inputs can be used to turn onand off modules on equipment. While analog inputs are used to control the speed ofa motor or the position of a motorized valve.
  14. 14. Remote StationField instrumentation connected to the plant or equipment being monitored andcontrolled are interfaced to the Remote Station to allow process manipulation at aremote site. It is also used to gather data from the equipment and transfer them tothe central SCADA system. The Remote Station may either be an RTU (RemoteTerminal Unit) or a PLC (Programmable Logic Controller). It may also be a single boardor modular unit.RTU versus PLCThe RTU (Remote Terminal Unit) is a ruggedized computer with very good radiointerfacing. It is used in situations where communications are more difficult. Onedisadvantage of the RTU is its poor programmability. However, modern RTUs are nowoffering good programmability comparable to PLCs.The PLC (Programmable Logic Controller) is a small industrial computer usually foundin factories. Its main use is to replace the relay logic of a plant or process. Today,the PLC is being used in SCADA systems to due its very good programmability. EarlierPLC’s have no serial communication ports for interfacing to radio for transferringof data. Nowadays, PLCs have extensive communication features and a wide supportfor popular radio units being used for SCADA system. In the near future we are seeingthe merging of the RTUs and the PLC’s.Micrologic is offering an inexpensive RTU for SCADA system wherein the PLC may bean overkill solution. It is a microcontroller-based RTU and can be interfaced to radiomodems for transmitting of data to the CMS.
  15. 15. Single Board versus Modular UnitThe Remote Station is usually available in two types, namely, the single board and themodular unit. The single board provides a fixed number of input/output (I/O)interfaces. It is cheaper, but does not offer easy expandability to a more sophisticatedsystem. The modular type is an expandable remote station and more expensive thanthe single board unit. Usually a back plane is used to connect the modules. Any I/O orcommunication modules needed for future expansion may be easily plugged in on thebackplane.Communication NetworkThe Communication Network refers to the communication equipment needed totransfer data to and from different sites. The medium used can either be cable,telephone or radio. The use of cable is usually implemented in a factory. This is notpractical for systems covering large geographical areas because of the high cost of thecables, conduits and the extensive labor in installing them.The use of telephone lines(i.e., leased or dial-up) is a cheaper solution for systems with large coverage. Theleased line is used for systems requiring on-line connection with the remote stations.This is expensive since one telephone line will be needed per site. Besides leased linesare more expensive than ordinary telephone line. Dial-up lines can be used onsystems requiring updates at regular intervals (e.g., hourly updates). Here ordinarytelephone lines can be used. The host can dial a particular number of a remote site toget the readings and send commands. Remote sites are usually not accessible bytelephone lines. The use of radio offers an economical solution. Radio modems areused to connect the remote sites to the host. An on-line operation can also beimplemented on the radio system. For locations wherein a direct radio link cannot beestablished, a radio repeater is used to link these sites.
  16. 16. Central Monitoring Station (CMS)The Central Monitoring Station (CMS) is the master unit of the SCADA system. It is incharge of collecting information gathered by the remote stations and of generatingnecessary action for any event detected. The CMS can have a single computerconfiguration or it can be networked to workstations to allow sharing of informationfrom the SCADA system.A Man-Machine Interface (MMI) program will be running on the CMS computer. Amimic diagram of the whole plant or process can be displayed onscreen for easieridentification with the real system. Each I/O point of the remote units can bedisplayed with corresponding graphical representation and the present I/O reading.The flow reading can be displayed on a graphical representation of a flowmeter. Areservoir can be displayed with the corresponding fluid contents depending on theactual tank level. Set-up parameters such as trip values, limits, etc. are entered onthis program and downloaded to the corresponding remote units for updating of theiroperating parameters.The MMI program can also create a separate window for alarms. The alarm windowcan display the alarm tag name, description, value, trip point value, time, date andother pertinent information. All alarms will be saved on a separate file for laterreview. A trending of required points can be programmed on the system. Trendinggraphs can be viewed or printed at a later time. Generation of management reportscan also be scheduled on for a specific time of day, on a periodic basis, upon operatorrequest, or event initiated alarms. Access to the program is permitted only toqualified operators. Each user is given a password and a privilege level to access onlyparticular areas of the program.. All actions taken by the users are logged on a file forlater review.
  17. 17. MMI Screen Showing Pipe System Diagram and Repair AreasTypical System ConfigurationsThere are two typical network configurations for the wireless telemetry radio-basedSCADA systems. They are the point-to-point and the point-to-multipointconfigurations.1.Point-to-Point Configuration2.Point-to-Multipoint Configuration
  18. 18. 1.Point-to-Point ConfigurationThe Point-to-Point configuration is the simplest set-up for a telemetry system. Heredata is exchanged between two stations. One station can be set up as the master andthe other as the slave. An example is a set-up of two RTUs: one for a reservoir or tankand the other for a water pump at a different location. Whenever the tank is nearlyempty, the RTU at the tank will send an EMPTY command to the other RTU. Uponreceiving this command, the RTU at the water pump will start pumping water to thetank. When the tank is full, the tank’s RTU will send a FULL command to the pump’sRTU to stop the motor. Point-to-Point Configuration
  19. 19. 2.Point-to-Multipoint ConfigurationThe Point-to-Multipoint configuration is where one device is designated as the masterunit to several slave units. The master is usually the main host and is located at thecontrol room. While the slaves are the remote units at the remote sites. Each slave isassigned a unique address or identification number. Point-to-Multipoint Configuration
  20. 20. Modes of CommunicationThere are two modes of communication available, namely, the polled system and theinterrupt system.1.Polled SystemIn the Polled or Master/Slave system, the master is in total control ofcommunications. The master makes a regular polling of data (i.e., sends and receivesdata) to each slave in sequence. The slave unit responds to the master only when itreceivers a request. This is called the half-duplex method. Each slave unit will have itsown unique address to allow correct identification. If a slave does not respond for apredetermined period of time, the master retries to poll it for a number of timesbefore continuing to poll the next slave unit.Advantages: • Process of data gathering is fairly simple• No collision can occur on the network• Link failure can easily be detectedDisadvantages:• Interrupt type request from a slave requesting immediate action cannot be handledimmediately• Waiting time increases with the number of slaves• All communication between slaves have to pass through the master with addedcomplexity
  21. 21. 2.Interrupt SystemThe interrupt system is also referred to as Report by Exception (RBE) configuredsystem. Here the slave monitors its inputs. When it detects a significant change orwhen it exceeds a limit, the slave initiates communication to the master and transfersdata. The system is designed with error detection and recovery process to cope withcollisions. Before any unit transmits, it must first check if any other unit istransmitting. This can be done by first detecting the carrier of the transmissionmedium. If another unit is transmitting, some form of random delay time is requiredbefore it tries again. Excessive collisions result to erratic system operation andpossible system failure. To cope with this, if after several attempts, the slave still failsto transmit a message to the master, it waits until polled by the master.Advantages:• System reduces unnecessary transfer of data as in polled systems• Quick detection of urgent status information• Allows slave-to-slave communicationDisadvantages:• Master may only detect a link failure after a period of time, that is, when system ispolled• Operator action is needed to have the latest valuesCollision of data may occur and may cause delay in the communication
  22. 22. SCADA Example ApplicationSedimentation Tank Monitor on/off status of pumps Control coliform, TSS, and on/off status of pumpsClarifier Monitor torque Control on/off status and torque alarmsGenerator Monitor and control temperatures and flow rates within exhaust heat recovery unit and heat exchangerTrickling Filter Monitor on/off status of pumps and blowers, dissolved oxygen, flow rate, and wetwell level Control on/off status of pumps and blowersChlorine Contact Tank Monitor ORP Control Cl2and SO2 injectionDigester Monitor and control temperature
  23. 23. SCADA System Benefits 1. Control units function as PLCs, RTUs, or DCUs. 2. Control units perform advanced measurement and control independent of the central computer. 3. PID control continues, even if communications to the main computer are lost. 4. Control units have many channel types to measure most available sensors. 5. Systems are compatible with our own or other vendors HMI software packages. 6. Control units have their own UPS; during ac power loss, they continue to measure and store time-stamped data. 7. Control units provide on-board statistical and mathematical processing. 8. Systems are easily expandable: add new sites or add sensors to existing sites. 9. Control units have wide operating temperature ranges and operate in rugged environments.
  24. 24. Futuristic Technoogy for Scada- BANWhat is BAN ?BAN –Body Area NetworkStill under research at Chiba University Japan under supervision of Prof.Hideyuki NebiyaThis technology works on the basic principle that human body poses its ownElectric Field and is a good conductor of electricity.Prof. Nebiya and his team found that human body posses an electric field of itsown and when ever human body comes in contact with some electricalequipment say if someone touch TV Screen ; the intensity of body electric fieldincreases considerably.From the above research, they concluded that human body can be used fortransmission using different frequency signals.
  25. 25. What human body communication is?It is a means of communication between devices via the human body. It willcontribute to a reduction in information leakage because the communication iscarried out via the human body. In addition, the transmission loss is believed tobe smaller compared with that of wireless spatial transmission, realizing wirelesscommunication with low power consumption.In this technology an electronic card is used which acts as transmitter. Signals aretransmitted through human body. Receiver on the other side receives the signaland respond back that signal has received and acts accordingly. Electronic Receiver Electronic Transmitter
  26. 26. APPLICATIONSIt can be used for locking and unlocking of doors as well as medical and healthcare purposes. It can also be adopted by the entertainment field for transmissionof music and image information. And dont forget automotive keyless entrysystems and wearable computing systems. 1. MEDICAL FIELDIn Medical Field, a wrist watch like equipment is give to patient to wear whichkeeps an eye on the state of patient and records every second change that takeplace in patient’s body. Nurses also wear similar device , when they touch thepatient the whole dat gets transferred to the nurse’s device which is thentransmitted to the concernd Doctor. In this way many patients can be monitoredat a same time from a central control room using SCADA.
  27. 27. 2. CORPORATE SECTOR In this field this technology plays a vital role, the company officials need not to carry bulky files or even a pen to sign the contract , all data is stored in there Electronic Data Cards which is transferred to other person on permission. To sign the contract 2 persons just need to shake hand and there signed contracts exchanged through there Electronic Data Cards
  28. 28. 3. Entertainment Industry In this field this technology is going to flourish at maximum rate. Just with the simple example you can get an idea how useful and amazing development its going to make in this sector. Now a days Ipod have wires for the head phones but with this technology there is no need for wired head phones. Just switch on the music player and put on your headphones and enjoy. Human both acts as carrier in this case. This is not over yet real application is now to start ,imagine your friend also want to listen the same music what he has to do is to wear a headset and just hold your hand and its done. In similar way upto 15 persons can get connect and listen music at same time. Further research is needed to develop it further to higher levels.
  29. 29. How about the progress in the development of its usageand market?A number of companies have been working on the employment ofhuman body communication technologies and actually developedprototypes. A variety of companies, including electronic manufacturers,mobile phone companies, office equipment manufacturers, automobilemanufacturers and house builders, showed demonstrations. In thedemonstrations, they used the products incorporating human bodycommunication technologies, such as keys for locking and unlocking,cash registers for retail shops and transmission and common use ofvideo, music and textual information in the entertainment field. Somedemonstrations could be seen at Security Show 2009 and IC Card World2009, both of which took place in March 2009 We have recentlyreceived many inquiries about their applications to sensor networks,which are drawing attention from not only the industrial sector but alsofrom the medical and healthcare sectors. People in medicalorganizations and healthcare companies seem to be placing greaterexpectations on human body communication technologies probablybecause the human body information gained by sensing can betransmitted to devices via the human body.“THIS TECHNOLOGY RECONNECT WIDE APARTED HUMAN BEINGS WITHA TOUCH “
  30. 30. LIMITATIONS:-Like every other technology, this technology also have some limitationssome are as follows :- 1. Loss of information and threat of stealing important information. 2. Physical effects on human body. Researchers are still working on these factors to use it commercially
  31. 31. HUMAN MACHINE INTERFACE (HMI)IntroductionTo work with a system, users have to be able to control and assess the state ofthe system. For example, when driving an automobile, the driver uses the steeringwheel to control the direction of the vehicle, and the accelerator pedal, brakepedal and gearstick to control the speed of the vehicle. The driver perceives theposition of the vehicle by looking through the windshield and exact speed of thevehicle by reading the speedometer. The user interface of the automobile is onthe whole composed of the instruments the driver can use to accomplish thetasks of driving and maintaining the automobile.TerminologyThere is a distinct difference between User Interface versus Operator Interface orHuman Machine Interface (HMI). The term user interface is often used in the context of (personal) computer systems and electronic devices o where a network of equipment or computers are interlinked through an MES (Manufacturing Execution System)-or Host. o An HMI is typically local to one machine or piece of equipment, and is the interface method between the human and the equipment/machine. An Operator interface is the interface method by which multiple equipment that are linked by a host control system is accessed or controlled. The system may expose several user interfaces to serve different kinds of users. For example, a computerized library database might provide two user interfaces, one for library patrons (limited set of functions, optimized for ease of use) and the other for library personnel (wide set of functions, optimized for efficiency).The user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human-
  32. 32. machine interface (HMI). HMI is a modification of the original term MMI (man-machine interface). In practice, the abbreviation MMI is still frequently used although some may claim that MMI stands for something different now. Another abbreviation is HCI, but is more commonly used for than human-computer interface. Other terms used are operator interface console (OIC) and operator interface terminal (OIT). However it is abbreviated, the terms refer to the layer that separates a human that is operating a machine from the machine itself.In science fiction, HMI is sometimes used to refer to what is better described asdirect neural interface. However, this latter usage is seeing increasing applicationin the real-life use of (medical) prostheses—the artificial extension that replaces amissing body part (e.g., cochlear implants).In some circumstance computers might observe the user, and react according totheir actions without specific commands. A means of tracking parts of the body isrequired, and sensors noting the position of the head, direction of gaze and so onhave been used experimentally. This is particularly relevant to immersiveinterfaces.Definition: The Human-Machine Interface is quite literally where the humanand the machine meet. It is the area of the human and the area of the machinethat interact during a given task.Interaction can include touch, sight, sound, heat transference or any otherphysical or cognitive function.Also Known As: Man-Machine InterfaceExamples: A typical computer station will have four human-machine interfaces,the keyboard (hand), the mouse (hand), the monitor (eyes) and the speakers(ears).
  33. 33. GoalsA basic goal of HMI is to improve the interactions between users andmachines(computers) by making computers more usable and receptive to theusers needs. Specifically, HMI is concerned with: methodologies and processes for designing interfaces (i.e., given a task and a class of users, design the best possible interface within given constraints, optimizing for a desired property such as learning ability or efficiency of use) methods for implementing interfaces (e.g. software toolkits and libraries; efficient algorithms) techniques for evaluating and comparing interfaces developing new interfaces and interaction techniques developing descriptive and predictive models and theories of interactionA long term goal of HMI is to design systems that minimize the barrier betweenthe humans cognitive model of what they want to accomplish and thecomputers understanding of the users task.Professional practitioners in HMI are usually designers concerned with thepractical application of design methodologies to real-world problems. Their workoften revolves around designing graphical user interfaces and web interfaces.Researchers in HMI are interested in developing new design methodologies,experimenting with new hardware devices, prototyping new software systems,exploring new paradigms for interaction, and developing models and theories ofinteraction.
  34. 34. HUMAN–MACHINE INTERACTIONHuman–Machine interaction (HMI) is the study of interaction between people(users) and Machines(Computers). Interaction between users and machinesoccurs at the user interface (or simply interface), which includes both softwareand hardware; for example, characters or objects displayed by software on apersonal computers monitor, input received from users via hardware peripheralssuch as keyboards and mice, and other user interactions with large-scalecomputerized systems such as aircraft and power plants. The Association forComputing Machinery defines human-machine interaction as "a disciplineconcerned with the design, evaluation and implementation of interactivecomputing systems for human use and with the study of major phenomenasurrounding them”. An important fact of HMI is the securing of user satisfactionBecause human-machine interaction studies a human and a machine inconjunction, it draws from supporting knowledge on both the machine and thehuman side. On the machine side, techniques in computer graphics, operatingsystems, programming languages, and development environments are relevant.On the human side, communication theory, graphic and industrial designdisciplines, linguistics, social sciences, cognitive psychology, and human factorsare relevant. Engineering and design methods are also relevant. Due to themultidisciplinary nature of HCI, people with different backgrounds contribute toits success. HCI is also sometimes referred to as human–machine interaction(HMI) or computer–human interaction (CHI).
  35. 35. HUMAN–MACHINE (COMPUTER) INTERFACEThe Human–Machine (computer) interface can be described as the point ofcommunication between the human user and the computer. The flow ofinformation between the human and computer is defined as the loop ofinteraction. The loop of interaction has several aspects to it including: Task Environment: The conditions and goals set upon the user. Machine Environment: The environment that the computer is connected to, i.e. a laptop in a college students dorm room. Areas of the Interface: Non-overlapping areas involve processes of the human and computer not pertaining to their interaction. Meanwhile, the overlapping areas only concern themselves with the processes pertaining to their interaction. Input Flow: The flow of information that begins in the task environment, when the user has some task that requires using their computer. Output: The flow of information that originates in the machine environment. Feedback: Loops through the interface that evaluate, moderate, and confirm processes as they pass from the human through the interface to the computer and back.
  36. 36. UsabilityUser interfaces are considered by some authors to be a prime ingredient ofComputer user satisfaction.The design of a user interface affects the amount of effort the user must expendto provide input for the system and to interpret the output of the system, andhow much effort it takes to learn how to do this. Usability is the degree to whichthe design of a particular user interface takes into account the human psychologyand physiology of the users, and makes the process of using the system effective,efficient and satisfyingUsability is mainly a characteristic of the user interface, but is also associated withthe functionalities of the product and the process to design it. It describes howwell a product can be used for its intended purpose by its target users withefficiency, effectiveness, and satisfaction, also taking into account therequirements from its context of use.ConsistencyA key property of a good user interface is consistency. There are three importantaspects.First, the controls for different features should be presented in aconsistent manner so that users can find the controls easily.For example, usersfind it very difficult to use software when some commands are available throughmenus, some through icons, and some through right-clicks. A good user interfacemight provide shortcuts or "synonyms" that provide parallel access to a feature,but users do not have to search multiple sources to find what theyre lookingfor.Second, the "principle of least astonishment" is crucial. Various featuresshould work in similar ways. For example, some features in Adobe Acrobat are"select tool, then select text to which apply." Others are "select text, then applyaction to selection."Third, user interfaces should not change version-to-version—user interfaces mustremain upward compatible.Good user interface design is about setting and meeting user expectations.Better(from a programmers point of view) is not better. The same (from a users pointof view) is better.
  37. 37. USER INTERFACES IN COMPUTINGIn computer science and human-computer interaction, the user interface (of acomputer program) refers to the graphical, textual and auditory information theprogram presents to the user, and the control sequences (such as keystrokes withthe computer keyboard, movements of the computer mouse, and selections withthe touchscreen) the user employs to control the program.TypesCurrently (as of 2009) the following types of user interface are the most common: Graphical user interfaces (GUI) accept input via devices such as computer keyboard and mouse and provide articulated graphical output on the computer monitor. There are at least two different principles widely used in GUI design: Object-oriented user interfaces (OOUIs) and application oriented interfaces [verification needed]. Web-based user interfaces or web user interfaces (WUI) accept input and provide output by generating web pages which are transmitted via the Internet and viewed by the user using a web browser program. Newer implementations utilize Java, AJAX, Adobe Flex, Microsoft .NET, or similar technologies to provide real-time control in a separate program, eliminating the need to refresh a traditional HTML based web browser. Administrative web interfaces for web-servers, servers and networked computers are often called Control panels.User interfaces that are common in various fields outside desktop computing: Command line interfaces, where the user provides the input by typing a command string with the computer keyboard and the system provides output by printing text on the computer monitor. Used by programmers and system administrators, in engineering and scientific environments, and by technically advanced personal computer users. Tactile interfaces supplement or replace other forms of output with haptic feedback methods. Used in computerized simulators etc.
  38. 38. Touch user interface are graphical user interfaces using a touchscreen display as a combined input and output device. Used in many types of point of sale, industrial processes and machines, self-service machines etc.Other types of user interfaces: Attentive user interfaces manage the user attention deciding when to interrupt the user, the kind of warnings, and the level of detail of the messages presented to the user. Batch interfaces are non-interactive user interfaces, where the user specifies all the details of the batch job in advance to batch processing, and receives the output when all the processing is done. The computer does not prompt for further input after the processing has started. Conversational Interface Agents attempt to personify the computer interface in the form of an animated person, robot, or other character (such as Microsofts Clippy the paperclip), and present interactions in a conversational form. Crossing-based interfaces are graphical user interfaces in which the primary task consists in crossing boundaries instead of pointing. Gesture interface are graphical user interfaces which accept input in a form of hand gestures, or mouse gestures sketched with a computer mouse or a stylus. Intelligent user interfaces are human-machine interfaces that aim to improve the efficiency, effectiveness, and naturalness of human-machine interaction by representing, reasoning, and acting on models of the user, domain, task, discourse, and media (e.g., graphics, natural language, gesture). Motion tracking interfaces monitor the users body motions and translate them into commands, currently being developed by Apple[2] Multi-screen interfaces, employ multiple displays to provide a more flexible interaction. This is often employed in computer game interaction in both the commercial arcades and more recently the handheld markets. Noncommand user interfaces, which observe the user to infer his / her needs and intentions, without requiring that he / she formulate explicit commands. Object-oriented user interface (OOUI)
  39. 39. Reflexive user interfaces where the users control and redefine the entiresystem via the user interface alone, for instance to change its commandverbs. Typically this is only possible with very rich graphic user interfaces.Tangible user interfaces, which place a greater emphasis on touch andphysical environment or its element.Task-Focused Interfaces are user interfaces which address the informationoverload problem of the desktop metaphor by making tasks, not files, theprimary unit of interactionText user interfaces are user interfaces which output text, but accept otherform of input in addition to or in place of typed command strings.Voice user interfaces, which accept input and provide output by generatingvoice prompts. The user input is made by pressing keys or buttons, orresponding verbally to the interface.Natural-Language interfaces - Used for search engines and on webpages.User types in a question and waits for a response.Zero-Input interfaces get inputs from a set of sensors instead of queryingthe user with input dialogs.Zooming user interfaces are graphical user interfaces in which informationobjects are represented at different levels of scale and detail, and wherethe user can change the scale of the viewed area in order to show moredetail.
  40. 40. DESIGN PRINCIPLESWhen evaluating a current user interface, or designing a new user interface, it isimportant to keep in mind the following experimental design principles: Early focus on user(s) and task(s): Establish how many users are needed to perform the task(s) and determine who the appropriate users should be; someone that has never used the interface, and will not use the interface in the future, is most likely not a valid user. In addition, define the task(s) the users will be performing and how often the task(s) need to be performed. Empirical measurement: Test the interface early on with real users who come in contact with the interface on an everyday basis. Keep in mind that results may be altered if the performance level of the user is not an accurate depiction of the real human-computer interaction. Establish quantitative usability specifics such as: the number of users performing the task(s), the time to complete the task(s), and the number of errors made during the task(s). Iterative design: After determining the users, tasks, and empirical measurements to include, perform the following iterative design steps: 1. Design the user interface 2. Test 3. Analyze results 4. RepeatRepeat the iterative design process until a sensible, user-friendly interface iscreated.
  41. 41. DESIGN METHODOLOGIESA number of diverse methodologies outlining techniques for Human–Machineinteraction design have emerged since the rise of the field in the 1980s. Mostdesign methodologies stem from a model for how users, designers, and technicalsystems interact. Early methodologies, for example, treated users cognitiveprocesses as predictable and quantifiable and encouraged design practitioners tolook to cognitive science results in areas such as memory and attention whendesigning user interfaces. Modern models tend to focus on a constant feedbackand conversation between users, designers, and engineers and push for technicalsystems to be wrapped around the types of experiences users want to have,rather than wrapping user experience around a completed system. User-centered design: user-centered design (UCD) is a modern, widely practiced design philosophy rooted in the idea that users must take center- stage in the design of any computer system. Users, designers and technical practitioners work together to articulate the wants, needs and limitations of the user and create a system that addresses these elements. Often, user- centered design projects are informed by ethnographic studies of the environments in which users will be interacting with the system. This practice is similar but not identical to Participatory Design, which emphasizes the possibility for end-users to contribute actively through shared design sessions and workshops. Principles of User Interface Design: these are seven principles that may be considered at any time during the design of a user interface in any order, namely Tolerance, Simplicity, Visibility, Affordance, Consistency, Structure and Feedback.Display designs :-Displays are human-made artifacts designed to support theperception of relevant system variables and to facilitate further processing of thatinformation. Before a display is designed, the task that the display is intended tosupport must be defined (e.g. navigating, controlling, decision making, learning,entertaining, etc.). A user or operator must be able to process whateverinformation that a system generates and displays; therefore, the informationmust be displayed according to principles in a manner that will supportperception, situation awareness, and understanding.
  42. 42. THIRTEEN PRINCIPLES OF DISPLAY DESIGNThese principles of human perception and information processing can be utilizedto create an effective display design. A reduction in errors, a reduction in requiredtraining time, an increase in efficiency, and an increase in user satisfaction are afew of the many potential benefits that can be achieved through utilization ofthese principles.Certain principles may not be applicable to different displays or situations. Someprinciples may seem to be conflicting, and there is no simple solution to say thatone principle is more important than another. The principles may be tailored to aspecific design or situation. Striking a functional balance among the principles iscritical for an effective design.Perceptual Principles1. Make displays legible (or audible)A display’s legibility is critical and necessary for designing a usable display. If thecharacters or objects being displayed cannot be discernible, then the operatorcannot effectively make use of them.2. Avoid absolute judgment limitsDo not ask the user to determine the level of a variable on the basis of a singlesensory variable (e.g. color, size, loudness). These sensory variables can containmany possible levels.3. Top-down processingSignals are likely perceived and interpreted in accordance with what is expectedbased on a user’s past experience. If a signal is presented contrary to the user’sexpectation, more physical evidence of that signal may need to be presented toassure that it is understood correctly.4. Redundancy gainIf a signal is presented more than once, it is more likely that it will be understoodcorrectly. This can be done by presenting the signal in alternative physical forms
  43. 43. (e.g. color and shape, voice and print, etc.), as redundancy does not implyrepetition. A traffic light is a good example of redundancy, as color and positionare redundant.5. Similarity causes confusion: Use discriminable elementsSignals that appear to be similar will likely be confused. The ratio of similarfeatures to different features causes signals to be similar. For example, A423B9 ismore similar to A423B8 than 92 is to 93. Unnecessary similar features should beremoved and dissimilar features should be highlighted.Mental Model Principles6. Principle of pictorial realismA display should look like the variable that it represents (e.g. high temperature ona thermometer shown as a higher vertical level). If there are multiple elements,they can be configured in a manner that looks like it would in the representedenvironment.7. Principle of the moving partMoving elements should move in a pattern and direction compatible with theuser’s mental model of how it actually moves in the system. For example, themoving element on an altimeter should move upward with increasing altitude.Principles Based on Attention8. Minimizing information access costWhen the user’s attention is diverted from one location to another to accessnecessary information, there is an associated cost in time or effort. A displaydesign should minimize this cost by allowing for frequently accessed sources to belocated at the nearest possible position. However, adequate legibility should notbe sacrificed to reduce this cost.9. Proximity compatibility principleDivided attention between two information sources may be necessary for thecompletion of one task. These sources must be mentally integrated and are
  44. 44. defined to have close mental proximity. Information access costs should be low,which can be achieved in many ways (e.g. proximity, linkage by common colors,patterns, shapes, etc.). However, close display proximity can be harmful bycausing too much clutter.10. Principle of multiple resourcesA user can more easily process information across different resources. Forexample, visual and auditory information can be presented simultaneously ratherthan presenting all visual or all auditory information.Memory Principles11. Replace memory with visual information: knowledge in the worldA user should not need to retain important information solely in working memoryor to retrieve it from long-term memory. A menu, checklist, or another displaycan aid the user by easing the use of their memory. However, the use of memorymay sometimes benefit the user by eliminating the need to reference some typeof knowledge in the world (e.g. an expert computer operator would rather usedirect commands from memory than refer to a manual). The use of knowledge ina user’s head and knowledge in the world must be balanced for an effectivedesign.12. Principle of predictive aidingProactive actions are usually more effective than reactive actions. A displayshould attempt to eliminate resource-demanding cognitive tasks and replacethem with simpler perceptual tasks to reduce the use of the user’s mentalresources. This will allow the user to not only focus on current conditions, but alsothink about possible future conditions. An example of a predictive aid is a roadsign displaying the distance from a certain destination13. Principle of consistencyOld habits from other displays will easily transfer to support processing of newdisplays if they are designed in a consistent manner. A user’s long-term memorywill trigger actions that are expected to be appropriate. A design must accept thisfact and utilize consistency among different displays.
  45. 45. MODALITIES AND MODESA modality is a path of communication employed by the user interface to carryinput and output. Examples of modalities: Input — computer keyboard allows the user to enter typed text, digitizing tablet allows the user to create free-form drawing Output — computer monitor allows the system to display text and graphics (vision modality), loudspeaker allows the system to produce sound (auditory modality)The user interface may employ several redundant input modalities and outputmodalities, allowing the user to choose which ones to use for interaction.A mode is a distinct method of operation within a computer program, in whichthe same input can produce different perceived results depending of the state ofthe computer program. Heavy use of modes often reduces the usability of a userinterface, as the user must expend effort to remember current mode states, andswitch between mode states as necessaryIn the industrial design field of human-machine interaction, the user interface is(a place) where interaction between humans and machines occurs. The goal ofinteraction between a human and a machine at the user interface is effectiveoperation and control of the machine, and feedback from the machine which aidsthe operator in making operational decisions. Examples of this broad concept ofuser interfaces include the interactive aspects of computer operating systems,hand tools, heavy machinery operator controls. and process controls. The designconsiderations applicable when creating user interfaces are related to or involvesuch disciplines as ergonomics and psychology.A user interface is the system by which people (users) interact with a machine.The user interface includes hardware (physical) and software (logical)components. User interfaces exist for various systems, and provide a means of: Input, allowing the users to manipulate a system, and/or Output, allowing the system to indicate the effects of the users manipulation.
  46. 46. Generally, the goal of human-machine interaction engineering is to produce auser interface which makes it easy, efficient, enjoyable to operate a machine inthe way which produces the desired result. This generally means that theoperator needs to provide minimal input to achieve the desired output, and alsothat the machine minimizes undesired outputs to the human.Ever since the increased use of personal computers and the relative decline insocietal awareness of heavy machinery, the term user interface has taken onovertones of the (graphical) user interface, while industrial control panel andmachinery control design discussions more commonly refer to human-machineinterfaces.Other terms for user interface include human-computer interface (HCI) and man-machine interface (MMI).
  47. 47. INTERACTION TECHNIQUEFold n Drop, a crossing-based interaction technique for dragging and droppingfiles between overlapping windows.An interaction technique, user interface technique or input technique is acombination of hardware and software elements that provides a way forcomputer users to accomplish a single task. For example, one can go back to thepreviously visited page on a Web browser by either clicking a button, pressing akey, performing a mouse gesture or uttering a speech command. It is a keyconcept in human-computer interaction.DefinitionAlthough there is no general agreement on the exact meaning of the term"interaction technique", the most popular definition is from the computergraphics literature:An interaction technique is a way of using a physical input/output device toperform a generic task in a human-computer dialogue.A more recent variation is: An interaction technique is the fusion of input andoutput, consisting of all software and hardware elements, that provides a way forthe user to accomplish a task.
  48. 48. The computing viewFrom the computers perspective, an interaction technique involves: One or several input devices that capture user input, One or several output devices that display user feedback, A piece of software that: o interprets user input into commands the computer can understand, o produces user feedback based on user input and the systems state.Consider for example the process of deleting a file using a contextual menu. Thisassumes the existence of a mouse (input device), a screen (output device), and apiece of code that paints a menu and updates its selection (user feedback) andsends a command to the file system when the user clicks on the "delete" item(interpretation). User feedback can be further used to confirm that the commandhas been invoked.The users viewFrom the users perspective, an interaction technique is a way to perform a singlecomputing task and can be informally expressed with user instructions or usagescenarios. For example "to delete a file, right-click on the file you want to delete,then click on the delete item".The designers viewFrom the user interface designers perspective, an interaction technique is a well-defined solution to a specific user interface design problem. Interactiontechniques as conceptual ideas can be refined, extended, modified and combined.For example, contextual menus are a solution to the problem of rapidly selectingcommands. Pie menus are a radial variant of contextual menus.Level of granularityInteraction techniques are usually fine-grained entities. For example, a desktopenvironment is too complex to be an interaction technique, whereas Exposé fitsthe common intuitive understanding of the term perfectly well. In general, a user
  49. 49. interface can be seen as a combination of many interaction techniques, some ofwhich are not necessarily as explicit as widgets.Interaction tasks and domain objectsAn interaction task is "the unit of an entry of information by the user" [1], such asentering a piece of text, issuing a command, or specifying a 2D position. A similarconcept is that of domain object, which is a piece of application data that can bemanipulated by the user.[3]Interaction techniques are the glue between physical I/O devices and interactiontasks or domain objects. Different types of interaction techniques can be used tomap a specific device to a specific domain object. For example, different gesturealphabets exist for pen-based text input.In general, the less compatible the device is with the domain object, the morecomplex the interaction technique. For example, using a mouse to specify a 2Dpoint involves a trivial interaction technique, whereas using a mouse to rotate a3D object requires more creativity to design the technique and more lines of codeto implement it.A current trend is to avoid complex interaction techniques by matching physicaldevices with the task as close as possible, such as exemplified by the field oftangible computing. But this is not always a feasible solution. Furthermore,device/task incompatibilities are unavoidable in computer accessibility, where asingle switch can be used to control the whole computer environment.Interaction styleInteraction techniques that share the same metaphor or design principles can beseen as belonging to the same interaction style. General examples are commandline and direct manipulation user interfaces.Visualization techniqueInteraction techniques essentially involve data manipulation and thus placegreater emphasis on input than output. Output is merely used to conveyaffordances and provide user feedback. The use of the term input techniquefurther reinforces the central role of input. Conversely, techniques that mainly
  50. 50. involve data exploration and thus place greater emphasis on output are calledvisualization techniques. They are studied in the field of information visualization.Research and innovationA large part of research in human-computer interaction involves exploring easier-to-learn or more efficient interaction techniques for common computing tasks.This includes inventing new (post-WIMP) interaction techniques, possibly relyingon methods from user interface design, and comparing them with existingtechniques using methods from experimental psychology. Examples of scientificvenues in these topics are the UIST and the CHI conferences. Other researchfocuses on the specification of interaction techniques, sometimes usingformalisms such as Petri nets for the purposes of formal verification.
  51. 51. HUMAN INTERFACE DEVICESA human interface device or HID is a type of computer device that interactsdirectly with, and most often takes input from, humans and may deliver output tohumans. The term "HID" most commonly refers to the USB-HID specification. Theterm was coined by Mike Van Flandern of Microsoft when he proposed the USBcommittee create a Human Input Device class working group. [when?] The workinggroup was renamed as the Human Interface Device class at the suggestion of TomSchmidt of DEC because the proposed standard supported bi-directionalcommunication.HistoryThe primary motivations for HID were to enable innovation in PC input devicesand simplify the process of installing these devices. Prior to HID, devices usuallyconformed to very narrowly defined protocols for mice, keyboards and joysticks(for example the standard mouse protocol at the time supported relative X and Yaxis data and binary input for up to two buttons). Any innovation in hardwarerequired overloading the use of data in an existing protocol or creation of customdevice drivers and evangelization of a new protocol to application developers. Bycontrast all HID devices deliver self describing packages that may contain aninfinite variety of data types and formats. A single HID driver on the PC parses thedata and enables dynamic association of data I/O with application functionality.This has enabled rapid innovation and proliferation of new human interfacedevices.The HID standard was developed by a working committee with representativesfrom several companies and the list of participants can be found in the "DeviceClass Definition for Human Interface Devices (HID)" document. The concept of aself describing extensible protocol was initially conceived by Mike Van Flandernand Manolito Adan working on a project named Raptor at Microsoft andindependently by Steve McGowan working on a device protocol for Access Buswhile at Forte. After comparing notes at a Consumer Game DeveloperConference, Steve and Mike agreed to collaborate on a new standard for theemerging Universal Serial Bus.
  52. 52. HARDWAREHardware input/output devices and peripherals:List of input devices o Unit record equipment o Barcode scanner o Keyboard  Computer keyboard  Keyboard shortcut  Ways to make typing more efficient: command history, autocomplete, autoreplace and Intellisense o Microphone o Pointing device  Computer mouse  Mouse chordingList of output devices o Visual devices  Graphical output device  Display device  Computer display  Video projector  Computer printer  Plotter o Auditory devices  Speakers  Earphones o Tactile devices  Refreshable Braille display  Braille embosser  Haptic devices
  53. 53. Common HIDs Keyboard Mouse, Trackball, Touchpad, Pointing stick Graphics tablet Joystick, Gamepad, Analog stick Webcam HeadsetLess common HIDs Driving simulator devices and flight simulator devices have HIDs such as gear sticks, steering wheels and pedals. Wired glove (Nintendo Power Glove) Dance pad Wii Remote Surface computing device Apples Sudden Motion Sensor(SMS) device in Macs.Most operating systems will recognize standard USB HID devices, like keyboardsand mice, without needing a special driver. When installed, a message saying thata "HID-compliant device" has been recognized generally appears on screen. Incomparison, this message does not usually appear for devices connected via thePS/2 6-pin DIN connectors which preceded USB. PS/2 does not support plug-and-play, which means that connecting a PS/2 keyboard or mouse with the computerpowered on does not always work. In addition, PS/2 does not support the HIDprotocol. A USB HID is described by the USB human interface device class.
  54. 54. Components of the HID protocolIn the HID protocol, there are 2 entities: the "host" and the "device". The device isthe entity that directly interacts with a human, such as a keyboard or mouse. Thehost communicates with the device and receives input data from the device onactions performed by the human. Output data flows from the host to the deviceand then to the human. The most common example of a host is a computer butsome cell phones and PDAs also can be hosts.The HID protocol makes implementation of devices very simple. Devices definetheir data packets and then present a "HID descriptor" to the host. The HIDdescriptor is a hard coded array of bytes that describe the devices data packets.This includes: how many packets the device supports, how large are the packets,and the purpose of each byte and bit in the packet. For example, a keyboard witha calculator program button can tell the host that the buttons pressed/releasedstate is stored as the 2nd bit in the 6th byte in data packet number 4 (note: theselocations are only illustrative and are device specific). The device typically storesthe HID descriptor in ROM and does not need to intrinsically understand or parsethe HID descriptor. Some mouse and keyboard hardware in the market today areimplemented using only an 8-bit CPU.The host is expected to be a more complex entity than the device. The host needsto retrieve the HID descriptor from the device and parse it before it can fullycommunicate with the device. Parsing the HID descriptor can be complicated.Multiple operating systems are known to have shipped bugs in the device driversresponsible for parsing the HID descriptors years after the device drivers wereoriginally released to the public. However, this complexity is the reason why rapidinnovation with HID devices is possible.The above mechanism describes what is known as HID "report protocol". Becauseit was understood that not all hosts would be capable of parsing HID descriptors,HID also defines "boot protocol". In boot protocol, only specific devices aresupported with only specific features because fixed data packet formats are used.The HID descriptor is not used in this mode so innovation is limited. However, thebenefit is that minimal functionality is still possible on hosts that otherwise wouldbe unable to support HID. The only devices supported in boot protocol are:-
  55. 55. Keyboard — Any of the first 256 key codes ("Usages") defined in the HID Usage Tables, Usage Page 7 can be reported by a keyboard using the boot protocol, but most systems only handle a subset of these keys. Most systems support all 104 keys on the IBM AT-101 layout, plus the three new keys designed for Microsoft Windows 95. Many systems also support additional keys on basic western European 105-, Korean 106-, Brazilian ABNT 107- and Japanese DOS/V 109-key layouts. Buttons, knobs and keys that are not reported on Usage Page 7 are not available. For example, a particular US keyboards QWERTY keys will function but the Calculator and Logoff keys will not because they are defined on Usage Page 12 and cannot be reported in boot protocol. Mouse — Only the X-axis, Y-axis, and the first 3 buttons will be available. Any additional features on the mouse will not function.One common usage of boot mode is during the first moments of a computersboot up sequence. Directly configuring a computers BIOS is often done using onlyboot mode.Other protocols using HIDSince HIDs original definition over USB, HID is now also used in other computercommunication buses. This enables HID devices that traditionally were only foundon USB to also be used on alternative buses. This is done since existing supportfor USB HID devices can typically be adapted much faster than having to invent anentirely new protocol to support mice, keyboards, and the like. Known buses thatuse HID are: Bluetooth HID — Bluetooth is a wireless communications technology. Several Bluetooth mice and keyboards already exist in the market place. Serial HID — Used in Microsofts Windows Media Center PC remote control receivers