Tips & Tricks
Like this document? Why not share!
Ui Design And Usability For Everybody
THESEUS Usability Guidelines for Us...
by Daniel Sonntag
User Centered Design
by Eran Toch
Usability requirements and their el...
by Lucas Machado
by sumit singh
Email sent successfully!
Show related SlideShares at end
Aug 12, 2012
Comment goes here.
12 hours ago
Are you sure you want to
Your message goes here
Be the first to comment
Be the first to like this
Number of Embeds
No notes for slide
Transcript of "7 13"
1. ISSN: 2277 – 9043 International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 5, July 2012 Roadmaps for Usability Engineering Sanjeev Narayan Bal Asst. Prof. Dept. of Comp.Sc. Trident Academy of Creative Technology, Bhubaneswar firstname.lastname@example.orgAbstract-- Usability engineering is a cost-effective, user of interactive application development, just as are systemscentered process that ensures a high level of effectiveness, engineering and software engineering. Usability engineeringefficiency, and safety in complex interactive systems. A activities can be tailored to allow individualizing as neededmajor challenge, and thus opportunity, in the field of for a specific project or product development effort. Thehuman-computer interaction (HCI) and specifically usability engineering process applies to any interactiveusability engineering (UE) is designing effective user system, ranging from training applications to multimedia CD-interfaces for emerging technologies that have no ROMs to augmented and virtual environments to simulationestablished design guidelines or interaction metaphors, or applications to graphical user interfaces (GUIs). The usabilityintroduce completely new ways for users to perceive and engineering process is flexible enough to be applied at anyinteract with technology and the world around them. This stage of the development life cycle, although early use of thepaper presents a brief description of usability engineering process provides the best opportunity for cost savings.activities, and discusses our experiences with leadingusability engineering activities for different applications.we propose a usability engineering approach that employs II. ACTIVITIES IN USABILITY ENGINEERINGuser-based studies to inform design, by iterativelyinserting a series of user-based studies into a traditionalusability engineering lifecycle to better inform initial userinterface designs. We will discuss interaction,visualization, and adaptive user support for maps onmobile devices. we propose an entangled User CenteredSystem Design (UCSD) and Feature Driven Development(FDD) process for MCI ,where a usability engineeringmodel is used.Index Terms--Usability engineering, mobile devices, MassCasualty Incident, User Centered Design, Feature DrivenDevelopment, virtual environments I. INTRODUCTION Usability engineering is a cost-effective, user-centeredprocess that ensures a high level of effectiveness, efficiency,and safety in complex interactive systems . Activities in thisprocess include user analysis, user task analysis, conceptualand detailed user interface design, quantifiable usabilitymetrics, rapid prototyping, and various kinds of user-centeredevaluations of the user interface. These activities are furtherexplained in Section "Activities in Usability Engineering."Usability engineering produces highly usable user interfacesthat are essential to reduced manning, reduced human error,and increased productivity. Unfortunately, managers anddevelopers often have the misconception that usabilityengineering activities add costs to a products development life Fig. 1 Typical user-centered activities associated with our usabilitycycle. In fact, usability engineering can reduce costs over the engineering process. Although the usual flow is generally left-to-right fromlife of the product, by reducing the need to add missed activity to activity, the arrows indicate the substantial iterations and revisions that occurs in practice.functionality later in the development cycle, when suchadditions are more expensive. The process is an integral part 7 All Rights Reserved © 2012 IJARCSEE
ISSN: 2277 – 9043 International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 5, July 2012 As mentioned in the introduction, usability engineering usability experts rely explicitly and solely on establishedconsists of numerous activities. Figure 1 shows a simple usability design guidelines to determine whether a userdiagram of the major activities. Usability engineering includes interface design effectively and efficiently supports user taskboth design and evaluations with users; it is not just performance (i.e.. usability). But usability experts can alsoapplicable at the evaluation phase. Usability engineering is not rely more implicitly on design guidelines and work throughtypically hypothesis-testing-based experimentation, but user task scenarios during their evaluation. Nielseninstead is structured, iterative user-centered design and recommends three to five evaluators for an expert evaluation,evaluation applied during all phases of the interactive system and has shown empirically that fewer evaluators generallydevelopment life cycle. Most existing usability engineering identify only a small subset of problems and that moremethods were spawned by the development of traditional evaluators produce diminishing results at higher costs. Eachdesktop graphical user interface (GUIs). evaluator first inspects the design alone, independently of In the following sections, we discuss several of the major other evaluators findings. Then the evaluators combine thenusability engineering activities, including domain analysis, data to analyze both common and conflicting usabilityexpert evaluation (also sometimes called heuristic evaluation findings. Results from an expert evaluation should not onlyor usability inspection), formative usability evaluation, and identify problematic user interface components and interactionsummative usability evaluation. techniques, but should also indicate why a particularDomain Analysis component or technique is problematic. This is arguably the Domain analysis is the process by which answers to two most cost-effective type of usability evaluation, because itcritical questions about a specific application context are does not involve users.determined: Formative Usability Evaluation • Who are the users? Formative evaluation is the process of assessing, refining, • What tasks will they perform? and improving a user interface design by having Thus, a key activity in domain analysis is user task representative users perform task-based scenarios, observinganalysis, which produces a complete description of tasks, their performance, and collecting and analyzing data tosubtasks. and actions that an interactive system should provide empirically identify usability problems. This observationalto support its human users, as well as other resources evaluation method can ensure usability of interactive systemsnecessary for users and the system to cooperatively perform by including users early and continually throughout usertasks . While it is preferable that user task analyses be interface development. This method relies heavily on usageperformed early in the development process, like all aspects of context (e.g.. user tasks, user motivation), as well as a soliduser interface development, task analyses also need to be understanding of Formative evaluation produces bothflexible and potentially iterative, allowing for modifications to qualitative and quantitative results collected fromuser performance and other user interface requirements during representative users during their performance of task scenariosany stage of development. In our experience, interviewing an . Qualitative data include critical incidents, a user event thatexisting and/or identified user base, along with subject matter has a significant impact, either positive or negative, on usersexperts and application "visionaries", Provides very useful task performance and/or satisfaction. Quantitative data includeinsight into what users need and expect from an application. metrics such as how long it takes a user to perform a givenObservation-based analysis requires a user interaction task, the number of errors encountered during taskprototype, and as such, is used as a last resort. A combination performance, measures of user satisfaction, and so on.of early analysis of application documentation (when Collected quantitative data are then compared to appropriateavailable) and interviews with subject matter experts typically baseline metrics, sometimes initially redefining or alteringprovides the most effective user task analysis. Domain evaluators perceptions of what should be considered baseline.analysis generates critical information used throughout all Both qualitative and quantitative data are equally importantstages of the usability engineering life cycle. A key result is a since they each provide unique insight into a user interfacetop-down, typically hierarchical decomposition of detailed designs strengths and weaknesses.user task descriptions. This decomposition serves as an Summative Usability Evaluationenumeration and explanation of desired functionality for use Summative evaluation, in contrast to formative evaluation-by designers and evaluators, as well as required task is a process that is typically performed after a product or somesequences. Other key results are one or more detailed part of its design is more or less complete. Its purpose is toscenarios, describing potential uses of the application, and a statistically compare several different systems or candidatelist of user-centered requirements. Without a clear designs, for example, to determine which one is "better,"understanding of application domain user tasks and user where better is defined in advance. In practice, summativerequirements, both evaluators and developers are forced to evaluation can take many forms. The most common are the"best guess" or interpret desired functionality, which comparative, field trial, and more recently, the expert review.inevitably leads to poor user interface design. While both the field trial and expert review methods are well-Expert Evaluation suited for design assessment, they typically involve Expert evaluation (also called heuristic evaluation or assessment of single prototypes or field-delivered designs.usability inspection) is the process of identifying potential Our experiences have found that the empirical comparativeusability problems by comparing a user interface design to approach employing representative users is very effective forestablished usability design guidelines. The identified analyzing strengths and weaknesses of various well-formed,problems are then used to derive recommendations for candidate designs set within appropriate user scenarios.improving that design. This method is used by usability However, it is the most costly type of evaluation because itexperts to identify critical usability problems early in the may need large numbers of users to achieve statistical validitydevelopment cycle, so that these design issues can be and reliability, and because data analysis can be complex andaddressed as part of the iterative design process. Often the challenging. 8 All Rights Reserved © 2012 IJARCSEE
ISSN: 2277 – 9043 International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 5, July 2012 III. USABILITY ENGINEERING APPROACHES field. Since this methodology assumes the presence of TO DESIGINING USER INTERFACES guidelines and heuristics to aid in designs to be evaluated during the "expert guidelines-based evaluation" phase, it is not To date, numerous approaches to software and user applicable to emerging technologies such as augmentedinterface design have been developed and applied. The reality, where user interface design guidelines and heuristicswaterfall model, developed by Royce, was the first widely have not yet been established. When examining many or theknown approach to software engineering. This model takes a approaches described above - and specifically the design andtop-down approach based on functional decomposition. Royce evaluation activities - in most cases, design activities rely onadmitted that while this process was designed to support large leveraging existing metaphors, style guides or standards in thesoftware development efforts, it was inherently flawed since it field (e.g., drop down menus, a browsers "back" button, etc.).did not support iteration; a property that he eventually added However, in cases where an application falls within anthe model. The spiral model (Boehm, ) was the first widely emerging technological field, designers often have no existingrecognized approach that utilized and promoted iteration. It is metaphors or style guides, much less standards on which touseful for designing user interfaces (as well as software), base their design. Moreover, in cases where the technologybecause it allows the details of user interfaces to emerge over provides novel approaches to user interaction ortime, with iterative feedback from evaluation sessions feeding fundamentally alters the way users perceive the interactiondesign and redesign. As with usability engineering space (i.e., where technology and the real world comeapproaches, the spiral model first creates a set of user-centered together), designers often have little understanding of therequirements through a suite of traditional domain analysis perceptual or cognitive ramifications of "best guess" designs.activities (e.g., structured interviews, participatory design, As a result, a process is needed to help designers of novel useretc.). Following requirements analysis, the second step simply interfaces iteratively create and evaluate designs, to gain astates that a "preliminary design is created for the new better understanding of effective design parameters, and tosystem". Hix and Hartson  describe a star life cycle that is determine under what conditions these parameters are bestexplicitly designed to support the creation of user interfaces. applied. Without this process, applications developed usingThe points of the star represent typical design/development traditional usability engineering approaches can only improveactivities such as "user analyses", "requirements/ usability incrementally from initial designs which again, are oftenspecifications", "rapid prototyping", etc, with each activity based on developers best guesses, given the absences ofconnected through a single center "usability evaluation" guidelines, metaphors, and standards.activity. The points of the start are not ordered, so one canstart at any point in the process, but can only proceed toanother point via usability evaluation. The design activitiesfocus on moving from a conceptual design to a detaileddesign. Mayhew  describes a usability engineering lifecyclethat is iterative and centered on integrating users throughoutthe entire development process. With respect to design, theusability engineering lifecycle relies on screen designstandards, which are iteratively evaluated and updated. Boththe screen design standards as well as the detailed userinterface designs rely on style guides that can take the form ofa platform" style guide (e.g., Mac, Windows, etc.),"corporate" style guide (applying a corporate "look and feel"), "product family" style guide (e.g., MS Office Suite), etc.Gabbard, Hix and Swan  present a cost-effective,structured, iterative methodology for user-centered design and evaluation of virtual environment (VE) user interfaces andinteraction. Fig. 2, depicts the general methodology, which isbased on sequentially performing: 1. user task analysis, 2. expert guidelines-based evaluation, 3. formative user-centered evaluation, and 4. summative comparative evaluations. While similar methodologies have been applied totraditional (GUI-based) computer systems, this methodologyis novel because we specifically designed it for —and appliedit to —VEs, and it leverages a set of heuristic guide-lines specifically designed for VEs. These sets of heuristicguidelines were derived from Gabbards taxonomy of Fig. 2 The user-centered design and evaluation methodology for virtualusability characteristics for VEs [5,6]. A shortcoming of this environment user interaction described by Gabbard .approach is that it does not give much guidance for designactivities. The approach does not describe how to engage in User Interface Design Activities for Augmented Realitydesign activities, but instead asserts that initial designs can be As shown in Fig. 3, it can be argued that user-basedcreated using input from task descriptions, sequences, and experiments are critical for driving design activities, usability,dependencies as well as guidelines and heuristics from the and discovery early in an emerging technologys development 9 All Rights Reserved © 2012 IJARCSEE
ISSN: 2277 – 9043 International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 5, July 2012(such as AR). As a technological field evolves, questions about how different design parameters mightlessons learned from conducting user-based studies are not support user task performance, designers may be able toonly critical for the usability of a particular application, but conduct a user-based study as a starting point. Under thisprovide value to the field as a whole in terms of insight into a approach, designers start with experimental design parameterspart of the user interface design space (e.g., of occlusion or as opposed to specific user interface designs. As shown in Fig.text legibility). As time progresses, contributions to the field 4, user based studies not only identify user interface design(from many researchers) begin to form a collection of parameters to assist in UI design, but also have the potential toinformal design guidelines and metaphors from which produce UI design guidelines and lessons learned, as well asresearchers and application designers alike can draw from. generate innovation, which provides both tangibleEventually, the informal design guidelines are shaken down contributions to the field while also improving the usability ofinto a collection of tried-and-true guidelines and metaphors a specific application. Ultimately, a set of iteratively refinedthat are adopted by the community. Finally, the guidelines and user interface designs are produced that are the basis for themetaphors become "defacto" standards or at best deemed overall application user interface design. This design can then"standards" by appropriate panels and committees. The be evaluated using formative user-centered evaluation, ascontext of the work reported here, however, falls within the described by Hix, Gabbard, and Swan .application of user-based studies to inform user interfacedesign; the left-most box of Fig. 3. Based on our experiencesperforming usability engineering, and specifically design andevaluation activities for the Battlefield Augmented RealitySystem, we propose an updated approach to user interfacedesign activities for augmented reality systems. This approachemphasizes iterative design activities in between the user taskanalysis phase (where requirements are gathered and usertasks understood) and the formative user-centered evaluationphase (where an application-level user interface prototype hasbeen developed and is under examination) (Fig. 2). With thisapproach, w7e couple the user expert evaluation and user-based studies to assist in the user interface design activity(Fig. 3). Expert evaluations can be iteratively combined withwell-designed user-based studies to refine designersunderstanding of the design space, understanding of effectivedesign parameters (e.g., to identify subsequent user-basedstudies), and most importantly to refine user interface designs.A strength of this approach is that interface design activitiesare driven by a number of activities; inputs from the user task analysis phase (Fig. 2), user interface design parameterscorrelated with good user interface performance (derived from Fig. 3 User-based experiments are a critical vehicle for discovery and usability early in an emerging fields development. Over time, contributionsuser-based studies), and expert evaluation results. from the field emerge, leading eventually to adopted user interface design With this methodology, expert evaluations along with guidelines and standards.user-based studies are iteratively applied to refine the userinterface design space. Of the three main activities shown in Fig. 3, there are twological starting points: user interface design and user basedstudies. An advantage of starting with user interface designactivities is that designers can start exploring the design spaceprior to investing time in system development, and moreover,can explore a number of candidate designs quickly and easily.In the past, we have successfully used PowerPoint mockups toexamine dozens of AR design alternatives. If mocked upcorrectly, the static designs can be presented through anoptical see through display, which allows designers to get anidea of how the designs may be perceived when viewedthrough an AR display in a representative context (e.g.,indoors versus outdoors). Once a set of designs have beencreated, expert evaluations can be applied to assess the staticuser interface designs, culling user interface designs that arelikely to be less effective than others. The expert evaluationsare also useful in terms of further understanding the design Fig. 4 We applied the depicted user-centered design activities as part ofspace by identifying potential user-based experimental factors our overall usability engineering approach when developing the Battlefield Augmented Reality System .and levels. Once identified, user-based studies can beconducted to further examine those factors and levels todetermine, for example, if the findings of the expert evaluationmatch that of user-based studies. In cases where the designspace is somewhat understood and designers have specific 10 All Rights Reserved © 2012 IJARCSEE
ISSN: 2277 – 9043 International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 5, July 2012 IV. USABILITY ENGINEERING FOR MOBILE DEVICES Usability determines to a major extent the success ofproducts and services that are based on information andcommunication technology. Usability engineering  is anapproach to develop software that is easy to use. effective andefficient, and is in our opinion based on three principles: 1)Early and continuous focus on user and tasks. 2) empiricalmeasurement. 3) and iterative design. In UE severaldevelopment cycles (Figure 5), with assessments and re-specifications, are worked through. A complex and interestingscenario is developed with users, from which userrequirements and features can be derived. The features areimplemented and the quality is assessed by human computerinteraction (HCI) metrics. These metrics are closely related tothe social challenges, regarding comfort and acceptance, butthe technological challenges are also addressed in the metrics.When taking the user as the center of the design it is alsoimportant to take into account his or her cognitive task load Fig. 5 The Usability engineering method, for which several metrics exist. The assessment of the use which experimental method.metrics can be done in several different ways, either by We conclude that difficulties for usability testing forexperts or users. mobile devices exist, but several frameworks are available which can be used for designing usable mobile map UE is a good design approach for a usable mobile map. applications. Because there are many challenges for mobileBut thorough understanding of the dynamic use context is applications, ease of use and effectiveness are crucial. Using acrucial for user-centered design of mobile applications [7, 12]. sound usability engineering approach, we decide whichWith mobile devices it is necessary to test them eventually interface features for mobile maps help dealing with thewhile the user is mobile, which makes evaluation difficult. technological, environmental, and social challenges.Depending on the task and the application, usability can Example-- Tourist information and navigation support byhighly differ between a sunny day and a rainy day, between a using 3D maps displayed on mobile devicesnoisy environment and a silent environment. The choice Laakso, Gjesdal and Sulebak  developed 3D maps withbetween a lab-experiment or a field experiment is therefore far tourist information and GPS for mobile devices. In this studyfrom crucial for mobile devices . The use of the mobile there was a strong focus on user requirements and feedback ofmap is mostly a secondary task; the primary task (route potential users on the prototypes. The study consisted of threeplanning or looking for a nearby restaurant) does have a iterations. In the first iteration the intended user group wasstrong influence on the way of use. Not only the device has to asked how they did perform the task which the applicationhave a high degree of fidelity, but the primary task has to be was going to support, and which functionalities they wouldsimulated realistic too. Zhang and Adipat  give an like to have in the proposed application. With the answers itoverview of challenges in usability testing of mobile devices. was possible to create a prototype which was tested in theThese challenges are the same as the design challenges for second iteration. The application was tested in a usability testmobile applications: mobile context, connectivity, small and with focus groups. Both the participants of the usabilityscreen size, different display resolutions, limited processing test and the participants of the focus groups belonged to thecapability and power, and different data entry methods. An group of intended users. The tasks that were performed in theexample is that low screen resolution can have disastrous usability test were typical tasks for the application andeffects on the usability of a mobile application. There are performed in a realistic field environment. There was onedifferent frameworks for usability testing of mobile devices drawback of the design of the experiment. The 3D map was[11,13]. In all frameworks, it is an important question whether shown on a mobile device whereas the 2D map was of paper.to do a usability test in the laboratory or in the field, and Therefore it was difficult to compare the two views. In thewhether experts are used or prospective users. Lab- final iteration another prototype was evaluated with the use ofexperiments are appropriate for improvement of the interface usability tests and questionnaires. The 3D map was found fundesign, for which the real device or an emulator can be used. but less usable than the 2D map. for which an explanationField-experiments on the other hand are more appropriate could be that participants are used to 2D maps. Furthermorewhen the final application is tested . The framework of results showed that location positioning, for example usingStreefkerk et al.  extends the framework of Zhang and GPS, is very important for map information on mobileAdipat by that it not only gives instructions on which devices. The experimental set-up of this study is a soundexperimental method to use , but gives constraints for when to example of usability engineering. All three general approaches are followed; there is an early and continuous focus on the user, empirical measurements are used, and it is an iterative design. In this experiment the technological challenge for 11 All Rights Reserved © 2012 IJARCSEE
ISSN: 2277 – 9043 International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 5, July 2012visualization of the viewpoint was examined and several HCI • Ambulance crews are informed on their way to the areametrics were used to measure the usability of the design. of operation as well as while waiting at the ambulance Users were included in both the design and the usability assembly area. Supplying this information to the rescuetesting. workers can help to cope with anxieties and help to prepare them for the situations they will be confronted with . V. USABILITY ENGINEERING FOR MASS CASUALITY INCIDENTS • All relevant information is stored on central servers for ad-hoc as well as post-hoc analyses. This information can To design an application system that is usable in case of an be very useful to implement organizational learning andMCI, we propose an entangled approach of User-Centered gradually improving the whole man-machine-system overSystem Design (UCSD) and Feature Driven Development time.(FDD) [14,15,16,17]. To prepare users to be able to handle a At the moment our project is still in a first prototypicalsystem in case of an MCI, we propose that the MDGS state. The goal is to integrate the system as a module intoprovides training for MCIs within the regular day-to-day the R2-System, an end-to-end solution for regular transportbusiness . This should be accomplished by using similar and rescue missions, of the DIGITALYS GmbH.support systems for handling MCIs and regular rescue andtransport missions. Speaking in terms of software engineering,support for an MCI should be provided by an additionalmodule of the same MDGS framework that is used forhandling the regular day-to-day business.Development Process As already mentioned, we follow an entangled approachthat combines UCSD and FDD to keep the project anddevelopment process focused (Figure 6). Classic elements ofUCSD (e.g. user studies, interviews) will be complementedby: •observing MCI exercises; •accompanying paramedics and emergency physicianswhile they are using the MDGS in regular missions; •evaluating the usability of the MDGS in regular missions; Fig. 6 Process model combining FDD and UCSD . •attending emergency medical aid and MCI relatedworkshops. Combining these and scientific information, features canbe derived. Natural dependencies between these "small, clientvalued function[s]"  and a prioritization process will leadto a sorted list of feature sets. To work through the list, we usethe iterative and incremental process of FDD. By using anentangled FDD/UCSD process as our softwareengineering paradigm, we are able to quickly roll out feature-sets, as well as keeping them close to the users needsand expectations through repeated user-feedback.General Principles for MDGSs Figure 7, gives an overview of our proposed MDGS. Theoverall design is based on the assumption that an MDGSthat follows the principle has to be technically feasible andsuitable for handling day-to-day rescue and transportmissions as well as MCIs. The basic features as shown in thefigure are: • All rescue workers (paramedics, emergency physicians,team leader, and incident commander) are using the samehandheld device, most likely a rugged tablet PC. • All stakeholders (public-safety answering point, crisissquad, hospital staff, and rescue teams) are kept in theloop. They are aware of all necessary information. The usersare guided by dialogues that are simple enough to bestill useful even in very demanding situations. • The location of every patient and rescue worker is madeavailable by location-based services for the stake holders in Fig. 7 Handling of MCIs as an integrated part of general MDGS.command. 12 All Rights Reserved © 2012 IJARCSEE
ISSN: 2277 – 9043 International Journal of Advanced Research in Computer Science and Electronics Engineering Volume 1, Issue 5, July 2012 VI. CONCLUSION  Chu, Y. and Ganz, A. 2007. WISTA: a wireless telemedicine system for disaster patient care. Mob. Netw. From these three applications, we have learned many Appl. 12,201-214.lessons on how to improve the process of usability  Demchak, B., Chan, T. C, Griswold, W. G., and Lenert,engineering. L. A. 2006. Situational awareness during mass-casualty We have presented a modified usability engineering events: command and control. A MI A Anna Symp Proc, 905.approach to design that employs a combination of user  Norman, D. A. and Draper, S. W. 1986. User centeredinterface design, user-based studies and expert evaluation to system design. New perspectives on human-computeriteratively design a usable user interface as well as refine interaction. Erlbaum, Hillsdale, N.J.designers understanding of a specific design space. In this  Palmer S. R. and Felsing J. M. 2002. A Practical Guidepaper we looked at the usability of maps on mobile devices. to Feature-Driven Development. Prentice HallDifferent methods of visualization, interaction, and adaptive PTR, Upper Saddle River NJ.user support to obtain good usability were discussed in theview of technical, environmental and social challenges. Wehave also disclosed a model for Usability Engineering for AuthorMass Casualty Incidents. Sanjeev Narayan Bal MBA, MCA, MTech (CSE) REFERENCES Assistant Professor (CSE) B.W. Boehm, "A Spiral Model of Software Development Trident Academy of Creative Technology, Bhubaneswar.arid Enhancement", IEEE Computer, 21(5), 1-72,1988. Life Member - ISTE D. Hix and H.R. Hartson, "Developing User Interfaces: Published many papers in National and International Journals.Ensuring Usability Through Product and Process", New York,NY, John Wiley & Sons,1993. D.J. Mayhew, "The Usability Engineering Lifecycle, aPractitioners Handbook for User Interface Design", SanFrancisco, CA, Morgan Kaufmann Publishers, 1999. J.L. Gabbard, D. Hix and J.E. Swan II, "User-CenteredDesign and Evaluation of Virtual Environments". InvitedPaper to IEEE Computer Graphics and Applications, Volume19, Number 6, pages 51-59, November / December, 1999. J.L. Gabbard, "Taxonomy of Usability Characteristics inVirtual Environments". Masters thesis. Department ofComputer Science, Virginia Tech, 1997. J.E. Swan II and J.L. Gabbard, "Survey of User-BasedExperimentation in Augmented Reality", In Proceedings of 1stInternational Conference on Virtual Reality, HCI International2005, Las Vegas, Nevada, USA, July 22-27,2005. Gorlenko. L. & Merrick, R. (2003). No wires attached:Usability challenges in the connected mobile world. IBMSystems Journal, 42, 639-651. Laakso. K., Gjesdal. O., & Sulebak, J. R. (2003). Touristinformation and navigation support by using 3D mapsdisplayed on mobile devices. Proceedings of HCI in MobileGuides (Udine, Italy: in conjunction with Mobile HCI 2003). Neerincx. M. A. & Lindenberg. J. (2005). Situatedcognitive engineering for complex task environments.Proceedings of the Seventh International Naturalistic DecisionMaking Conference. Nielsen. J. (1994). Usability Engineering. MorganKaufmann. Streefkerk, J. W.. Esch-Bussemakers. M. P.. & Neerincx.M. A. (2006). Designing personal attentive user interfaces inthe mobile public safety domain. Computers in HumanBehavior, 22, 749-770. Vetere. F., Howard. S.. Pedell. S.. & Balbo. S. (2003).Walking through mobile use: novel heuristics and theirapplication. Proceedings of OzCHI, 24-32. Zhang. D. & Adipat. B. (2005). Challenges.Methodologies, and Issues in the Usability Testing of MobileApplications. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 18, 293-308. 13 All Rights Reserved © 2012 IJARCSEE