SlideShare a Scribd company logo
1 of 6
Download to read offline
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
F-142
www.viva-technology.org/New/IJRI
THE USABILITY METRICS FOR USER EXPERIENCE
Prachi Desul1
, Prof.Chandani Patel2
1
(Department of MCA, Viva School Of MCA/ University of Mumbai, India)
2
(Department of MCA, Viva School Of MCA/ University of Mumbai, India)
Abstract : The Usability Metric for User Experience (UMUX) is a four-item Likert scale used for the
subjective assessment of an application’s perceived usability. It is designed to provide results similar to those
obtained with the 10-item System Usability Scale, and is organized around the ISO 9241-11 definition of
usability. A pilot version was assembled from candidate items, which was then tested alongside the System
Usability Scale during usability testing. It was shown that the two scales correlate well, are reliable, and both
align on one underlying usability factor. In addition, the Usability Metric for User Experience is compact
enough to serve as a usability module in a broader user experience metric
Keywords - metric,system usability scale,usability,user experience.
I. INTRODUCTION
Usability can be measured, but it is rarely Metrics are expensive and are a poor use of typically scarce
usability resources. Although measuring usability can cost fourfold the maximum amount as conducting
qualitative studies (which often generate better insight), metrics are sometimes well worth the expense. Among
other things, metrics can help managers track design progress and support decisions about when to release a
product. As organizations increase their usability investments, collecting actual measurements is a natural next
step and does provide benefits. In general, usability metrics let you: Track progress between releases. You
cannot fine-tune your methodology unless you recognize how well you're doing. Assess your competitive
position. Are you better or worse than other companies? Where are you better or worse? Make a Stop/Go
decision before launch. Is the design ok to release to an unsuspecting world? Create bonus plans for design
managers and higher-level executives. For example, you'll determine bonus amounts for development project
leaders supported what percentage customer-support calls or emails their products generated during the year.
Usability may be a quality attribute that assesses how easy user interfaces are to use. The word "usability" also
refers to methods for improving ease-of-use during the planning process.
Usability is defined by 5 quality components:
Learnability: How easy is it for users to accomplish basic tasks the primary time they encounter the design?
Efficiency: Once users have learned the planning , how quickly can they perform tasks? Memorability: When
users return to the planning after a period of not using it, how easily can they re-establish proficiency? Errors:
what percentage errors do users make, how severe are these errors, and the way easily can they get over the
errors? Satisfaction: How pleasant is it to use the planning There are many other important quality attributes. A
key one is utility, which refers to the design's functionality: Does it do what users need?
Usability and utility are equally important and together determine whether something is useful: It matters little
that something is straightforward if it isn't what you would like
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
F-143
www.viva-technology.org/New/IJRI
RELATED WORK
Software development organizations contains marketing, design, project management, development and
quality assurance team. it's important for the various teams within the organization to know the
advantages and limitation of incorporating various usability testing methods within the software development
life cycle. Some reasons for poor usability include effort prioritization conflicts from development, project
management, and style team. The part played by the usability engineer is to urge involved because
the heuristic judge and facilitate the event and style efforts are supported usability principles and at an
equivalent time adhering to the project period of time . Two approaches for usability inspection methods
consist of user experience testing and expert review or more commonly referred to as Heuristic Evaluation
(HE). This paper focuses on understanding the strength of HE as a strategy for defect detection. The results also
increase the need for integrating traditional heuristics with modified heuristics customized to the domain or field
of the project being tested such as E-Government.[1]
. Describes an innovative methodology developed for usability tests of the IEEE PCS internet site that
combines heuristic evaluation and task-based testing. Tests conducted on the PCS Web site has evaluated
whether the location facilitated members' ability to seek out information and participate in
discussions, also as developers' are capable to seek out , contribute, and manage administrative
information on the location . The distinctive social characteristics of Communities of Practice (CoPs)
provide context for tailoring design heuristics for informational internet sites that serve the
requirements and interests of CoP members. The discussion gives important on technical communication
principles that apply not only to evaluating the effectiveness of the PCS internet site design but also to all
or any centralised. f the PCS Web site design but also to all centralised technical communication products and
media that increasingly demand user participation.[2] Here Proposes a usability testing method that alters a
given usability testing method to form it less expensive and time consuming for the investigator. The
usage of user-centred methods is stimulated and a mixture of two centralised methods suggested. Future
this method is combined with other techniques to additionally detect the state of satisfaction within the
participant.
User based features like emotions, opinions, cognitive and conative effects are therefore are
considered. a way for the joint analysis of all data gathered is proposed.[3] More automated system
testing might be instrumental in achieving these goals and in recent years testing tools are developed to
automate the interaction with software systems at the GUI level. However, there's absence knowledge on
the usability and applicability of these tools in an industrial setting. This study analyses two tools for
automated visual GUI testing on a real-world, safety-critical software is developed by the corporate Saab
AB.
The tools are compared supported their characteristics also as how they support automation of system
test cases that have previously been presented manually. The time to develop and the size of the
automated test cases also as their execution times are evaluated.[4] Usability testing is important to be
performed by software development companies to determine whether their products are usable or
unusable. it's equally important for the end- users companies running usability studies also . This paper
represents the event of
Usability Management System (USEMATE), an automatic system as an alternate solution
to assist usability tester or practitioner to run usability testing more efficiently and effectively.
The main objective of USEMATE is to enhance the present systems which are paper-based,
require manual score calculation using excel and manual reaction time recording into a webbased
management system. The tools used for the event compromise Adobe Photoshop CS2, Adobe
Dreamweaver CS3, Apache Web Server, and a private computer (PC). The modules and
usefulness criteria included and therefore the approach utilized in the event of this automated system
were replicated from a case study on usability testing of a webpage conducted earlier. USEMATE is
envisaged to be ready to minimize the lengthy working hour and energy needed to manage the usability
testing process from phase to phase.[5]
Usage of traditional UT techniques which aren't sufficient and suitable with the growing
complexity of internet sites & constraints faced by usability practitioners. For a sample, the Lab
Based Usability Testing (LBUT) is dear and has lesser coverage than Exploratory
Heuristics Evaluation (EHE) while the EHE is subjected to false alarms. A hybrid usability
methodology (HUM) comprising of LBUT and EHE is obtainable . Six experiments involving
EHE and LBUT were performed at the first , in-between and future stages of the SDLC of
websites, during which the simplest relative performance of every method were measured using
the dependent variables followed by the planning of a HUM. To prove the HUM, four case
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
F-144
www.viva-technology.org/New/IJRI
studies were conducted, during which remarkable improvements were observed in website
effectiveness and efficiency. supported the findings, HUM may be a realistic approach for usability
practitioners and also provides stakeholders a validated situational deciding
framework for usability testing strategies taking under consideration world constraints.[6]
II. METHODOLOGY
usability may be a multidimensional concept that aims into the fulfillment of certain set of goals,
mainly; effectiveness, efficiency and satisfaction” and without these goals, usability can't be achieved.
Effectiveness: this term refers to the accuracy and completeness of the user goal achievement.
Efficiency: refers to the resources exhausted by users so as to make sure an accurate and completed achievement
of the goals.
Satisfaction refers to the subjective thoughts of the user regarding their attitude, level of comfort, relevance
of application and therefore the acceptability of use.
A system or a product is completely hooked in to its specific and distinct context of use, the character of the
task, the users appointed to require the task, and finally the equipment used to perform it.
Measuring the usability of a certain system can be done through the measurement of the three goals using a
number of observable and quantifiable usability metrics.
In the light of the three goals mentioned earlier, we’ll go through the different metrics used to measure each
goal, however, our main focus will be on the Effectiveness It can be measured through using two usability
metrics: Success rate, called also completion rate and the number of errors .Success rate/ completion rate: is the
percentage of users who were able to successfully complete the tasks. Despite the very fact that this metric
remains unable to supply insights on how the tasks were performed or why users fail just in case of failure,
they're still critical and are at the core of usability.The success rate is one of the most commonly used metric for
most of practitioners, where 79% of them reported using the success rate as the first metric to think about for
simple use and through data collection and interpretation.
The success rate metric are often measured by assigning a binary value of 0 and 1 to the users; where 1 is
assigned to those that successfully complete the task and 0 to the ones who fail to do so.”Once the test is over
and you have all the data you need to calculate your success rate, the next step would be to divide the total
number of correctly completed attempts by the total number of attempts multiplied by 100.The completion rate
is easy to measure and to collect but with one major pitfall to consider; it happens frequently when a user stops
at some point during the task and fails to end it or maybe finishes it but not in the expected way.Taking into
account that they have completed some steps successfully in the task, how would you score what they have
accomplished as an evaluator?I am getting to dive a touch bit into the small print on the way to score you users
taking under consideration the various stages of their success or failure, using an example to illustrate.
Let’s consider, for instance, that your user task is to order a box of dark chocolates with a card to their
mother for mother’s day.The scoring might seem simple at first glance, and you'll easily say; if the mother
receives the box of bittersweet chocolate with the cardboard then it's a case of success. On the opposite hand, if
the mother doesn't receive anything then we will simply say, that this is often a case of failure.
However, it’s not that straightforward , there are other considerations:
Ordered a box of chocolate but not the dark one (white or milky or a spread of these) alongside card.
Ordered the proper chocolate box without a present card
Ordered quite one box of chocolate by mistake and a present card
Ordered a box of chocolate but didn’t add delivery information or address
Ordered a box of chocolates and gift card successfully but to the incorrect address
All these cases entail a percentage of success and failure within the process of fulfilling the task, their failure
is partial also as their success, which simply means that as an evaluator you would like to interact your own
personal opinion within the scoring.
If you decided that there are no middle grounds in the estimated scoring, your success rate would be different
from that obtained when you appreciate the effort they have made in spite of the task you planned for them.
The fact that there is not a steady rule when it comes to scoring your users, and oftentimes success rates
become subjective; because different evaluators won’t have the same scoring and estimate an equivalent
percentage of failure or success for the above cases, within the same way. However, so as to mainstream the
method , you'd like to work out the important aspects of the task and what score you would allot each a part of
it.
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
F-145
www.viva-technology.org/New/IJRI
Success rate remains the only usability metric and therefore the easiest among the entire range of those
usability signals, mainly because it’s quick and straightforward and doesn't require much preparation and time to
gather and most significantly it enables you from tracking the progress within your system being one among the
overall areas commonly employed by marketers and designers right along , to ascertain the large picture of how
well their system is doing at the extent of user experience, this doesn't change the very fact , that it remains
subjective.
Help the designers and developers to ascertain that uncovering problems isn't a symbol of
failure. nobody does an ideal job the initially time only. Users always surprise us. It's much better to seek
out out about the issues with a some users during a usability test than later when the design is being
reviewed and is out there within the marketplace. [7]
2. The Number of Errors
This metric provides an idea about the average number of times where an error occurred per user when
performing a given task.These errors can be either slips; where the user accidently types the incorrect email
address or picks the incorrect dates when making a reservation or booking a flight, or they will be mistakes
where the user clicks on an image that’s not clickable or even double clicks a button or a link
intentionally.Normally any users of any interactive system may make errors, where 2 out of every 3 users err,
and there's absolutely no such thing as a ‘’perfect’’ system anyway..To help you measure and ensure obtaining
great diagnostic results, it is highly recommended to set a short description where you give details about how to
score those errors and the severity of a certain of an error to show you how simple and intuitive your system is.
3. Time-Based Efficiency
Or referred to as time on task, this metric helps in the measurement of the time spent by the user to complete
the task or speed of work. This consequently means there's an immediate relationship between the efficiency
and effectiveness, and that we can say, that efficiency is really the user effectiveness divided by the user time
spent.
4. The Overall Relative Efficiency
This is actually measured through users who successfully completed the task in relation to the total time
taken by all users.Let’s consider that we have 2 users where each one of is supposed to complete a different
task.The first user has successfully completed task (1) yet failed to complete task (2). While the second hand has
did not complete task (1) but completed task (2) successfully.
5.Post Task Satisfaction
Once your users have finished the task and it doesn’t matter whether complete it successfully or not, it’s time
to hand them over a questionnaire to have an idea about the difficulty of the task from the users point of
view.Generally, these tasks consist of 5 questions, and the idea behind them give your users a space to judge the
usability of your system.
6. Task Level Satisfaction
This metric helps into investigating the overall impression of users confronted with the system. To measure
the level of satisfaction you can either use the smiley scale method where the user is expected to choose one of
the 5 smileys as a reflection of their satisfaction or lack of satisfaction.The Word Method is also use to measure
the user’s level of satisfaction through listing a series of positive and negative connotations highlighted in green
and red respectively.
In light of the conceptual framework we have discussed earlier, the user experience is highly influenced by
everything that surrounds it.However, the tide might be turning on usability funding. I've recently worked on
several projects to determine formal usability metrics in several companies. As organizations increase their
usability investments, collecting actual measurements is a natural next step and does provide benefits. In
general, usability metrics let you:
Track progress between releases. You cannot fine-tune your methodology unless you recognize how well you're
doing.Assess your competitive position. Are you better or worse than other companies? Where are you better or
worse?Make a Stop/Go decision before launch. Is the design ok to release to an unsuspecting world?
Create bonus plans for design managers and higher-level executives. For example, you'll determine bonus
amounts for development project leaders supported what percentage customer-support calls or emails their
products generated during the year.
How to Measure
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
F-146
www.viva-technology.org/New/IJRI
It is easy to specify usability metrics, but hard to gather them. Typically, usability is measured relative to
users' performance on a given set of test tasks. The most basic measures are supported the definition of usability
as a top quality metric:
success rate (whether users can perform the task at all),
the time a task requires,
the error rate, and
users' subjective satisfaction.[8]
It is also possible to collect more specific metrics, such as the percentage of time that users follow an
optimal navigation path or the number of times they need to back track. You can collect usability metrics for
both novice users and experienced users. Few websites have truly expert users, since people rarely spend
enough time on any given site to find out it in great detail. Given this, most websites benefit most from studying
novice users. Exceptions are sites like Yahoo and Amazon, which have highly committed and constant users and
may enjoy studying expert users. Intranets, extranets, and weblications are almost like traditional software
design and can hopefully have skilled users; studying experienced users is thus more important than working
with the novice users who typically dominate public websites. With qualitative user testing, it is enough to test
3–5 users. After the fifth user tests, you've got all the insight you're likely to urge and your best bet is to travel
back to the drafting board and improve the design so that you can test it again. Testing quite five users wastes
resources, reducing the amount of design iterations and compromising the ultimate design quality.
Unfortunately, when you're collecting usability metrics, you want to test with quite five users. In order to urge a
fairly tight confidence interval on the results, I usually recommend testing 20 users for every design. Thus,
conducting quantitative usability studies is approximately fourfold as expensive as conducting qualitative ones.
Considering that you simply can learn more from the simpler studies, I usually recommend against metrics
unless the project is extremely well funded.success rate or the completion rate because it’s gives a general idea
about the performance of the system.
IV. FIGURES AND TABLES
Comparing Two Designs
To illustrate quantitative results, we can look at those recently posted by Macromedia from its usability study of
a Flash site, aimed at showing that Flash is not necessarily bad. Basically, Macromedia took a design,
redesigned it according to a set of usability guidelines, and tested both versions with a group of users. Here are
the results:
Table no:1
Original Design Redesign
Task 1 12 sec. 6 sec.
Task 2 75 sec. 15 sec.
Task 3 9 sec. 8 sec.
Task 4 140 sec. 40 sec.
Satisfaction score* 44.75 74.50
*Measured on a scale ranging from
12 (unsatisfactory on all counts) to 84 (excellent on all counts).
Table no :2
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021)
ISSN(Online): 2581-7280
VIVA Institute of Technology
9th
National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021)
F-147
www.viva-technology.org/New/IJRI
Fig-1 fig-2
Fig-3
V. CONCLUSION
usability metrics, it's possible to watch and quantify the usability of any system irrespective if it's software,
hardware, web-based or a mobile application. This is because the metrics presented here are supported extensive
research and testing by various academics and experts and have withstood the test of your time .
Moreover, they cover all of the three core elements that constitute the definition of usability: effectiveness,
efficiency and satisfaction, thus ensuring an all-round quantification of the usability of the system being tested.
usability gets side-tracked and becomes something which will be addressed afterward . Tracking the usability of
your product with metrics allows you to possess a transparent understanding of the experience you're providing
to your users, and improve it over time. usability metrics are measured and aggregated into actionable results,
which allows you to act instantly on the info you record. That makes it painless to stay track of how your
design's usability progresses, detect issues, and improve your users' experience
REFERENCES
Bevan, N. and Macleod, M. 1994. Usability measurement in context, Behavior and Information Technology 13: 132–145.
Ivory, M.Y. and Hearst, M.A. 2001. The state of the art in automating usability evaluation of user interfaces, ACM Computing Surveys 33:
470–516.
Kirakowski, J. and Corbett, M., 1993. SUMI: The Software Usability Measurement Inventory, British Journal of Educational Technology
24: 210–212
Lin, H. X., Choong, Y.-Y., and Salvendy, G., 1997. A proposed index of usability: A method for comparing the relative usability of
different software systems, Behaviour and Information Technology, 16: 267-277.
Macleod, M., 1994. Usability: Practical Methods for testing and Improvement, Proceedings of the Norwegian Computer Society Software
Conference, Sandvika, Norway. Retrieved July 3, 2005 from http://www.usability.serco.com/papers/mm-us94.pdf.
Macleod, M., and Rengger, R., 1993. The development of DRUM: A software tool for video-assisted usability evaluation. Retrieved July 3,
2005 from http://www.usability.serco.com/papers/drum93.pdf
Nielsen, J., 1993. Usability Engineering, London, UK: Academic Press
Symposium on User Interface Software and Technology, New York: ACM Press, pp. 101–110. Shackel, B., 1991. Usability—Context,
framework, definition, design and evaluation, in B. Shackel and S. Richardson (Eds.), Human Factors for Informatics Usability, Cambridge,
MA: University Press, pp. 21–38.
Landuaer, T.K. The Trouble with Computers: Usefulness, Usability and Productivity, MIT Press, 1995. Mayhew, D.J. (1999). The Usability
Engineering Lifecycle: A Practitioner’s Handbook for User Interface design, Morgan Kaufmann, San Francisco.
Holzinger, A.: Usability Engineering for Software Developers. Communications of the ACM 48(1), 71–74 (2005)
Seffah, A., Metzker, E.: The obstacles and myths of usability and software engineering. Communications of the ACM 47(12), 71–76 (2004)
Nielsen, Jakob (4 January 2012). "Usability 101: Introduction to Usability". Nielsen Norman Group. Archived from the original on 1
September 2016. Retrieved 7 August 2016.

More Related Content

What's hot

The Impact of In-House Software Development Practices on System Usability in ...
The Impact of In-House Software Development Practices on System Usability in ...The Impact of In-House Software Development Practices on System Usability in ...
The Impact of In-House Software Development Practices on System Usability in ...
IJMIT JOURNAL
 
A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...
A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...
A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...
ijseajournal
 
Smart Sim Selector: A Software for Simulation Software Selection
Smart Sim Selector: A Software for Simulation Software SelectionSmart Sim Selector: A Software for Simulation Software Selection
Smart Sim Selector: A Software for Simulation Software Selection
CSCJournals
 
http___www.irma-international.org_viewtitle_32970_
http___www.irma-international.org_viewtitle_32970_http___www.irma-international.org_viewtitle_32970_
http___www.irma-international.org_viewtitle_32970_
Abdul Hakeem
 

What's hot (18)

The Impact of In-House Software Development Practices on System Usability in ...
The Impact of In-House Software Development Practices on System Usability in ...The Impact of In-House Software Development Practices on System Usability in ...
The Impact of In-House Software Development Practices on System Usability in ...
 
AN APPROACH TO IMPROVEMENT THE USABILITY IN SOFTWARE PRODUCTS
AN APPROACH TO IMPROVEMENT THE USABILITY IN SOFTWARE PRODUCTSAN APPROACH TO IMPROVEMENT THE USABILITY IN SOFTWARE PRODUCTS
AN APPROACH TO IMPROVEMENT THE USABILITY IN SOFTWARE PRODUCTS
 
A study of various viewpoints and aspects software quality perspective
A study of various viewpoints and aspects  software quality perspectiveA study of various viewpoints and aspects  software quality perspective
A study of various viewpoints and aspects software quality perspective
 
A METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSIS
A METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSISA METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSIS
A METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSIS
 
MAKE THE QUALITY OF SOFTWARE PRODUCT IN THE VIEW OF POOR PRACTICES BY USING S...
MAKE THE QUALITY OF SOFTWARE PRODUCT IN THE VIEW OF POOR PRACTICES BY USING S...MAKE THE QUALITY OF SOFTWARE PRODUCT IN THE VIEW OF POOR PRACTICES BY USING S...
MAKE THE QUALITY OF SOFTWARE PRODUCT IN THE VIEW OF POOR PRACTICES BY USING S...
 
A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...
A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...
A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...
 
30 8948 prakash paper64 (edit ndit)
30 8948 prakash paper64 (edit ndit)30 8948 prakash paper64 (edit ndit)
30 8948 prakash paper64 (edit ndit)
 
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...
 
Ijetr021224
Ijetr021224Ijetr021224
Ijetr021224
 
Smart Sim Selector: A Software for Simulation Software Selection
Smart Sim Selector: A Software for Simulation Software SelectionSmart Sim Selector: A Software for Simulation Software Selection
Smart Sim Selector: A Software for Simulation Software Selection
 
A Guideline Tool for Ongoing Product Evaluation in Small and Medium-Sized Ent...
A Guideline Tool for Ongoing Product Evaluation in Small and Medium-Sized Ent...A Guideline Tool for Ongoing Product Evaluation in Small and Medium-Sized Ent...
A Guideline Tool for Ongoing Product Evaluation in Small and Medium-Sized Ent...
 
Hci In The Software Process
Hci In The Software ProcessHci In The Software Process
Hci In The Software Process
 
http___www.irma-international.org_viewtitle_32970_
http___www.irma-international.org_viewtitle_32970_http___www.irma-international.org_viewtitle_32970_
http___www.irma-international.org_viewtitle_32970_
 
Performance Evaluation of Software Quality Model
Performance Evaluation of Software Quality ModelPerformance Evaluation of Software Quality Model
Performance Evaluation of Software Quality Model
 
Effectiveness of software product metrics for mobile application
Effectiveness of software product metrics for mobile application Effectiveness of software product metrics for mobile application
Effectiveness of software product metrics for mobile application
 
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
 
An Elite Model for COTS Component Selection Process
An Elite Model for COTS Component Selection ProcessAn Elite Model for COTS Component Selection Process
An Elite Model for COTS Component Selection Process
 
216328327 nilesh-and-teams-project
216328327 nilesh-and-teams-project216328327 nilesh-and-teams-project
216328327 nilesh-and-teams-project
 

Similar to 195

THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...
THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...
THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...
IJMIT JOURNAL
 
2012 in tech-usability_of_interfaces (1)
2012 in tech-usability_of_interfaces (1)2012 in tech-usability_of_interfaces (1)
2012 in tech-usability_of_interfaces (1)
Mahesh Kate
 
DESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance FrameworkDESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance Framework
IJERA Editor
 
Factors Influencing the Efficacy of Agile Usage
Factors Influencing the Efficacy of Agile UsageFactors Influencing the Efficacy of Agile Usage
Factors Influencing the Efficacy of Agile Usage
Dr. Amarjeet Singh
 
Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...
IJECEIAES
 

Similar to 195 (20)

Majestic MRSS Usability Engineering
Majestic MRSS Usability EngineeringMajestic MRSS Usability Engineering
Majestic MRSS Usability Engineering
 
MMRSS Usability Engineering
MMRSS Usability EngineeringMMRSS Usability Engineering
MMRSS Usability Engineering
 
Some practical considerations and a
Some practical considerations and aSome practical considerations and a
Some practical considerations and a
 
Ijetr021224
Ijetr021224Ijetr021224
Ijetr021224
 
The impact of user involvement in software development process
The impact of user involvement in software development processThe impact of user involvement in software development process
The impact of user involvement in software development process
 
THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...
THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...
THE IMPACT OF IN-HOUSE SOFTWARE DEVELOPMENT PRACTICES ON SYSTEM USABILITY IN ...
 
Best Practices for Improving User Interface Design
Best Practices for Improving User Interface DesignBest Practices for Improving User Interface Design
Best Practices for Improving User Interface Design
 
BEST PRACTICES FOR IMPROVING USER INTERFACE DESIGN
BEST PRACTICES FOR IMPROVING USER INTERFACE DESIGN BEST PRACTICES FOR IMPROVING USER INTERFACE DESIGN
BEST PRACTICES FOR IMPROVING USER INTERFACE DESIGN
 
Ijcatr04051006
Ijcatr04051006Ijcatr04051006
Ijcatr04051006
 
Approaches and Challenges of Software Reusability: A Review of Research Liter...
Approaches and Challenges of Software Reusability: A Review of Research Liter...Approaches and Challenges of Software Reusability: A Review of Research Liter...
Approaches and Challenges of Software Reusability: A Review of Research Liter...
 
User Experience Evaluation for Automation Tools: An Industrial Experience
User Experience Evaluation for Automation Tools: An Industrial ExperienceUser Experience Evaluation for Automation Tools: An Industrial Experience
User Experience Evaluation for Automation Tools: An Industrial Experience
 
2012 in tech-usability_of_interfaces (1)
2012 in tech-usability_of_interfaces (1)2012 in tech-usability_of_interfaces (1)
2012 in tech-usability_of_interfaces (1)
 
DESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance FrameworkDESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance Framework
 
USEFul: A Framework to Mainstream Web Site Usability through Automated Evalua...
USEFul: A Framework to Mainstream Web Site Usability through Automated Evalua...USEFul: A Framework to Mainstream Web Site Usability through Automated Evalua...
USEFul: A Framework to Mainstream Web Site Usability through Automated Evalua...
 
Factors Influencing the Efficacy of Agile Usage
Factors Influencing the Efficacy of Agile UsageFactors Influencing the Efficacy of Agile Usage
Factors Influencing the Efficacy of Agile Usage
 
Hci in-the-software-process-1
Hci in-the-software-process-1Hci in-the-software-process-1
Hci in-the-software-process-1
 
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...
 
Evaluation of Web Applications based on UX Parameters
Evaluation of Web Applications based on UX ParametersEvaluation of Web Applications based on UX Parameters
Evaluation of Web Applications based on UX Parameters
 
Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...
 
A Systematic Review On Software Cost Estimation In Agile Software Development
A Systematic Review On Software Cost Estimation In Agile Software DevelopmentA Systematic Review On Software Cost Estimation In Agile Software Development
A Systematic Review On Software Cost Estimation In Agile Software Development
 

More from vivatechijri

Structural and Morphological Studies of Nano Composite Polymer Gel Electroly...
Structural and Morphological Studies of Nano Composite  Polymer Gel Electroly...Structural and Morphological Studies of Nano Composite  Polymer Gel Electroly...
Structural and Morphological Studies of Nano Composite Polymer Gel Electroly...
vivatechijri
 
An Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGE
An Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGEAn Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGE
An Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGE
vivatechijri
 
Recommender Systems
Recommender SystemsRecommender Systems
Recommender Systems
vivatechijri
 

More from vivatechijri (20)

Understanding the Impact and Challenges of Corona Crisis on Education Sector...
Understanding the Impact and Challenges of Corona Crisis on  Education Sector...Understanding the Impact and Challenges of Corona Crisis on  Education Sector...
Understanding the Impact and Challenges of Corona Crisis on Education Sector...
 
LEADERSHIP ONLY CAN LEAD THE ORGANIZATION TOWARDS IMPROVEMENT AND DEVELOPMENT
LEADERSHIP ONLY CAN LEAD THE ORGANIZATION  TOWARDS IMPROVEMENT AND DEVELOPMENT  LEADERSHIP ONLY CAN LEAD THE ORGANIZATION  TOWARDS IMPROVEMENT AND DEVELOPMENT
LEADERSHIP ONLY CAN LEAD THE ORGANIZATION TOWARDS IMPROVEMENT AND DEVELOPMENT
 
A study on solving Assignment Problem
A study on solving Assignment ProblemA study on solving Assignment Problem
A study on solving Assignment Problem
 
Structural and Morphological Studies of Nano Composite Polymer Gel Electroly...
Structural and Morphological Studies of Nano Composite  Polymer Gel Electroly...Structural and Morphological Studies of Nano Composite  Polymer Gel Electroly...
Structural and Morphological Studies of Nano Composite Polymer Gel Electroly...
 
Theoretical study of two dimensional Nano sheet for gas sensing application
Theoretical study of two dimensional Nano sheet for gas sensing  applicationTheoretical study of two dimensional Nano sheet for gas sensing  application
Theoretical study of two dimensional Nano sheet for gas sensing application
 
METHODS FOR DETECTION OF COMMON ADULTERANTS IN FOOD
METHODS FOR DETECTION OF COMMON  ADULTERANTS IN FOODMETHODS FOR DETECTION OF COMMON  ADULTERANTS IN FOOD
METHODS FOR DETECTION OF COMMON ADULTERANTS IN FOOD
 
The Business Development Ethics
The Business Development EthicsThe Business Development Ethics
The Business Development Ethics
 
Digital Wellbeing
Digital WellbeingDigital Wellbeing
Digital Wellbeing
 
An Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGE
An Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGEAn Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGE
An Alternative to Hard Drives in the Coming Future:DNA-BASED DATA STORAGE
 
Enhancing The Capability of Chatbots
Enhancing The Capability of ChatbotsEnhancing The Capability of Chatbots
Enhancing The Capability of Chatbots
 
Smart Glasses Technology
Smart Glasses TechnologySmart Glasses Technology
Smart Glasses Technology
 
Future Applications of Smart Iot Devices
Future Applications of Smart Iot DevicesFuture Applications of Smart Iot Devices
Future Applications of Smart Iot Devices
 
Cross Platform Development Using Flutter
Cross Platform Development Using FlutterCross Platform Development Using Flutter
Cross Platform Development Using Flutter
 
3D INTERNET
3D INTERNET3D INTERNET
3D INTERNET
 
Recommender Systems
Recommender SystemsRecommender Systems
Recommender Systems
 
Light Fidelity(LiFi)- Wireless Optical Networking Technology
Light Fidelity(LiFi)- Wireless Optical Networking TechnologyLight Fidelity(LiFi)- Wireless Optical Networking Technology
Light Fidelity(LiFi)- Wireless Optical Networking Technology
 
Social media platform and Our right to privacy
Social media platform and Our right to privacySocial media platform and Our right to privacy
Social media platform and Our right to privacy
 
Google File System
Google File SystemGoogle File System
Google File System
 
A Study of Tokenization of Real Estate Using Blockchain Technology
A Study of Tokenization of Real Estate Using Blockchain TechnologyA Study of Tokenization of Real Estate Using Blockchain Technology
A Study of Tokenization of Real Estate Using Blockchain Technology
 
A Study of Data Storage Security Issues in Cloud Computing
A Study of Data Storage Security Issues in Cloud ComputingA Study of Data Storage Security Issues in Cloud Computing
A Study of Data Storage Security Issues in Cloud Computing
 

Recently uploaded

Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
Epec Engineered Technologies
 
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills KuwaitKuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
jaanualu31
 
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
AldoGarca30
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
HenryBriggs2
 
Digital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptxDigital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptx
pritamlangde
 
Integrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - NeometrixIntegrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - Neometrix
Neometrix_Engineering_Pvt_Ltd
 

Recently uploaded (20)

School management system project Report.pdf
School management system project Report.pdfSchool management system project Report.pdf
School management system project Report.pdf
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
 
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills KuwaitKuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
 
Linux Systems Programming: Inter Process Communication (IPC) using Pipes
Linux Systems Programming: Inter Process Communication (IPC) using PipesLinux Systems Programming: Inter Process Communication (IPC) using Pipes
Linux Systems Programming: Inter Process Communication (IPC) using Pipes
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
 
Online electricity billing project report..pdf
Online electricity billing project report..pdfOnline electricity billing project report..pdf
Online electricity billing project report..pdf
 
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
 
Digital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptxDigital Communication Essentials: DPCM, DM, and ADM .pptx
Digital Communication Essentials: DPCM, DM, and ADM .pptx
 
Memory Interfacing of 8086 with DMA 8257
Memory Interfacing of 8086 with DMA 8257Memory Interfacing of 8086 with DMA 8257
Memory Interfacing of 8086 with DMA 8257
 
Ground Improvement Technique: Earth Reinforcement
Ground Improvement Technique: Earth ReinforcementGround Improvement Technique: Earth Reinforcement
Ground Improvement Technique: Earth Reinforcement
 
Employee leave management system project.
Employee leave management system project.Employee leave management system project.
Employee leave management system project.
 
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
 
Augmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptxAugmented Reality (AR) with Augin Software.pptx
Augmented Reality (AR) with Augin Software.pptx
 
Integrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - NeometrixIntegrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - Neometrix
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
 
Introduction to Artificial Intelligence ( AI)
Introduction to Artificial Intelligence ( AI)Introduction to Artificial Intelligence ( AI)
Introduction to Artificial Intelligence ( AI)
 
Convergence of Robotics and Gen AI offers excellent opportunities for Entrepr...
Convergence of Robotics and Gen AI offers excellent opportunities for Entrepr...Convergence of Robotics and Gen AI offers excellent opportunities for Entrepr...
Convergence of Robotics and Gen AI offers excellent opportunities for Entrepr...
 
Basic Electronics for diploma students as per technical education Kerala Syll...
Basic Electronics for diploma students as per technical education Kerala Syll...Basic Electronics for diploma students as per technical education Kerala Syll...
Basic Electronics for diploma students as per technical education Kerala Syll...
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and properties
 

195

  • 1. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021) ISSN(Online): 2581-7280 VIVA Institute of Technology 9th National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021) F-142 www.viva-technology.org/New/IJRI THE USABILITY METRICS FOR USER EXPERIENCE Prachi Desul1 , Prof.Chandani Patel2 1 (Department of MCA, Viva School Of MCA/ University of Mumbai, India) 2 (Department of MCA, Viva School Of MCA/ University of Mumbai, India) Abstract : The Usability Metric for User Experience (UMUX) is a four-item Likert scale used for the subjective assessment of an application’s perceived usability. It is designed to provide results similar to those obtained with the 10-item System Usability Scale, and is organized around the ISO 9241-11 definition of usability. A pilot version was assembled from candidate items, which was then tested alongside the System Usability Scale during usability testing. It was shown that the two scales correlate well, are reliable, and both align on one underlying usability factor. In addition, the Usability Metric for User Experience is compact enough to serve as a usability module in a broader user experience metric Keywords - metric,system usability scale,usability,user experience. I. INTRODUCTION Usability can be measured, but it is rarely Metrics are expensive and are a poor use of typically scarce usability resources. Although measuring usability can cost fourfold the maximum amount as conducting qualitative studies (which often generate better insight), metrics are sometimes well worth the expense. Among other things, metrics can help managers track design progress and support decisions about when to release a product. As organizations increase their usability investments, collecting actual measurements is a natural next step and does provide benefits. In general, usability metrics let you: Track progress between releases. You cannot fine-tune your methodology unless you recognize how well you're doing. Assess your competitive position. Are you better or worse than other companies? Where are you better or worse? Make a Stop/Go decision before launch. Is the design ok to release to an unsuspecting world? Create bonus plans for design managers and higher-level executives. For example, you'll determine bonus amounts for development project leaders supported what percentage customer-support calls or emails their products generated during the year. Usability may be a quality attribute that assesses how easy user interfaces are to use. The word "usability" also refers to methods for improving ease-of-use during the planning process. Usability is defined by 5 quality components: Learnability: How easy is it for users to accomplish basic tasks the primary time they encounter the design? Efficiency: Once users have learned the planning , how quickly can they perform tasks? Memorability: When users return to the planning after a period of not using it, how easily can they re-establish proficiency? Errors: what percentage errors do users make, how severe are these errors, and the way easily can they get over the errors? Satisfaction: How pleasant is it to use the planning There are many other important quality attributes. A key one is utility, which refers to the design's functionality: Does it do what users need? Usability and utility are equally important and together determine whether something is useful: It matters little that something is straightforward if it isn't what you would like
  • 2. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021) ISSN(Online): 2581-7280 VIVA Institute of Technology 9th National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021) F-143 www.viva-technology.org/New/IJRI RELATED WORK Software development organizations contains marketing, design, project management, development and quality assurance team. it's important for the various teams within the organization to know the advantages and limitation of incorporating various usability testing methods within the software development life cycle. Some reasons for poor usability include effort prioritization conflicts from development, project management, and style team. The part played by the usability engineer is to urge involved because the heuristic judge and facilitate the event and style efforts are supported usability principles and at an equivalent time adhering to the project period of time . Two approaches for usability inspection methods consist of user experience testing and expert review or more commonly referred to as Heuristic Evaluation (HE). This paper focuses on understanding the strength of HE as a strategy for defect detection. The results also increase the need for integrating traditional heuristics with modified heuristics customized to the domain or field of the project being tested such as E-Government.[1] . Describes an innovative methodology developed for usability tests of the IEEE PCS internet site that combines heuristic evaluation and task-based testing. Tests conducted on the PCS Web site has evaluated whether the location facilitated members' ability to seek out information and participate in discussions, also as developers' are capable to seek out , contribute, and manage administrative information on the location . The distinctive social characteristics of Communities of Practice (CoPs) provide context for tailoring design heuristics for informational internet sites that serve the requirements and interests of CoP members. The discussion gives important on technical communication principles that apply not only to evaluating the effectiveness of the PCS internet site design but also to all or any centralised. f the PCS Web site design but also to all centralised technical communication products and media that increasingly demand user participation.[2] Here Proposes a usability testing method that alters a given usability testing method to form it less expensive and time consuming for the investigator. The usage of user-centred methods is stimulated and a mixture of two centralised methods suggested. Future this method is combined with other techniques to additionally detect the state of satisfaction within the participant. User based features like emotions, opinions, cognitive and conative effects are therefore are considered. a way for the joint analysis of all data gathered is proposed.[3] More automated system testing might be instrumental in achieving these goals and in recent years testing tools are developed to automate the interaction with software systems at the GUI level. However, there's absence knowledge on the usability and applicability of these tools in an industrial setting. This study analyses two tools for automated visual GUI testing on a real-world, safety-critical software is developed by the corporate Saab AB. The tools are compared supported their characteristics also as how they support automation of system test cases that have previously been presented manually. The time to develop and the size of the automated test cases also as their execution times are evaluated.[4] Usability testing is important to be performed by software development companies to determine whether their products are usable or unusable. it's equally important for the end- users companies running usability studies also . This paper represents the event of Usability Management System (USEMATE), an automatic system as an alternate solution to assist usability tester or practitioner to run usability testing more efficiently and effectively. The main objective of USEMATE is to enhance the present systems which are paper-based, require manual score calculation using excel and manual reaction time recording into a webbased management system. The tools used for the event compromise Adobe Photoshop CS2, Adobe Dreamweaver CS3, Apache Web Server, and a private computer (PC). The modules and usefulness criteria included and therefore the approach utilized in the event of this automated system were replicated from a case study on usability testing of a webpage conducted earlier. USEMATE is envisaged to be ready to minimize the lengthy working hour and energy needed to manage the usability testing process from phase to phase.[5] Usage of traditional UT techniques which aren't sufficient and suitable with the growing complexity of internet sites & constraints faced by usability practitioners. For a sample, the Lab Based Usability Testing (LBUT) is dear and has lesser coverage than Exploratory Heuristics Evaluation (EHE) while the EHE is subjected to false alarms. A hybrid usability methodology (HUM) comprising of LBUT and EHE is obtainable . Six experiments involving EHE and LBUT were performed at the first , in-between and future stages of the SDLC of websites, during which the simplest relative performance of every method were measured using the dependent variables followed by the planning of a HUM. To prove the HUM, four case
  • 3. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021) ISSN(Online): 2581-7280 VIVA Institute of Technology 9th National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021) F-144 www.viva-technology.org/New/IJRI studies were conducted, during which remarkable improvements were observed in website effectiveness and efficiency. supported the findings, HUM may be a realistic approach for usability practitioners and also provides stakeholders a validated situational deciding framework for usability testing strategies taking under consideration world constraints.[6] II. METHODOLOGY usability may be a multidimensional concept that aims into the fulfillment of certain set of goals, mainly; effectiveness, efficiency and satisfaction” and without these goals, usability can't be achieved. Effectiveness: this term refers to the accuracy and completeness of the user goal achievement. Efficiency: refers to the resources exhausted by users so as to make sure an accurate and completed achievement of the goals. Satisfaction refers to the subjective thoughts of the user regarding their attitude, level of comfort, relevance of application and therefore the acceptability of use. A system or a product is completely hooked in to its specific and distinct context of use, the character of the task, the users appointed to require the task, and finally the equipment used to perform it. Measuring the usability of a certain system can be done through the measurement of the three goals using a number of observable and quantifiable usability metrics. In the light of the three goals mentioned earlier, we’ll go through the different metrics used to measure each goal, however, our main focus will be on the Effectiveness It can be measured through using two usability metrics: Success rate, called also completion rate and the number of errors .Success rate/ completion rate: is the percentage of users who were able to successfully complete the tasks. Despite the very fact that this metric remains unable to supply insights on how the tasks were performed or why users fail just in case of failure, they're still critical and are at the core of usability.The success rate is one of the most commonly used metric for most of practitioners, where 79% of them reported using the success rate as the first metric to think about for simple use and through data collection and interpretation. The success rate metric are often measured by assigning a binary value of 0 and 1 to the users; where 1 is assigned to those that successfully complete the task and 0 to the ones who fail to do so.”Once the test is over and you have all the data you need to calculate your success rate, the next step would be to divide the total number of correctly completed attempts by the total number of attempts multiplied by 100.The completion rate is easy to measure and to collect but with one major pitfall to consider; it happens frequently when a user stops at some point during the task and fails to end it or maybe finishes it but not in the expected way.Taking into account that they have completed some steps successfully in the task, how would you score what they have accomplished as an evaluator?I am getting to dive a touch bit into the small print on the way to score you users taking under consideration the various stages of their success or failure, using an example to illustrate. Let’s consider, for instance, that your user task is to order a box of dark chocolates with a card to their mother for mother’s day.The scoring might seem simple at first glance, and you'll easily say; if the mother receives the box of bittersweet chocolate with the cardboard then it's a case of success. On the opposite hand, if the mother doesn't receive anything then we will simply say, that this is often a case of failure. However, it’s not that straightforward , there are other considerations: Ordered a box of chocolate but not the dark one (white or milky or a spread of these) alongside card. Ordered the proper chocolate box without a present card Ordered quite one box of chocolate by mistake and a present card Ordered a box of chocolate but didn’t add delivery information or address Ordered a box of chocolates and gift card successfully but to the incorrect address All these cases entail a percentage of success and failure within the process of fulfilling the task, their failure is partial also as their success, which simply means that as an evaluator you would like to interact your own personal opinion within the scoring. If you decided that there are no middle grounds in the estimated scoring, your success rate would be different from that obtained when you appreciate the effort they have made in spite of the task you planned for them. The fact that there is not a steady rule when it comes to scoring your users, and oftentimes success rates become subjective; because different evaluators won’t have the same scoring and estimate an equivalent percentage of failure or success for the above cases, within the same way. However, so as to mainstream the method , you'd like to work out the important aspects of the task and what score you would allot each a part of it.
  • 4. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021) ISSN(Online): 2581-7280 VIVA Institute of Technology 9th National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021) F-145 www.viva-technology.org/New/IJRI Success rate remains the only usability metric and therefore the easiest among the entire range of those usability signals, mainly because it’s quick and straightforward and doesn't require much preparation and time to gather and most significantly it enables you from tracking the progress within your system being one among the overall areas commonly employed by marketers and designers right along , to ascertain the large picture of how well their system is doing at the extent of user experience, this doesn't change the very fact , that it remains subjective. Help the designers and developers to ascertain that uncovering problems isn't a symbol of failure. nobody does an ideal job the initially time only. Users always surprise us. It's much better to seek out out about the issues with a some users during a usability test than later when the design is being reviewed and is out there within the marketplace. [7] 2. The Number of Errors This metric provides an idea about the average number of times where an error occurred per user when performing a given task.These errors can be either slips; where the user accidently types the incorrect email address or picks the incorrect dates when making a reservation or booking a flight, or they will be mistakes where the user clicks on an image that’s not clickable or even double clicks a button or a link intentionally.Normally any users of any interactive system may make errors, where 2 out of every 3 users err, and there's absolutely no such thing as a ‘’perfect’’ system anyway..To help you measure and ensure obtaining great diagnostic results, it is highly recommended to set a short description where you give details about how to score those errors and the severity of a certain of an error to show you how simple and intuitive your system is. 3. Time-Based Efficiency Or referred to as time on task, this metric helps in the measurement of the time spent by the user to complete the task or speed of work. This consequently means there's an immediate relationship between the efficiency and effectiveness, and that we can say, that efficiency is really the user effectiveness divided by the user time spent. 4. The Overall Relative Efficiency This is actually measured through users who successfully completed the task in relation to the total time taken by all users.Let’s consider that we have 2 users where each one of is supposed to complete a different task.The first user has successfully completed task (1) yet failed to complete task (2). While the second hand has did not complete task (1) but completed task (2) successfully. 5.Post Task Satisfaction Once your users have finished the task and it doesn’t matter whether complete it successfully or not, it’s time to hand them over a questionnaire to have an idea about the difficulty of the task from the users point of view.Generally, these tasks consist of 5 questions, and the idea behind them give your users a space to judge the usability of your system. 6. Task Level Satisfaction This metric helps into investigating the overall impression of users confronted with the system. To measure the level of satisfaction you can either use the smiley scale method where the user is expected to choose one of the 5 smileys as a reflection of their satisfaction or lack of satisfaction.The Word Method is also use to measure the user’s level of satisfaction through listing a series of positive and negative connotations highlighted in green and red respectively. In light of the conceptual framework we have discussed earlier, the user experience is highly influenced by everything that surrounds it.However, the tide might be turning on usability funding. I've recently worked on several projects to determine formal usability metrics in several companies. As organizations increase their usability investments, collecting actual measurements is a natural next step and does provide benefits. In general, usability metrics let you: Track progress between releases. You cannot fine-tune your methodology unless you recognize how well you're doing.Assess your competitive position. Are you better or worse than other companies? Where are you better or worse?Make a Stop/Go decision before launch. Is the design ok to release to an unsuspecting world? Create bonus plans for design managers and higher-level executives. For example, you'll determine bonus amounts for development project leaders supported what percentage customer-support calls or emails their products generated during the year. How to Measure
  • 5. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021) ISSN(Online): 2581-7280 VIVA Institute of Technology 9th National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021) F-146 www.viva-technology.org/New/IJRI It is easy to specify usability metrics, but hard to gather them. Typically, usability is measured relative to users' performance on a given set of test tasks. The most basic measures are supported the definition of usability as a top quality metric: success rate (whether users can perform the task at all), the time a task requires, the error rate, and users' subjective satisfaction.[8] It is also possible to collect more specific metrics, such as the percentage of time that users follow an optimal navigation path or the number of times they need to back track. You can collect usability metrics for both novice users and experienced users. Few websites have truly expert users, since people rarely spend enough time on any given site to find out it in great detail. Given this, most websites benefit most from studying novice users. Exceptions are sites like Yahoo and Amazon, which have highly committed and constant users and may enjoy studying expert users. Intranets, extranets, and weblications are almost like traditional software design and can hopefully have skilled users; studying experienced users is thus more important than working with the novice users who typically dominate public websites. With qualitative user testing, it is enough to test 3–5 users. After the fifth user tests, you've got all the insight you're likely to urge and your best bet is to travel back to the drafting board and improve the design so that you can test it again. Testing quite five users wastes resources, reducing the amount of design iterations and compromising the ultimate design quality. Unfortunately, when you're collecting usability metrics, you want to test with quite five users. In order to urge a fairly tight confidence interval on the results, I usually recommend testing 20 users for every design. Thus, conducting quantitative usability studies is approximately fourfold as expensive as conducting qualitative ones. Considering that you simply can learn more from the simpler studies, I usually recommend against metrics unless the project is extremely well funded.success rate or the completion rate because it’s gives a general idea about the performance of the system. IV. FIGURES AND TABLES Comparing Two Designs To illustrate quantitative results, we can look at those recently posted by Macromedia from its usability study of a Flash site, aimed at showing that Flash is not necessarily bad. Basically, Macromedia took a design, redesigned it according to a set of usability guidelines, and tested both versions with a group of users. Here are the results: Table no:1 Original Design Redesign Task 1 12 sec. 6 sec. Task 2 75 sec. 15 sec. Task 3 9 sec. 8 sec. Task 4 140 sec. 40 sec. Satisfaction score* 44.75 74.50 *Measured on a scale ranging from 12 (unsatisfactory on all counts) to 84 (excellent on all counts). Table no :2
  • 6. VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 4 (2021) ISSN(Online): 2581-7280 VIVA Institute of Technology 9th National Conference on Role of Engineers in Nation Building – 2021 (NCRENB-2021) F-147 www.viva-technology.org/New/IJRI Fig-1 fig-2 Fig-3 V. CONCLUSION usability metrics, it's possible to watch and quantify the usability of any system irrespective if it's software, hardware, web-based or a mobile application. This is because the metrics presented here are supported extensive research and testing by various academics and experts and have withstood the test of your time . Moreover, they cover all of the three core elements that constitute the definition of usability: effectiveness, efficiency and satisfaction, thus ensuring an all-round quantification of the usability of the system being tested. usability gets side-tracked and becomes something which will be addressed afterward . Tracking the usability of your product with metrics allows you to possess a transparent understanding of the experience you're providing to your users, and improve it over time. usability metrics are measured and aggregated into actionable results, which allows you to act instantly on the info you record. That makes it painless to stay track of how your design's usability progresses, detect issues, and improve your users' experience REFERENCES Bevan, N. and Macleod, M. 1994. Usability measurement in context, Behavior and Information Technology 13: 132–145. Ivory, M.Y. and Hearst, M.A. 2001. The state of the art in automating usability evaluation of user interfaces, ACM Computing Surveys 33: 470–516. Kirakowski, J. and Corbett, M., 1993. SUMI: The Software Usability Measurement Inventory, British Journal of Educational Technology 24: 210–212 Lin, H. X., Choong, Y.-Y., and Salvendy, G., 1997. A proposed index of usability: A method for comparing the relative usability of different software systems, Behaviour and Information Technology, 16: 267-277. Macleod, M., 1994. Usability: Practical Methods for testing and Improvement, Proceedings of the Norwegian Computer Society Software Conference, Sandvika, Norway. Retrieved July 3, 2005 from http://www.usability.serco.com/papers/mm-us94.pdf. Macleod, M., and Rengger, R., 1993. The development of DRUM: A software tool for video-assisted usability evaluation. Retrieved July 3, 2005 from http://www.usability.serco.com/papers/drum93.pdf Nielsen, J., 1993. Usability Engineering, London, UK: Academic Press Symposium on User Interface Software and Technology, New York: ACM Press, pp. 101–110. Shackel, B., 1991. Usability—Context, framework, definition, design and evaluation, in B. Shackel and S. Richardson (Eds.), Human Factors for Informatics Usability, Cambridge, MA: University Press, pp. 21–38. Landuaer, T.K. The Trouble with Computers: Usefulness, Usability and Productivity, MIT Press, 1995. Mayhew, D.J. (1999). The Usability Engineering Lifecycle: A Practitioner’s Handbook for User Interface design, Morgan Kaufmann, San Francisco. Holzinger, A.: Usability Engineering for Software Developers. Communications of the ACM 48(1), 71–74 (2005) Seffah, A., Metzker, E.: The obstacles and myths of usability and software engineering. Communications of the ACM 47(12), 71–76 (2004) Nielsen, Jakob (4 January 2012). "Usability 101: Introduction to Usability". Nielsen Norman Group. Archived from the original on 1 September 2016. Retrieved 7 August 2016.