Why Usability Works For It Audit Iacis 2010

848 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
848
On SlideShare
0
From Embeds
0
Number of Embeds
35
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • This study was a traditional experimental design that involved 2 simulated websites. the high complexity site & the low complexity site. the 4 dependent variables used to evaluate Cognitive Processing were 1) Purchase task success Rate 2) Time for subjects to purchase products, 3) User satisfaction Score & 4) voluntary exiting Rate The data collection booklet was developed to collect the Dependent Variables. It includes 5 things. NEXT
  • This study was a traditional experimental design that involved 2 simulated websites. the high complexity site & the low complexity site. the 4 dependent variables used to evaluate Cognitive Processing were 1) Purchase task success Rate 2) Time for subjects to purchase products, 3) User satisfaction Score & 4) voluntary exiting Rate The data collection booklet was developed to collect the Dependent Variables. It includes 5 things. NEXT
  • Why Usability Works For It Audit Iacis 2010

    1. 1. WHY USABILITY TESTING WORKS FOR IT AUDIT Voraphan Manomuth Raungpaka, PhD Kasetsart University, THAILAND [email_address] Usana Patramontree, CPA Kasetsart University, THAILAND [email_address] Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010 Presented by Voraphan M. Raungpaka 7 October, 2010
    2. 2. <ul><li>Introduction and Background </li></ul><ul><li>Research Methodology </li></ul><ul><li>Results </li></ul><ul><li>Recommendations & </li></ul><ul><li>Practitioner’s take away </li></ul>OUTLINE Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    3. 3. <ul><li>We were expected to perform typical Computer Assisted Audit techniques (CAATs) such as </li></ul><ul><ul><li>the test data and </li></ul></ul><ul><ul><li>general audit software techniques. </li></ul></ul><ul><li>CAATs requires high computer skill. </li></ul><ul><ul><ul><li>These techniques are under-utilized in auditing field (Curtis & Payne, 2008; Celluro, 2003; IIA 2002; AICPA, 1998). </li></ul></ul></ul>IN OUR ENGAGEMENT PLAN Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    4. 4. <ul><li>The impact of real users and computer interaction , especially when using a system interface on the data integrity , seems to have been overlooked or superficially commented on. </li></ul><ul><li>It is an obvious risk made in database integrity when input data are incomplete and/ or inaccurate because of mistakes that real users made when using an interface. </li></ul>
    5. 5. TYPICAL IT AUDIT TECHNIQUES Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010 Is used to discover weaknesses In program controls. Is used to discover data, database inaccuracy and Incompleteness. What IT techniques should be used to discover usability problems of usability problems of real users? user Application Program Test Data Audit Software ???? Data Data
    6. 6. AN AUDITOR EVER KNOW ABOUT ANY SYSTEM USABILITY PROBLEMS OF REAL USERS? <ul><li>Auditors seldom know about any system usability problems of real users </li></ul><ul><ul><li>since auditors are hired to audit company’s historical data. </li></ul></ul>Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    7. 7. WHAT WE HAVE GOT FROM PRELIMINARY SURVEYS ? Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010 management and staff were quite satisfied about the various aspects of FMS , but there were also management and staff with negative user interface experiences So…… I came up with the idea of applying usability testing techniques to gain better evidence on the users’ aspect when using fund management system in specified tasks . ( In addition to administering the test data and the generalized audit package)
    8. 8. WHAT IS FMS SOFTWARE <ul><li>Fund Management System (FMS) software is a pseudonym of a main software which has been in-house developed to store data for grant projects data , assist in managing the projects, and provide managerial reports. </li></ul>Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    9. 9. SOLUTION POSTED! Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010 Is used to discover weaknesses In program controls. Is used to discover data, database inaccuracy and Incompleteness. Is used to discover usability problems of usability problems. Usability testing? user Application Program Test Data Audit Software Data Data
    10. 10. USABILITY TESTING TECHNIQUE <ul><ul><li>User + ability to use …. </li></ul></ul><ul><ul><li>A technique used to evaluate a human-made product by testing it on users or acquiring direct information on how people use technology , such as web applications, software applications, computer interfaces, devices, or any other products. </li></ul></ul><ul><ul><li>involves watching people using the product to discover errors and areas for improvement. </li></ul></ul>Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    11. 11. <ul><li>Usability testing generally involves measuring how well test subjects respond in 5 areas: </li></ul><ul><li>Efficiency —how long does it take subjects to complete basic tasks? </li></ul><ul><li>Accuracy —how many mistakes do subjects make? </li></ul><ul><li>Recall —how much do the subjects remember afterwards or after periods of non-use? </li></ul><ul><li>Emotional response —how do the subjects feel about the task completed? Is the person confident or stressed? Would the user recommend this system to a friend? </li></ul><ul><li>Learnability —how easy is it for users to accomplish basic task the first time they encounter the design? </li></ul>USABILITY TESTING Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    12. 12. <ul><li>Usability Testing Techniques </li></ul>METHODOLOGY USED TO EVALUATE USABILITY PROBLEMS Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    13. 13. <ul><li>Tested participant’s qualification </li></ul><ul><ul><li>who were familiar with the use of computers and some other software applications </li></ul></ul><ul><ul><li>but never trained nor used the FMS software application. </li></ul></ul><ul><li>In our pre-test briefing , </li></ul><ul><ul><li>introduced them to the application. </li></ul></ul><ul><ul><li>asked them to think-aloud and do the 10 specified tasks </li></ul></ul><ul><li>Carefully observed them during the 60min test session. </li></ul>OBSERVATION Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    14. 14. <ul><li>a technique used to solicit participant feedback during the test. </li></ul><ul><li>Participants were asked to describe out loud how they were attempting to complete each task as they worked on them while we did video record what the users said and did. </li></ul><ul><li>We took notes of user comments and actions taken during test and performed a critical-incident analysis on the participants' actions/comments </li></ul>THINK-ALOUD Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    15. 15. <ul><li>It was developed by using the Venkatesh and Morris, et al. (2003) questions and followed the UTAUT Theory . </li></ul><ul><li>This questionnaire consisted of 6parts </li></ul><ul><ul><li>performance expectancy, </li></ul></ul><ul><ul><li>effort expectancy, </li></ul></ul><ul><ul><li>attitude toward using, </li></ul></ul><ul><ul><li>facilitating conditions, </li></ul></ul><ul><ul><li>self efficacy, </li></ul></ul><ul><ul><li>behavior intention </li></ul></ul>POST TEST QUSTIONNAIRE Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    16. 16. UTAUT THEORY Unified Theory of Acceptance and Use of Technology (UTAUT), by Venkatesh et al (2003). Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    17. 17. <ul><li>The study results show that user acceptance of FMS software is generally low . </li></ul><ul><li>The high number of users who did the tasks unsuccessfully , and who gave up shows the high risk level of inaccuracy and incompleteness of input data. </li></ul><ul><li>The mean score of 13.8 minutes shows the inefficiency in using this software. </li></ul>RESULTS Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    18. 18. POST-TEST QUESTIONNAIRE RESULTS Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    19. 19. OBSERVATION RESULTS <ul><li>From the results, only 46 percent of the users who did the tasks successfully compared to the 52 and 2 percent of give ups and wrong answers, respectively. </li></ul>Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010 <ul><ul><li>Data Collection: </li></ul></ul><ul><ul><li>We counted a task as successful (S) if the user found the correct (C) answer. </li></ul></ul><ul><ul><li>If the user gave a wrong (W) answer, or if he or she gave up (G), we counted the task as unsuccessful (U). </li></ul></ul>Higher Risk
    20. 20. OBSERVATION RESULTS Figure 2: Results from Post Test uestionnaire Lower Risk Higher Risk Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010 Inefficiency
    21. 21. PROS AND CONS OF USING THE USABILITY TESTING TECHNIQUES IN IT AUDIT <ul><li>It is not too difficult to perform, </li></ul><ul><li>It is relatively understandable by firm’s management and users , </li></ul><ul><li>It provides ways to measure the users’ attitude and behavior, </li></ul><ul><li>the recommendation gained from the ‘Think Aloud’ process is beneficial for further improvement of the software. </li></ul><ul><li>it does not provide direct evidences on data inaccuracy and incompleteness; </li></ul><ul><li>it only indicates the potential audit risk areas and the risk likelihood. </li></ul><ul><li>PROs: </li></ul><ul><li>CONs: </li></ul>Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010 We still need other typical computer assisted audit techniques to detect the actual risk and to correct the problem.
    22. 22. <ul><li>help auditors to carry out their IT audit plan and engagement more effectively. </li></ul><ul><li>hope that our experience gained from this work can contribute beneficial knowledge to the IT audit profession . </li></ul>CONTRIBUTION OF THIS REPORT Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    23. 23. PRACTITIONER’S TAKE AWAY <ul><li>Individual usability testing gives you more information on users’ behavior when using the application’s interface, which we rarely had any explicit audit evidence before. </li></ul><ul><li>Individual usability testing supports thinking aloud method , which allows you to understand how participant thinks and feels about whatever he/she is looking at and doing, as he/she goes about his/her tasks. </li></ul><ul><li>Individual usability testing gives you almost immediate feedback on your design and you will have much better understanding which parts of your interface are difficult to use. </li></ul>Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    24. 24. PRACTITIONER’S TAKE AWAY (CONT.) <ul><li>The video-recorded test session of the individual usability testing allows us to refer to what participant did and reacted so you can do an analysis of the video where the user identification of usability problems is emphasized. </li></ul><ul><li>Individual usability testing session followed by a individual interview, a focused discussion, enables us to clarify user’s comments made during the test session. </li></ul><ul><li>‘ Zero users will give zero insights ’ (Nielsen, 200b), but the usability testing with only five users will allow you to discover more than 75% of the usability problems in the design. </li></ul>Voraphan Manomuth Raungpaka, IACIS Conference, Las Vegas, USA,2010
    25. 25. THANK YOU! Question?
    26. 26. REFERENCES <ul><li>Boritz, E. (2005). IS practitioners' views on core concepts of information integrity International. Journal of Accounting Information Systems, 6(4),  260-279. </li></ul><ul><li>CICA/AICPA. (1999 ). Continuous auditing: Research report. Canadian Institute of Chartered Accountants. </li></ul><ul><li>Flowerday, S., Blundell, A.W., & Von Solms, R. (2006).Continuous auditing technologies and models: A discussion. Computers & Security ,  25(5), 325-331. </li></ul><ul><li>Institute of Internal Auditors (IIA). (2008). Exposure draft international standards for the professional practice of internal auditing. Retrieved on March 5, 2009. Available: www.theiia.org/ Guidance </li></ul><ul><li>  </li></ul><ul><li>Manomuth, V. (2006). An empirical test of the minimal cognitive processing theory of web buyer behavior. Unpublished Doctoral Dissertation, Utah State University, Logan. </li></ul><ul><li>Murthy, U. & Groomer, M. (2004 ). A continuous auditing web services model for xml-based accounting systems. International Journal of Accounting Information Systems, 5 (2), 139–163. </li></ul><ul><li>Nielsen, J. (1994). Usability engineering . San Diego, CA: Morgan Kaufmann. </li></ul><ul><li>Nielsen, J. (2000a). Designing Web Usability . Indianapolis, In.: New Riders Publishing. </li></ul><ul><li>Nielsen, J. (2000b). Why you only need to test with 5 users. Retrieved on Jan 25, 2009. Available: www.useit.com/alertbox/20000319.html </li></ul><ul><li>Nielsen, J. (2003). Usability 101: Introduction to usability. Retrieved Jan 25, 2008, Available: www.useit.com/alertbox/20030825.html </li></ul><ul><li>Norden, L., Creelan, J. M., Kimball, D., & Quesenbery, W. (2006). The machinery of democracy: Usability of voting systems . New York: Brennan Center of Justice at NYU School of Law. </li></ul><ul><li>Robert, L. (2003 ). Towards a paradigm for continuous auditing. Retrieved on Feb 07, 2009. Available: webcache.googleusercontent.com/search ?=cache:IkrtU7LxXVQJ:www.auditsoftware.net/community/how/run/tools/Towards%2520a%2520Paradigm%2520for%2520Continuous%2520Auditin1.doc+Towards+a+paradigm+for+continuous+auditing&cd=1&hl=en&ct=clnk&gl=th </li></ul><ul><li>Redish, J. (2007). Expanding usability testing to evaluate complex systems. Journal of Usability Studies, 2(3), 102-111. </li></ul><ul><li>Trites, G. (2004). Director Responsibility for IT governance. Retrieved on Jan 20, 2009. Available: www.sciencedirect.com/science?_ob= ArticleURL&_udi=B6W6B-4CS4FMS-&_user=10 &_coverDate=07%2F31%2F2004&_rdoc=1&_fmt=high&_orig=search&_sort=d&_docanchor=&view=c&_searchStrId=1360084478&_rerunOrigin=google&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=82a58e6e82df0c725ea400445fb40a9f </li></ul><ul><li>Tuttle, B. & Scott, V. D. (2007). An empirical examination of COBIT as an internal control framework for information technology . International Journal of Accounting Information Systems, 8(4), 240-263. </li></ul><ul><li>Van Duyne, D. K., Landay, J. A., & Hong, J. I. (2003). The design of sites: Patterns, principles, and processes for crafting a customer-centered Web experience . San Francisco: Addison Wesley. </li></ul><ul><li>Venkatesh, V., Morris, M.G., Davis, G.B., Davis, & F.D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425-478. </li></ul>

    ×