• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
White paper   loren k schwappach
 

White paper loren k schwappach

on

  • 161 views

 

Statistics

Views

Total Views
161
Views on SlideShare
161
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    White paper   loren k schwappach White paper loren k schwappach Document Transcript

    • 1Short Analytical Critique on An Examination of Software Engineering Work Practices Loren Karl Schwappach Colorado Technical University
    • SHORT ANALYTICAL CRITIQUE 2 AbstractThis paper is a short analytical critique analyzing the white paper An Examination of SoftwareEngineering Work Practices written by Janice Singer, Timothy Lethbridge, Norman Vinson, andNicolas Anquetil. Their paper presented work practice data on the daily activities of softwareengineers collected using four separate studies. In their paper they rational that empirical studiesof programmers, and Human Computer Interaction studies of programmers is problematic andtherefore a human-human Work Practices approach is necessary for accurate data collection. Inthis critique the bias and accuracy of their methodologies used is critically reviewed as well asthe end result of their product (a source code search utility) they created as a solution to theirhypothesized problem.
    • SHORT ANALYTICAL CRITIQUE 3 Short Analytical Critique on An Examination of Software Engineering Work Practices1. Introduction This paper is a short analytical critique analyzing the white paper An Examination ofSoftware Engineering Work Practices written by Janice Singer, Timothy Lethbridge, NormanVinson, and Nicolas Anquetil. In their paper the authors presented work practice data on thedaily activities of software engineers collected using four separate studies (one looking at anindividual, two looking at a software engineering group, and one looking at a company-widetools usage statistics. In their paper they rational that Empirical Studies of Programmers (ESP),and Human Computer Interaction (HCI) studies of programmers are problematic and therefore ahuman-human Work Practices approach is necessary for accurate data collection. ESP states thatan understanding of the mental processes involved in programming will permit the design oftools that mesh with the programming process (Singer, Lethbridge, Vinson, Anquetil, 1997).HCI say that designers should attempt to ensure that prospective users can use the softwarewithout encountering difficulties (In other words the design should clarify what actions the usershould take) (Singer, Lethbridge, Vinson, Anquetil, 1997). The author discounts both of thesemethods stating that neither produces tools that actually can be used. However, the authors didnot have the future knowledge we have today showing the impact that ESP and HCI have madein the twentieth century (such as in AI operating systems and languages such as Microsoft’sVisual C++ Version 2010 which heavily used HCI). In this short critique the bias and accuracyof the methodologies used by this study group are critically reviewed as well as the end results oftheir product (a Unix based source code search utility called tkSee) which they created as asolution to their hypothesized problem.
    • SHORT ANALYTICAL CRITIQUE 42. Critique of the Work First off, the authors should be given credit for their contributions to the research in thestudy of work practices. Through their efforts they are attempting to demonstrate that workpractices provides a beneficial path to tool design that is alternative (and they believe preferred)to human-computer interaction methods such as ESP and HCI. They argue that by studying theuser’s cognitive processes and mental models with the emphasis on usability more practical tooldesign becomes possible. This argument may or may not be true; however the process in whichthey use to conclude this and thereby produce the data used in justifying the development of theirtkSee tool is very arguable. Not to mention the fact that their universal search utility (their endproduct) does receive any hits via google. While critiquing the white paper it quickly appears that the author’s bias and primarymotivation for using the work practice methodology are questionable by simply looking at thefields of study of the authors and the time the article was published, the lead author Janice Singerbeing a cognitive psychologist and the document published at a time where computers were justbeginning to become common place devices, 1997. This case further becomes apparent when examining the data collected by the groupwhile meeting with the first study group (a recently hired employee they name B and state “hasworked in the software industry for many years” (Singer, Lethbridge, Vinson, Anquetil, 1997),the number of years is never told, and in fact, while they argue he probably spends more timereading source due to his inexperience the experienced group they utilize in their study iscomposed of eight members ranging from a recent college graduate to a member with eight years
    • SHORT ANALYTICAL CRITIQUE 5if experience, and they claim that the group is more experienced than employee B who “hasworked in the software industry for many years” (Singer, Lethbridge, Vinson, Anquetil, 1997). Right at the start the information and data seem to point at the author’s bias towardshuman-human work studies, and one might even buy-in to the argument if the collectionmethods they use for their conclusions had been more sound and accurate. In their paper they first show the results of a questionnaire illustrating that 66% of theemployees that took the survey spent time each day reading through documentation, while 50%of the employees looked at source, writing documentation, writing code, and attending meetings.(Singer, Lethbridge, Vinson, Anquetil, 1997). Later in the paper they argue that the employeesclearly were wrong in with their open ended statements after watching employee B for fourteenhalf-hour sessions in a five month period. Truly believing that watching an employee once everytwo weeks for a half hour is enough to compile a list of his daily activities is lunacy in its self.Had they ran longer shadow sessions they may have come up with extremely different resultsand in-fact validated the employee’s questionnaire results. The next fallacy to their argument that ESP studies are worthless for accurate tooldevelopment is encountered during the company study showing company-wide tool usagestatistics. In this aspect they show that looking at source code and search statistics is one of themain events logged daily. So here we have computer-human data showing right away thatsource code and search are the highest items encountered daily. They use this data to build upontheir belief that a more efficient Just In Time Comprehension (JITC) search utility is needed toallow more efficient programming/debugging. This is the type of data that would have beenencountered in an ESP study yet, the focus group claims that ESP studies are worthless towardstool development.
    • SHORT ANALYTICAL CRITIQUE 6 Most of the fallacies that are encountered while critically reviewing this paper are due tothe human limitations set by their work study driven methodology. Had they had computerlogging or even video recording available to get a more detailed analysis of the data they studyprobably would have resulted in much more usable data.3. Conclusion The author concludes the paper stating that the study of work practices using humaninteraction was a complete success and resulted in the development of their a universal, multi-language supporting search utility for enhancing JITC while writing/reviewing source code.While I agree that a focus on work practices increases the likeliness that tools can be smoothlyintegrated into the users, the arguments and test methodology used is highly debatable and itwould seem a computer-human methodology in toadies world of high speed computing is amuch more attractive alternative and can analyze and review human utilization statistics muchmore correct and efficiently than monitoring a single user statistics (especially in fourteen, halfhour sessions in a five month period).
    • SHORT ANALYTICAL CRITIQUE 7 ReferencesAnderson, J., (1995). Cognitive Psychology and Its Implications, WH FreemanBlomberg, J., Suchman, L., & Trigg, R., (1996). Reflections on a Work-oriented Design Project. Human Computer Interaction (11), pp. 237-265Beyer, H., & Holtzblatt, K., (1995). Apprenticing with the customer. Communications of the ACM (38), pp. 45-52Brooks, R., (1983). Towards a Theory of the Comprehension of Computer Programs, International Journal of Man-Machine Studies (18), pp. 543-554Holt, R., Software Bookshelf: Overview And Construction, Retrieved from: www.turing.toronto.edu/holt/papers/bsbuild.htmlLethbridge, T., & Anquetil, N., Architecture of a source code exploration tool: A software engineering case study. School of Information Technology and Engineering, Technical Report.Lethbridge, T. and Singer J., (1997). Understanding Software Maintenance Tools: Some Empirical Research, Workshop on Empirical Studies of Software Maintenance (WESS 97), Bari Italy.
    • SHORT ANALYTICAL CRITIQUE 8Lethbridge, T. and Singer, J, (1996). Strategies for Studying Maintenance", Workshop on Empirical Studies of Software Maintenance, Monterey.Littman, D., Pinto, J., Letovsky, S., & Soloway, E., (1986) Mental Models and Software Maintenance, Empirical Studies of Programmers, pp. 80-98.Mayhew, D., (1991). Principles and Guidelines in Software User Interface Design, Prentice Hall.Müller, H., Mehmet, O., Tilley, S., and Uhl, J., (1993). A Reverse Engineering Approach to Subsystem Identification, Software Maintenance and Practice, Vol 5, 181-204.Pennington, N., (1987) Stimulus Structures and Mental Representations in expert comprehension of computer programs. Cognitive Psychology (19),pp. 295-341.Singer, J. and Lethbridge, T, (1996). Methods for Studying Maintenance Activities, Workshop on Empirical Studies of Software Maintenance, Monterey.Singer, J., and Lethbridge, T. (in preparation). Just-in-Time Comprehension: A New Model ofProgram Understanding.Singer J., Lethbridge T., Vinson N., Anquetil N., (1997). An Examination of Software Engineering Work Practices. NRC Publications Archive. Retrieved from: http://nparc.cisti-icist.nrc-cnrc.gc.ca/npsi/ctrl?action=rtdoc&an=5209032&lang=en
    • SHORT ANALYTICAL CRITIQUE 9Singer, J, Lethbridge, T., and Vinson, N. (1998) Work Practices as an Alternative Method for Tool Design in Software Engineering, submitted to CHI.Storey, M., Fracchia, F., & Müller, H., (1997). Cognitive Elements to support the construction of a mental model during software visualization. In Proceedings of the 5th Workshop on Program Comprehension, Dearborn, MI, pp. 17-28, May, 1997.Take5 Corporation home page, http://www.takefive.com/index.htmVicente, K and Pejtersen, A. (1993). Cognitive Work Analysis, in press von Mayrhauser, A., & Vans, A., From Program Comprehension to Tool Requirements for an Industrial Environment, In: Proceedings of the 2nd Workshop on Program Comprehension, Capri, Italy, pp. 78-86.Von Mayrhauser, A., & Vans, A., (1993). From Code Understanding Needs to Reverse Engineering Tool Capabilities, In: Proceedings of the 6th International Workshop on Computer-Aided Software Engineering (CASE93), Singapore, pp. 230-239.Von Mayrhauser, A and & Vans, A., (1995) Program Comprehension During Software Maintenance and Evolution, Computer, pp. 44-55.