Your SlideShare is downloading. ×
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Swebok guide 2004
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Swebok guide 2004

988

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
988
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
25
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Guide to the Software Engineering Body of Knowledge 2004 Version SWEBOK ® A project of the IEEE Computer Society Professional Practices Committee© IEEE – 2004 Version ®SWEBOK is an official service mark of the IEEE
  • 2. © IEEE – 2004 Version
  • 3. Guide to the Software Engineering Body of Knowledge 2004 Version SWEBOK® Executive Editors Alain Abran, École de technologie supérieure James W. Moore, The MITRE Corp. Editors Pierre Bourque, École de technologie supérieure Robert Dupuis, Université du Québec à Montréal Project Champion Leonard L. Tripp, Chair, Professional Practices Committee, IEEE Computer Society (2001-2003) http://computer.org Los Alamitos, California Washington • Brussels • Tokyo© IEEE – 2004 Version ®SWEBOK is an official service mark of the IEEE
  • 4. Copyright © 2004 by The Institute of Electrical and Electronics Engineers, Inc. All rights reserved.Copyright and Reprint Permissions: This document may be copied, in whole or in part, in any form or by any means, as is, orwith alterations, provided that (1) alterations are clearly marked as alterations and (2) this copyright notice is includedunmodified in any copy. Any other use or distribution of this document is prohibited without the prior express permission ofIEEE.You use this document on the condition that you indemnify and hold harmless IEEE from any and all liability or damages toyourself or your hardware or software, or third parties, including attorneys fees, court costs, and other related costs and expenses,arising out of your use of this document irrespective of the cause of said liability.IEEE MAKES THIS DOCUMENT AVAILABLE ON AN "AS IS" BASIS AND MAKES NO WARRANTY, EXPRESS OR IMPLIED, AS TOTHE ACCURACY, CAPABILITY, EFFICIENCY MERCHANTABILITY, OR FUNCTIONING OF THIS DOCUMENT. IN NO EVENTWILL IEEE BE LIABLE FOR ANY GENERAL, CONSEQUENTIAL, INDIRECT, INCIDENTAL, EXEMPLARY, OR SPECIALDAMAGES, EVEN IF IEEE HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. IEEE Computer Society Order Number C2330 ISBN 0-7695-2330-7 Library of Congress Number 2005921729 Additional copies may be ordered from: IEEE Computer Society IEEE Service Center IEEE Computer Society Customer Service Center 445 Hoes Lane Asia/Pacific Office 10662 Los Vaqueros Circle P.O. Box 1331 Watanabe Bldg., 1-4-2 P.O. Box 3014 Piscataway, NJ 08855-1331 Minami-Aoyama Los Alamitos, CA 90720-1314 Tel: + 1-732-981-0060 Minato-ku, Tokyo 107-0062 Tel: + 1-714-821-8380 Fax: + 1-732-981-9667 JAPAN Fax: + 1-714-821-4641 http://shop.ieee.org/store/ Tel: + 81-3-3408-3118 E-mail: cs.books@computer.org customer-service@ieee.org Fax: + 81-3-3408-3553 tokyo.ofc@computer.org Publisher: Angela Burgess Group Managing Editor, CS Press: Deborah Plummer Advertising/Promotions: Tom Fink Production Editor: Bob Werner Printed in the United States of America © IEEE – 2004 Version
  • 5. TABLE OF CONTENTSFOREWORD ................................................................................................................................................................viiPREFACE..................................................................................................................................................................xviiCHAPTER 1 INTRODUCTION TO THE GUIDE ..................................................................................................1-1CHAPTER 2 SOFTWARE REQUIREMENTS .......................................................................................................2-1CHAPTER 3 SOFTWARE DESIGN....................................................................................................................3-1CHAPTER 4 SOFTWARE CONSTRUCTION .......................................................................................................4-1CHAPTER 5 SOFTWARE TESTING ..................................................................................................................5-1CHAPTER 6 SOFTWARE MAINTENANCE ........................................................................................................6-1CHAPTER 7 SOFTWARE CONFIGURATION MANAGEMENT .............................................................................7-1CHAPTER 8 SOFTWARE ENGINEERING MANAGEMENT .................................................................................8-1CHAPTER 9 SOFTWARE ENGINEERING PROCESS ...........................................................................................9-1CHAPTER 10 SOFTWARE ENGINEERING TOOLS AND METHODS ...................................................................10-1CHAPTER 11 SOFTWARE QUALITY ...............................................................................................................11-1CHAPTER 12 RELATED DISCIPLINES OF SOFTWARE ENGINEERING ..............................................................12-1APPENDIX A KNOWLEDGE AREA DESCRIPTION SPECIFICATIONS FOR THE IRONMAN VERSION OF THE GUIDE TO THE SOFTWARE ENGINEERING BODY OF KNOWLEDGE ............................................................................................................A-1APPENDIX B EVOLUTION OF THE GUIDE TO THE SOFTWARE ENGINEERING BODY OF KNOWLEDGE ............................................................................................................ B-1APPENDIX C ALLOCATION OF IEEE AND ISO SOFTWARE ENGINEERING STANDARDS TO SWEBOK KNOWLEDGE AREAS .................................................................................................. C-1APPENDIX D CLASSIFICATION OF TOPICS ACCORDING TO BLOOM’S TAXONOMY ........................................D-1IEEE – 2004 Version v
  • 6. vi
  • 7. FOREWORD In this Guide, the IEEE Computer Society establishes for the first time a baseline for the body of knowledge forthe field of software engineering, and the work partially fulfills the Society’s responsibility to promote theadvancement of both theory and practice in this field. In so doing, the Society has been guided by the experience ofdisciplines with longer histories but was not bound either by their problems or their solutions. It should be noted that the Guide does not purport to define the body of knowledge but rather to serve as acompendium and guide to the body of knowledge that has been developing and evolving over the past four decades.Furthermore, this body of knowledge is not static. The Guide must, necessarily, develop and evolve as softwareengineering matures. It nevertheless constitutes a valuable element of the software engineering infrastructure. In 1958, John Tukey, the world-renowned statistician, coined the term software. The term software engineeringwas used in the title of a NATO conference held in Germany in 1968. The IEEE Computer Society first published itsTransactions on Software Engineering in 1972. The committee established within the IEEE Computer Society fordeveloping software engineering standards was founded in 1976. The first holistic view of software engineering to emerge from the IEEE Computer Society resulted from aneffort led by Fletcher Buckley to develop IEEE standard 730 for software quality assurance, which was completed in1979. The purpose of IEEE Std 730 was to provide uniform, minimum acceptable requirements for preparation andcontent of software quality assurance plans. This standard was influential in completing the developing standards inthe following topics: configuration management, software testing, software requirements, software design, andsoftware verification and validation. During the period 1981-1985, the IEEE Computer Society held a series of workshops concerning the applicationof software engineering standards. These workshops involved practitioners sharing their experiences with existingstandards. The workshops also held sessions on planning for future standards, including one involving measures andmetrics for software engineering products and processes. The planning also resulted in IEEE Std 1002, Taxonomy ofSoftware Engineering Standards (1986), which provided a new, holistic view of software engineering. The standarddescribes the form and content of a software engineering standards taxonomy. It explains the various types ofsoftware engineering standards, their functional and external relationships, and the role of various functionsparticipating in the software life cycle. In 1990, planning for an international standard with an overall view was begun. The planning focused onreconciling the software process views from IEEE Std 1074 and the revised US DoD standard 2167A. The revisionwas eventually published as DoD Std 498. The international standard was completed in 1995 with designation,ISO/IEC12207, and given the title of Standard for Software Life Cycle Processes. Std ISO/IEC 12207 provided amajor point of departure for the body of knowledge captured in this book. It was the IEEE Computer Society Board of Governors’ approval of the motion put forward in May 1993 byFletcher Buckley which resulted in the writing of this book. The Association for Computing Machinery (ACM)Council approved a related motion in August 1993. The two motions led to a joint committee under the leadership ofMario Barbacci and Stuart Zweben who served as cochairs. The mission statement of the joint committee was “Toestablish the appropriate sets(s) of criteria and norms for professional practice of software engineering upon whichindustrial decisions, professional certification, and educational curricula can be based.” The steering committeeorganized task forces in the following areas: 1. Define Required Body of Knowledge and Recommended Practices. 2. Define Ethics and Professional Standards. 3. Define Educational Curricula for undergraduate, graduate, and continuing education.This book supplies the first component: required body of knowledge and recommend practices. The code of ethics and professional practice for software engineering was completed in 1998 and approved byboth the ACM Council and the IEEE Computer Society Board of Governors. It has been adopted by numerouscorporations and other organizations and is included in several recent textbooks. The educational curriculum for undergraduates is being completed by a joint effort of the IEEE ComputerSociety and the ACM and is expected to be completed in 2004.© IEEE – 2004 Version vii
  • 8. Every profession is based on a body of knowledge and recommended practices, although they are not alwaysdefined in a precise manner. In many cases, these are formally documented, usually in a form that permits them to beused for such purposes as accreditation of academic programs, development of education and training programs,certification of specialists, or professional licensing. Generally, a professional society or related body maintainscustody of such a formal definition. In cases where no such formality exists, the body of knowledge andrecommended practices are “generally recognized” by practitioners and may be codified in a variety of ways fordifferent uses. It is hoped that readers will find this book useful in guiding them toward the knowledge and resources they needin their lifelong career development as software engineering professionals. The book is dedicated to Fletcher Buckley in recognition of his commitment to promoting software engineeringas a professional discipline and his excellence as a software engineering practitioner in radar applications. Leonard L. Tripp, IEEE Fellow 2003 Chair, Professional Practices Committee, IEEE Computer Society (2001-2003) Chair, Joint IEEE Computer Society and ACM Steering Committee for the Establishment of Software Engineering as a Profession (1998-1999) Chair, Software Engineering Standards Committee, IEEE Computer Society (1992-1998) viii © IEEE – 2004 Version
  • 9. ASSOCIATE EDITORS The following persons served as Associate Editors for either the Trial version published in 2001 or for the 2004version.Software Requirements Peter Sawyer and Gerald Kotonya, Computing Department, Lancaster University, UK, {p.sawyer}{g.kotonya}@lancaster.ac.ukSoftware Design Guy Tremblay, Département d’informatique, UQAM, Canada, tremblay.guy@uqam.caSoftware Construction Steve McConnell, Construx Software, USA, Steve.McConnell@construx.com Terry Bollinger, the MITRE Corporation, USA, terry@mitre.org Philippe Gabrini, Département d’informatique, UQAM, Canada, gabrini.philippe@uqam.ca Louis Martin, Département d’informatique, UQAM, Canada, martin.louis@uqam.caSoftware Testing Antonia Bertolino and Eda Marchetti, ISTI-CNR, Italy, {antonia.bertolino}{eda.marchetti}@isti.cnr.itSoftware Maintenance Thomas M. Pigoski, Techsoft Inc., USA, tmpigoski@techsoft.com Alain April, École de technologie supérieure, Canada, aapril@ele.etsmtl.caSoftware Configuration Management John A. Scott, Lawrence Livermore National Laboratory, USA, scott7@llnl.gov David Nisse, USA, nissed@worldnet.att.netSoftware Engineering Management Dennis Frailey, Raytheon Company, USA, DJFrailey@Raytheon.com Stephen G. MacDonell, Auckland University of technology, New Zealand, smacdone@aut.ac.nz Andrew R. Gray, University of Otago, New ZealandSoftware Engineering Process Khaled El Emam, served while at the Canadian National Research Council, Canada, khaled.el-emam@nrc-cnrc.gc.caSoftware Engineering Tools and Methods David Carrington, School of Information Technology and Electrical Engineering, The University of Queensland, Australia, davec@itee.uq.edu.auSoftware Quality Alain April, École de technologie supérieure, Canada, aapril@ele.etsmtl.ca Dolores Wallace, retired from the National Institute of Standards and Technology, USA, Dolores.Wallace@nist.gov Larry Reeker, NIST, USA, Larry.Reeker@nist.govReferences Editor Marc Bouisset, Département d’informatique, UQAM, Bouisset.Marc@uqam.ca© IEEE – 2004 Version ix
  • 10. INDUSTRIAL ADVISORY BOARD At the time of the publication, the following people formed the Industrial Advisory Board:Mario R. Barbacci, Software Engineering Institute, representing theIEEE Computer SocietyCarl Chang, representing Computing Curricula 2001François Coallier, École de technologie supérieure, speaking as ISO/IEC JTC 1 / SC7 ChairmanCharles Howell, The MITRE CorporationAnatol Kark, National Research Council of CanadaPhilippe Kruchten, University of British Columbia, served as representative of Rational SoftwareLaure Le Bars, SAP Labs (Canada)Steve McConnell, Construx SoftwareDan Nash, Raytheon CompanyFred Otto, Canadian Council of Professional Engineers (CCPE)Richard Metz, The Boeing CompanyLarry Reeker, National Institute of Standards and Technology, Department of Commerce, USAThe following persons served along with the IAB in the Executive Change Control Board for the 2004 edition:Donald Bagert, Rose-Hulman Institute of Technology, representing the IEEE Computer Society ProfessionalPractices CommitteeAnn Sobel, Miami University, representing the Computing Curricula Software Engineering Steering Committee x © IEEE – 2004 Version
  • 11. PANEL OF EXPERTS The following persons served on the panel of experts for the preparation of the Trial version of the Guide:Steve McConnell, Construx SoftwareRoger Pressman, R.S. Pressman and AssociatesIan Sommerville, Lancaster University, UK© IEEE – 2004 Version xi
  • 12. REVIEW TEAM The following people participated in the review process of this Guide for the Trial version and/or for the 2004version.Abbas, Rasha, Australia Bierwolf, Robert, The Charette, Robert, USAAbran, Alain, Canada Netherlands Chevrier, Marielle, CanadaAccioly, Carlos, Brazil Bisbal, Jesus, Ireland Chi, Donald, USAAckerman, Frank, USA Boivin, Michel, Canada Chiew, Vincent, CanadaAkiyama, Yoshihiro, Japan Bolton, Michael, Canada Chilenski, John, USAAl-Abdullah, Mohammad, USA Bomitali, Evelino, Italy Chow, Keith, ItalyAlarcon, Miren Idoia, Spain Bonderer, Reto, Switzerland Ciciliani, Ricardo, ArgentinaAlawy, Ahmed, USA Bonk, Francis, USA Clark, Glenda, USAAlleman, Glen, USA Booch, Grady, USA Cleavenger, Darrell, USAAllen, Bob, Canada Booker, Glenn, USA Cloos, Romain, LuxembourgAllen, David, USA Börstler, Jürgen, Sweden Coallier, François, CanadaAmorosa, Francesco, Italy Borzovs, Juris, Latvia Coblentz, Brenda, USAAmyot, Daniel, Canada Botting, Richard, USA Cohen, Phil, AustraliaAndrade, Daniel, Brazil Bourque, Pierre, Canada Collard, Ross, New ZealandApril, Alain, Canada Bowen, Thomas, USA Collignon, Stephane, AustraliaArroyo-Figueror, Javier, USA Boyd, Milt, USA Connors, Kathy Jo, USAAshford, Sonny, USA Boyer, Ken, USA Cooper, Daniel, USAAtsushi, Sawada, Japan Brashear, Phil, USA Councill, Bill, USABackitis Jr., Frank, USA Briggs, Steve, USA Cox, Margery, USABagert, Donald, USA Bright, Daniela, USA Cunin, Pierre-Yves, FranceBaker, Jr., David, USA Brosseau, Jim, Canada DaLuz, Joseph, USABaker, Theodore, USA Brotbeck, George, USA Dampier, David, USABaldwin, Mark, USA Brown, Normand, Canada Daneva, Maya, CanadaBales, David, UK Bruhn, Anna, USA Daneva, Maya, CanadaBamberger, Judy, USA Brune, Kevin, USA Daughtry, Taz, USABanerjee, Bakul, USA Bryant, Jeanne, USA Davis, Ruth, USABarber, Scott, USA Buglione, Luigi, Italy De Cesare, Sergio, UKBarker, Harry, UK Bullock, James, USA Dekleva, Sasa, USABarnes, Julie, USA Burns, Robert, USA Del Castillo, Federico, PeruBarney, David, Australia Burnstein, Ilene, USA Del Dago, Gustavo, ArgentinaBarros, Rafael, Colombia Byrne, Edward, USA DeWeese, Perry, USABastarache, Louis, Canada Calizaya, Percy, Peru Di Nunno, Donn, USABayer, Steven, USA Carreon, Juan, USA Diaz-Herrera, Jorge, USABeaulac, Adeline, Canada Carroll, Sue, USA Dieste, Oscar, SpainBeck, William, USA Carruthers, Kate, Australia Dion, Francis, CanadaBeckman, Kathleen, USA Caruso, Richard, USA Dixon, Wes, USABelow, Doreen, USA Carvalho, Paul, Canada Dolado, Javier, SpainBenediktsson, Oddur, Iceland Case, Pam, USA Donaldson, John, UKBen-Menachem, Mordechai, Cavanaugh, John, USA Dorantes, Marco, MexicoIsrael Celia, John A., USA Dorofee, Audrey, USABergeron, Alain, Canada Chalupa Sampaio, Alberto Douglass, Keith, CanadaBerler, Alexander, Greece Antonio, Portugal Du, Weichang, CanadaBernet, Martin, USA Chan, F.T., Hong Kong Duben, Anthony, USABernstein, Larry, USA Chan, Keith, Hong Kong Dudash, Edward, USABertram, Martin, Germany Chandra, A.K., India Duncan, Scott, USABialik, Tracy, USA Chang, Wen-Kui, Taiwan Duong, Vinh, CanadaBielikova, Maria, Slovakia Chapin, Ned, USA Durham, George, USA xii © IEEE – 2004 Version
  • 13. Dutil, Daniel, Canada Gresse von Wangenheim, Jones, James E., USADutton, Jeffrey, USA Christiane, Brazil Jones, Alan, UKEbert, Christof, France Grigonis, George, USA Jones, James, USAEdge, Gary, USA Gupta, Arun, USA Jones, Larry, CanadaEdwards, Helen Maria, UK Gustafson, David, USA Jones, Paul, USAEl-Kadi, Amr, Egypt Gutcher, Frank, USA Ju, Dehua, ChinaEndres, David, USA Haas, Bob, USA Juan-Martinez, Manuel-Engelmann, Franz, Switzerland Hagar, Jon, USA Fernando, SpainEscue, Marilyn, USA Hagstrom, Erick, USA Juhasz, Zoltan, HungaryEspinoza, Marco, Peru Hailey, Victoria, Canada Juristo, Natalia, SpainFay, Istvan, Hungary Hall, Duncan, New Zealand Kaiser, Michael, SwitzerlandFayad, Mohamed, USA Haller, John, USA Kambic, George, USAFendrich, John, USA Halstead-Nussloch, Richard, Kamthan, Pankaj, CanadaFerguson, Robert, USA USA Kaner, Cem, USAFernandez, Eduardo, USA Hamm, Linda, USA Kark, Anatol, CanadaFernandez-Sanchez, Jose Luis, Hankewitz, Lutz, Germany Kasser, Joe, USASpain Harker, Rob, USA Kasser, Joseph, AustraliaFilgueiras, Lucia, Brazil Hart, Hal, USA Katz, Alf, AustraliaFinkelstein, Anthony, UK Hart, Ronald, USA Kececi, Nihal, CanadaFlinchbaugh, Scott, USA Hartner, Clinton, USA Kell, Penelope, USAForrey, Arden, USA Hayeck, Elie, USA Kelly, Diane, CanadaFortenberry, Kirby, USA He, Zhonglin, UK Kelly, Frank, USAFoster, Henrietta, USA Hedger, Dick, USA Kenett, Ron, IsraelFowler, Martin, USA Hefner, Rick, USA Kenney, Mary L., USAFowler, John Jr., USA Heinrich, Mark, USA Kerievsky, Joshua, USAFox, Christopher, USA Heinze, Sherry, Canada Kerr, John, USAFrankl, Phyllis, USA Hensel, Alan, USA Kierzyk, Robert, USAFreibergs, Imants, Latvia Herrmann, Debra, USA Kinsner, W., CanadaFrezza, Stephen, USA Hesse, Wolfgang, Germany Kirkpatrick, Harry, USAFruehauf, Karol, Switzerland Hilburn, Thomas, USA Kittiel, Linda, USAFuggetta, Alphonso, Italy Hill, Michael, USA Klappholz, David, USAFujii, Roger, USA Ho, Vinh, Canada Klein, Joshua, IsraelFUSCHI, David Luigi, Italy Hodgen, Bruce, Australia Knight, Claire, UKFuschi, David Luigi, Italy Hodges, Brett, Canada Knoke, Peter, USAGabrini, Philippe, Canada Hoffman, Douglas, Canada Ko, Roy, Hong KongGagnon, Eric, Canada Hoffman, Michael, USA Kolewe, Ralph, CanadaGanor, Eitan, Israel Hoganson, Tammy, USA Komal, Surinder Singh, CanadaGarbajosa, Juan, Spain Hollocker, Chuck, USA Kovalovsky, Stefan, AustriaGarceau, Benoît, Canada Horch, John, USA Krauth, Péter, HungaryGarcia-Palencia, Omar, Howard, Adrian, UK Krishnan, Nirmala, USAColombia Huang, Hui Min, USA Kromholz, Alfred, CanadaGarner, Barry, USA Hung, Chih-Cheng, USA Kruchten, Philippe, CanadaGelperin, David, USA Hung, Peter, USA Kuehner, Nathanael, CanadaGersting, Judith, Hawaii Hunt, Theresa, USA Kwok, Shui Hung, CanadaGiesler, Gregg, USA Hunter, John, USA Lacroix, Dominique, CanadaGil, Indalecio, Spain Hvannberg, Ebba Thora, Iceland LaMotte, Stephen W., USAGilchrist, Thomas, USA Hybertson, Duane, USA Land, Susan, USAGiurescu, Nicolae, Canada Ikiz, Seckin, Turkey Lange, Douglas, USAGlass, Robert, USA Iyengar, Dwaraka, USA Laporte, Claude, CanadaGlynn, Garth, UK Jackelen, George, USA Lawlis, Patricia, USAGoers, Ron, USA Jaeger, Dawn, USA Le, Thach, USAGogates, Gregory, USA Jahnke, Jens, Canada Leavitt, Randal, CanadaGoldsmith, Robin, USA James, Jean, USA LeBel, Réjean, CanadaGoodbrand, Alan, Canada Jino, Mario, Brazil Leciston, David, USAGorski, Janusz, Poland Johnson, Vandy, USA Lee, Chanyoung, USAGraybill, Mark, USA Jones, Griffin, USA Lehman, Meir (Manny), UK© IEEE – 2004 Version xiii
  • 14. Leigh, William, USA Miranda, Eduardo, Canada Phister, Paul, USALembo, Jim, USA Mistrik, Ivan, Germany Phister, Jr., Paul, USALenss, John, USA Mitasiunas, Antanas, Lithuania Piattini, Mario, SpainLeonard, Eugene, USA Modell, Howard, USA Piersall, Jeff, USALethbridge, Timothy, Canada Modell, Staiger,USA Pillai, S.K., IndiaLeung, Hareton, Hong Kong Modesitt, Kenneth, USA Pinder, Alan, UKLever, Ronald, The Netherlands Moland, Kathryn, USA Pinheiro, Francisco A., BrazilLevesque, Ghislain, Canada Moldavsky, Symon, Ukraine Plekhanova, Valentina, UKLey, Earl, USA Montequín, Vicente R., Spain Poon, Peter, USALinders, Ben, Netherlands Moreno, Ana Maria, Spain Poppendieck, Mary, USALinscomb, Dennis, USA Mosiuoa, Tseliso, Lesotho Powell, Mace, USALittle, Joyce Currie, USA Moudry, James, USA Predenkoski, Mary, USALogan, Jim, USA Msheik, Hamdan, Canada Prescott, Allen, USALong, Carol, UK Mularz, Diane, USA Pressman, Roger, USALounis, Hakim, Canada Mullens, David, USA Price, Art, USALow, Graham, Australia Müllerburg, Monika, Germany Price, Margaretha, USALutz, Michael, USA Murali, Nagarajan, Australia Pullum, Laura, USALynch, Gary, USA Murphy, Mike, USA Purser, Keith, USAMachado, Cristina, Brazil Napier, John, USA Purssey, John, AustraliaMacKay, Stephen, Canada Narasimhadevara, Sudha, Pustaver, John, USAMacKenzie, Garth, USA Canada Quinn, Anne, USAMacNeil, Paul, USA Narawane, Ranjana, India Radnell, David, AustraliaMagel, Kenneth, USA Narayanan, Ramanathan, India Rae, Andrew, UKMains, Harold, USA Navarro Ramirez, Daniel, Rafea, Ahmed, EgyptMalak, Renee, USA Mexico Ramsden, Patrick, AustraliaMaldonado, José Carlos, Brazil Navas Plano, Francisco, Spain Rao, N.Vyaghrewara, IndiaMarcos, Esperanza, Spain Navrat, Pavol, Slovakia Rawsthorne, Dan, USAMarinescu, Radu, Romania Neumann, Dolly, USA Reader, Katherine, USAMarm, Waldo, Peru Nguyen-Kim, Hong, Canada Reddy, Vijay,USAMarusca, Ioan, Canada Nikandros, George, Australia Redwine, Samuel, USAMatlen, Duane, USA Nishiyama, Tetsuto, Japan Reed, Karl, AustraliaMatsumoto, Yoshihiro, Japan Nunn, David, USA Reedy, Ann, USAMcBride, Tom, Australia ODonoghue, David, Ireland Reeker, Larry, USAMcCarthy, Glenn, USA Oliver, David John, Australia Rethard, Tom, USAMcChesney, Ian, UK Olson, Keith, USA Reussner, Ralf, GermanyMcCormick, Thomas, Canada Oskarsson, Östen, Sweden Rios, Joaquin, SpainMcCown, Christian, USA Ostrom, Donald, USA Risbec, Philippe, FranceMcDonald, Jim, USA Oudshoorn, Michael, Australia Roach, Steve, USAMcGrath Carroll, Sue, USA Owen, Cherry, USA Robillard, Pierre, CanadaMcHutchison, Diane, USA Pai, Hsueh-Ieng, Canada Rocha, Zalkind, BrazilMcKinnell, Brian, Canada Parrish, Lee, USA Rodeiro Iglesias, Javier, SpainMcMichael, Robert, USA Parsons, Samuel, USA Rodriguez-Dapena, Patricia,McMillan, William, USA Patel, Dilip, UK SpainMcQuaid, Patricia, USA Paulk, Mark, USA Rogoway, Paul, IsraelMead, Nancy, USA Pavelka, Jan, Czech Republic Rontondi, Guido, ItalyMeeuse, Jaap, The Netherlands Pavlov, Vladimir, Ukraine Roose, Philippe, FranceMeier, Michael, USA Pawlyszyn, Blanche, USA Rosca, Daniela, USAMeisenzahl, Christopher, USA Pecceu, Didier, France Rosenberg, Linda, USAMelhart, Bonnie, USA Perisic, Branko, Yugoslavia Rourke, Michael, AustraliaMengel, Susan, USA Perry, Dale, USA Rout, Terry, AustraliaMeredith, Denis, USA Peters, Dennis, Canada Rufer, Russ, USAMeyerhoff, Dirk, Germany Petersen, Erik, Australia Ruiz, Francisco, SpainMili, Hafedh, Canada Pfahl, Dietmar, Germany Ruocco, Anthony, USAMiller, Chris, Netherlands Pfeiffer, Martin, Germany Rutherfoord, Rebecca, USAMiller, Keith, USA Phillips, Dwayne, USA Ryan, Michael, IrelandMiller, Mark, USA Phipps, Robert, USA Salustri, Filippo, Canada xiv © IEEE – 2004 Version
  • 15. Salustri, Filippo, Canada Soundarajan, Neelam, USA Urbanowicz, Theodore, USASalwin, Arthur, USA Sousa Santos, Frederico, Van Duine, Dan, USASanden, Bo, USA Portugal Van Ekris, Jaap, NetherlandsSandmayr, Helmut, Switzerland Spillers, Mark, USA Van Oosterhout, Bram, AustraliaSantana Filho, Ozeas Vieira, Spinellis, Diomidis, Greece Vander Plaats, Jim, USABrazil Splaine, Steve, USA Vegas, Sira, SpainSato, Tomonobu, Japan Springer, Donald, USA Verner, June, USAsatyadas, antony, USA Staiger, John, USA Villas-Boas, André, BrazilSatyadas, Antony, USA Starai, Thomas, USA Vollman, Thomas, USASchaaf, Robert, USA Steurs, Stefan, Belgium Walker, Richard, AustraliaScheper, Charlotte, USA St-Pierre, Denis, Canada Walsh, Bucky, USASchiffel, Jeffrey, USA Stroulia, Eleni, Canada Wang, Yingxu, SwedenSchlicht, Bill, USA Subramanian, K.S., India Wear, Larry, USASchrott, William, USA Sundaram, Sai, UK Weigel, richard, USASchwarm, Stephen, USA Swanek, James, USA Weinstock, Charles, USASchweppe, Edmund, USA Swearingen, Sandra, USA Wenyin, Liu, ChinaSebern, Mark, USA Szymkowiak, Paul, Canada Werner, Linda, USASeffah, Ahmed, Canada Tamai, Tetsuo, Japan Wheeler, David, USASelby, Nancy, USA Tasker, Dan, New Zealand White, Nathan, USASelph, William, USA Taylor, Stanford, USA White, Stephanie, USASen, Dhruba, USA Terekhov, Andrey A., Russian Whitmire, Scott, USASenechal, Raymond, USA Federation Wijbrans, Klaas, TheSepulveda, Christian, USA Terski, Matt, USA NetherlandsSetlur, Atul, USA Thayer, Richard, USA Wijbrans-Roodbergen, Margot,Sharp, David, USA Thomas, Michael, USA The NetherlandsShepard, Terry, Canada Thompson, A. Allan, Australia Wilkie, Frederick, UKShepherd, Alan, Germany Thompson, John Barrie, UK Wille, Cornelius, GermanyShillato, Rrobert W, USA Titus, Jason, USA Wilson, Charles, USAShintani, Katsutoshi, Japan Tockey, Steve, USA Wilson, Leon, USASilva, Andres, Spain Tovar, Edmundo, Spain Wilson, Russell, USASilva, Andres, Spain Towhidnejad, Massood, USA Woechan, Kenneth, USASinger, Carl, USA Trellue, Patricia, USA Woit, Denise, CanadaSinnett, Paul, UK Trèves, Nicolas, France Yadin, Aharon, IsraelSintzoff, André, France Troy, Elliot, USA Yih, Swu, TaiwanSitte, Renate, Australia Tsui, Frank, USA Young, Michal, USASky, Richard, USA Tsuneo, Furuyama, Japan Yrivarren, Jorge, PeruSmilie, Kevin, USA Tuohy, Kenney, USA Znotka, Juergen, GermanySmith, David, USA Tuohy, Marsha P., USA Zuser, Wolfgang, AustriaSophatsathit, Peraphon, Thailand Turczyn, Stephen, USA Zvegintzov, Nicholas, USASorensen, Reed, USA Upchurch, Richard, USA Zweben, Stu, USA© IEEE – 2004 Version xv
  • 16. The following motion was unanimously adopted by the Industrial Advisory Board on 6 February 2004.The Industrial Advisory Board finds that the Software Engineering Body of Knowledge project initiatedin 1998 has been successfully completed; and endorses the 2004 Version of the Guide to the SWEBOKand commends it to the IEEE Computer Society Board of Governors for their approval. The following motion was adopted by the IEEE Computer Society Board of Governors in February 2004.MOVED, that the Board of Governors of the IEEE Computer Society approves the 2004 Edition of theGuide to the Software Engineering Body of Knowledge and authorizes the Chair of the ProfessionalPractices Committee to proceed with printing. xvi © IEEE – 2004 Version
  • 17. PREFACESoftware engineering is an emerging discipline and has long offered a certification for computingthere are unmistakable trends indicating an increasing professionals.level of maturity: All of these efforts are based upon the presumption Several universities throughout the world offer that there is a Body of Knowledge that should be undergraduate degrees in software engineering. mastered by practicing software engineers. The Body For example, such degrees are offered at the of Knowledge exists in the literature that has University of New South Wales (Australia), accumulated over the past thirty years. This book McMaster University (Canada), the Rochester provides a Guide to that Body of Knowledge. Institute of Technology (US), the University of Sheffield (UK), and other universities. PURPOSE In the US, the Engineering Accreditation The purpose of the Guide to the Software Engineering Commission of the Accreditation Board for Body of Knowledge is to provide a consensually Engineering and Technology (ABET) is validated characterization of the bounds of the responsible for the accreditation of undergraduate software engineering discipline and to provide a software engineering programs. topical access to the Body of Knowledge supporting The Canadian Information Processing Society has that discipline. The Body of Knowledge is subdivided published criteria to accredit software engineering into ten software engineering Knowledge Areas (KA) undergraduate university programs. plus an additional chapter providing an overview of the The Software Engineering Institute’s Capability KAs of strongly related disciplines. The descriptions of Maturity Model for Software (SW CMM) and the the KAs are designed to discriminate among the new Capability Maturity Model Integration various important concepts, permitting readers to find (CMMI) are used to assess organizational their way quickly to subjects of interest. Upon finding capability for software engineering. The famous a subject, readers are referred to key papers or book ISO 9000 quality management standards have chapters selected because they succinctly present the been applied to software engineering by the new knowledge. ISO/IEC 90003. In browsing the Guide, readers will note that the The Texas Board of Professional Engineers has content is markedly different from computer science. begun to license professional software engineers. Just as electrical engineering is based upon the science of physics, software engineering should be based, The Association of Professional Engineers and among other things, upon computer science. In these Geoscientists of British Columbia (APEGBC) has two cases, though, the emphasis is necessarily begun registering software professional engineers, different. Scientists extend our knowledge of the laws and the Professional Engineers of Ontario (PEO) of nature while engineers apply those laws of nature to has also announced requirements for licensing. build useful artifacts, under a number of constraints. The Association for Computing Machinery Therefore, the emphasis of the Guide is placed on the (ACM) and the Computer Society of the Institute construction of useful software artifacts. of Electrical and Electronics Engineers (IEEE) Readers will also notice that many important aspects of have jointly developed and adopted a Code of information technology that may constitute important Ethics and Professional Practice for software software engineering knowledge are not covered in the engineering professionals.1 Guide, including specific programming languages, The IEEE Computer Society offers the Certified relational databases, and networks. This is a Software Development Professional certification consequence of an engineering-based approach. In all for software engineering. The Institute for fields—not only computing—the designers of Certification of Computing Professionals (ICCP) engineering curricula have realized that specific technologies are replaced much more rapidly than the engineering work force. An engineer must be equipped1 with the essential knowledge that supports the The ACM/CS Software Engineering Code of Ethics and Professional Practice can be found at selection of the appropriate technology at the http://www.computer.org/certification/ethics.htm. appropriate time in the appropriate circumstance. For © IEEE – 2004 Version xvii
  • 18. example, software might be built in Fortran using organizations in need of a consistent view of softwarefunctional decomposition or in C++ using object- engineering for defining education and trainingoriented techniques. The techniques for software requirements, classifying jobs, developingconfiguring instances of those systems would be quite performance evaluation policies, or specifyingdifferent. But, the principles and objectives of software development tasks. It also addressesconfiguration management remain the same. The practicing, or managing, software engineers and theGuide therefore does not focus on the rapidly changing officials responsible for making public policytechnologies, although their general principles are regarding licensing and professional guidelines. Indescribed in relevant KAs. addition, professional societies and educators definingThese exclusions demonstrate that this Guide is the certification rules, accreditation policies fornecessarily incomplete. The Guide covers software university curricula, and guidelines for professionalengineering knowledge that is necessary but not practice will benefit from SWEBOK, as well as thesufficient for a software engineer. Practicing software students learning the software engineering professionengineers will need to know many things about and educators and trainers engaged in definingcomputer science, project management, and systems curricula and course content.engineering—to name a few—that fall outside theBody of Knowledge characterized by this Guide. EVOLUTION OF THE GUIDEHowever, stating that this information should beknown by software engineers is not the same as stating From 1993 to 2000, the IEEE Computer Society andthat this knowledge falls within the bounds of the the Association for Computing Machinery (ACM) cooperated in promoting the professionalization ofsoftware engineering discipline. Instead, it should bestated that software engineers need to know some software engineering through their joint Softwarethings taken from other disciplines—and that is the Engineering Coordinating Committee (SWECC). The Code of Ethics was completed under stewardship ofapproach adopted in this Guide. So, this Guidecharacterizes the Body of Knowledge falling within the the SWECC primarily through volunteer efforts. Thescope of software engineering and provides references SWEBOK project was initiated by the SWECC in 1998.to relevant information from other disciplines. Achapter of the Guide provides a taxonomical overview The SWEBOK project’s scope, the variety ofof the related disciplines derived from authoritative communities involved, and the need for broadsources. participation suggested a need for full-time rather thanThe emphasis on engineering practice leads the Guide volunteer management. For this purpose, the IEEEtoward a strong relationship with the normative Computer Society contracted the Software Engineeringliterature. Most of the computer science, information Management Research Laboratory at the Université dutechnology, and software engineering literature Québec à Montréal (UQAM) to manage the effort. Inprovides information useful to software engineers, but recent years, UQAM has been joined by the École dea relatively small portion is normative. A normative technologie supérieure, Montréal, Québec.document prescribes what an engineer should do in a The project plan comprised three successive phases:specified situation rather than providing information Strawman, Stoneman, and Ironman. An earlythat might be helpful. The normative literature is prototype, Strawman, demonstrated how the projectvalidated by consensus formed among practitioners might be organized. The publication of the widelyand is concentrated in standards and related circulated Trial Version of the Guide in 2001 markeddocuments. From the beginning, the SWEBOK project the end of the Stoneman phase of the project andwas conceived as having a strong relationship to the initiated a period of trial usage. The current Guidenormative literature of software engineering. The two marks the end of the Ironman period by providing amajor standards bodies for software engineering (IEEE Guide that has achieved consensus through broadComputer Society Software Engineering Standards review and trial application.Committee and ISO/IEC JTC1/SC7) are represented in The project team developed two important principlesthe project. Ultimately, we hope that software for guiding the project: transparency and consensus.engineering practice standards will contain principles By transparency, we mean that the developmentdirectly traceable to the Guide. process is itself documented, published, and publicized so that important decisions and status are visible to allINTENDED AUDIENCE concerned parties. By consensus, we mean that the only practical method for legitimizing a statement ofThe Guide is oriented toward a variety of audiences, this kind is through broad participation and agreementall over the world. It aims to serve public and private by all significant sectors of the relevant community. xviii © IEEE – 2004 Version
  • 19. Literally hundreds of contributors, reviewers, and trial rearranged to make it more usable, but we were carefulusers have played a part in producing the current to include the same information that was approved bydocument. the earlier consensus process. We updated theLike any software project, the SWEBOK project has reference list so that it would be easier to obtain themany stakeholders—some of which are formally references.represented. An Industrial Advisory Board, composed Trial usage resulted in the recommendation that threeof representatives from industry (Boeing, Construx KAs should be rewritten. Practitioners remarked thatSoftware, the MITRE Corporation, Rational Software, the Construction KA was difficult to apply in aRaytheon Systems, and SAP Labs-Canada), research practical context. The Management KA was perceivedagencies (National Institute of Standards and as being too close to general management and notTechnology, National Research Council of Canada), sufficiently specific to software engineering concerns.the Canadian Council of Professional Engineers, and The Quality KA was viewed as an uncomfortable mixthe IEEE Computer Society, has provided financial of process quality and product quality; it was revised tosupport for the project. The IAB’s generous support emphasize the latter.permits us to make the products of the SWEBOK Finally, some KAs were revised to remove materialproject publicly available without any charge duplicating that of other KAs.(see http://www.swebok.org). IAB membership issupplemented with the chairs of ISO/IEC JTC1/SC7and the related Computing Curricula 2001 initiative. LIMITATIONSThe IAB reviews and approves the project plans, Even though the Guide has gone through an elaborateoversees consensus building and review processes, development and review process, the followingpromotes the project, and lends credibility to the effort. limitations of this process must be recognized andIn general, it ensures the relevance of the effort to real- stated:world needs. Software engineering continues to be infusedThe Trial Version of the Guide was the product of with new technology and new practices.extensive review and comment. In three public review Acceptance of new techniques grows and oldercycles, a total of roughly 500 reviewers from 42 techniques are discarded. The topics listed ascountries provided roughly 9,000 comments, all of “generally accepted” in this Guide are carefullywhich are available at www.swebok.org. To produce selected at this time. Inevitably, though, thethe current version, we released the Trial Version for selection will need to evolve.extensive trial usage. Trial application in specializedstudies resulted in 17 papers describing good aspects The amount of literature that has been publishedof the Guide, as well as aspects needing improvement. on software engineering is considerable and theA Web-based survey captured additional experience: reference material included in this Guide should573 individuals from 55 countries registered for the not be seen as a definitive selection but rather as asurvey; 124 reviewers from 21 countries actually reasonable selection. Obviously, there are otherprovided comments—1,020 of them. Additional excellent authors and excellent references thansuggestions for improvement resulted from liaison those included in the Guide. In the case of thewith related organizations and efforts: IEEE-CS/ACM Guide, references were selected because they areComputing Curricula Software Engineering; the IEEE written in English, readily available, recent, andCS Certified Software Development Professional easily readable, and—taken as a group—theyproject; ISO/IEC JTC1/SC7 (software and systems provide coverage of the topics within the KA.engineering standards); the IEEE Software Important and highly relevant reference materialEngineering Standards Committee; the American written in languages other than English have beenSociety for Quality, Software Division; and an omitted from the selected reference material.engineering professional society, the Canadian Council Additionally, one must consider thatof Professional Engineers. Software engineering is an emerging discipline. This is especially true if you compare it to certainCHANGES SINCE THE TRIAL VERSION more established engineering disciplines. ThisThe overall goal of the current revision was to improve means notably that the boundaries between thethe readability, consistency, and usability of the Guide. KAs of software engineering and betweenThis implied a general rewrite of the entire text to software engineering and its related disciplinesmake the style consistent throughout the document. In remain a matter for continued evolution.several cases, the topical breakdown of the KA was © IEEE – 2004 Version xix
  • 20. The contents of this Guide must therefore be viewed as replace or amend in any way laws, rules, andan “informed and reasonable” characterization of the procedures that have been defined by official publicsoftware engineering Body of Knowledge and as policy makers around the world regarding the practicebaseline for future evolution. Additionally, please note and definition of engineering and software engineeringthat the Guide is not attempting nor does it claim to in particular. xx © IEEE – 2004 Version
  • 21. Alain Abran Executive Editors of the James W. MooreÉcole de technologie supérieure Guide to the Software The MITRE Corporation Engineering Body of KnowledgePierre Bourque Editors of the Guide to Robert DupuisÉcole de Technologie Supérieure the Software Engineering Université du Québec à Montréal Body of KnowledgeLeonard Tripp Chair of the Professional1999 President Practices Committee,IEEE Computer Society IEEE Computer Society (2001-2003) December 2004 The SWEBOK project Web site is http://www.swebok.org/ Serge Oligny, Suzanne Paquette, Keith Paton,ACKNOWLEDGMENTS Dave Rayford, Normand Séguin, Paul Sinnett, Denis St-Pierre, Dale Strok, Pascale Tardif,The SWEBOK editorial team gratefully Louise Thibaudeau, Dolores Wallace, Évaristeacknowledges the support provided by the Valery Bevo Wandji, and Michal Young.members of the Industrial Advisory Board.Funding for this project has been provided by the Finally, there are surely other people who haveACM, Boeing, the Canadian Council of contributed to this Guide, either directly orProfessional Engineers, Construx Software, the indirectly, whose names we have inadvertentlyIEEE Computer Society, the MITRE omitted. To those people, we offer our tacitCorporation, the National Institute of Standards appreciation and apologize for having omittedand Technology, the National Research Council explicit recognition here.of Canada, Rational Software, RaytheonCompany, and SAP Labs (Canada). The team isthankful for the counsel provided by the Panel ofExperts. The team also appreciates the importantwork performed by the Associate Editors. Wewould also like to express our gratitude for initialwork on the Knowledge Area Descriptionscompleted by Imants Freibergs, Stephen Frezza,Andrew Gray, Vinh T. Ho, Michael Lutz, LarryReeker, Guy Tremblay, Chris Verhoef, andSybille Wolff. The editorial team must alsoacknowledge the indispensable contribution ofthe hundreds of reviewers.The editorial team also wishes to thank thefollowing people who contributed to the projectin various ways: Mark Ardis, Yussef Belkebir,Michel Boivin, Julie Bonneau, Simon Bouchard,François Cossette, Vinh Duong, Gilles Gauthier,Michèle Hébert, Paula Hawthorn, Richard W.Heiman, Julie Hudon, Idrissa Konkobo, ReneKöppel, Lucette Lapointe, Claude Laporte, LuisMolinié, Hamdan Msheik, Iphigénie N’Diyae, © IEEE – 2004 Version xxi
  • 22. xxii © IEEE – 2004 Version
  • 23. CHAPTER 1 INTRODUCTION TO THE GUIDEIn spite of the millions of software professionals worldwide accounting.3 They concluded that an engineering profession isand the ubiquitous presence of software in our society, characterized by several components:software engineering has only recently reached the status of a An initial professional education in a curriculumlegitimate engineering discipline and a recognized profession. validated by society through accreditationAchieving consensus by the profession on a core body of Registration of fitness to practice via voluntaryknowledge is a key milestone in all disciplines and had been certification or mandatory licensingidentified by the IEEE Computer Society as crucial for the Specialized skill development and continuingevolution of software engineering towards professional status. professional educationThis Guide, written under the auspices of the ProfessionalPractices Committee, is part of a multi-year project designed Communal support via a professional societyto reach such a consensus. A commitment to norms of conduct often prescribed in a code of ethicsWHAT IS SOFTWARE ENGINEERING? This Guide contributes to the first three of these components.The IEEE Computer Society defines software engineering as Articulating a Body of Knowledge is an essential step toward developing a profession because it represents a broad“(1) The application of a systematic, disciplined, quantifiable consensus regarding what a software engineering professionalapproach to the development, operation, and maintenance of should know. Without such a consensus, no licensingsoftware; that is, the application of engineering to software. examination can be validated, no curriculum can prepare an(2) The study of approaches as in (1).”1 individual for an examination, and no criteria can be formulated for accrediting a curriculum. The development ofWHAT IS A RECOGNIZED PROFESSION? consensus is also a prerequisite to the adoption of coherent skills development and continuing professional educationFor software engineering to be fully known as a legitimate programs in organizations.engineering discipline and a recognized profession, consensuson a core body of knowledge is imperative. This fact is well WHAT ARE THE OBJECTIVES OF THE SWEBOK PROJECT?illustrated by Starr when he defines what can be considered alegitimate discipline and a recognized profession. In his The Guide should not be confused with the Body ofPulitzer Prize-winning book on the history of the medical Knowledge itself, which already exists in the publishedprofession in the USA, he states, literature. The purpose of the Guide is to describe what“The legitimization of professional authority involves three portion of the Body of Knowledge is generally accepted, todistinctive claims: first, that the knowledge and competence of organize that portion, and to provide a topical access to it.the professional have been validated by a community of his or Additional information on the meaning given to “generallyher peers; second, that this consensually validated knowledge accepted” can be found below and in Appendix A.rests on rational, scientific grounds; and third, that the The Guide to the Software Engineering Body of Knowledgeprofessional’s judgment and advice are oriented toward a set (SWEBOK) was established with the following fiveof substantive values, such as health. These aspects of objectives:legitimacy correspond to the kinds of attributes—collegial, 1. To promote a consistent view of software engineeringcognitive, and moral—usually embodied in the term worldwide“profession.”2 2. To clarify the place—and set the boundary—of software engineering with respect to other disciplinesWHAT ARE THE CHARACTERISTICS OF A PROFESSION? such as computer science, project management,Gary Ford and Norman Gibbs studied several recognized computer engineering, and mathematicsprofessions, including medicine, law, engineering, and 3. To characterize the contents of the software engineering discipline1 “IEEE Standard Glossary of Software Engineering Terminology,” IEEE std 610.12-1990, 1990. 3 G. Ford and N.E. Gibbs, A Mature Profession of Software Engineering,2 P. Starr, The Social Transformation of American Medicine, Basic Books, Software Engineering Institute, Carnegie Mellon University, Pittsburgh, 1982, p. 15. Pa., tech. report CMU/SEI-96-TR-004, Jan. 1996.© IEEE – 2004 Version 1-1
  • 24. 4. To provide a topical access to the Software Engineering Table 2 Related disciplines Body of Knowledge Computer engineering Project management5. To provide a foundation for curriculum development Computer science Quality management and for individual certification and licensing material Management Software ergonomicsThe first of these objectives, a consistent worldwide view ofsoftware engineering, was supported by a development Mathematics Systems engineeringprocess which engaged approximately 500 reviewers from 42countries in the Stoneman phase (1998–2001) leading to the HIERARCHICAL ORGANIZATIONTrial version, and over 120 reviewers from 21 countries in theIronman phase (2003) leading to the 2004 version. More The organization of the KA descriptions or chaptersinformation regarding the development process can be found supports the third of the project’s objectives—ain the Preface and on the Web site (www.swebok.org). characterization of the contents of software engineering.Professional and learned societies and public agencies The detailed specifications provided by the project’sinvolved in software engineering were officially contacted, editorial team to the associate editors regarding the contentsmade aware of this project, and invited to participate in the of the KA descriptions can be found in Appendix A.review process. Associate editors were recruited from North The Guide uses a hierarchical organization to decompose eachAmerica, the Pacific Rim, and Europe. Presentations on the KA into a set of topics with recognizable labels. A two- orproject were made at various international venues and more three-level breakdown provides a reasonable way to findare scheduled for the upcoming year. topics of interest. The Guide treats the selected topics in aThe second of the objectives, the desire to set a boundary for manner compatible with major schools of thought and withsoftware engineering, motivates the fundamental organization breakdowns generally found in industry and in softwareof the Guide. The material that is recognized as being within engineering literature and standards. The breakdowns of topicsthis discipline is organized into the first ten Knowledge Areas do not presume particular application domains, business uses,(KAs) listed in Table 1. Each of these KAs is treated as a management philosophies, development methods, and sochapter in this Guide. forth. The extent of each topic’s description is only that needed to understand the generally accepted nature of the topics and for the reader to successfully find reference material. After all, Table 1 The SWEBOK Knowledge Areas (KAs) the Body of Knowledge is found in the reference material Software requirements themselves, not in the Guide. Software design REFERENCE MATERIAL AND MATRIX Software construction Software testing To provide a topical access to the knowledge—the fourth of Software maintenance the project’s objectives—the Guide identifies reference material for each KA, including book chapters, refereed Software configuration management papers, or other recognized sources of authoritative Software engineering management information. Each KA description also includes a matrix Software engineering process relating the reference material to the listed topics. The total Software engineering tools and methods volume of cited literature is intended to be suitable for mastery through the completion of an undergraduate education plus Software quality four years of experience. In this edition of the Guide, all KAs were allocated around 500 pages of reference material, and this was the specification theIn establishing a boundary, it is also important to identify what associate editors were invited to apply. It may be argued thatdisciplines share that boundary, and often a common some KAs, such as software design for instance, deserve moreintersection, with software engineering. To this end, the Guide pages of reference material than others. Such modulation mayalso recognizes eight related disciplines, listed in Table 2 (see be applied in future editions of the Guide.Chapter 12, “Related Disciplines of Software Engineering”). It should be noted that the Guide does not attempt to beSoftware engineers should, of course, have knowledge of comprehensive in its citations. Much material that is bothmaterial from these fields (and the KA descriptions may make suitable and excellent is not referenced. Material was selectedreference to them). It is not, however, an objective of the in part because—taken as a collection—it provides coverageSWEBOK Guide to characterize the knowledge of the related of the topics described.disciplines, but rather what knowledge is viewed as specific tosoftware engineering. DEPTH OF TREATMENT From the outset, the question arose as to the depth of treatment 1–2 © IEEE – 2004 Version
  • 25. the Guide should provide. The project team adopted an given too much importance. As much as possible, pointers andapproach which supports the fifth of the project’s objectives— links have been given in the text where relevant and useful.providing a foundation for curriculum development, Links between KAs are not of the input-output type. The KAscertification, and licensing. The editorial team applied the are meant to be views on the knowledge one should possess incriterion of generally accepted knowledge, to be distinguished software engineering with respect to the KA in question. Thefrom advanced and research knowledge (on the grounds of decomposition of the discipline within KAs and the order inmaturity) and from specialized knowledge (on the grounds of which the KAs are presented are not to be assimilated with anygenerality of application). The definition comes from the particular method or model. The methods are described in theProject Management Institute: “The generally accepted appropriate KA in the Guide, and the Guide itself is not one ofknowledge applies to most projects most of the time, and them.widespread consensus validates its value and effectiveness.”4 THE KNOWLEDGE AREAS Generally Accepted Figure 1 maps out the eleven chapters and the important topics incorporated within them. The first five KAs are presented in Established traditional practices Practices used only for certain types traditional waterfall life-cycle sequence. However, this does recommended by many organizations not imply that the Guide adopts or encourages the waterfall model, or any other model. The subsequent KAs are presented in alphabetical order, and those of the related disciplines are Specialized of software presented in the last chapter. This is identical to the sequence Advanced and Research in which they are presented in this Guide. Innovative practices tested and used only by some organizations and concepts still STRUCTURE OF THE KA DESCRIPTIONS being developed and tested in research The KA descriptions are structured as follows. organizations In the introduction, a brief definition of the KA and an overview of its scope and of its relationship with other KAs are presented. The breakdown of topics constitutes the core of each KA Figure 1 Categories of knowledge description, describing the decomposition of the KA into subareas, topics, and sub-topics. For each topic or sub-topic, aHowever, the term “generally accepted” does not imply that short description is given, along with one or more references.the designated knowledge should be uniformly applied to all The reference material was chosen because it is considered tosoftware engineering endeavors—each project’s needs constitute the best presentation of the knowledge relative to thedetermine that—but it does imply that competent, capable topic, taking into account the limitations imposed on thesoftware engineers should be equipped with this knowledge choice of references (see above). A matrix links the topics tofor potential application. More precisely, generally accepted the reference material.knowledge should be included in the study material for the The last part of the KA description is the list of recommendedsoftware engineering licensing examination that graduates references. Appendix A of each KA includes suggestions forwould take after gaining four years of work experience. further reading for those users who wish to learn more aboutAlthough this criterion is specific to the US style of education the KA topics. Appendix B presents the list of standards mostand does not necessarily apply to other countries, we deem it relevant to the KA. Note that citations enclosed in squareuseful. However, the two definitions of generally accepted brackets “[ ]” in the text identify recommended references,knowledge should be seen as complementary. while those enclosed in parentheses “( )” identify the usual references used to write or justify the text. The former are to beLIMITATIONS RELATED TO THE BOOK FORMAT found in the corresponding section of the KA and the latter in Appendix A of the KA.The book format for which this edition was conceived has itslimitations. The nature of the contents would be better served Brief summaries of the KA descriptions and appendices areusing a hypertext structure, where a topic would be linked to given next.topics other than the ones immediately preceding andfollowing it in a list. SOFTWARE REQUIREMENTS KA(SEE FIGURE 2, COLUMN A)Some boundaries between KAs, subareas, and so on are also A requirement is defined as a property that must be exhibitedsometimes relatively arbitrary. These boundaries are not to be in order to solve some real-world problem. The first knowledge subarea is Software Requirements4 A Guide to the Project Management Body of Knowledge, 2000 ed., Project Fundamentals. It includes definitions of software requirements Management Institute, www.pmi.org.© IEEE – 2004 Version 1-3
  • 26. themselves, but also of the major types of requirements: SOFTWARE DESIGN KA(SEE FIGURE 2, COLUMN B)product vs. process, functional vs. nonfunctional, emergent According to the IEEE definition [IEEE 610.12-90], design isproperties. The subarea also describes the importance of both “the process of defining the architecture, components,quantifiable requirements and distinguishes between systems interfaces, and other characteristics of a system or component”and software requirements. and “the result of [that] process.” The KA is divided into sixThe second knowledge subarea is the Requirements Process, subareas.which introduces the process itself, orienting the remaining The first subarea presents Software Design Fundamentals,five subareas and showing how requirements engineering which form an underlying basis to the understanding of thedovetails with the other software engineering processes. It role and scope of software design. These are general softwaredescribes process models, process actors, process support and concepts, the context of software design, the software designmanagement, and process quality and improvement. process, and the enabling techniques for software design.The third subarea is Requirements Elicitation, which is The second subarea groups together the Key Issues in Softwareconcerned with where software requirements come from and Design. They include concurrency, control and handling ofhow the software engineer can collect them. It includes events, distribution of components, error and exceptionrequirement sources and elicitation techniques. handling and fault tolerance, interaction and presentation, andThe fourth subarea, Requirements Analysis, is concerned with data persistence.the process of analyzing requirements to The third subarea is Software Structure and Architecture, the Detect and resolve conflicts between requirements topics of which are architectural structures and viewpoints, Discover the bounds of the software and how it must architectural styles, design patterns, and, finally, families of interact with its environment programs and frameworks. Elaborate system requirements to software requirements The fourth subarea describes software Design Quality AnalysisRequirements analysis includes requirements classification, and Evaluation. While there is a entire KA devoted toconceptual modeling, architectural design and requirements software quality, this subarea presents the topics specificallyallocation, and requirements negotiation. related to software design. These aspects are quality attributes, quality analysis, and evaluation techniques and measures.The fifth subarea is Requirements Specification. Requirementsspecification typically refers to the production of a document, The fifth subarea is Software Design Notations, which areor its electronic equivalent, that can be systematically divided into structural and behavioral descriptions.reviewed, evaluated, and approved. For complex systems, The last subarea describes Software Design Strategies andparticularly those involving substantial non-software Methods. First, general strategies are described, followed bycomponents, as many as three different types of documents are function-oriented design methods, object-oriented designproduced: system definition, system requirements methods, data-structure-centered design, component- basedspecification, and software requirements specification. The design, and others.subarea describes all three documents and the underlyingactivities. SOFTWARE CONSTRUCTION KA (SEE FIGURE 2, COLUMN C)The sixth subarea is Requirements Validation, the aim of Software construction refers to the detailed creation ofwhich is to pick up any problems before resources are working, meaningful software through a combination ofcommitted to addressing the requirements. Requirements coding, verification, unit testing, integration testing, andvalidation is concerned with the process of examining the debugging. The KA includes three subareas.requirements documents to ensure that they are defining theright system (that is, the system that the user expects). It is The first subarea is Software Construction Fundamentals. Thesubdivided into descriptions of the conduct of requirements first three topics are basic principles of construction:reviews, prototyping, and model validation and acceptance minimizing complexity, anticipating change, and constructing for verification. The last topic discusses standards fortests. construction.The seventh and last subarea is Practical Considerations. Itdescribes topics which need to be understood in practice. The The second subarea describes Managing Construction. Thefirst topic is the iterative nature of the requirements process. topics are construction models, construction planning, andThe next three topics are fundamentally about change construction measurement.management and the maintenance of requirements in a state The third subarea covers Practical Considerations. The topicswhich accurately mirrors the software to be built, or that has are construction design, construction languages, coding,already been built. It includes change management, construction testing, reuse, construction quality, andrequirements attributes, and requirements tracing. The final integration.topic is requirements measurement. SOFTWARE TESTING (SEE FIGURE 2, COLUMN D) Software Testing consists of the dynamic verification of the 1–4 © IEEE – 2004 Version
  • 27. behavior of a program on a finite set of test cases, suitably covers the topics of the organizational context for SCM,selected from the usually infinite executions domain, against constraints and guidance for SCM, planning for SCM, thethe expected behavior. It includes five subareas. SCM plan itself, and surveillance of SCM.It begins with a description of Software Testing Fundamentals. The second subarea is Software Configuration Identification,First, the testing-related terminology is presented, then key which identifies items to be controlled, establishesissues of testing are described, and finally the relationship of identification schemes for the items and their versions, andtesting to other activities is covered. establishes the tools and techniques to be used in acquiring andThe second subarea is Test Levels. These are divided between managing controlled items. The first topics in this subarea arethe targets of the tests and the objectives of the tests. identification of the items to be controlled and the software library.The third subarea is Test Techniques. The first categoryincludes the tests based on the tester’s intuition and The third subarea is Software Configuration Control, which isexperience. A second group comprises specification-based the management of changes during the software life cycle. Thetechniques, followed by code-based techniques, fault-based topics are: first, requesting, evaluating, and approving softwaretechniques, usage-based techniques, and techniques relative to changes; second, implementing software changes; and third,the nature of the application. A discussion of how to select and deviations and waivers.combine the appropriate techniques is also presented. The fourth subarea is Software Configuration StatusThe fourth subarea covers Test-Related Measures. The Accounting. Its topics are software configuration statusmeasures are grouped into those related to the evaluation of information and software configuration status reporting.the program under test and the evaluation of the tests The fifth subarea is Software Configuration Auditing. Itperformed. consists of software functional configuration auditing,The last subarea describes the Test Process and includes software physical configuration auditing, and in-process auditspractical considerations and the test activities. of a software baseline. The last subarea is Software Release Management andSOFTWARE MAINTENANCE (SEE FIGURE 2, COLUMN E) Delivery, covering software building and software releaseOnce in operation, anomalies are uncovered, operating management.environments change, and new user requirements surface.The maintenance phase of the life cycle commences upon SOFTWARE ENGINEERING MANAGEMENT (SEE FIGURE 3,delivery, but maintenance activities occur much earlier. The COLUMN G)Software Maintenance KA is divided into four subareas. The Software Engineering Management KA addresses theThe first one presents Software Maintenance Fundamentals: management and measurement of software engineering. Whiledefinitions and terminology, the nature of maintenance, the measurement is an important aspect of all KAs, it is here thatneed for maintenance, the majority of maintenance costs, the the topic of measurement programs is presented. There are sixevolution of software, and the categories of maintenance. subareas for software engineering management. The first five cover software project management and the sixth describesThe second subarea groups together the Key Issues in Software software measurement programs.Maintenance. These are the technical issues, the managementissues, maintenance cost estimation, and software maintenance The first subarea is Initiation and Scope Definition, whichmeasurement. comprises determination and negotiation of requirements, feasibility analysis, and process for the review and revision ofThe third subarea describes the Maintenance Process. The requirements.topics here are the maintenance processes and maintenanceactivities. The second subarea is Software Project Planning and includes process planning, determining deliverables, effort, scheduleTechniques for Maintenance constitute the fourth subarea. and cost estimation, resource allocation, risk management,These include program comprehension, re-engineering, and quality management, and plan management.reverse engineering. The third subarea is Software Project Enactment. The topics here are implementation of plans, supplier contractSOFTWARE CONFIGURATION MANAGEMENT (SEE FIGURE 3, management, implementation of measurement process,COLUMN F) monitor process, control process, and reporting.Software Configuration Management (SCM) is the disciplineof identifying the configuration of software at distinct points in The fourth subarea is Review and Evaluation, which includestime for the purpose of systematically controlling changes to the topics of determining satisfaction of requirements andthe configuration and of maintaining the integrity and reviewing and evaluating performance.traceability of the configuration throughout the system life The fifth subarea describes Closure: determining closure andcycle. This KA includes six subareas. closure activities.The first subarea is Management of the SCM Process. It Finally, the sixth subarea describes Software Engineering© IEEE – 2004 Version 1-5
  • 28. Measurement, more specifically, measurement programs. SOFTWARE QUALITY (SEE FIGURE 3, COLUMN J)Product and process measures are described in the Software The Software Quality KA deals with software qualityEngineering Process KA. Many of the other KAs also describe considerations which transcend the software life cyclemeasures specific to their KA. The topics of this subarea processes. Since software quality is a ubiquitous concern ininclude establishing and sustaining measurement commitment, software engineering, it is also considered in many of the otherplanning the measurement process, performing the KAs, and the reader will notice pointers to those KAsmeasurement process, and evaluating measurement. throughout this KA. The description of this KA covers three subareas.SOFTWARE ENGINEERING PROCESS (SEE FIGURE 3, COLUMN The first subarea describes the Software Quality FundamentalsH) such as software engineering culture and ethics, the value andThe Software Engineering Process KA is concerned with the costs of quality, models and quality characteristics, and qualitydefinition, implementation, assessment, measurement, improvement.management, change, and improvement of the software The second subarea covers Software Quality Managementengineering process itself. It is divided into four subareas. Processes. The topics here are software quality assurance,The first subarea presents Process Implementation and verification and validation, and reviews and audits.Change. The topics here are process infrastructure, the The third and final subarea describes Practical Considerationssoftware process management cycle, models for process related to software quality. The topics are software qualityimplementation and change, and practical considerations. requirements, defect characterization, software qualityThe second subarea deals with Process Definition. It includes management techniques, and software quality measurement.the topics of software life cycle models, software life cycleprocesses, notations for process definitions, process RELATED DISCIPLINES OF SOFTWARE ENGINEERING (SEEadaptation, and automation. FIGURE 3, COLUMN K)The third subarea is Process Assessment. The topics here The last chapter is entitled Related Disciplines of Softwareinclude process assessment models and process assessment Engineering. In order to circumscribe software engineering, itmethods. is necessary to identify the disciplines with which softwareThe fourth subarea describes Process and Product engineering shares a common boundary. This chapterMeasurements. The software engineering process covers identifies, in alphabetical order, these related disciplines. Forgeneral product measurement, as well as process measurement each related discipline, and using a consensus-basedin general. Measurements specific to KAs are described in the recognized source as found, are identified:relevant KA. The topics are process measurement, software an informative definition (when feasible);product measurement, quality of measurement results, a list of KAs.software information models, and process measurementtechniques. The related disciplines are:SOFTWARE ENGINEERING TOOLS AND METHODS (SEE FIGURE Computer engineering Project management3, COLUMN I) Computer science Quality managementThe Software Engineering Tools and Methods KA includesboth software engineering tools and software engineering Management Software ergonomicsmethods. Mathematics Systems engineeringThe Software Engineering Tools subarea uses the samestructure as the Guide itself, with one topic for each of theother nine software engineering KAs. An additional topic is APPENDICESprovided: miscellaneous tools issues, such as tool integrationtechniques, which are potentially applicable to all classes of APPENDIX A. KADESCRIPTION SPECIFICATIONStools. The appendix describes the specifications provided by theThe Software Engineering Methods subarea is divided into editorial team to the associate editors for the content,four subsections: heuristic methods dealing with informal recommended references, format, and style of the KAapproaches, formal methods dealing with mathematically descriptions.based approaches, and prototyping methods dealing withsoftware development approaches based on various forms ofprototyping. 1–6 © IEEE – 2004 Version
  • 29. APPENDIX B. EVOLUTION OF THE GUIDE APPENDIX D. BLOOM RATINGSThe second appendix describes the project’s proposal for theevolution of the Guide. The 2004 Guide is simply the current As an aid, notably to curriculum developers (and other users),edition of a guide which will continue evolving to meet the in support of the project’s fifth objective, the fourth appendixneeds of the software engineering community. Planning for rates each topic with one of a set of pedagogical categoriesevolution is not yet complete, but a tentative outline of the commonly attributed to Benjamin Bloom. The concept is thatprocess is provided in this appendix. As of this writing, this educational objectives can be classified into six categoriesprocess has been endorsed by the project’s Industrial Advisory representing increasing depth: knowledge, comprehension,Board and briefed to the Board of Governors of the IEEE application, analysis, synthesis, and evaluation. Results of thisComputer Society but is not yet either funded or implemented. exercise for all KAs can be found in Appendix D. This Appendix must not, however, be viewed as a definitiveAPPENDIX C. ALLOCATION OF STANDARDS TO KAS classification, but much more as a starting point.The third appendix is an annotated table of the most relevantstandards, mostly from the IEEE and the ISO, allocated to theKAs of the SWEBOK Guide.© IEEE – 2004 Version 1-7
  • 30. Figure 2 First five KAs 1–8 © IEEE – 2004 Version
  • 31. Figure 3 Last six KAs© IEEE – 2004 Version 1-9
  • 32. 1–10 © IEEE – 2004 Version
  • 33. CHAPTER 2 SOFTWARE REQUIREMENTS breakdown reflects the fact that the requirements process,ACRONYMS if it is to be successful, must be considered as a processDAG Directed Acyclic Graph involving complex, tightly coupled activities (bothFSM Functional Size Measurement sequential and concurrent), rather than as a discrete, one-INCOSE International Council on Systems off activity performed at the outset of a software Engineering development project.SADT Structured Analysis and Design Technique The Software Requirements KA is related closely to theUML Unified Modeling Language Software Design, Software Testing, Software Maintenance, Software Configuration Management, Software EngineeringINTRODUCTION Management, Software Engineering Process, and SoftwareThe Software Requirements Knowledge Area (KA) is Quality KAs.concerned with the elicitation, analysis, specification, andvalidation of software requirements. It is widely BREAKDOWN OF TOPICS FOR SOFTWAREacknowledged within the software industry that software REQUIREMENTSengineering projects are critically vulnerable when these 1. Software Requirements Fundamentalsactivities are performed poorly. 1.1. Definition of a Software RequirementSoftware requirements express the needs and constraints At its most basic, a software requirement is a propertyplaced on a software product that contribute to the which must be exhibited in order to solve some problemsolution of some real-world problem. [Kot00] in the real world. The Guide refers to requirements onThe term “requirements engineering” is widely used in the “software” because it is concerned with problems to befield to denote the systematic handling of requirements. addressed by software. Hence, a software requirement is aFor reasons of consistency, though, this term will not be property which must be exhibited by software developedused in the Guide, as it has been decided that the use of or adapted to solve a particular problem. The problemthe term “engineering” for activities other than software may be to automate part of a task of someone who willengineering ones is to be avoided in this edition of the use the software, to support the business processes of theGuide. organization that has commissioned the software, to correct shortcomings of existing software, to control aFor the same reason, “requirements engineer,” a term device, and many more. The functioning of users,which appears in some of the literature, will not be used business processes, and devices is typically complex. Byeither. Instead, the term “software engineer” or, in some extension, therefore, the requirements on particularspecific cases, “requirements specialist” will be used, the software are typically a complex combination oflatter where the role in question is usually performed by requirements from different people at different levels ofan individual other than a software engineer. This does an organization and from the environment in which thenot imply, however, that a software engineer could not software will operate.perform the function. An essential property of all software requirements is thatThe KA breakdown is broadly compatible with the they be verifiable. It may be difficult or costly to verifysections of IEEE 12207 that refer to requirements certain software requirements. For example, verificationactivities. (IEEE12207.1-96) of the throughput requirement on the call center mayA risk inherent in the proposed breakdown is that a necessitate the development of simulation software. Bothwaterfall-like process may be inferred. To guard against the software requirements and software quality personnelthis, subarea 2 Requirements process, is designed to must ensure that the requirements can be verified withinprovide a high-level overview of the requirements process the available resource constraints.by setting out the resources and constraints under which Requirements have other attributes in addition to thethe process operates and which act to configure it. behavioral properties that they express. CommonAn alternate decomposition could use a product-based examples include a priority rating to enable trade-offs instructure (system requirements, software requirements, the face of finite resources and a status value to enableprototypes, use cases, and so on). The process-based project progress to be monitored. Typically, software requirements are uniquely identified so that they can be© IEEE – 2004 Version 2-1
  • 34. subjected to software configuration control and managed shall be written in Ada.”). These are sometimes known asover the entire software life cycle. [Kot00; Pfl01; Som05; process requirements.Tha97] Some software requirements generate implicit process1.2. Product and Process Requirements requirements. The choice of verification technique is oneA distinction can be drawn between product parameters example. Another might be the use of particularlyand process parameters. Product parameters are rigorous analysis techniques (such as formal specificationrequirements on software to be developed (for example, methods) to reduce faults which can lead to inadequate“The software shall verify that a student meets all reliability. Process requirements may also be imposedprerequisites before he or she registers for a course.”). directly by the development organization, their customer, or a third party such as a safety regulator [Kot00; Som97].A process parameter is essentially a constraint on thedevelopment of the software (for example, “The software Figure 1 Breakdown of topics for the Software Requirements KA1.3. Functional and Nonfunctional Requirements They can be further classified according to whether theyFunctional requirements describe the functions that the are performance requirements, maintainabilitysoftware is to execute; for example, formatting some text requirements, safety requirements, reliabilityor modulating a signal. They are sometimes known as requirements, or one of many other types of softwarecapabilities. requirements. These topics are also discussed in the Software Quality KA. [Kot00; Som97]Nonfunctional requirements are the ones that act toconstrain the solution. Nonfunctional requirements aresometimes known as constraints or quality requirements. 2-2 © IEEE – 2004 Version
  • 35. 1.4. Emergent Properties 2.1. Process ModelsSome requirements represent emergent properties of The objective of this topic is to provide an understandingsoftware—that is, requirements which cannot be that the requirements processaddressed by a single component, but which depend for Is not a discrete front-end activity of the softwaretheir satisfaction on how all the software components life cycle, but rather a process initiated at theinteroperate. The throughput requirement for a call center beginning of a project and continuing to be refinedwould, for example, depend on how the telephone system, throughout the life cycleinformation system, and the operators all interacted under Identifies software requirements as configurationactual operating conditions. Emergent properties are items, and manages them using the same softwarecrucially dependent on the system architecture. [Som05] configuration management practices as other1.5. Quantifiable Requirements products of the software life cycle processesSoftware requirements should be stated as clearly and as Needs to be adapted to the organization and projectunambiguously as possible, and, where appropriate, contextquantitatively. It is important to avoid vague and In particular, the topic is concerned with how theunverifiable requirements which depend for their activities of elicitation, analysis, specification, andinterpretation on subjective judgment (“the software shall validation are configured for different types of projectsbe reliable”; “the software shall be user-friendly”). This is and constraints. The topic also includes activities whichparticularly important for nonfunctional requirements. provide input into the requirements process, such asTwo examples of quantified requirements are the marketing and feasibility studies. [Kot00; Rob99; Som97;following: a call center’s software must increase the Som05]center’s throughput by 20%; and a system shall have aprobability of generating a fatal error during any hour of 2.2. Process Actorsoperation of less than 1 * 10−8. The throughput This topic introduces the roles of the people whorequirement is at a very high level and will need to be participate in the requirements process. This process isused to derive a number of detailed requirements. The fundamentally interdisciplinary, and the requirementsreliability requirement will tightly constrain the system specialist needs to mediate between the domain of thearchitecture. [Dav93; Som05] stakeholder and that of software engineering. There are often many people involved besides the requirements1.6. System Requirements and Software Requirements specialist, each of whom has a stake in the software. TheIn this topic, system means “an interacting combination stakeholders will vary across projects, but always includeof elements to accomplish a defined objective. These users/operators and customers (who need not be theinclude hardware, software, firmware, people, same). [Gog93]information, techniques, facilities, services, and other Typical examples of software stakeholders include (butsupport elements,” as defined by the International Council are not restricted to)on Systems Engineering (INCOSE00). Users: This group comprises those who will operateSystem requirements are the requirements for the system the software. It is often a heterogeneous groupas a whole. In a system containing software components, comprising people with different roles andsoftware requirements are derived from system requirements.requirements. Customers: This group comprises those who haveThe literature on requirements sometimes calls system commissioned the software or who represent therequirements “user requirements.” The Guide defines software’s target market.“user requirements” in a restricted way as the Market analysts: A mass-market product will notrequirements of the system’s customers or end-users. have a commissioning customer, so marketingSystem requirements, by contrast, encompass user people are often needed to establish what the marketrequirements, requirements of other stakeholders (such as needs and to act as proxy customers.regulatory authorities), and requirements without an Regulators: Many application domains such asidentifiable human source. banking and public transport are regulated. Software in these domains must comply with the requirements2. Requirements Process of the regulatory authorities.This section introduces the software requirements process, Software engineers: These individuals have aorienting the remaining five subareas and showing how legitimate interest in profiting from developing thethe requirements process dovetails with the overall software by, for example, reusing components insoftware engineering process. [Dav93; Som05] other products. If, in this scenario, a customer of a particular product has specific requirements which compromise the potential for component reuse, the© IEEE – 2004 Version 2-3
  • 36. software engineers must carefully weigh their own development begins, requirements specialists may form stake against those of the customer. the conduit for this communication. They must mediate between the domain of the software users (and otherIt will not be possible to perfectly satisfy the requirements stakeholders) and the technical world of the softwareof every stakeholder, and it is the software engineer’s job engineer.to negotiate trade-offs which are both acceptable to theprincipal stakeholders and within budgetary, technical,regulatory, and other constraints. A prerequisite for this isthat all the stakeholders be identified, the nature of their“stake” analyzed, and their requirements elicited. [Dav93;Kot00; Rob99; Som97; You01]2.3. Process Support and ManagementThis topic introduces the project management resourcesrequired and consumed by the requirements process. Itestablishes the context for the first subarea (Initiation andscope definition) of the Software EngineeringManagement KA. Its principal purpose is to make the linkbetween the process activities identified in 2.1 and theissues of cost, human resources, training, and tools.[Rob99; Som97; You01]2.4. Process Quality and ImprovementThis topic is concerned with the assessment of the qualityand improvement of the requirements process. Its purposeis to emphasize the key role the requirements processplays in terms of the cost and timeliness of a softwareproduct, and of the customer’s satisfaction with it[Som97]. It will help to orient the requirements processwith quality standards and process improvement modelsfor software and systems. Process quality andimprovement is closely related to both the SoftwareQuality KA and the Software Engineering Process KA.Of particular interest are issues of software qualityattributes and measurement, and software processdefinition. This topic covers Requirements process coverage by process improvement standards and models Requirements process measures and benchmarking Improvement planning and implementation [Kot00; Som97; You01]3. Requirements Elicitation [Dav93; Gog93; Lou95; Pfl01]Requirements elicitation is concerned with wheresoftware requirements come from and how the softwareengineer can collect them. It is the first stage in buildingan understanding of the problem the software is requiredto solve. It is fundamentally a human activity, and iswhere the stakeholders are identified and relationshipsestablished between the development team and thecustomer. It is variously termed “requirements capture,”“requirements discovery,” and “requirementsacquisition.”One of the fundamental tenets of good softwareengineering is that there be good communication betweensoftware users and software engineers. Before 2-4 © IEEE – 2004 Version
  • 37. 3.1. Requirements Sources very difficult area and the software engineer needs to be [Dav93; Gog93; Pfl01] sensitized to the fact that (for example) users may have difficulty describing their tasks, may leave importantRequirements have many sources in typical software, and information unstated, or may be unwilling or unable toit is essential that all potential sources be identified and cooperate. It is particularly important to understand thatevaluated for their impact on it. This topic is designed to elicitation is not a passive activity, and that, even ifpromote awareness of the various sources of software cooperative and articulate stakeholders are available, therequirements and of the frameworks for managing them. software engineer has to work hard to elicit the rightThe main points covered are information. A number of techniques exist for doing this, Goals. The term goal (sometimes called “business the principal ones being [Gog93] concern” or “critical success factor”) refers to the Interviews, a “traditional” means of eliciting overall, high-level objectives of the software. Goals requirements. It is important to understand the provide the motivation for the software, but are advantages and limitations of interviews and how often vaguely formulated. Software engineers need they should be conducted. to pay particular attention to assessing the value Scenarios, a valuable means for providing context to (relative to priority) and cost of goals. A feasibility the elicitation of user requirements. They allow the study is a relatively low-cost way of doing this. software engineer to provide a framework for [Lou95]. questions about user tasks by permitting “what if” Domain knowledge. The software engineer needs to and “how is this done” questions to be asked. The acquire, or have available, knowledge about the most common type of scenario is the use case. There application domain. This enables them to infer tacit is a link here to topic 4.2 (Conceptual modeling) knowledge that the stakeholders do not articulate, because scenario notations such as use cases and assess the trade-offs that will be necessary between diagrams are common in modeling software. conflicting requirements, and, sometimes, to act as a Prototypes, a valuable tool for clarifying unclear “user” champion. requirements. They can act in a similar way to Stakeholders (see topic 2.2 Process actors). Much scenarios by providing users with a context within software has proved unsatisfactory because it has which they can better understand what information stressed the requirements of one group of they need to provide. There is a wide range of stakeholders at the expense of those of others. prototyping techniques, from paper mock-ups of Hence, software is delivered which is difficult to use screen designs to beta-test versions of software or which subverts the cultural or political structures products, and a strong overlap of their use for of the customer organization. The software engineer requirements elicitation and the use of prototypes needs to identify, represent, and manage the for requirements validation (see topic 6.2 “viewpoints” of many different types of Prototyping). stakeholders. [Kot00] Facilitated meetings. The purpose of these is to try The operational environment. Requirements will be to achieve a summative effect whereby a group of derived from the environment in which the software people can bring more insight into their software will be executed. These may be, for example, timing requirements than by working individually. They constraints in real-time software or interoperability can brainstorm and refine ideas which may be constraints in an office environment. These must be difficult to bring to the surface using interviews. actively sought out, because they can greatly affect Another advantage is that conflicting requirements software feasibility and cost, and restrict design surface early on in a way that lets the stakeholders choices. [Tha97] recognize where there is conflict. When it works The organizational environment. Software is often well, this technique may result in a richer and more required to support a business process, the selection consistent set of requirements than might otherwise of which may be conditioned by the structure, be achievable. However, meetings need to be culture, and internal politics of the organization. The handled carefully (hence the need for a facilitator) to software engineer needs to be sensitive to these, prevent a situation from occurring where the critical since, in general, new software should not force abilities of the team are eroded by group loyalty, or unplanned change on the business process. the requirements reflecting the concerns of a few3.2. Elicitation Techniques outspoken (and perhaps senior) people are favored [Dav93; Kot00; Lou95; Pfl01] to the detriment of others. Observation. The importance of software contextOnce the requirements sources have been identified, the within the organizational environment has led to thesoftware engineer can start eliciting requirements from adaptation of observational techniques forthem. This topic concentrates on techniques for getting requirements elicitation. Software engineers learnhuman stakeholders to articulate their requirements. It is a about user tasks by immersing themselves in the© IEEE – 2004 Version 2-5
  • 38. environment and observing how users interact with The scope of the requirement. Scope refers to the their software and with each other. These techniques extent to which a requirement affects the software are relatively expensive, but they are instructive and software components. Some requirements, because they illustrate that many user tasks and particularly certain nonfunctional ones, have a business processes are too subtle and complex for global scope in that their satisfaction cannot be their actors to describe easily. allocated to a discrete component. Hence, a4. Requirements Analysis requirement with global scope may strongly affect the software architecture and the design of many [Som05] components, whereas one with a narrow scope mayThis topic is concerned with the process of analyzing offer a number of design choices and have littlerequirements to impact on the satisfaction of other requirements. Volatility/stability. Some requirements will change Detect and resolve conflicts between requirements during the life cycle of the software, and even Discover the bounds of the software and how it must during the development process itself. It is useful if interact with its environment some estimate of the likelihood that a requirement Elaborate system requirements to derive software change can be made. For example, in a banking requirements application, requirements for functions to calculateThe traditional view of requirements analysis has been and credit interest to customers’ accounts are likelythat it be reduced to conceptual modeling using one of a to be more stable than a requirement to support anumber of analysis methods such as the Structured particular kind of tax-free account. The formerAnalysis and Design Technique (SADT). While reflect a fundamental feature of the banking domainconceptual modeling is important, we include the (that accounts can earn interest), while the latter mayclassification of requirements to help inform trade-offs be rendered obsolete by a change to governmentbetween requirements (requirements classification) and legislation. Flagging potentially volatilethe process of establishing these trade-offs (requirements requirements can help the software engineernegotiation). [Dav93] establish a design which is more tolerant of change.Care must be taken to describe requirements precisely Other classifications may be appropriate, depending uponenough to enable the requirements to be validated, their the organization’s normal practice and the applicationimplementation to be verified, and their costs to be itself.estimated. There is a strong overlap between requirements4.1. Requirements Classification classification and requirements attributes (see topic 7.3 [Dav93; Kot00; Som05] Requirements attributes).Requirements can be classified on a number of 4.2. Conceptual Modelingdimensions. Examples include [Dav93; Kot00; Som05] Whether the requirement is functional or The development of models of a real-world problem is nonfunctional (see topic 1.3 Functional and key to software requirements analysis. Their purpose is to nonfunctional requirements). aid in understanding the problem, rather than to initiate Whether the requirement is derived from one or design of the solution. Hence, conceptual models more high-level requirements or an emergent comprise models of entities from the problem domain property (see topic 1.4 Emergent properties) or is configured to reflect their real-world relationships and being imposed directly on the software by a dependencies. stakeholder or some other source. Whether the requirement is on the product or the Several kinds of models can be developed. These include process. Requirements on the process can constrain data and control flows, state models, event traces, user the choice of contractor, the software engineering interactions, object models, data models, and many others. process to be adopted, or the standards to be adhered The factors that influence the choice of model include to. The nature of the problem. Some types of software The requirement priority. In general, the higher the demand that certain aspects be analyzed particularly priority, the more essential the requirement is for rigorously. For example, control flow and state models meeting the overall goals of the software. Often are likely to be more important for real-time software classified on a fixed-point scale such as mandatory, than for management information software, while it highly desirable, desirable, or optional, the priority would usually be the opposite for data models. often has to be balanced against the cost of The expertise of the software engineer. It is often development and implementation. more productive to adopt a modeling notation or 2-6 © IEEE – 2004 Version
  • 39. method with which the software engineer has requirements be identified. This is requirements experience. allocation–the assignment, to components, of the The process requirements of the customer. responsibility for satisfying requirements. Customers may impose their favored notation or Allocation is important to permit detailed analysis of method, or prohibit any with which they are requirements. Hence, for example, once a set of unfamiliar. This factor can conflict with the requirements has been allocated to a component, the previous factor. individual requirements can be further analyzed to The availability of methods and tools. Notations or discover further requirements on how the component methods which are poorly supported by training and needs to interact with other components in order to satisfy tools may not achieve widespread acceptance even if the allocated requirements. In large projects, allocation they are suited to particular types of problems. stimulates a new round of analysis for each subsystem. AsNote that, in almost all cases, it is useful to start by an example, requirements for a particular brakingbuilding a model of the software context. The software performance for a car (braking distance, safety in poorcontext provides a connection between the intended driving conditions, smoothness of application, pedalsoftware and its external environment. This is crucial to pressure required, and so on) may be allocated to theunderstanding the software’s context in its operational braking hardware (mechanical and hydraulic assemblies)environment and to identifying its interfaces with the and an anti-lock braking system (ABS). Only when aenvironment. requirement for an anti-lock braking system has been identified, and the requirements allocated to it, can theThe issue of modeling is tightly coupled with that of capabilities of the ABS, the braking hardware, andmethods. For practical purposes, a method is a notation emergent properties (such as the car weight) be used to(or set of notations) supported by a process which guides identify the detailed ABS software requirements.the application of the notations. There is little empiricalevidence to support claims for the superiority of one Architectural design is closely identified withnotation over another. However, the widespread conceptual modeling. The mapping from real-worldacceptance of a particular method or notation can lead to domain entities to software components is not alwaysbeneficial industry-wide pooling of skills and knowledge. obvious, so architectural design is identified as aThis is currently the situation with the UML (Unified separate topic. The requirements of notations andModeling Language). (UML04) methods are broadly the same for both conceptual modeling and architectural design.Formal modeling using notations based on discretemathematics, and which are traceable to logical reasoning, IEEE Std 1471-2000, Recommended Practice forhave made an impact in some specialized domains. These Architectural Description of Software Intensive Systems,may be imposed by customers or standards or may offer suggests a multiple-viewpoint approach to describing thecompelling advantages to the analysis of certain critical architecture of systems and their software items.functions or components. (IEEE1471-00)This topic does not seek to “teach” a particular modeling 4.4. Requirements Negotiationstyle or notation but rather provides guidance on the Another term commonly used for this sub-topic ispurpose and intent of modeling. “conflict resolution.” This concerns resolving problemsTwo standards provide notations which may be useful in with requirements where conflicts occur between twoperforming conceptual modeling–IEEE Std 1320.1, stakeholders requiring mutually incompatible features,IDEF0 for functional modeling; and IEEE Std 1320.2, between requirements and resources, or betweenIDEF1X97 (IDEFObject) for information modeling. functional and non-functional requirements, for example. [Kot00, Som97] In most cases, it is unwise for the4.3. Architectural Design and Requirements Allocation software engineer to make a unilateral decision, and so it [Dav93; Som05] becomes necessary to consult with the stakeholder(s) to reach a consensus on an appropriate trade-off. It is oftenAt some point, the architecture of the solution must be important for contractual reasons that such decisions bederived. Architectural design is the point at which the traceable back to the customer. We have classified this asrequirements process overlaps with software or systems a software requirements analysis topic because problemsdesign and illustrates how impossible it is to cleanly emerge as the result of analysis. However, a strong casedecouple the two tasks. [Som01] This topic is closely can also be made for considering it a requirementsrelated to the Software Structure and Architecture subarea validation topic.in the Software Design KA. In many cases, the softwareengineer acts as software architect because the process of 5. Requirements Specificationanalyzing and elaborating the requirements demands that For most engineering professions, the term “specification”the components that will be responsible for satisfying the refers to the assignment of numerical values or limits to a© IEEE – 2004 Version 2-7
  • 40. product’s design goals. (Vin90) Typical physical systems (in market-driven projects, these roles may be played by thehave a relatively small number of such values. Typical marketing and development divisions) on what the softwaresoftware has a large number of requirements, and the product is to do, as well as what it is not expected to do. Foremphasis is shared between performing the numerical non-technical readers, the software requirementsquantification and managing the complexity of interaction specification document is often accompanied by a softwareamong the large number of requirements. So, in software requirements definition document.engineering jargon, “software requirements specification” Software requirements specification permits a rigoroustypically refers to the production of a document, or its assessment of requirements before design can begin andelectronic equivalent, which can be systematically reduces later redesign. It should also provide a realisticreviewed, evaluated, and approved. For complex systems, basis for estimating product costs, risks, and schedules.particularly those involving substantial non-softwarecomponents, as many as three different types of Organizations can also use a software requirementsdocuments are produced: system definition, system specification document to develop their own validationrequirements, and software requirements. For simple and verification plans more productively.software products, only the third of these is required. All Software requirements specification provides an informedthree documents are described here, with theunderstanding that they may be combined as appropriate. basis for transferring a software product to new users orA description of systems engineering can be found in new machines. Finally, it can provide a basis for softwareChapter 12, Related Disciplines of Software Engineering. enhancement. Software requirements are often written in natural5.1. The System Definition Document language, but, in software requirements specification, thisThis document (sometimes known as the user requirements may be supplemented by formal or semi-formaldocument or concept of operations) records the system descriptions. Selection of appropriate notations permitsrequirements. It defines the high-level system requirements particular requirements and aspects of the softwarefrom the domain perspective. Its readership includes architecture to be described more precisely and conciselyrepresentatives of the system users/customers (marketing than natural language. The general rule is that notationsmay play these roles for market-driven software), so its should be used which allow the requirements to becontent must be couched in terms of the domain. The described as precisely as possible. This is particularlydocument lists the system requirements along with crucial for safety-critical and certain other types ofbackground information about the overall objectives for the dependable software. However, the choice of notation issystem, its target environment and a statement of the often constrained by the training, skills and preferences ofconstraints, assumptions, and non-functional requirements. the document’s authors and readers.It may include conceptual models designed to illustrate thesystem context, usage scenarios and the principal domain A number of quality indicators have been developedentities, as well as data, information, and workflows. IEEE which can be used to relate the quality of softwareStd 1362, Concept of Operations Document, provides requirements specification to other project variablesadvice on the preparation and content of such a document. such as cost, acceptance, performance, schedule,(IEEE1362-98) reproducibility, etc. Quality indicators for individual software requirements specification statements include5.2. System Requirements Specification imperatives, directives, weak phrases, options, and [Dav93; Kot00; Rob99; Tha97] continuances. Indicators for the entire software requirements specification document include size,Developers of systems with substantial software and non- readability, specification, depth, and text structure.software components, a modern airliner, for example, [Dav93; Tha97] (Ros98)often separate the description of system requirementsfrom the description of software requirements. In this IEEE has a standard, IEEE Std 830 [IEEE830-98], for theview, system requirements are specified, the software production and content of the software requirementsrequirements are derived from the system requirements, specification. Also, IEEE 1465 (similar to ISO/IECand then the requirements for the software components 12119) is a standard treating quality requirements inare specified. Strictly speaking, system requirements software packages. (IEEE1465-98)specification is a systems engineering activity and fallsoutside the scope of this Guide. IEEE Std 1233 is a guide 6. Requirements validationfor developing system requirements. (IEEE1233-98) [Dav93] The requirements documents may be subject to validation5.3. Software Requirements Specification and verification procedures. The requirements may be [Kot00; Rob99] validated to ensure that the software engineer hasSoftware requirements specification establishes the basis for understood the requirements, and it is also important toagreement between customers and contractors or suppliers verify that a requirements document conforms to company 2-8 © IEEE – 2004 Version
  • 41. standards, and that it is understandable, consistent, and prototypes which avoid software, such as flip-chart-basedcomplete. Formal notations offer the important advantage mockups. Prototypes may be costly to develop. However,of permitting the last two properties to be proven (in a if they avoid the wastage of resources caused by trying torestricted sense, at least). Different stakeholders, including satisfy erroneous requirements, their cost can be morerepresentatives of the customer and developer, should easily justified.review the document(s). Requirements documents aresubject to the same software configuration management 6.3. Model Validationpractices as the other deliverables of the software life cycle [Dav93; Kot00; Tha97]processes. (Bry94, Ros98) It is typically necessary to validate the quality of theIt is normal to explicitly schedule one or more points in models developed during analysis. For example, in objectthe requirements process where the requirements are models, it is useful to perform a static analysis to verifyvalidated. The aim is to pick up any problems before that communication paths exist between objects which, inresources are committed to addressing the requirements. the stakeholders’ domain, exchange data. If formalRequirements validation is concerned with the process of specification notations are used, it is possible to useexamining the requirements document to ensure that it formal reasoning to prove specification properties.defines the right software (that is, the software that theusers expect). [Kot00] 6.4. Acceptance Tests [Dav93]6.1. Requirements Reviews An essential property of a software requirement is that it [Kot00; Som05; Tha97] should be possible to validate that the finished productPerhaps the most common means of validation is by satisfies it. Requirements which cannot be validated areinspection or reviews of the requirements document(s). A really just “wishes.” An important task is thereforegroup of reviewers is assigned a brief to look for errors, planning how to verify each requirement. In most cases,mistaken assumptions, lack of clarity, and deviation from designing acceptance tests does this.standard practice. The composition of the group that Identifying and designing acceptance tests may beconducts the review is important (at least one representative difficult for non-functional requirements (see topic 1.3of the customer should be included for a customer-driven Functional and Non-functional Requirements). To beproject, for example), and it may help to provide guidance validated, they must first be analyzed to the point whereon what to look for in the form of checklists. they can be expressed quantitatively.Reviews may be constituted on completion of the system Additional information can be found in the Softwaredefinition document, the system specification document, Testing KA, sub-topic 2.2.4 Conformance testing.the software requirements specification document, thebaseline specification for a new release, or at any other 7. Practical Considerationsstep in the process. IEEE Std 1028 provides guidance onconducting such reviews. (IEEE1028-97) Reviews are The first level of decomposition of subareas presented inalso covered in the Software Quality KA, topic 2.3 this KA may seem to describe a linear sequence ofReviews and Audits. activities. This is a simplified view of the process. [Dav93]6.2. Prototyping The requirements process spans the whole software life [Dav93; Kot00; Som05; Tha97] cycle. Change management and the maintenance of thePrototyping is commonly a means for validating the requirements in a state which accurately mirrors thesoftware engineer’s interpretation of the software software to be built, or that has been built, are key to therequirements, as well as for eliciting new requirements. success of the software engineering process. [Kot00;As with elicitation, there is a range of prototyping Lou95]techniques and a number of points in the process where Not every organization has a culture of documenting andprototype validation may be appropriate. The advantage managing requirements. It is frequent in dynamic start-upof prototypes is that they can make it easier to interpret companies, driven by a strong “product vision” and limitedthe software engineer’s assumptions and, where needed, resources, to view requirements documentation as angive useful feedback on why they are wrong. For unnecessary overhead. Most often, however, as theseexample, the dynamic behavior of a user interface can be companies expand, as their customer base grows, and asbetter understood through an animated prototype than their product starts to evolve, they discover that they needthrough textual description or graphical models. There are to recover the requirements that motivated product featuresalso disadvantages, however. These include the danger of in order to assess the impact of proposed changes. Hence,users’ attention being distracted from the core underlying requirements documentation and change management arefunctionality by cosmetic issues or quality problems with key to the success of any requirements process.the prototype. For this reason, several people recommend© IEEE – 2004 Version 2-9
  • 42. 7.1. Iterative Nature of the Requirements Process management, the procedures that need to be in place, and [Kot00; You01] the analysis that should be applied to proposed changes. It has strong links to the Software ConfigurationThere is general pressure in the software industry for ever Management KA.shorter development cycles, and this is particularlypronounced in highly competitive market-driven sectors. 7.3. Requirements AttributesMoreover, most projects are constrained in some way by [Kot00]their environment, and many are upgrades to, or revisions of,existing software where the architecture is a given. In Requirements should consist not only of a specification ofpractice, therefore, it is almost always impractical to what is required, but also of ancillary information whichimplement the requirements process as a linear, deterministic helps manage and interpret the requirements. This shouldprocess in which software requirements are elicited from the include the various classification dimensions of thestakeholders, baselined, allocated, and handed over to the requirement (see topic 4.1 Requirements Classification)software development team. It is certainly a myth that the and the verification method or acceptance test plan. It mayrequirements for large software projects are ever perfectly also include additional information such as a summaryunderstood or perfectly specified. [Som97] rationale for each requirement, the source of each requirement, and a change history. The most importantInstead, requirements typically iterate towards a level of requirements attribute, however, is an identifier whichquality and detail which is sufficient to permit design and allows the requirements to be uniquely andprocurement decisions to be made. In some projects, this unambiguously identified.may result in the requirements being baselined before alltheir properties are fully understood. This risks expensive 7.4. Requirements Tracingrework if problems emerge late in the software [Kot00]engineering process. However, software engineers are Requirements tracing is concerned with recovering thenecessarily constrained by project management plans and source of requirements and predicting the effects ofmust therefore take steps to ensure that the “quality” of requirements. Tracing is fundamental to performingthe requirements is as high as possible given the available impact analysis when requirements change. Aresources. They should, for example, make explicit any requirement should be traceable backwards to theassumptions which underpin the requirements, as well as requirements and stakeholders which motivated it (from aany known problems. software requirement back to the system requirement(s)In almost all cases, requirements understanding continues that it helps satisfy, for example). Conversely, ato evolve as design and development proceeds. This often requirement should be traceable forwards into theleads to the revision of requirements late in the life cycle. requirements and design entities that satisfy it (forPerhaps the most crucial point in understanding example, from a system requirement into the softwarerequirements engineering is that a significant proportion requirements that have been elaborated from it, and onof the requirements will change. This is sometimes due to into the code modules that implement it).errors in the analysis, but it is frequently an inevitable The requirements tracing for a typical project will form aconsequence of change in the “environment”: for complex directed acyclic graph (DAG) of requirements.example, the customer’s operating or businessenvironment, or the market into which software must sell. 7.5. Measuring RequirementsWhatever the cause, it is important to recognize the As a practical matter, it is typically useful to have someinevitability of change and take steps to mitigate its concept of the “volume” of the requirements for aeffects. Change has to be managed by ensuring that particular software product. This number is useful inproposed changes go through a defined review and evaluating the “size” of a change in requirements, inapproval process, and, by applying careful requirements estimating the cost of a development or maintenance task,tracing, impact analysis, and software configuration or simply for use as the denominator in othermanagement (see the Software Configuration measurements. Functional Size Measurement (FSM) is aManagement KA). Hence, the requirements process is not technique for evaluating the size of a body of functionalmerely a front-end task in software development, but requirements. IEEE Std 14143.1 defines the concept ofspans the whole software life cycle. In a typical project, FSM. [IEEE14143.1-00] Standards from ISO/IEC andthe software requirements activities evolve over time from other sources describe particular FSM methods.elicitation to change management. Additional information on size measurement and7.2. Change Management standards will be found in the Software Engineering [Kot00] Process KA.Change management is central to the management ofrequirements. This topic describes the role of change 2-10 © IEEE – 2004 Version
  • 43. MATRIX OF TOPICS VS. REFERENCE MATERIAL [IEEE830 [IEEE141 [Som97] [Som05] [Gog93] [Rob99] [You01] 43.1-00] [Lou95] [Tha97] [Dav93] [Kot00] [Pfl01] -98]1. Software Requirements Fundamentals1.1 Definition of a Software Requirement * * c5 c11.2 Product and Process Requirements * c11.3 Functional and Non-functional * c1 Requirements1.4 Emergent Properties c2 c3s1.5 Quantifiable Requirements c6 41.6 System Requirements and Software Requirements2. Requirements Process * c52.1 Process Models c2s1 * c2 c32.2 Process Actors c2 * c2s2 c3 c2 c32.3 Process Support and Management c3 c2 c2,c72.4 Process Quality and Improvement c2s4 c2 c53. Requirements Elicitation * * * *3.1 Requirements Sources c2 * c3s1 * * c13.2 Elicitation Techniques c2 * c3s2 * *4. Requirements Analysis * c64.1 Requirements Classification * c8s1 c64.2 Conceptual Modeling * * c74.3 Architectural Design and Requirements * c10 Allocation4.4 Requirements Negotiation c3s4 *5. Requirements Specification5.1 The System Definition Document5.2 The System Requirements Specification * * c9 c35.3 The Software Requirements Specification * * * c9 c36. Requirements Validation * *6.1 Requirements Reviews c4s1 c6 c56.2 Prototyping c6 c4s2 c8 c66.3 Model Validation * c4s3 c56.4 Acceptance Tests *7. Practical Considerations * * *7.1 Iterative Nature of the Requirements c5s1 c2 c6 Process7.2 Change Management c5s37.3 Requirement Attributes c5s27.4 Requirements Tracing c5s47.5 Measuring Requirements *© IEEE – 2004 Version 2-11
  • 44. RECOMMENDED REFERENCES FOR SOFTWARE [Lou95] P. Loucopulos and V. Karakostas, SystemsREQUIREMENTS Requirements Engineering, McGraw-Hill, 1995.[Dav93] A.M. Davis, Software Requirements: Objects, [Pfl01] S.L. Pfleeger, “Software Engineering: Theory andFunctions and States, Prentice Hall, 1993. Practice,” second ed., Prentice Hall, 2001, Chap. 4.[Gog93] J. Goguen and C. Linde, “Techniques for [Rob99] S. Robertson and J. Robertson, Mastering theRequirements Elicitation,” presented at International Requirements Process, Addison-Wesley, 1999.Symposium on Requirements Engineering, 1993. [Som97] I. Sommerville and P. Sawyer, Requirements[IEEE830-98] IEEE Std 830-1998, IEEE Recommended Engineering: A Good Practice Guide, John Wiley &Practice for Software Requirements Specifications, IEEE, Sons, 1997, Chap. 1-2.1998. [Som05] I. Sommerville, Software Engineering, seventh(IEEE14143.1-00) IEEE Std 14143.1-2000//ISO/ ed., Addison-Wesley, 2005.IEC14143-1:1998, Information Technology—Software [Tha97] R.H. Thayer and M. Dorfman, eds., SoftwareMeasurement—Functional Size Measurement—Part 1: Requirements Engineering, IEEE Computer SocietyDefinitions of Concepts, IEEE, 2000. Press, 1997, pp. 176-205, 389-404.[Kot00] G. Kotonya and I. Sommerville, Requirements [You01] R.R. You, Effective Requirements Practices,Engineering: Processes and Techniques, John Wiley & Addison-Wesley, 2001.Sons, 2000. 2-12 © IEEE – 2004 Version
  • 45. APPENDIX A. LIST OF FURTHER READINGS (Dav94) A. Davis and P. Hsia, “Giving Voice to(Ale02) I. Alexander and R. Stevens, Writing Better Requirements Engineering: Guest Editors’ Introduction,”Requirements, Addison-Wesley, 2002. IEEE Software, vol. 11, iss. 2, March 1994, pp. 12-16.(Ard97) M. Ardis, “Formal Methods for (Def94) J. DeFoe, “Requirements EngineeringTelecommunication System Requirements: A Survey of Technology in Industrial Education,” presented at IEEEStandardized Languages,” Annals of Software International Conference on Requirements Engineering,Engineering, vol. 3, 1997. 1994.(Ber97) V. Berzins et al., “A Requirements Evolution (Dem97) E. Demirors, “A Blackboard Framework forModel for Computer Aided Prototyping,” presented at Supporting Teams in Software Development,” presentedNinth IEEE International Conference on Software at Ninth IEEE International Conference on SoftwareEngineering and Knowledge Engineering, Knowledge Engineering and Knowledge Engineering, KnowledgeSystems Institute, 1997. Systems Institute, 1997.(Bey95) H. Beyer and K. Holtzblatt, “Apprenticing with (Die95) M. Diepstraten, “Command and Control Systemthe Customer,” Communications of the ACM, vol. 38, iss. Requirements Analysis and System Requirements5, May 1995, pp. 45-52. Specification for a Tactical System,” presented at First(Bru95) G. Bruno and R. Agarwal, “Validating Software IEEE International Conference on Engineering ofRequirements Using Operational Models,” presented at Complex Computer Systems, 1995.Second Symposium on Software Quality Techniques and (Dob94) J. Dobson and R. Strens, “OrganizationalAcquisition Criteria, 1995. Requirements Definition for Information Technology,”(Bry94) E. Bryne, “IEEE Standard 830: Recommended presented at IEEE International Conference onPractice for Software Requirements Specification,” Requirements Engineering, 1994.presented at IEEE International Conference on (Duf95) D. Duffy et al., “A Framework for RequirementsRequirements Engineering, 1994. Analysis Using Automated Reasoning,” presented at(Buc94) G. Bucci et al., “An Object-Oriented Dual Seventh International Conference on AdvancedLanguage for Specifying Reactive Systems,” presented at Information Systems Engineering, 1995.IEEE International Conference on Requirements (Eas95) S. Easterbrook and B. Nuseibeh, “ManagingEngineering, 1994. Inconsistencies in an Evolving Specification,” presented(Bus95) D. Bustard and P. Lundy, “Enhancing Soft at Second International Symposium on RequirementsSystems Analysis with Formal Modeling,” presented at Engineering, 1995.Second International Symposium on Requirements (Edw95) M. Edwards et al., “RECAP: A RequirementsEngineering, 1995. Elicitation, Capture, and Analysis Process Prototype Tool(Che94) M. Chechik and J. Gannon, “Automated for Large Complex Systems,” presented at First IEEEVerification of Requirements Implementation,” presented International Conference on Engineering of Complexat Proceedings of the International Symposium on Computer Systems, 1995.Software Testing and Analysis, special issue, 1994. (ElE95) K. El-Emam and N. Madhavji, “Requirements(Chu95) L. Chung and B. Nixon, “Dealing with Non- Engineering Practices in Information SystemsFunctional Requirements: Three Experimental Studies of Development: A Multiple Case Study,” presented ata Process-Oriented Approach,” presented at Seventeenth Second International Symposium on RequirementsIEEE International Conference on Software Engineering, Engineering, 1995.1995. (Fai97) R. Fairley and R. Thayer, “The Concept of(Cia97) P. Ciancarini et al., “Engineering Formal Operations: The Bridge from Operational RequirementsRequirements: An Analysis and Testing Method for Z to Technical Specifications,” Annals of SoftwareDocuments,” Annals of Software Engineering, vol. 3, Engineering, vol. 3, 1997.1997. (Fic95) S. Fickas and M. Feather, “Requirements(Cre94) R. Crespo, “We Need to Identify the Monitoring in Dynamic Environments,” presented atRequirements of the Statements of Non-Functional Second International Symposium on RequirementsRequirements,” presented at International Workshop on Engineering, 1995.Requirements Engineering: Foundations of Software (Fie95) R. Fields et al., “A Task-Centered Approach toQuality, 1994. Analyzing Human Error Tolerance Requirements,”(Cur94) P. Curran et al., “BORIS-R Specification of the presented at Second International Symposium onRequirements of a Large-Scale Software Intensive Requirements Engineering, 1995.System,” presented at Requirements Elicitation for (Gha94) J. Ghajar-Dowlatshahi and A. Varnekar, “RapidSoftware-Based Systems, 1994. Prototyping in Requirements Specification Phase of(Dar97) R. Darimont and J. Souquieres, “Reusing Software Systems,” presented at Fourth InternationalOperational Requirements: A Process-Oriented Symposium on Systems Engineering, National Council onApproach,” presented at IEEE International Symposium Systems Engineering, 1994.on Requirements Engineering, 1997. (Gib95) M. Gibson, “Domain Knowledge Reuse During© IEEE – 2004 Version 2-13
  • 46. Requirements Engineering,” presented at Seventh International Symposium on Requirements Engineering,International Conference on Advanced Information 1997.Systems Engineering (CAiSE ’95), 1995. (Kos97) R. Kosman, “A Two-Step Methodology to(Gol94) L. Goldin and D. Berry, “AbstFinder: A Reduce Requirements Defects,” Annals of SoftwarePrototype Abstraction Finder for Natural Language Text Engineering, vol. 3, 1997.for Use in Requirements Elicitation: Design, (Kro95) J. Krogstie et al., “Towards a DeeperMethodology and Evaluation,” presented at IEEE Understanding of Quality in Requirements Engineering,”International Conference on Requirements Engineering, presented at Seventh International Conference on1994. Advanced Information Systems Engineering (CAiSE ’95),(Got97) O. Gotel and A. Finkelstein, “Extending 1995.Requirements Traceability: Lessons Learned from an (Lal95) V. Lalioti and B. Theodoulidis, “Visual ScenariosIndustrial Case Study,” presented at IEEE International for Validation of Requirements Specification,” presentedSymposium on Requirements Engineering, 1997. at Seventh International Conference on Software(Hei96) M. Heimdahl, “Errors Introduced during the Engineering and Knowledge Engineering, KnowledgeTACS II Requirements Specification Effort: A Systems Institute, 1995.Retrospective Case Study,” presented at Eighteenth IEEE (Lam95) A. v. Lamsweerde et al., “Goal-DirectedInternational Conference on Software Engineering, 1996. Elaboration of Requirements for a Meeting Scheduler:(Hei96a) C. Heitmeyer et al., “Automated Consistency Problems and Lessons Learnt,” presented at SecondChecking Requirements Specifications,” ACM International Symposium on Requirements Engineering,Transactions on Software Engineering and Methodology, 1995.vol. 5, iss. 3, July 1996, pp. 231-261. (Lei97) J. Leite et al., “Enhancing a Requirements(Hol95) K. Holtzblatt and H. Beyer, “Requirements Baseline with Scenarios,” presented at IEEE InternationalGathering: The Human Factor,” Communications of the Symposium on Requirements Engineering, 1997.ACM, vol. 38, iss. 5, May 1995, pp. 31-32. (Ler97) F. Lerch et al.., “Using Simulation-Based(Hud96) E. Hudlicka, “Requirements Elicitation with Experiments for Software Requirements Engineering,”Indirect Knowledge Elicitation Techniques: Comparison Annals of Software Engineering, vol. 3, 1997.of Three Methods,” presented at Second IEEE (Lev94) N. Leveson et al., “Requirements SpecificationInternational Conference on Requirements Engineering, for Process-Control Systems,” IEEE Transactions on1996. Software Engineering, vol. 20, iss. 9, September 1994, pp.(Hug94) K. Hughes et al., “A Taxonomy for 684-707.Requirements Analysis Techniques,” presented at IEEE (Lut96a) R. Lutz and R. Woodhouse, “Contributions ofInternational Conference on Requirements Engineering, SFMEA to Requirements Analysis,” presented at Second1994. IEEE International Conference on Requirements(Hug95) J. Hughes et al., “Presenting Ethnography in the Engineering, 1996.Requirements Process,” presented at Second IEEE (Lut97) R. Lutz and R. Woodhouse, “RequirementsInternational Symposium on Requirements Engineering, Analysis Using Forward and Backward Search,” Annals1995. of Software Engineering, vol. 3, 1997.(Hut94) A.T.F. Hutt, ed., Object Analysis and Design - (Mac96) L. Macaulay, Requirements Engineering,Comparison of Methods. Object Analysis and Design - Springer-Verlag, 1996.Description of Methods, John Wiley & Sons, 1994. (Mai95) N. Maiden et al., “Computational Mechanisms(INCOSE00) INCOSE, How To: Guide for all Engineers, for Distributed Requirements Engineering,” presented atVersion 2, International Council on Systems Engineering, Seventh International Conference on Software2000. Engineering and Knowledge Engineering, Knowledge(Jac95) M. Jackson, Software Requirements and Systems Institute, 1995.Specifications, Addison-Wesley, 1995. (Mar94) B. Mar, “Requirements for Development of(Jac97) M. Jackson, “The Meaning of Requirements,” Software Requirements,” presented at FourthAnnals of Software Engineering, vol. 3, 1997. International Symposium on Systems Engineering, 1994.(Jon96) S. Jones and C. Britton, “Early Elicitation and (Mas97) P. Massonet and A. v. Lamsweerde, “AnalogicalDefinition of Requirements for an Interactive Multimedia Reuse of Requirements Frameworks,” presented at IEEEInformation System,” presented at Second IEEE International Symposium on Requirements Engineering,International Conference on Requirements Engineering, 1997.1996. (McF95) I. McFarland and I. Reilly, “Requirements(Kir96) T. Kirner and A. Davis, “Nonfunctional Traceability in an Integrated Development Environment,”Requirements for Real-Time Systems,” Advances in presented at Second International Symposium onComputers, 1996. Requirements Engineering, 1995.(Kle97) M. Klein, “Handling Exceptions in Collaborative (Mea94) N. Mead, “The Role of Software Architecture inRequirements Acquisition,” presented at IEEE Requirements Engineering,” presented at IEEE 2-14 © IEEE – 2004 Version
  • 47. International Conference on Requirements Engineering, Traceability: A Case Study,” presented at Second1994. International Symposium on Requirements Engineering,(Mos95) D. Mostert and S. v. Solms, “A Technique to 1995.Include Computer Security, Safety, and Resilience (Reg95) B. Regnell et al., “Improving the Use CaseRequirements as Part of the Requirements Specification,” Driven Approach to Requirements Engineering,”Journal of Systems and Software, vol. 31, iss. 1, October presented at Second IEEE International Symposium on1995, pp. 45-53. Requirements Engineering, 1995.(Myl95) J. Mylopoulos et al., “Multiple Viewpoints (Reu94) H. Reubenstein, “The Role of SoftwareAnalysis of Software Specification Process,” IEEE Architecture in Software Requirements Engineering,”Transactions on Software Engineering, 1995. presented at IEEE International Conference on(Nis92) K. Nishimura and S. Honiden, “Representing and Requirements Engineering, 1994.Using Non-Functional Requirements: A Process-Oriented (Rob94) J. Robertson and S. Robertson, CompleteApproach,” IEEE Transactions on Software Engineering, Systems Analysis, Vol. 1 and 2, Prentice Hall, 1994.December 1992. (Rob94a) W. Robinson and S. Fickas, “Supporting Multi-(Nis97) H. Nissen et al., “View-Directed Requirements Perspective Requirements Engineering,” presented atEngineering: A Framework and Metamodel,” presented at IEEE International Conference on RequirementsNinth IEEE International Conference on Software Engineering, 1994.Engineering and Knowledge Engineering, Knowledge (Ros98) L. Rosenberg, T.F. Hammer, and L.L. Huffman,Systems Institute, 1997. “Requirements, testing and metrics,” presented at 15th(OBr96) L. O’Brien, “From Use Case to Database: Annual Pacific Northwest Software Quality Conference,Implementing a Requirements Tracking System,” 1998.Software Development, vol. 4, iss. 2, February 1996, pp. (Sch94) W. Schoening, “The Next Big Step in Systems43-47. Engineering Tools: Integrating Automated Requirements(UML04) Object Management Group, Unified Modeling Tools with Computer Simulated Synthesis and Test,”Language, www.uml.org, 2004. presented at Fourth International Symposium on Systems(Opd94) A. Opdahl, “Requirements Engineering for Engineering, 1994.Software Performance,” presented at International (She94) M. Shekaran, “The Role of Software ArchitectureWorkshop on Requirements Engineering: Foundations of in Requirements Engineering,” presented at IEEESoftware Quality, 1994. International Conference on Requirements Engineering,(Pin96) F. Pinheiro and J. Goguen, “An Object-Oriented 1994.Tool for Tracing Requirements,” IEEE Software, vol. 13, (Sid97) J. Siddiqi et al., “Towards Quality Requirementsiss. 2, March 1996, pp. 52-64. Via Animated Formal Specifications,” Annals of Software(Pla96) G. Playle and C. Schroeder, “Software Engineering, vol. 3, 1997.Requirements Elicitation: Problems, Tools, and (Span97) G. Spanoudakis and A. Finkelstein,Techniques,” Crosstalk: The Journal of Defense Software “Reconciling Requirements: A Method for ManagingEngineering, vol. 9, iss. 12, December 1996, pp. 19-24. Interference, Inconsistency, and Conflict,” Annals of(Poh94) K. Pohl et al., “Applying AI Techniques to Software Engineering, vol. 3, 1997.Requirements Engineering: The NATURE Prototype,” (Ste94) R. Stevens, “Structured Requirements,” presentedpresented at IEEE Workshop on Research Issues in the at Fourth International Symposium on SystemsIntersection Between Software Engineering and Artificial Engineering, 1994.Intelligence, 1994. (Vin90) W.G. Vincenti, What Engineers Know and How(Por95) A. Porter et al., “Comparing Detection Methods They Know It - Analytical Studies form Aeronauticalfor Software Requirements Inspections: A Replicated History, John Hopkins University Press, 1990.Experiment,” IEEE Transactions on Software (Wei03) K. Weigers, Software Requirements, second ed.,Engineering, vol. 21, iss. 6, June 1995, pp. 563-575. Microsoft Press, 2003.(Pot95) C. Potts et al., “An Evaluation of Inquiry-Based (Whi95) S. White and M. Edwards, “A RequirementsRequirements Analysis for an Internet Server,” presented Taxonomy for Specifying Complex Systems,” presentedat Second International Symposium on Requirements at First IEEE International Conference on Engineering ofEngineering, 1995. Complex Computer Systems, 1995.(Pot97) C. Potts and I. Hsi, “Abstraction and Context in (Wil99) B. Wiley, Essential System Requirements: ARequirements Engineering: Toward a Synthesis,” Annals Practical Guide to Event-Driven Methods, Addison-of Software Engineering, vol. 3, 1997. Wesley, 1999.(Pot97a) C. Potts and W. Newstetter, “Naturalistic Inquiry (Wyd96) T. Wyder, “Capturing Requirements With Useand Requirements Engineering: Reconciling Their Cases,” Software Development, vol. 4, iss. 2, FebruaryTheoretical Foundations,” presented at IEEE International 1996, pp. 36-40.Symposium on Requirements Engineering, 1997. (Yen97) J. Yen and W. Tiao, “A Systematic Tradeoff(Ram95) B. Ramesh et al., “Implementing Requirements Analysis for Conflicting Imprecise Requirements,”© IEEE – 2004 Version 2-15
  • 48. presented at IEEE International Symposium on Requirements Engineering, 1997.Requirements Engineering, 1997. (Zav96) P. Zave and M. Jackson, “Where Do Operations(Yu97) E. Yu, “Towards Modeling and Reasoning Come From? A Multiparadigm Specification Technique,”Support for Early-Phase Requirements Engineering,” IEEE Transactions on Software Engineering,, vol. 22, iss.presented at IEEE International Symposium on 7, July 1996, pp. 508-528. 2-16 © IEEE – 2004 Version
  • 49. APPENDIX B. LIST OF STANDARDS(IEEE830-98) IEEE Std 830-1998, IEEE RecommendedPractice for Software Requirements Specifications, IEEE,1998.(IEEE1028-97) IEEE Std 1028-1997 (R2002), IEEEStandard for Software Reviews, IEEE, 1997.(IEEE1233-98) IEEE Std 1233-1998, IEEE Guide forDeveloping System Requirements Specifications, 1998.(IEEE1320.1-98) IEEE Std 1320.1-1998, IEEE Standardfor Functional Modeling Language-Syntax and Semanticsfor IDEF0, IEEE, 1998.(IEEE1320.2-98) IEEE Std 1320.2-1998, IEEE Standardfor Conceptual Modeling Language-Syntax and Semanticsfor IDEFIX97 (IDEFObjetct), IEEE, 1998.(IEEE1362-98) IEEE Std 1362-1998, IEEE Guide forInformation Technology-System Definition-Concept ofOperations (ConOps) Document, IEEE, 1998.(IEEE1465-98) IEEE Std 1465-1998//ISO/IEC12119:1994, IEEE Standard Adoption ofInternational Standard ISO/IEC12119:1994(E),Information Technology-Software Packages-Qualityrequirements and testing, IEEE, 1998.(IEEEP1471-00) IEEE Std 1471-2000, IEEERecommended Practice for Architectural DescriptionosSoftware Intensive Systems, Architecture Working Groupof the Software Engineering Standards Committee, 2000.(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std.ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.(IEEE14143.1-00) IEEE Std 14143.1-2000//ISO/IEC14143-1:1998, Information Technology-SoftwareMeasurement-Functional Size Measurement-Part 1:Definitions of Concepts, IEEE, 2000.© IEEE – 2004 Version 2-17
  • 50. 2-18 © IEEE – 2004 Version
  • 51. CHAPTER 3 SOFTWARE DESIGN Concerning the scope of the Software Design KnowledgeACRONYMS Area (KA), the current KA description does not discussADL Architecture Description Languages every topic the name of which contains the word “design.” In Tom DeMarco’s terminology (DeM99), the KACRC Class Responsibility Collaborator card discussed in this chapter deals mainly with D-designERD Entity-Relationship Diagram (decomposition design, mapping software into componentIDL Interface Description Language pieces). However, because of its importance in the growing field of software architecture, we will also address FP-DFD Data Flow Diagram design (family pattern design, whose goal is to establishPDL Pseudo-Code and Program Design Language exploitable commonalities in a family of software). ByCBD Component-Based design contrast, the Software Design KA does not address I-design (invention design, usually performed during the softwareINTRODUCTION requirements process with the objective of conceptualizing and specifying software to satisfy discovered needs andDesign is defined in [IEEE610.12-90] as both “the process requirements), since this topic should be considered part ofof defining the architecture, components, interfaces, and requirements analysis and specification.other characteristics of a system or component” and “theresult of [that] process.” Viewed as a process, software The Software Design KA description is related specificallydesign is the software engineering life cycle activity in to Software Requirements, Software Construction,which software requirements are analyzed in order to Software Engineering Management, Software Quality, andproduce a description of the software’s internal structure Related Disciplines of Software Engineering.that will serve as the basis for its construction. Moreprecisely, a software design (the result) must describe the BREAKDOWN OF TOPICS FOR SOFTWARE DESIGNsoftware architecture—that is, how software is decomposed 1. Software Design Fundamentalsand organized into components—and the interfaces The concepts, notions, and terminology introduced herebetween those components. It must also describe the form an underlying basis for understanding the role andcomponents at a level of detail that enable their construc- scope of software design.tion. 1.1. General Design ConceptsSoftware design plays an important role in developingsoftware: it allows software engineers to produce various Software is not the only field where design is involved. Inmodels that form a kind of blueprint of the solution to be the general sense, we can view design as a form of problem-implemented. We can analyze and evaluate these models to solving. [Bud03:c1] For example, the concept of a wickeddetermine whether or not they will allow us to fulfill the problem–a problem with no definitive solution–is interestingvarious requirements. We can also examine and evaluate in terms of understanding the limits of design. [Bud04:c1] Avarious alternative solutions and trade-offs. Finally, we can number of other notions and concepts are also of interest inuse the resulting models to plan the subsequent develop- understanding design in its general sense: goals, constraints,ment activities, in addition to using them as input and the alternatives, representations, and solutions. [Smi93]starting point of construction and testing. 1.2. Context of Software DesignIn a standard listing of software life cycle processes such as To understand the role of software design, it is important toIEEE/EIA 12207 Software Life Cycle Processes understand the context in which it fits, the software[IEEE12207.0-96], software design consists of two engineering life cycle. Thus, it is important to understandactivities that fit between software requirements analysis the major characteristics of software requirements analysisand software construction: vs. software design vs. software construction vs. software Software architectural design (sometimes called top- testing. [IEEE12207.0-96]; Lis01:c11; Mar02; Pfl01:c2; level design): describing software’s top-level structure Pre04:c2] and organization and identifying the various 1.3. Software Design Process components Software design is generally considered a two-step process: Software detailed design: describing each component [Bas03; Dor02:v1c4s2; Fre83:I; IEEE12207.0-96]; sufficiently to allow for its construction. Lis01:c13; Mar02:D]© IEEE – 2004 Version 3–1
  • 52. 1.3.1. Architectural design 1.4.2. Coupling and cohesion Architectural design describes how software is Coupling is defined as the strength of the decomposed and organized into components (the relationships between modules, whereas cohesion is software architecture) [IEEEP1471-00] defined by how the elements making up a module1.3.2. Detailed design are related. [Bas98:c6; Jal97:c5; Pfl01:c5; Pre04:c9] Detailed design describes the specific behavior of 1.4.3. Decomposition and modularization these components. Decomposing and modularizing large software into aThe output of this process is a set of models and artifacts number of smaller independent ones, usually withthat record the major decisions that have been taken. the goal of placing different functionalities or[Bud04:c2; IEE1016-98; Lis01:c13; Pre04:c9] responsibilities in different components. [Bas98:c6; Bus96:c6; Jal97 :c5; Pfl01:c5; Pre04:c9]1.4. Enabling Techniques 1.4.4. Encapsulation/information hidingAccording to the Oxford English Dictionary, a principle is Encapsulation/information hiding means grouping“a basic truth or a general law … that is used as a basis of and packaging the elements and internal details of anreasoning or a guide to action.” Software design principles, abstraction and making those details inaccessible.also called enabling techniques [Bus96], are key notions [Bas98:c6; Bus96:c6; Jal97:c5; Pfl01:c5; Pre04:c9]considered fundamental to many different software design 1.4.5. Separation of interface and implementationapproaches and concepts. The enabling techniques are thefollowing: [Bas98:c6; Bus96:c6; IEEE1016-98; Jal97:c5,c6; Separating interface and implementation involvesLis01:c1,c3; Pfl01:c5; Pre04:c9] defining a component by specifying a public interface, known to the clients, separate from the1.4.1. Abstraction details of how the component is realized. [Bas98:c6; Abstraction is “the process of forgetting information Bos00:c10; Lis01:c1,c9] so that things that are different can be treated as if 1.4.6. Sufficiency, completeness and primitiveness they were the same.” [Lis01] In the context of software design, two key abstraction mechanisms are Achieving sufficiency, completeness, and parameterization and specification. Abstraction by primitiveness means ensuring that a software specification leads to three major kinds of component captures all the important characteristics abstraction: procedural abstraction, data abstraction, of an abstraction, and nothing more. [Bus96:c6; and control (iteration) abstraction. [Bas98:c6; Lis01:c5] Jal97:c5,c6; Lis01:c1,c2,c5,c6; Pre04:c1] 3–2 © IEEE – 2004 Version
  • 53. 2. Key Issues in Software Design and the relationships between them.” (Bus96:c6) Architecture thus attempts to define the internal structureA number of key issues must be dealt with when designing — according to the Oxford English Dictionary, “the way insoftware. Some are quality concerns that all software must which something is constructed or organized” — of theaddress—for example, performance. Another important resulting software. During the mid-1990s, however,issue is how to decompose, organize, and package software software architecture started to emerge as a broadercomponents. This is so fundamental that all design approa- discipline involving the study of software structures andches must address it in one way or another (see topic 1.4 architectures in a more generic way [Sha96]. This gave riseEnabling Techniques and subarea 6 Software Design to a number of interesting ideas about software design atStrategies and Methods). In contrast, other issues “deal with different levels of abstraction. Some of these concepts cansome aspect of software’s behavior that is not in the applica- be useful during the architectural design (for example,tion domain, but which addresses some of the supporting architectural style) of specific software, as well as duringdomains.” [Bos00] Such issues, which often cross-cut the its detailed design (for example, lower-level designsystem’s functionality, have been referred to as aspects: patterns). But they can also be useful for designing generic“[aspects] tend not to be units of software’s functional systems, leading to the design of families of programs (alsodecomposition, but rather to be properties that affect the per- known as product lines). Interestingly, most of theseformance or semantics of the components in systemic concepts can be seen as attempts to describe, and thusways” (Kic97). A number of these key, cross-cutting issues reuse, generic design knowledge.are the following (presented in alphabetical order): 3.1. Architectural Structures and Viewpoints2.1. Concurrency Different high-level facets of a software design can andHow to decompose the software into processes, tasks, should be described and documented. These facets are oftenand threads and deal with related efficiency, atomicity, called views: “A view represents a partial aspect of asynchronization, and scheduling issues. [Bos00:c5; software architecture that shows specific properties of aMar02:CSD; Mey97:c30; Pre04:c9] software system” [Bus96:c6]. These distinct views pertain2.2. Control and Handling of Events to distinct issues associated with software design — forHow to organize data and control flow, how to handle example, the logical view (satisfying the functional require-reactive and temporal events through various mechanisms ments) vs. the process view (concurrency issues) vs. thesuch as implicit invocation and call-backs. [Bas98:c5; physical view (distribution issues) vs. the developmentMey97:c32; Pfl01:c5] view (how the design is broken down into implementation units). Other authors use different terminologies, like2.3. Distribution of Components behavioral vs. functional vs. structural vs. data modelingHow to distribute the software across the hardware, how views. In summary, a software design is a multi-facetedthe components communicate, how middleware can be artifact produced by the design process and generallyused to deal with heterogeneous software. [Bas03:c16; composed of relatively independent and orthogonal views.Bos00:c5; Bus96:c2 Mar94:DD; Mey97:c30; Pre04:c30] [Bas03:c2; Boo99:c31; Bud04:c5; Bus96:c6; IEEE1016-98;2.4. Error and Exception Handling and Fault Tolerance IEEE1471-00]Architectural Styles (macroarchitectural patterns)How to prevent and tolerate faults and deal withexceptional conditions. [Lis01:c4; Mey97:c12; Pfl01:c5] An architectural style is “a set of constraints on an architecture [that] defines a set or family of architectures2.5. Interaction and Presentation that satisfies them” [Bas03:c2]. An architectural style canHow to structure and organize the interactions with users thus be seen as a meta-model which can provide software’sand the presentation of information (for example, high-level organization (its macroarchitecture). Variousseparation of presentation and business logic using the authors have identified a number of major architecturalModel-View-Controller approach). [Bas98:c6; Bos00:c5; styles. [Bas03:c5; Boo99:c28; Bos00:c6; Bus96:c1,c6;Bus96:c2; Lis01:c13; Mey97:c32] It is to be noted that this Pfl01:c5]topic is not about specifying user interface details, which is General structure (for example, layers, pipes, andthe task of user interface design (a part of Software filters, blackboard)Ergonomics); see Related Disciplines of SoftwareEngineering. Distributed systems (for example, client-server, three- tiers, broker)2.6. Data Persistence Interactive systems (for example, Model-View-How long-lived data are to be handled. [Bos00:c5; Controller, Presentation-Abstraction-Control)Mey97:c31] Adaptable systems (for example, micro-kernel,3. Software Structure and Architecture reflection)In its strict sense, a software architecture is “a description Others (for example, batch, interpreters, processof the subsystems and components of a software system control, rule-based).© IEEE – 2004 Version 3–3
  • 54. 3.2. Design Patterns (microarchitectural patterns) quality of design artifacts (for example, architectureSuccinctly described, a pattern is “a common solution to a reviews [Bas03:c11], design reviews, and inspectionscommon problem in a given context.” (Jac99) While [Bud04:c4; Fre83:VIII; IEEE1028-97; Jal97:c5,c7;architectural styles can be viewed as patterns describing Lis01:c14; Pfl01:c5], scenario-based techniquesthe high-level organization of software (their [Bas98:c9; Bos00:c5], requirements tracingmacroarchitecture), other design patterns can be used to [Dor02:v1c4s2; Pfl01:c11])describe details at a lower, more local level (their Static analysis: formal or semiformal static (non-microarchitecture). [Bas98:c13; Boo99:c28; Bus96:c1; executable) analysis that can be used to evaluate aMar02:DP] design (for example, fault-tree analysis or automated Creational patterns (for example, builder, factory, cross-checking) [Jal97:c5; Pfl01:c5] prototype, and singleton) Simulation and prototyping: dynamic techniques to Structural patterns (for example, adapter, bridge, evaluate a design (for example, performance composite, decorator, façade, flyweight, and proxy) simulation or feasibility prototype [Bas98:c10; Bos00:c5; Bud04:c4; Pfl01:c5]) Behavioral patterns (for example, command, inter- preter, iterator, mediator, memento, observer, state, 4.3. Measures strategy, template, visitor) Measures can be used to assess or to quantitatively estimate3.3. Families of Programs and Frameworks various aspects of a software design’s size, structure, or quality. Most measures that have been proposed generallyOne possible approach to allow the reuse of software depend on the approach used for producing the design.designs and components is to design families of software, These measures are classified in two broad categories:also known as software product lines. This can be done byidentifying the commonalities among members of such Function-oriented (structured) design measures: thefamilies and by using reusable and customizable design’s structure, obtained mostly through functionalcomponents to account for the variability among family decomposition; generally represented as a structuremembers. [Bos00:c7,c10; Bas98:c15; Pre04:c30] chart (sometimes called a hierarchical diagram) on which various measures can be computed [Jal97:c5,c7,In OO programming, a key related notion is that of the Pre04:c15]framework: a partially complete software subsystem thatcan be extended by appropriately instantiating specific Object-oriented design measures: the design’s overallplug-ins (also known as hot spots). [Bos00:c11; Boo99:c28; structure is often represented as a class diagram, onBus96:c6] which various measures can be computed. Measures on the properties of each class’s internal content can also4. Software Design Quality Analysis and Evaluation be computed [Jal97:c6,c7; Pre04:c15]This section includes a number of quality and evaluation 5. Software Design Notationstopics that are specifically related to software design. Mostare covered in a general manner in the Software Quality KA. Many notations and languages exist to represent software design artifacts. Some are used mainly to describe a4.1. Quality Attributes design’s structural organization, others to representVarious attributes are generally considered important for software behavior. Certain notations are used mostly duringobtaining a software design of good quality—various architectural design and others mainly during detailed“ilities” (maintainability, portability, testability, design, although some notations can be used in both steps.traceability), various “nesses” (correctness, robustness), In addition, some notations are used mostly in the contextincluding “fitness of purpose.” [Bos00:c5; Bud04:c4; of specific methods (see the Software Design StrategiesBus96:c6; ISO9126.1-01; ISO15026-98; Mar94:D; and Methods subarea). Here, they are categorized into nota-Mey97:c3; Pfl01:c5] An interesting distinction is the one tions for describing the structural (static) view vs. thebetween quality attributes discernable at run-time behavioral (dynamic) view.(performance, security, availability, functionality, 5.1. Structural Descriptions (static view)usability), those not discernable at run-time (modifiability,portability, reusability, integrability, and testability), and The following notations, mostly (but not always) graphical,those related to the architecture’s intrinsic qualities (con- describe and represent the structural aspects of a softwareceptual integrity, correctness, and completeness, design—that is, they describe the major components andbuildability). [Bas03:c4] how they are interconnected (static view):4.2. Quality Analysis and Evaluation Techniques Architecture description languages (ADLs): textual, often formal, languages used to describe a softwareVarious tools and techniques can help ensure a software architecture in terms of components and connectorsdesign’s quality. [Bas03:c12] Software design reviews: informal or semiformal, often group-based, techniques to verify and ensure the 3–4 © IEEE – 2004 Version
  • 55. Class and object diagrams: used to represent a set of Sequence diagrams: used to show the interactions classes (and objects) and their interrelationships among a group of objects, with emphasis on the time- [Boo99:c8,c14; Jal97:c5,c6] ordering of messages [Boo99:c18] Component diagrams: used to represent a set of State transition and statechart diagrams: used to show components (“physical and replaceable part[s] of a the control flow from state to state in a state machine system that [conform] to and [provide] the realization [Boo99:c24; Bud04:c6; Mar02:DR; Jal97:c7] of a set of interfaces” [Boo99]) and their Formal specification languages: textual languages that interrelationships [Boo99:c12,c31] use basic notions from mathematics (for example, Class responsibility collaborator cards (CRCs): used logic, set, sequence) to rigorously and abstractly define to denote the names of components (class), their software component interfaces and behavior, often in responsibilities, and their collaborating components’ terms of pre- and post-conditions [Bud04:c18; names [Boo99:c4; Bus96] Dor02:v1c6s5; Mey97:c11] Deployment diagrams: used to represent a set of Pseudocode and program design languages (PDLs): (physical) nodes and their interrelationships, and, thus, structured-programming-like languages used to to model the physical aspects of a system [Boo99:c30] describe, generally at the detailed design stage, the Entity-relationship diagrams (ERDs): used to represent behavior of a procedure or method [Bud04:c6; conceptual models of data stored in information Fre83:VII; Jal97:c7; Pre04:c8, c11] systems [Bud04:c6; Dor02:v1c5; Mar02:DR] 6. Software Design Strategies and Methods Interface description languages (IDLs): programming- There exist various general strategies to help guide the like languages used to define the interfaces (names and design process. [Bud04:c9, Mar02:D] In contrast with types of exported operations) of software components general strategies, methods are more specific in that they [Bas98:c8; Boo99:c11] generally suggest and provide a set of notations to be used Jackson structure diagrams: used to describe the data with the method, a description of the process to be used structures in terms of sequence, selection, and iteration when following the method and a set of guidelines in using [Bud04:c6; Mar02:DR] the method. [Bud04:c8] Such methods are useful as a means of transferring knowledge and as a common Structure charts: used to describe the calling structure framework for teams of software engineers. [Bud03:c8] See of programs (which module calls, and is called by, also the Software Engineering Tools and Methods KA. which other module) [Bud04:c6; Jal97:c5; Mar02:DR; Pre04:c10] 6.1. General Strategies5.2. Behavioral Descriptions (dynamic view) Some often-cited examples of general strategies useful in the design process are divide-and-conquer and stepwiseThe following notations and languages, some graphical and refinement [Bud04:c12; Fre83:V], top-down vs. bottom-upsome textual, are used to describe the dynamic behavior of strategies [Jal97:c5; Lis01:c13], data abstraction and infor-software and components. Many of these notations are mation hiding [Fre83:V], use of heuristics [Bud04:c8], useuseful mostly, but not exclusively, during detailed design. of patterns and pattern languages [Bud04:c10; Bus96:c5], Activity diagrams: used to show the control flow from use of an iterative and incremental approach. [Pfl01:c2] activity (“ongoing non-atomic execution within a state 6.2. Function-Oriented (Structured) Design machine”) to activity [Boo99:c19] [Bud04:c14; Dor02:v1c6s4; Fre83:V; Jal97:c5; Collaboration diagrams: used to show the interactions that occur among a group of objects, where the Pre04:c9, c10] emphasis is on the objects, their links, and the This is one of the classical methods of software design, messages they exchange on these links [Boo99:c18] where decomposition centers on identifying the major Data flow diagrams (DFDs): used to show data flow software functions and then elaborating and refining them among a set of processes [Bud04:c6; Mar02:DR; in a top-down manner. Structured design is generally used Pre04:c8] after structured analysis, thus producing, among other things, data flow diagrams and associated process Decision tables and diagrams: used to represent descriptions. Researchers have proposed various strategies complex combinations of conditions and actions (for example, transformation analysis, transaction analysis) [Pre04:c11] and heuristics (for example, fan-in/fan-out, scope of effect Flowcharts and structured flowcharts: used to vs. scope of control) to transform a DFD into a software represent the flow of control and the associated actions architecture generally represented as a structure chart. to be performed [Fre83:VII; Mar02:DR; Pre04:c11]© IEEE – 2004 Version 3–5
  • 56. 6.3. Object-Oriented Design software engineer first describes the input and output data [Bud0:c16; Dor02:v1:c6s2,s3; Fre83:VI; Jal97:c6; structures (using Jackson’s structure diagrams, for instance) and then develops the program’s control structure based on Mar02:D; Pre04:c9] these data structure diagrams. Various heuristics have beenNumerous software design methods based on objects have proposed to deal with special cases—for example, whenbeen proposed. The field has evolved from the early object- there is a mismatch between the input and output structures.based design of the mid-1980s (noun = object; verb = 6.5. Component-Based Design (CBD)method; adjective = attribute) through OO design, whereinheritance and polymorphism play a key role, to the field A software component is an independent unit, having well-of component-based design, where meta-information can be defined interfaces and dependencies that can be composeddefined and accessed (through reflection, for example). and deployed independently. Component-based designAlthough OO design’s roots stem from the concept of data addresses issues related to providing, developing, andabstraction, responsibility-driven design has also been integrating such components in order to improve reuse.proposed as an alternative approach to OO design. [Bud04:c11]6.4. Data-Structure-Centered Design 6.6. Other Methods [Bud04:c15; Fre83:III,VII; Mar02:D] Other interesting but less mainstream approaches also exist: formal and rigorous methods [Bud04:c18; Dor02:c5; Fre83;Data-structure-centered design (for example, Jackson, Mey97:c11; Pre04:c29] and transformational methods.Warnier-Orr) starts from the data structures a program [Pfl98:c2]manipulates rather than from the function it performs. The 3–6 © IEEE – 2004 Version
  • 57. MATRIX OF TOPICS VS. REFERENCE MATERIAL 96] [Pfl01] [Jal97] [Lis01] [Fre83] [Pre04] [Bas03] [Bos00] [Bus96] [Boo99] [Dor02] [Smi93] {Bas98} [Bud03] [Mey97] {Mar94} [Mar02]* [ISO9126-01] [ISO15026-98] [IEEE12207.0- [IEEE1016-98] [IEEE1028-97] [IEEE1471-00]1. Software DesignFundamentals1.1 General Design C1 *Concepts1.2 The Context of * c11s1 D c2s2 c2Software Design1.3 The Software c2s1, c13s1, C2 v1c4s2 2-22 * * D c9Design Process c2s4 * c13s2 c1s1,c1s2, c3s1-c3s3, c5s1,1.4 Enabling 77-85, c5s2, {c6s1} c10s3 c6s3 * c5s2, c9Techniques c5s8, c5s5 c6s2 125-128, c9s1-c9s32. Key Issues inSoftware Design2.1 Concurrency c5s4.1 CSD c30 c92.2 Control and c32s4, {c5s2} c5s3Handling of Events c32s5 c16s3,2.3 Distribution of c16s4 c5s4.1 c2s3 {DD} c30 c30Components2.4 Error andException Handling c4s3-c4s5 c12 c5s5and Fault Tolerance2.5 Interaction and {c6s2} c5s4.1 c2s4 c13s3 c32s2Presentation2.6 Data Persistence c5s4.1 c31 * see the next section© IEEE – 2004 Version 3–7
  • 58. 96] [Pfl01] [Jal97] [Lis01] [Fre83] [Pre04] [Bas03] [Bos00] [Bus96] [Boo99] [Dor02] [Smi93] {Bas98} [Bud03] [Mey97] {Mar94} [Mar02]* [ISO9126-01] [ISO15026-98] [IEEE12207.0- [IEEE1016-98] [IEEE1028-97] [IEEE1471-00]3. SoftwareStructure andArchitecture3.1 ArchitecturalStructures and c2s5 c31 c5 c6s1 * *Viewpoints c1s1-3.2 Architectural c5s9 c28 c6s3.1 c1s3, c5s3Styles c6s2 {c13s3 c1s1-3.3 Design Patterns c28 DP } c1s3 c7s1,3.4 Families of c7s2, {c15s1, c10s2-Programs and c28 c6s2 C30 c15s3} c10s4,Frameworks c11s2, c11s44. Software DesignQuality Analysis andEvaluation4.1 Quality Attributes c4s2 c5s2.3 c4 c6s4 {D} c3 c5s5 * * c11s3,4.2 Quality Analysis c5s2.1, {c9s1, c5s6, c5s2.2, 542- c5s5,and Evaluation c9s2, c4 v1c4s2 c14s1 c5s7, c5s3, 576 c7s3Techniques c10s2, * c11s5 c5s4 c10s3} c5s6,4.3 Measures c6s5, c15 c7s4 3–8 © IEEE – 2004 Version
  • 59. 96] [Pfl01] [Jal97] [Lis01] [Fre83] [Pre04] [Bas03] [Bos00] [Bus96] [Boo99] [Dor02] [Smi93] {Bas98} [Bud03] [Mey97] {Mar94} [Mar02]* [ISO9126-01] [ISO15026-98] [IEEE12207.0- [IEEE1016-98] [IEEE1028-97] [IEEE1471-00]5. Software DesignNotations c4, c8,5.1 Structural c11, {c8s4} c12, c5s3,cDescriptions c12s1, c6 429 DR c10 c14, 6s3(Static View) c12s2 c30, c315.2 Behavioral 485- c18, 490,Descriptions c19, c6, c18 v1c5 c7s2 DR c11 c8, c11 506-(Dynamic View) c24 5136. Software DesignStrategies andMethods 304- c8, c5s1- 320,6.1 General Strategies c10, c5s1.4 c13s13 c2s2 c5s4 533- c12 5396.2 Function-Oriented 328- c14 v1c6s4 c5s4 c9, c10(Structured) Design 352 v1c6s26.3 Object-Oriented 420- c16 , c6s4 D c9Design 436 v1c6s3 201-6.4 Data-Structure- 210, C15 DCentered Design 514- 5326.5 Component-Based c11Design (CBD) 181- 395-6.6 Other Methods c18 c11 c2s2 C29 192 407© IEEE – 2004 Version 3–9
  • 60. [IEEE12207.0-96] IEEE/EIA 12207.0-1996//ISO/RECOMMENDED REFERENCES FOR SOFTWARE DESIGN IEC12207:1995, Industry Implementation of Int. Std.[Bas98] L. Bass, P. Clements, and R. Kazman, Software ISO/IEC 12207:95, Standard for Information Technology-Architecture in Practice, Addison-Wesley, 1998. Software Life Cycle Processes, IEEE, 1996.[Bas03] L. Bass, P. Clements, and R. Kazman, Software [ISO9126-01] ISO/IEC 9126-1:2001, Software EngineeringArchitecture in Practice, second ed., Addison-Wesley, Product Quality—Part 1: Quality Model, ISO and IEC,2003. 2001.[Boo99] G. Booch, J. Rumbaugh, and I. Jacobson, The [ISO15026-98] ISO/IEC 15026-1998, InformationUnified Modeling Language User Guide, Addison-Wesley, Technology — System and Software Integrity Levels, ISO1999. and IEC, 1998.[Bos00] J. Bosch, Design & Use of Software Architectures: [Jal97] P. Jalote, An Integrated Approach to SoftwareAdopting and Evolving a Product-Line Approach, first ed., Engineering, second ed., Springer-Verlag, 1997.ACM Press, 2000. [Lis01] B. Liskov and J. Guttag, Program Development in[Bud04] D. Budgen, Software Design, second ed., Addison- Java: Abstraction, Specification, and Object-OrientedWesley, 2004. Design, Addison-Wesley, 2001.[Bus96] F. Buschmann et al., Pattern-Oriented Software [Mar94] J.J. Marciniak, Encyclopedia of SoftwareArchitecture: A System of Patterns, John Wiley & Sons, Engineering, J. Wiley & Sons, 1994.1996. The references to the Encyclopedia are as follows:[Dor02] M. Dorfman and R.H. Thayer, eds., Software CBD = Component-Based DesignEngineering (Vol. 1 & Vol. 2), IEEE Computer SocietyPress, 2002. D = Design[Fre83] P. Freeman and A.I. Wasserman, Tutorial on DD = Design of the Distributed SystemSoftware Design Techniques, fourth ed., IEEE Computer DR = Design RepresentationSociety Press, 1983. [Mar02] J.J. Marciniak, Encyclopedia of Software[IEEE610.12-90] IEEE Std 610.12-1990 (R2002), IEEE Engineering, second ed., J. Wiley & Sons, 2002.Standard Glossary of Software Engineering Terminology, [Mey97] B. Meyer, Object-Oriented SoftwareIEEE, 1990. Construction, second ed., Prentice-Hall, 1997.[IEEE1016-98] IEEE Std 1016-1998, IEEE Recommended [Pfl01] S.L. Pfleeger, Software Engineering: Theory andPractice for Software Design Descriptions, IEEE, 1998. Practice, second ed., Prentice-Hall, 2001.[IEEE1028-97] IEEE Std 1028-1997 (R2002), IEEE [Pre04] R.S. Pressman, Software Engineering: AStandard for Software Reviews, IEEE, 1997. Practitioner’s Approach, sixth ed., McGraw-Hill, 2004.[IEEE1471-00] IEEE Std 1471-2000, IEEE Recommended [Smi93] G. Smith and G. Browne, “ConceptualPractice for Architectural Description of Software Foundations of Design Problem-Solving,” IEEEIntensive Systems, Architecture Working Group of the Transactions on Systems, Man and Cybernetics, vol. 23,Software Engineering Standards Committee, 2000. iss. 5, 1209-1219, Sep.-Oct. 1993. 3–10 © IEEE – 2004 Version
  • 61. Programming,” presented at ECOOP ’97 — Object-APPENDIX A. LIST OF FURTHER READINGS Oriented Programming, 1997.(Boo94a) G. Booch, Object Oriented Analysis and Design (Kru95) P. B. Kruchten, “The 4+1 View Model ofwith Applications, second ed., The Benjamin/Cummings Architecture,” IEEE Software, vol. 12, iss. 6, 42-50, 1995Publishing Company, 1994. (Lar98) C. Larman, Applying UML and Patterns: An(Coa91) P. Coad and E. Yourdon, Object-Oriented Design, Introduction to Object-Oriented Analysis and Design,Yourdon Press, 1991. Prentice-Hall, 1998.(Cro84) N. Cross, Developments in Design Methodology, (McC93) S. McConnell, Code Complete: A PracticalJohn Wiley & Sons, 1984. Handbook of Software Construction, Microsoft Press, 1993.(DSo99) D.F. D’Souza and A.C. Wills, Objects, (Pag00) M. Page-Jones, Fundamentals of Object-OrientedComponents, and Frameworks with UML — The Catalysis Design in UML, Addison-Wesley, 2000.Approach, Addison-Wesley, 1999. (Pet92) H. Petroski, To Engineer Is Human: The Role of(Dem99) T. DeMarco, “The Paradox of Software Failure in Successful Design, Vintage Books, 1992.Architecture and Design,” Stevens Prize Lecture, Aug.1999. (Pre95) W. Pree, Design Patterns for Object-Oriented Software Development, Addison-Wesley and ACM Press,(Fen98) N.E. Fenton and S.L. Pfleeger, Software Metrics: A 1995.Rigorous & Practical Approach, second ed., InternationalThomson Computer Press, 1998. (Rie96) A.J. Riel, Object-Oriented Design Heuristics, Addison-Wesley, 1996.(Fow99) M. Fowler, Refactoring: Improving the Design ofExisting Code, Addison-Wesley, 1999. (Rum91) J. Rumbaugh et al., Object-Oriented Modeling and Design, Prentice-Hall, 1991.(Fow03) M. Fowler, Patterns of Enterprise ApplicationArchitecture, Addison-Wesley, 2003. (Sha96) M. Shaw and D. Garlan, Software Architecture: Perspectives on an Emerging Discipline, Prentice-Hall,(Gam95) E. Gamma et al., Design Patterns: Elements of 1996.Reusable Object-Oriented Software, Addison-Wesley,1995. (Som05) I. Sommerville, Software Engineering, seventh ed., Addison-Wesley, 2005.(Hut94) A.T.F. Hutt, Object Analysis and Design —Comparison of Methods. Object Analysis and Design — (Wie98) R. Wieringa, “A Survey of Structured and Object-Description of Methods, John Wiley & Sons, 1994. Oriented Software Specification Methods and Techniques,” ACM Computing Surveys, vol. 30, iss. 4, 1998, pp. 459-(Jac99) I. Jacobson, G. Booch, and J. Rumbaugh, The 527.Unified Software Development Process, Addison-Wesley,1999. (Wir90) R. Wirfs-Brock, B. Wilkerson, and L. Wiener, Designing Object-Oriented Software, Prentice-Hall, 1990.(Kic97) G. Kiczales et al., “Aspect-Oriented© IEEE – 2004 Version 3–11
  • 62. Software Engineering Standards Committee, 2000.APPENDIX B. LIST OF STANDARDS (IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/(IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE IEC12207:1995, Industry Implementation of Int. Std.Standard Glossary of Software Engineering Terminology, ISO/IEC 12207:95, Standard for Information Technology-IEEE, 1990. Software Life Cycle Processes, vol. IEEE, 1996.(IEEE1016-98) IEEE Std 1016-1998, IEEE Recommended (ISO9126-01) ISO/IEC 9126-1:2001, SoftwarePractice for Software Design Descriptions, IEEE, 1998. Engineering-Product Quality-Part 1: Quality Model, ISO(IEEE1028-97) IEEE Std 1028-1997 (R2002), IEEE and IEC, 2001.Standard for Software Reviews, IEEE, 1997. (ISO15026-98) ISO/IEC 15026-1998 Information(IEEE1471-00) IEEE Std 1471-2000, IEEE Recommended Technology — System and Software Integrity Levels, ISOPractice for Architectural Descriptions of Software- and IEC, 1998.Intensive Systems, Architecture Working Group of the 3–12 © IEEE – 2004 Version
  • 63. CHAPTER 4 SOFTWARE CONSTRUCTIONACRONYMS to belong to the computer science domain. It is also relatedOMG Object Management Group to project management, insofar as the management ofUML Unified Modeling Language construction can present considerable challenges.INTRODUCTION BREAKDOWN OF TOPICS FOR SOFTWARE CONSTRUCTIONThe term software construction refers to the detailedcreation of working, meaningful software through a The breakdown of the Software Construction KA iscombination of coding, verification, unit testing, integration presented below, together with brief descriptions of thetesting, and debugging. major topics associated with it. Appropriate references are also given for each of the topics. Figure 1 gives a graphicalThe Software Construction Knowledge Area is linked to all representation of the top-level decomposition of thethe other KAs, most strongly to Software Design and breakdown for this KA.Software Testing. This is because the software constructionprocess itself involves significant software design and test 1. Software Construction Fundamentalsactivity. It also uses the output of design and provides oneof the inputs to testing, both design and testing being the The fundamentals of software construction includeactivities, not the KAs in this case. Detailed boundaries Minimizing complexitybetween design, construction, and testing (if any) will varydepending upon the software life cycle processes that are Anticipating changeused in a project. Constructing for verificationAlthough some detailed design may be performed prior to Standards in constructionconstruction, much design work is performed within the The first three concepts apply to design as well as toconstruction activity itself. Thus the Software Construction construction. The following sections define these conceptsKA is closely linked to the Software Design KA. and describe how they apply to construction.Throughout construction, software engineers both unit-test 1.1. Minimizing Complexityand integration-test their work. Thus, the SoftwareConstruction KA is closely linked to the Software Testing [Bec99; Ben00; Hun00; Ker99; Mag93; McC04]KA as well. A major factor in how people convey intent to computers isSoftware construction typically produces the highest the severely limited ability of people to hold complexvolume of configuration items that need to be managed in a structures and information in their working memories,software project (source files, content, test cases, and so especially over long periods of time. This leads to one ofon). Thus, the Software Construction KA is also closely the strongest drivers in software construction: minimizinglinked to the Software Configuration Management KA. complexity. The need to reduce complexity applies to essentially every aspect of software construction, and isSince software construction relies heavily on tools and particularly critical to the process of verification and testingmethods and is probably the most tool-intensive of the of software constructions.KAs, it is linked to the Software Engineering Tools andMethods KA. In software construction, reduced complexity is achieved through emphasizing the creation of code that is simple andWhile software quality is important in all the KAs, code is readable rather than clever.the ultimate deliverable of a software project, and thusSoftware Quality is also closely linked to Software Minimizing complexity is accomplished through makingConstruction. use of standards, which is discussed in topic 1.4 Standards in Construction, and through numerous specific techniquesAmong the Related Disciplines of Software Engineering, which are summarized in topic 3.3 Coding. It is alsothe Software Construction KA is most akin to computer supported by the construction-focused quality techniquesscience in its use of knowledge of algorithms and of summarized in topic 3.5 Construction Quality.detailed coding practices, both of which are oftenconsidered© IEEE – 2004 Version 4–1
  • 64. 1.2. Anticipating Change Communication methods (for example, standards for [Ben00; Ker99; McC04] document formats and contents)Most software will change over time, and the anticipation Programming languages (for example, languageof change drives many aspects of software construction. standards for languages like Java and C++)Software is unavoidably part of changing external Platforms (for example, programmer interface standardsenvironments, and changes in those outside environments for operating system calls)affect software in diverse ways. Tools (for example, diagrammatic standards for notationsAnticipating change is supported by many specific like UML (Unified Modeling Language))techniques summarized in topic 3.3 Coding. Software Construction Software M anaging Practical Construction Construction Considerations Fundamentals Minimizing Complexity Construction Models Construction design Construction Languages Anticipating Change Construction Planning Coding Constructing for Verification Construction Construction Testing Measurement Reuse Standards in Construction Construction Quality Integration Figure 1. Breakdown of topics for the Software Construction KA.1.3. Constructing for Verification Use of external standards. Construction depends on the use [Ben00; Hun00; Ker99; Mag93; McC04] of external standards for construction languages,Constructing for verification means building software in construction tools, technical interfaces, and interactionssuch a way that faults can be ferreted out readily by the between Software Construction and other KAs. Standardssoftware engineers writing the software, as well as during come from numerous sources, including hardware andindependent testing and operational activities. Specific software interface specifications such as the Objecttechniques that support constructing for verification include Management Group (OMG) and international organizationsfollowing coding standards to support code reviews, unit such as the IEEE or ISO.testing, organizing code to support automated testing, and Use of internal standards. Standards may also be createdrestricted use of complex or hard-to-understand language on an organizational basis at the corporate level or for usestructures, among others. on specific projects. These standards support coordination1.4. Standards in Construction of group activities, minimizing complexity, anticipating [IEEE12207-95; McC04] change, and constructing for verification.Standards that directly affect construction issues include 4–2 © IEEE – 2004 Version
  • 65. 2. Managing Construction improving the construction process, as well as for other2.1. Construction Models [Bec99; McC04] reasons. See the Software Engineering Process KA for more on measurements.Numerous models have been created to develop software,some of which emphasize construction more than others. 3. Practical considerationsSome models are more linear from the construction point of Construction is an activity in which the software has toview, such as the waterfall and staged-delivery life cycle come to terms with arbitrary and chaotic real-worldmodels. These models treat construction as an activity constraints, and to do so exactly. Due to its proximity towhich occurs only after significant prerequisite work has real-world constraints, construction is more driven bybeen completed—including detailed requirements work, practical considerations than some other KAs, and softwareextensive design work, and detailed planning. The more engineering is perhaps most craft-like in the constructionlinear approaches tend to emphasize the activities that area.precede construction (requirements and design), and tend tocreate more distinct separations between the activities. In 3.1. Construction Designthese models, the main emphasis of construction may be [Bec99; Ben00; Hun00; IEEE12207-95; Mag93;coding. McC04]Other models are more iterative, such as evolutionary Some projects allocate more design activity to construction;prototyping, Extreme Programming, and Scrum. These others to a phase explicitly focused on design. Regardlessapproaches tend to treat construction as an activity that of the exact allocation, some detailed design work willoccurs concurrently with other software development occur at the construction level, and that design work tendsactivities, including requirements, design, and planning, or to be dictated by immovable constraints imposed by theoverlaps them. These approaches tend to mix design, real-world problem that is being addressed by the software.coding, and testing activities, and they often treat thecombination of activities as construction. Just as construction workers building a physical structure must make small-scale modifications to account forConsequently, what is considered to be “construction” unanticipated gaps in the builder’s plans, softwaredepends to some degree on the life cycle model used. construction workers must make modifications on a smaller2.2. Construction Planning or larger scale to flesh out details of the software design [Bec99; McC04] during construction.The choice of construction method is a key aspect of the The details of the design activity at the construction levelconstruction planning activity. The choice of construction are essentially the same as described in the Softwaremethod affects the extent to which construction Design KA, but they are applied on a smaller scale.prerequisites are performed, the order in which they are 3.2 Construction Languagesperformed, and the degree to which they are expected to be [Hun00; McC04]completed before construction work begins. Construction languages include all forms ofThe approach to construction affects the project’s ability to communication by which a human can specify anreduce complexity, anticipate change, and construct for executable problem solution to a computer.verification. Each of these objectives may also be addressedat the process, requirements, and design levels—but they The simplest type of construction language is awill also be influenced by the choice of construction configuration language, in which software engineersmethod. choose from a limited set of predefined options to create new or custom software installations. The text-basedConstruction planning also defines the order in which configuration files used in both the Windows and Unixcomponents are created and integrated, the software quality operating systems are examples of this, and the menu stylemanagement processes, the allocation of task assignments selection lists of some program generators constituteto specific software engineers, and the other tasks, another.according to the chosen method. Toolkit languages are used to build applications out of2.3. Construction Measurement toolkits (integrated sets of application-specific reusable [McC04] parts), and are more complex than configuration languages.Numerous construction activities and artifacts can be Toolkit languages may be explicitly defined as applicationmeasured, including code developed, code modified, code programming languages (for example, scripts), or mayreused, code destroyed, code complexity, code inspection simply be implied by the set of interfaces of a toolkit.statistics, fault-fix and fault-find rates, effort, and scheduling. Programming languages are the most flexible type ofThese measurements can be useful for purposes of managing construction languages. They also contain the least amountconstruction, ensuring quality during construction, of information about specific application areas and© IEEE – 2004 Version 4–3
  • 66. development processes, and so require the most training Source code organization (into statements, routines,and skill to use effectively. classes, packages, or other structures)There are three general kinds of notation used for Code documentationprogramming languages, namely: Code tuning Linguistic 3.3. Construction Testing Formal [Bec99; Hun00; Mag93; McC04] Visual Construction involves two forms of testing, which are oftenLinguistic notations are distinguished in particular by the performed by the software engineer who wrote the code:use of word-like strings of text to represent complex Unit testingsoftware constructions, and the combination of such word- Integration testinglike strings into patterns that have a sentence-like syntax.Properly used, each such string should have a strong The purpose of construction testing is to reduce the gapsemantic connotation providing an immediate intuitive between the time at which faults are inserted into the codeunderstanding of what will happen when the underlying and the time those faults are detected. In some cases,software construction is executed. construction testing is performed after code has been written. In other cases, test cases may be created beforeFormal notations rely less on intuitive, everyday meanings code is written.of words and text strings and more on definitions backed upby precise, unambiguous, and formal (or mathematical) Construction testing typically involves a subset of types ofdefinitions. Formal construction notations and formal testing, which are described in the Software Testing KA.methods are at the heart of most forms of system For instance, construction testing does not typically includeprogramming, where accuracy, time behavior, and system testing, alpha testing, beta testing, stress testing,testability are more important than ease of mapping into configuration testing, usability testing, or other, morenatural language. Formal constructions also use precisely specialized kinds of testing.defined ways of combining symbols that avoid the Two standards have been published on the topic: IEEE Stdambiguity of many natural language constructions. 829-1998, IEEE Standard for Software Test DocumentationVisual notations rely much less on the text-oriented and IEEE Std 1008-1987, IEEE Standard for Software Unitnotations of both linguistic and formal construction, and Testing.instead rely on direct visual interpretation and placement of See also the corresponding sub-topics in the Softwarevisual entities that represent the underlying software. Testing KA: 2.1.1 Unit Testing and 2.1.2 IntegrationVisual construction tends to be somewhat limited by the Testing for more specialized reference material.difficulty of making “complex“ statements using only 3.4. Reusemovement of visual entities on a display. However, it canalso be a powerful tool in cases where the primary [IEEE1517-99; Som05].programming task is simply to build and “adjust“ a visual As stated in the introduction of (IEEE1517-99):interface to a program, the detailed behavior of which has “Implementing software reuse entails more than creatingbeen defined earlier. and using libraries of assets. It requires formalizing the3.2. Coding practice of reuse by integrating reuse processes and [Ben00; IEEE12207-95; McC04] activities into the software life cycle.“ However, reuse is important enough in software construction that it isThe following considerations apply to the software included here as a topic.construction coding activity: The tasks related to reuse in software construction during Techniques for creating understandable source code, including naming and source code layout coding and testing are: Use of classes, enumerated types, variables, named The selection of the reusable units, databases, test constants, and other similar entities procedures, or test data Use of control structures The evaluation of code or test reusability Handling of error conditions—both planned errors and The reporting of reuse information on new code, test exceptions (input of bad data, for example) procedures, or test data Prevention of code-level security breaches (buffer 3.5. Construction Quality overruns or array index overflows, for example) [Bec99; Hun00; IEEE12207-95; Mag93; McC04] Resource usage via use of exclusion mechanisms and discipline in accessing serially reusable resources Numerous techniques exist to ensure the quality of code as (including threads or database locks) it is constructed. The primary techniques used for construction include 4–4 © IEEE – 2004 Version
  • 67. Unit testing and integration testing (as mentioned in 3.7 Integration topic 3.4 Construction Testing) [Bec99; IEEE12207-95; McC04] Test-first development (see also the Software Testing A key activity during construction is the integration of KA, topic 2.2 Objectives of Testing) separately constructed routines, classes, components, and Code stepping subsystems. In addition, a particular software system may Use of assertions need to be integrated with other software or hardware Debugging systems. Technical reviews (see also the Software Quality KA, Concerns related to construction integration include sub-topic 2.3.2 Technical Reviews) planning the sequence in which components will be Static analysis (IEEE1028) (see also the Software integrated, creating scaffolding to support interim versions Quality KA, topic 2.3 Reviews and Audits) of the software, determining the degree of testing andThe specific technique or techniques selected depend on the quality work performed on components before they arenature of the software being constructed, as well as on the integrated, and determining points in the project at whichskills set of the software engineers performing the interim versions of the software are tested.construction.Construction quality activities are differentiated from otherquality activities by their focus. Construction qualityactivities focus on code and on artifacts that are closelyrelated to code: small-scale designs—as opposed to otherartifacts that are less directly connected to the code, such asrequirements, high-level designs, and plans.© IEEE – 2004 Version 4–5
  • 68. Matrix of Topics vs. Reference Material [Mag93] 12207.0] [Hun00] [Ben00] [Ker99] [Bec99] [Som05] [Mcc04] [IEEE [IEEE 1517]1. Software Construction Fundamentals c2, c3, c7-c9, c24,1.1 Minimizing Complexity c17 c2, c3 c7, c8 c2, c3 c6 c27, c28, c31, c32, c34 c3-c5, c11, c24,1.2 Anticipating Change c2, c9 c13, c14 c31, c32, c34 c8,1.3 Constructing for c21, c23, c1, c5, c2, c3, c5, c4 c20-c23,Verification c34, c43 c6 c7 c31, c341.4 Standards in X c4Construction2. ManagingConstruction c2, c3,2.1 Construction Modals c10 c27, c29 c3, c4, c12,2.2 Construction Planning c21, c15, c21 c27-c292.3 Construction c25, c28Measurement3. PracticalConsiderations c8-c10, c3, c5,3.1 Construction Design c17 c33 X c6 p175-6 c243.2 Construction c12, c14- c4Languages c20 c5-c19,3.3 Coding c6-c10 X c25-c263.4 Construction Testing c18 c34, c43 X c4 c22, c233.5 Reuse X c14 c8,3.6 Construction Quality c18 c18 X c4, c6, c7 c20-c253.7 Integration c16 X c29 4–6 © IEEE – 2004 Version
  • 69. RECOMMENDED REFERENCES FOR SOFTWARECONSTRUCTION[Bec99] K. Beck, Extreme Programming Explained:Embrace Change, Addison-Wesley, 1999, Chap. 10, 12,15, 16-18, 21.[Ben00a] J. Bentley, Programming Pearls, second ed.,Addison-Wesley, 2000, Chap. 2-4, 6-11, 13, 14, pp. 175-176.[Hun00] A. Hunt and D. Thomas, The PragmaticProgrammer, Addison-Wesley, 2000, Chap. 7, 8 12, 14-21,23, 33, 34, 36-40, 42, 43.[IEEE1517-99] IEEE Std 1517-1999, IEEE Standard forInformation Technology-Software Life Cycle Processes-Reuse Processes, IEEE, 1999.[IEEE12207.0-96] IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std.ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.[Ker99a] B.W. Kernighan and R. Pike, The Practice ofProgramming, Addison-Wesley, 1999, Chap. 2, 3, 5, 6, 9.[Mag93] S. Maguire, Writing Solid Code: Microsoft’sTechniques for Developing Bug-Free C Software,Microsoft Press, 1993, Chap. 2-7.[McC04] S. McConnell, Code Complete: A PracticalHandbook of Software Construction, Microsoft Press,second ed., 2004.[Som05] I. Sommerville, Software Engineering, seventhed., Addison-Wesley, 2005.© IEEE – 2004 Version 4–7
  • 70. APPENDIX A. LIST OF FURTHER READINGS(Bar98) T.T. Barker, Writing Software Documentation: ATask-Oriented Approach, Allyn & Bacon, 1998.(Bec02) K. Beck, Test-Driven Development: By Example,Addison-Wesley, 2002.(Fow99) M. Fowler and al., Refactoring: Improving theDesign of Existing Code, Addison-Wesley, 1999.(How02) M. Howard and D.C. Leblanc, Writing SecureCode, Microsoft Press, 2002.(Hum97b) W.S. Humphrey, Introduction to the PersonalSoftware Process, Addison-Wesley, 1997.(Mey97) B. Meyer, Object-Oriented SoftwareConstruction, second ed., Prentice Hall, 1997, Chap. 6, 10,11.(Set96) R. Sethi, Programming Languages: Concepts &Constructs, second ed., Addison-Wesley, 1996, Parts II-V. 4–8 © IEEE – 2004 Version
  • 71. APPENDIX B. LIST OF STANDARDS(IEEE829-98) IEEE Std 829-1998, IEEE Standard forSoftware Test Documentation, IEEE, 1998.(IEEE1008-87) IEEE Std 1008-1987 (R2003), IEEEStandard for Software Unit Testing, IEEE, 1987.(IEEE1028-97) IEEE Std 1028-1997 (R2002), IEEEStandard for Software Reviews, IEEE, 1997.(IEEE1517-99) IEEE Std 1517-1999, IEEE Standard forInformation Technology-Software Life Cycle Processes-Reuse Processes, IEEE, 1999.(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std.ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.© IEEE – 2004 Version 4–9
  • 72. 4–10 © IEEE – 2004 Version
  • 73. CHAPTER 5 SOFTWARE TESTING requirements or reasonable expectations. See, in theACRONYM Software Requirements KA, topic 6.4 Acceptance Tests.SRET Software Reliability Engineered Testing The view of software testing has evolved towards a more constructive one. Testing is no longer seen as an activityINTRODUCTION which starts only after the coding phase is complete, withTesting is an activity performed for evaluating product the limited purpose of detecting failures. Software testing isquality, and for improving it, by identifying defects and now seen as an activity which should encompass the wholeproblems. development and maintenance process and is itself an important part of the actual product construction. Indeed,Software testing consists of the dynamic verification of the planning for testing should start with the early stages of thebehavior of a program on a finite set of test cases, suitably requirement process, and test plans and procedures must beselected from the usually infinite executions domain, systematically and continuously developed, and possiblyagainst the expected behavior. refined, as development proceeds. These test planning andIn the above definition, italicized words correspond to key designing activities themselves constitute useful input forissues in identifying the Knowledge Area of Software designers in highlighting potential weaknesses (like designTesting. In particular: oversights or contradictions, and omissions or ambiguities Dynamic: This term means that testing always implies in the documentation). executing the program on (valued) inputs. To be It is currently considered that the right attitude towards precise, the input value alone is not always sufficient to quality is one of prevention: it is obviously much better to determine a test, since a complex, nondeterministic avoid problems than to correct them. Testing must be seen, system might react to the same input with different then, primarily as a means for checking not only whether behaviors, depending on the system state. In this KA, the prevention has been effective, but also for identifying though, the term “input” will be maintained, with the faults in those cases where, for some reason, it has not been implied convention that its meaning also includes a effective. It is perhaps obvious but worth recognizing that, specified input state, in those cases in which it is even after successful completion of an extensive testing needed. Different from testing and complementary to it campaign, the software could still contain faults. The are static techniques, as described in the Software remedy for software failures experienced after delivery is Quality KA. provided by corrective maintenance actions. Software Finite: Even in simple programs, so many test cases maintenance topics are covered in the Software are theoretically possible that exhaustive testing could Maintenance KA. require months or years to execute. This is why in In the Software Quality KA (See topic 3.3 Software Quality practice the whole test set can generally be considered Management Techniques), software quality management infinite. Testing always implies a trade-off between techniques are notably categorized into static techniques limited resources and schedules on the one hand and (no code execution) and dynamic techniques (code inherently unlimited test requirements on the other. execution). Both categories are useful. This KA focuses on Selected: The many proposed test techniques differ dynamic techniques. essentially in how they select the test set, and software Software testing is also related to software construction engineers must be aware that different selection criteria (see topic 3.4 Construction Testing in that KA). Unit and may yield vastly different degrees of effectiveness. integration testing are intimately related to software How to identify the most suitable selection criterion construction, if not part of it. under given conditions is a very complex problem; in practice, risk analysis techniques and test engineering BREAKDOWN OF TOPICS expertise are applied. The breakdown of topics for the Software Testing KA is Expected: It must be possible, although not always shown in Figure 1. easy, to decide whether the observed outcomes of program execution are acceptable or not, otherwise the The first subarea describes Software Testing Fundamentals. testing effort would be useless. The observed behavior It covers the basic definitions in the field of software may be checked against user expectations (commonly testing, the basic terminology and key issues, and its referred to as testing for validation), against a relationship with other activities. specification (testing for verification), or, finally, The second subarea, Test Levels, consists of two against the anticipated behavior from implicit (orthogonal) topics: 2.1 lists the levels in which the testing© IEEE – 2004 Version 5–1
  • 74. of large software is traditionally subdivided; and 2.2 italicized questions above is posed). Criteria for addressingconsiders testing for specific conditions or properties and is the first question are referred to as test adequacy criteria,referred to as objectives of testing. Not all types of testing while those addressing the second question are the testapply to every software product, nor has every possible selection criteria.type been listed. Several Test Techniques have been developed in the pastThe test target and test objective together determine how few decades, and new ones are still being proposed.the test set is identified, both with regard to its Generally accepted techniques are covered in subarea 3.consistency—how much testing is enough for achieving the Test-related Measures are dealt with in subarea 4.stated objective—and its composition—which test casesshould be selected for achieving the stated objective Finally, issues relative to Test Process are covered in(although usually the “for achieving the stated objective” subarea 5.part is left implicit and only the first part of the two Software Testing Software Testing Test Test Related Test Levels Test Process Fundamentals Techniques Measures Testing-Related The Target of the Evaluation of the Practical Terminology Based on Testers Program Under Test Test Considerations Intuition and Experience Keys Issues Objectives of Evaluation of the Test Activities Testing Specification-based Tests Performed Relationships of Testing to Other Code-based Activities Fault-based Usage-based Based on Nature of Application Selecting and Combining Techniques Figure 1 Breakdown of topics for the Software Testing KA several others. This terminology is precisely defined in1. Software Testing Fundamentals IEEE Standard 610.12-1990, Standard Glossary of1.1. Testing-related terminology Software Engineering Terminology (IEEE610-90), and is also discussed in the Software Quality KA. It is essential to1.1.1. Definitions of testing and related terminology clearly distinguish between the cause of a malfunction, for [Bei90:c1; Jor02:c2; Lyu96:c2s2.2] (IEEE610.12- which the term fault or defect will be used here, and an 90) undesired effect observed in the system’s delivered service,A comprehensive introduction to the Software Testing KA which will be called a failure. Testing can reveal failures,is provided in the recommended references. but it is the faults that can and must be removed.1.1.2. Faults vs. Failures However, it should be recognized that the cause of a failure [Jor02:c2; Lyu96:c2s2.2; Per95:c1; Pfl01:c8] cannot always be unequivocally identified. No theoretical criteria exist to definitively determine what fault caused the (IEEE610.12-90; IEEE982.1-88) observed failure. It might be said that it was the fault thatMany terms are used in the software engineering literature had to be modified to remove the problem, but otherto describe a malfunction, notably fault, failure, error, and modifications could have worked just as well. To avoid 5–2 © IEEE – 2004 Version
  • 75. ambiguity, some authors prefer to speak of failure-causing 1.2.7. Testabilityinputs (Fra98) instead of faults—that is, those sets of inputs [Bei90:c3, c13] (Bac90; Ber96a; Voa95)that cause a failure to appear. The term “software testability” has two related but different1.2. Key issues meanings: on the one hand, it refers to the degree to which1.2.1. Test selection criteria/Test adequacy criteria (or it is easy for software to fulfill a given test coverage stopping rules) criterion, as in (Bac90); on the other hand, it is defined as [Pfl01:c8s7.3; Zhu97:s1.1] (Wey83; Wey91; Zhu97) the likelihood, possibly measured statistically, that the software will expose a failure under testing, if it is faulty, asA test selection criterion is a means of deciding what a in (Voa95, Ber96a). Both meanings are important.suitable set of test cases should be. A selection criterion canbe used for selecting the test cases or for checking whether 1.3. Relationships of testing to other activitiesa selected test suite is adequate—that is, to decide whether Software testing is related to but different from staticthe testing can be stopped. See also the sub-topic software quality management techniques, proofs ofTermination, under topic 5.1 Practical considerations. correctness, debugging, and programming. However, it is1.2.2. Testing effectiveness/Objectives for testing informative to consider testing from the point of view of [Bei90:c1s1.4; Per95:c21] (Fra98) software quality analysts and of certifiers.Testing is the observation of a sample of program Testing vs. Static Software Quality Managementexecutions. Sample selection can be guided by different techniques. See also the Software Quality KA, subareaobjectives: it is only in light of the objective pursued that 2. Software Quality Management Processes.the effectiveness of the test set can be evaluated. [Bei90:c1; Per95:c17] (IEEE1008-87)1.2.3. Testing for defect identification Testing vs. Correctness Proofs and Formal Verification [Bei90:c1; Kan99:c1] [Bei90:c1s5; Pfl01:c8].In testing for defect identification, a successful test is one Testing vs. Debugging. See also the Softwarewhich causes the system to fail. This is quite different from Construction KA, topic 3.4 Construction testingtesting to demonstrate that the software meets its [Bei90:c1s2.1] (IEEE1008-87).specifications or other desired properties, in which case Testing vs. Programming. See also the Softwaretesting is successful if no (significant) failures are observed. Construction KA, topic 3.4 Construction testing1.2.4. The oracle problem [Bei90:c1s2.3]. [Bei90:c1] (Ber96, Wey83) Testing and Certification (Wak99).An oracle is any (human or mechanical) agent which 2. Test Levelsdecides whether a program behaved correctly in a giventest, and accordingly produces a verdict of “pass” or “fail.” 2.1. The target of the testThere exist many different kinds of oracles, and oracle Software testing is usually performed at different levelsautomation can be very difficult and expensive. along the development and maintenance processes. That is1.2.5. Theoretical and practical limitations of testing to say, the target of the test can vary: a single module, a [Kan99:c2] (How76) group of such modules (related by purpose, use, behavior, or structure), or a whole system. [Bei90:c1; Jor02:c13;Testing theory warns against ascribing an unjustified level Pfl01:c8] Three big test stages can be conceptuallyof confidence to a series of passed tests. Unfortunately, distinguished, namely Unit, Integration, and System. Nomost established results of testing theory are negative ones, process model is implied, nor are any of those three stagesin that they state what testing can never achieve as opposed assumed to have greater importance than the other two.to what it actually achieved. The most famous quotation inthis regard is the Dijkstra aphorism that “program testing 2.1.1. Unit testingcan be used to show the presence of bugs, but never to [Bei90:c1; Per95:c17; Pfl01:c8s7.3] (IEEE1008-87)show their absence.” The obvious reason is that complete Unit testing verifies the functioning in isolation of softwaretesting is not feasible in real software. Because of this, pieces which are separately testable. Depending on thetesting must be driven based on risk and can be seen as a context, these could be the individual subprograms or arisk management strategy. larger component made of tightly related units. A test unit1.2.6. The problem of infeasible paths is defined more precisely in the IEEE Standard for [Bei90:c3] Software Unit Testing (IEEE1008-87), which also describes an integrated approach to systematic andInfeasible paths, the control flow paths that cannot be documented unit testing. Typically, unit testing occurs withexercised by any input data, are a significant problem in access to the code being tested and with the support ofpath-oriented testing, and particularly in the automatedderivation of test inputs for code-based testing techniques.© IEEE – 2004 Version 5–3
  • 76. debugging tools, and might involve the programmers who References recommended above for this topic describe thewrote the code. set of potential test objectives. The sub-topics listed below2.1.2. Integration testing are those most often cited in the literature. Note that some kinds of testing are more appropriate for custom-made [Jor02:c13, 14; Pfl01:c8s7.4] software packages, installation testing, for example; andIntegration testing is the process of verifying the interaction others for generic products, like beta testing.between software components. Classical integration testing 2.2.1. Acceptance/qualification testingstrategies, such as top-down or bottom-up, are used withtraditional, hierarchically structured software. [Per95:c10; Pfl01:c9s8.5] (IEEE12207.0-96:s5.3.9)Modern systematic integration strategies are rather Acceptance testing checks the system behavior against thearchitecture-driven, which implies integrating the software customer’s requirements, however these may have beencomponents or subsystems based on identified functional expressed; the customers undertake, or specify, typicalthreads. Integration testing is a continuous activity, at each tasks to check that their requirements have been met or thatstage of which software engineers must abstract away the organization has identified these for the target marketlower-level perspectives and concentrate on the for the software. This testing activity may or may notperspectives of the level they are integrating. Except for involve the developers of the system.small, simple software, systematic, incremental integration 2.2.2. Installation testingtesting strategies are usually preferred to putting all the [Per95:c9; Pfl01:c9s8.6]components together at once, which is pictorially called“big bang” testing. Usually after completion of software and acceptance testing, the software can be verified upon installation in the2.1.3. System testing target environment. Installation testing can be viewed as [Jor02:c15; Pfl01:c9] system testing conducted once again according to hardwareSystem testing is concerned with the behavior of a whole configuration requirements. Installation procedures maysystem. The majority of functional failures should already also be verified.have been identified during unit and integration testing. 2.2.3. Alpha and beta testingSystem testing is usually considered appropriate for [Kan99:c13]comparing the system to the non-functional systemrequirements, such as security, speed, accuracy, and Before the software is released, it is sometimes given to areliability. External interfaces to other applications, small, representative set of potential users for trial use,utilities, hardware devices, or the operating environment either in-house (alpha testing) or external (beta testing).are also evaluated at this level. See the Software These users report problems with the product. Alpha andRequirements KA for more information on functional and beta use is often uncontrolled, and is not always referred tonon-functional requirements. in a test plan.2.2. Objectives of Testing 2.2.4. Conformance testing/Functional testing/Correctness testing [Per95:c8; Pfl01:c9s8.3] [Kan99:c7; Per95:c8] (Wak99)Testing is conducted in view of a specific objective, whichis stated more or less explicitly, and with varying degrees Conformance testing is aimed at validating whether or notof precision. Stating the objective in precise, quantitative the observed behavior of the tested software conforms to itsterms allows control to be established over the test process. specifications.Testing can be aimed at verifying different properties. Test 2.2.5. Reliability achievement and evaluationcases can be designed to check that the functional [Lyu96:c7; Pfl01:c9s.8.4] (Pos96)specifications are correctly implemented, which is In helping to identify faults, testing is a means to improvevariously referred to in the literature as conformance reliability. By contrast, by randomly generating test casestesting, correctness testing, or functional testing. However, according to the operational profile, statistical measures ofseveral other nonfunctional properties may be tested as reliability can be derived. Using reliability growth models,well, including performance, reliability, and usability, both objectives can be pursued together (see also sub-topicamong many others. 4.1.4 Life test, reliability evaluation).Other important objectives for testing include (but are not 2.2.6. Regression testinglimited to) reliability measurement, usability evaluation,and acceptance, for which different approaches would be [Kan99:c7; Per95:c11, c12; Pfl01:c9s8.1] (Rot96)taken. Note that the test objective varies with the test target; According to (IEEE610.12-90), regression testing is thein general, different purposes being addressed at a different “selective retesting of a system or component to verify thatlevel of testing. modifications have not caused unintended effects...” In practice, the idea is to show that software which previously 5–4 © IEEE – 2004 Version
  • 77. passed the tests still does. Beizer (Bei90) defines it as any of executions deemed equivalent. The leading principlerepetition of tests intended to show that the software’s underlying such techniques is to be as systematic asbehavior is unchanged, except insofar as required. possible in identifying a representative set of programObviously a trade-off must be made between the assurance behaviors; for instance, considering subclasses of the inputgiven by regression testing every time a change is made domain, scenarios, states, and dataflow.and the resources required to do that. It is difficult to find a homogeneous basis for classifying allRegression testing can be conducted at each of the test techniques, and the one used here must be seen as alevels described in topic 2.1 The target of the test and may compromise. The classification is based on how tests areapply to functional and nonfunctional testing. generated from the software engineer’s intuition and2.2.7. Performance testing experience, the specifications, the code structure, the (real or artificial) faults to be discovered, the field usage, or, [Per95:c17; Pfl01:c9s8.3] (Wak99) finally, the nature of the application. Sometimes theseThis is specifically aimed at verifying that the software techniques are classified as white-box, also called glass-meets the specified performance requirements, for instance, box, if the tests rely on information about how the softwarecapacity and response time. A specific kind of performance has been designed or coded, or as black-box if the test casestesting is volume testing (Per95:p185, p487; Pfl01:p401), rely only on the input/output behavior. One last categoryin which internal program or system limitations are tried. deals with combined use of two or more techniques.2.2.8. Stress testing Obviously, these techniques are not used equally often by all practitioners. Included in the list are those that a [Per95:c17; Pfl01:c9s8.3] software engineer should know.Stress testing exercises software at the maximum design 3.1. Based on the software engineer’s intuition andload, as well as beyond it. experience2.2.9. Back-to-back testing 3.1.1. Ad hoc testingA single test set is performed on two implemented versions [Kan99:c1]of a software product, and the results are compared. Perhaps the most widely practiced technique remains ad2.2.10. Recovery testing [Per95:c17; Pfl01:c9s8.3] hoc testing: tests are derived relying on the softwareRecovery testing is aimed at verifying software restart engineer’s skill, intuition, and experience with similarcapabilities after a “disaster.” programs. Ad hoc testing might be useful for identifying special tests, those not easily captured by formalized2.2.11.Configuration testing techniques. [Kan99:c8; Pfl01:c9s8.3] 3.1.2. Exploratory testingIn cases where software is built to serve different users, Exploratory testing is defined as simultaneous learning, testconfiguration testing analyzes the software under thevarious specified configurations. design, and test execution; that is, the tests are not defined in advance in an established test plan, but are dynamically2.2.12. Usability testing designed, executed, and modified. The effectiveness of [Per95:c8; Pfl01:c9s8.3] exploratory testing relies on the software engineer’s knowledge, which can be derived from various sources:This process evaluates how easy it is for end-users to use observed product behavior during testing, familiarity withand learn the software, including user documentation; how the application, the platform, the failure process, the type ofeffectively the software functions in supporting user tasks; possible faults and failures, the risk associated with aand, finally, its ability to recover from user errors. particular product, and so on. [Kan01:c3]2.2.13.Test-driven development 3.2. Specification-based techniques [Bec02] 3.2.1. Equivalence partitioningTest-driven development is not a test technique per se, [Jor02:c7; Kan99:c7]promoting the use of tests as a surrogate for a requirementsspecification document rather than as an independent check The input domain is subdivided into a collection of subsets,that the software has correctly implemented the or equivalent classes, which are deemed equivalentrequirements. according to a specified relation, and a representative set of tests (sometimes only one) is taken from each class.3. Test Techniques 3.2.2. Boundary-value analysisOne of the aims of testing is to reveal as much potential for [Jor02:c6; Kan99:c7]failure as possible, and many techniques have been Test cases are chosen on and near the boundaries of thedeveloped to do this, which attempt to “break” the program, input domain of variables, with the underlying rationaleby running one or more tests drawn from identified classes that many faults tend to concentrate near the extreme values© IEEE – 2004 Version 5–5
  • 78. of inputs. An extension of this technique is robustness variables are defined, used, and killed (undefined). Thetesting, wherein test cases are also chosen outside the input strongest criterion, all definition-use paths, requires that,domain of variables, to test program robustness to for each variable, every control flow path segment from aunexpected or erroneous inputs. definition of that variable to a use of that definition is3.2.3. Decision table executed. In order to reduce the number of paths required, weaker strategies such as all-definitions and all-uses are [Bei90:c10s3] (Jor02) employed.Decision tables represent logical relationships between 3.3.3. Reference models for code-based testingconditions (roughly, inputs) and actions (roughly, outputs). (flowgraph, call graph)Test cases are systematically derived by considering everypossible combination of conditions and actions. A related [Bei90:c3; Jor02:c5].technique is cause-effect graphing. [Pfl01:c9] Although not a technique in itself, the control structure of a3.2.4. Finite-state machine-based program is graphically represented using a flowgraph in code-based testing techniques. A flowgraph is a directed [Bei90:c11; Jor02:c8] graph the nodes and arcs of which correspond to programBy modeling a program as a finite state machine, tests can elements. For instance, nodes may represent statements orbe selected in order to cover states and transitions on it. uninterrupted sequences of statements, and arcs the transfer3.2.5. Testing from formal specifications of control between nodes. [Zhu97:s2.2] (Ber91; Dic93; Hor95) 3.4. Fault-based techniquesGiving the specifications in a formal language allows for (Mor90)automatic derivation of functional test cases, and, at the With different degrees of formalization, fault-based testingsame time, provides a reference output, an oracle, for techniques devise test cases specifically aimed at revealingchecking test results. Methods exist for deriving test cases categories of likely or predefined faults.from model-based (Dic93, Hor95) or algebraic 3.4.1. Error guessingspecifications. (Ber91) [Kan99:c7]3.2.6. Random testing In error guessing, test cases are specifically designed by [Bei90:c13; Kan99:c7] software engineers trying to figure out the most plausibleTests are generated purely at random, not to be confused faults in a given program. A good source of information iswith statistical testing from the operational profile as the history of faults discovered in earlier projects, as well asdescribed in sub-topic 3.5.1 Operational profile. This form the software engineer’s expertise.of testing falls under the heading of the specification-based 3.4.2. Mutation testingentry, since at least the input domain must be known, to be [Per95:c17; Zhu97:s3.2-s3.3]able to pick random points within it. A mutant is a slightly modified version of the program3.3. Code-based techniques under test, differing from it by a small, syntactic change.3.3.1. Control-flow-based criteria Every test case exercises both the original and all generated [Bei90:c3; Jor02:c10] (Zhu97) mutants: if a test case is successful in identifying the difference between the program and a mutant, the latter isControl-flow-based coverage criteria is aimed at covering said to be “killed.” Originally conceived as a technique toall the statements or blocks of statements in a program, or evaluate a test set (see 4.2), mutation testing is also aspecified combinations of them. Several coverage criteria testing criterion in itself: either tests are randomlyhave been proposed, like condition/decision coverage. The generated until enough mutants have been killed, or testsstrongest of the control-flow-based criteria is path testing, are specifically designed to kill surviving mutants. In thewhich aims to execute all entry-to-exit control flow paths in latter case, mutation testing can also be categorized as athe flowgraph. Since path testing is generally not feasible code-based technique. The underlying assumption ofbecause of loops, other less stringent criteria tend to be mutation testing, the coupling effect, is that by looking forused in practice, such as statement testing, branch testing, simple syntactic faults, more complex but real faults will beand condition/decision testing. The adequacy of such tests found. For the technique to be effective, a large number ofis measured in percentages; for example, when all branches mutants must be automatically derived in a systematic way.have been executed at least once by the tests, 100% branchcoverage is said to have been achieved. 3.5. Usage-based techniques3.3.2. Data flow-based criteria 3.5.1. Operational profile [Bei90:c5] (Jor02; Zhu97) [Jor02:c15; Lyu96:c5; Pfl01:c9]In data-flow-based testing, the control flowgraph is In testing for reliability evaluation, the test environmentannotated with information about how the program must reproduce the operational environment of the software 5–6 © IEEE – 2004 Version
  • 79. as closely as possible. The idea is to infer, from the help to ensure the achievement of test objectives. Forobserved test results, the future reliability of the software instance, branch coverage is a popular test technique.when in actual use. To do this, inputs are assigned a Achieving a specified branch coverage measure should notprobability distribution, or profile, according to their be considered the objective of testing per se: it is a meansoccurrence in actual operation. to improve the chances of finding failures by systematically3.5.2. Software Reliability Engineered Testing exercising every program branch out of a decision point. To avoid such misunderstandings, a clear distinction should be [Lyu96:c6] made between test-related measures, which provide anSoftware Reliability Engineered Testing (SRET) is a testing evaluation of the program under test based on the observedmethod encompassing the whole development process, test outputs, and those which evaluate the thoroughness ofwhereby testing is “designed and guided by reliability the test set. Additional information on measurementobjectives and expected relative usage and criticality of programs is provided in the Software Engineeringdifferent functions in the field.” Management KA, subarea 6, Software engineering3.6. Techniques based on the nature of the application measurement. Additional information on measures can be found in the Software Engineering Process KA, subarea 4,The above techniques apply to all types of software. Process and product measurement.However, for some kinds of applications, some additionalknow-how is required for test derivation. A list of a few Measurement is usually considered instrumental to qualityspecialized testing fields is provided here, based on the analysis. Measurement may also be used to optimize thenature of the application under test: planning and execution of the tests. Test management can use several process measures to monitor progress. Object-oriented testing [Jor02:c17; Pfl01:c8s7.5] Measures relative to the test process for management (Bin00) purposes are considered in topic 5.1 Practical Component-based testing considerations. Web-based testing 4.1. Evaluation of the program under test (IEEE982.1- 98) GUI testing [Jor20] 4.1.1. Program measurements to aid in planning and Testing of concurrent programs (Car91) designing testing Protocol conformance testing (Pos96; Boc94) [Bei90:c7s4.2; Jor02:c9] (Ber96; IEEE982.1-88) Testing of real-time systems (Sch94) Measures based on program size (for example, source lines Testing of safety-critical systems (IEEE1228-94) of code or function points) or on program structure (like3.7. Selecting and combining techniques complexity) are used to guide testing. Structural measures can also include measurements among program modules in3.7.1. Functional and structural terms of the frequency with which modules call each other. [Bei90:c1s.2.2; Jor02:c2, c9, c12; Per95:c17] 4.1.2. Fault types, classification, and statistics (Pos96) [Bei90:c2; Jor02:c2; Pfl01:c8]Specification-based and code-based test techniques are (Bei90; IEEE1044-93; Kan99; Lyu96)often contrasted as functional vs. structural testing. Thesetwo approaches to test selection are not to be seen as The testing literature is rich in classifications andalternative but rather as complementary; in fact, they use taxonomies of faults. To make testing more effective, it isdifferent sources of information and have proved to important to know which types of faults could be found inhighlight different kinds of problems. They could be used the software under test, and the relative frequency within combination, depending on budgetary considerations. which these faults have occurred in the past. This information can be very useful in making quality3.7.2. Deterministic vs. random predictions, as well as for process improvement. More (Ham92; Lyu96:p541-547) information can be found in the Software Quality KA, topicTest cases can be selected in a deterministic way, according 3.2 Defect characterization. An IEEE standard exists onto one of the various techniques listed, or randomly drawn how to classify software “anomalies” (IEEE1044-93).from some distribution of inputs, such as is usually done in 4.1.3. Fault densityreliability testing. Several analytical and empirical [Per95:c20] (IEEE982.1-88; Lyu96:c9)comparisons have been conducted to analyze the conditionsthat make one approach more effective than the other. A program under test can be assessed by counting and classifying the discovered faults by their types. For each4. Test-related measures fault class, fault density is measured as the ratio between the number of faults found and the size of the program.Sometimes, test techniques are confused with testobjectives. Test techniques are to be viewed as aids which© IEEE – 2004 Version 5–7
  • 80. 4.1.4. Life test, reliability evaluation 4.2.3. Mutation score [Pfl01:c9] (Pos96:p146-154) [Zhu97:s3.2-s3.3]A statistical estimate of software reliability, which can be In mutation testing (see sub-topic 3.4.2), the ratio of killedobtained by reliability achievement and evaluation (see mutants to the total number of generated mutants can be asub-topic 2.2.5), can be used to evaluate a product and measure of the effectiveness of the executed test set.decide whether or not testing can be stopped. 4.2.4. Comparison and relative effectiveness of different4.1.5. Reliability growth models techniques [Lyu96:c7; Pfl01:c9] (Lyu96:c3, c4) [Jor02:c9, c12; Per95:c17; Zhu97:s5] (Fra93; Fra98;Reliability growth models provide a prediction of reliability Pos96: p64-72)based on the failures observed under reliability Several studies have been conducted to compare theachievement and evaluation (see sub-topic 2.2.5). They relative effectiveness of different test techniques. It isassume, in general, that the faults that caused the observed important to be precise as to the property against which thefailures have been fixed (although some models also accept techniques are being assessed; what, for instance, is theimperfect fixes), and thus, on average, the product’s exact meaning given to the term “effectiveness”? Possiblereliability exhibits an increasing trend. There now exist interpretations are: the number of tests needed to find thedozens of published models. Many are laid down on some first failure, the ratio of the number of faults found throughcommon assumptions, while others differ. Notably, these testing to all the faults found during and after testing, ormodels are divided into failure-count and time-between- how much reliability was improved. Analytical andfailure models. empirical comparisons between different techniques have4.2. Evaluation of the tests performed been conducted according to each of the notions of4.2.1. Coverage/thoroughness measures effectiveness specified above. [Jor02:c9; Pfl01:c8] (IEEE982.1-88) 5. Test ProcessSeveral test adequacy criteria require that the test cases Testing concepts, strategies, techniques, and measures needsystematically exercise a set of elements identified in the to be integrated into a defined and controlled process whichprogram or in the specifications (see subarea 3). To is run by people. The test process supports testing activitiesevaluate the thoroughness of the executed tests, testers can and provides guidance to testing teams, from test planningmonitor the elements covered, so that they can dynamically to test output evaluation, in such a way as to providemeasure the ratio between covered elements and their total justified assurance that the test objectives will be met cost-number. For example, it is possible to measure the effectively.percentage of covered branches in the program flowgraph,or that of the functional requirements exercised among 5.1. Practical considerationsthose listed in the specifications document. Code-based 5.1.1. Attitudes/Egoless programmingadequacy criteria require appropriate instrumentation of the [Bei90:c13s3.2; Pfl01:c8]program under test. A very important component of successful testing is a4.2.2. Fault seeding collaborative attitude towards testing and quality assurance [Pfl01:c8] (Zhu97:s3.1) activities. Managers have a key role in fostering a generallySome faults are artificially introduced into the program favorable reception towards failure discovery duringbefore test. When the tests are executed, some of these development and maintenance; for instance, by preventingseeded faults will be revealed, and possibly some faults a mindset of code ownership among programmers, so thatwhich were already there will be as well. In theory, they will not feel responsible for failures revealed by theirdepending on which of the artificial faults are discovered, code.and how many, testing effectiveness can be evaluated, and 5.1.2. Test guidesthe remaining number of genuine faults can be estimated. [Kan01]In practice, statisticians question the distribution andrepresentativeness of seeded faults relative to genuine faults The testing phases could be guided by various aims, forand the small sample size on which any extrapolations are example: in risk-based testing, which uses the product risksbased. Some also argue that this technique should be used to prioritize and focus the test strategy; or in scenario-basedwith great care, since inserting faults into software involves testing, in which test cases are defined based on specifiedthe obvious risk of leaving them there. software scenarios. 5–8 © IEEE – 2004 Version
  • 81. 5.1.3. Test process management be associated with the analysis of risks. Moreover, the [Bec02: III; Per95:c1-c4; Pfl01:c9] (IEEE1074-97; resources that are worth spending on testing should be commensurate with the use/criticality of the application: IEEE12207.0-96:s5.3.9, s5.4.2, s6.4, s6.5) different techniques have different costs and yield differentTest activities conducted at different levels (see subarea 2. levels of confidence in product reliability.Test levels) must be organized, together with people, tools, 5.1.7. Terminationpolicies, and measurements, into a well-defined processwhich is an integral part of the life cycle. In IEEE/EIA [Bei90:c2s2.4; Per95:c2]Standard 12207.0, testing is not described as a stand-alone A decision must be made as to how much testing is enoughprocess, but principles for testing activities are included and when a test stage can be terminated. Thoroughnessalong with both the five primary life cycle processes and measures, such as achieved code coverage or functionalthe supporting process. In IEEE Std 1074, testing is completeness, as well as estimates of fault density or ofgrouped with other evaluation activities as integral to the operational reliability, provide useful support, but are notentire life cycle. sufficient in themselves. The decision also involves5.1.4. Test documentation and work products considerations about the costs and risks incurred by the [Bei90:c13s5; Kan99:c12; Per95:c19; Pfl01:c9s8.8] potential for remaining failures, as opposed to the costs (IEEE829-98) implied by continuing to test. See also sub-topic 1.2.1 Test selection criteria/Test adequacy criteria.Documentation is an integral part of the formalization ofthe test process. The IEEE Standard for Software Test 5.1.8. Test reuse and test patternsDocumentation (IEEE829-98) provides a good description [Bei90:c13s5]of test documents and of their relationship with one another To carry out testing or maintenance in an organized andand with the testing process. Test documents may include, cost-effective way, the means used to test each part of theamong others, Test Plan, Test Design Specification, Test software should be reused systematically. This repositoryProcedure Specification, Test Case Specification, Test Log, of test materials must be under the control of softwareand Test Incident or Problem Report. The software under configuration management, so that changes to softwaretest is documented as the Test Item. Test documentation requirements or design can be reflected in changes to theshould be produced and continually updated, to the same scope of the tests conducted.level of quality as other types of documentation in softwareengineering. The test solutions adopted for testing some application types under certain circumstances, with the motivations5.1.5. Internal vs. independent test team behind the decisions taken, form a test pattern which can [Bei90:c13s2.2-c13s2.3; Kan99:c15; Per95:c4; itself be documented for later reuse in similar projects. Pfl01:c9] 5.2. Test ActivitiesFormalization of the test process may involve formalizing Under this topic, a brief overview of test activities is given;the test team organization as well. The test team can be as often implied by the following description, successfulcomposed of internal members (that is, on the project team, management of test activities strongly depends on theinvolved or not in software construction), of external Software Configuration Management process.members, in the hope of bringing in an unbiased, 5.2.1. Planningindependent perspective, or, finally, of both internal andexternal members. Considerations of costs, schedule, [Kan99:c12; Per95:c19; Pfl01:c8s7.6] (IEEE829-maturity levels of the involved organizations, and criticality 98:s4; IEEE1008-87:s1-s3)of the application may determine the decision. Like any other aspect of project management, testing5.1.6. Cost/effort estimation and other process measures activities must be planned. Key aspects of test planning [Per95:c4, c21] (Per95: Appendix B; Pos96:p139- include coordination of personnel, management of available 145; IEEE982.1-88) test facilities and equipment (which may include magneticSeveral measures related to the resources spent on testing, media, test plans and procedures), and planning for possibleas well as to the relative fault-finding effectiveness of the undesirable outcomes. If more than one baseline of thevarious test phases, are used by managers to control and software is being maintained, then a major planningimprove the test process. These test measures may cover consideration is the time and effort needed to ensure thatsuch aspects as number of test cases specified, number of the test environment is set to the proper configuration.test cases executed, number of test cases passed, and 5.2.2. Test-case generationnumber of test cases failed, among others. [Kan99:c7] (Pos96:c2; IEEE1008-87:s4, s5)Evaluation of test phase reports can be combined with root- Generation of test cases is based on the level of testing tocause analysis to evaluate test process effectiveness in be performed and the particular testing techniques. Testfinding faults as early as possible. Such an evaluation could© IEEE – 2004 Version 5–9
  • 82. cases should be under the control of software configuration particularly important, a formal review board may bemanagement and include the expected results for each test. convened to evaluate them.5.2.3. Test environment development 5.2.6. Problem reporting/Test log [Kan99:c11] [Kan99:c5; Per95:c20] (IEEE829-98:s9-s10)The environment used for testing should be compatible Testing activities can be entered into a test log to identifywith the software engineering tools. It should facilitate when a test was conducted, who performed the test, whatdevelopment and control of test cases, as well as logging software configuration was the basis for testing, and otherand recovery of expected results, scripts, and other testing relevant identification information. Unexpected or incorrectmaterials. test results can be recorded in a problem-reporting system,5.2.4. Execution the data of which form the basis for later debugging and for fixing the problems that were observed as failures during [Bei90:c13; Kan99:c11] (IEEE1008-87:s6, s7) testing. Also, anomalies not classified as faults could beExecution of tests should embody a basic principle of documented in case they later turn out to be more seriousscientific experimentation: everything done during testing than first thought. Test reports are also an input to theshould be performed and documented clearly enough that change management request process (see the Softwareanother person could replicate the results. Hence, testing Configuration Management KA, subarea 3, Softwareshould be performed in accordance with documented configuration control).procedures using a clearly defined version of the software 5.2.7. Defect trackingunder test. [Kan99:c6]5.2.5. Test results evaluation Failures observed during testing are most often due to faults [Per95:c20,c21] (Pos96:p18-20, p131-138) or defects in the software. Such defects can be analyzed toThe results of testing must be evaluated to determine determine when they were introduced into the software,whether or not the test has been successful. In most cases, what kind of error caused them to be created (poorly“successful” means that the software performed as expected defined requirements, incorrect variable declaration,and did not have any major unexpected outcomes. Not all memory leak, programming syntax error, for example), andunexpected outcomes are necessarily faults, however, but when they could have been first observed in the software.could be judged to be simply noise. Before a failure can be Defect-tracking information is used to determine whatremoved, an analysis and debugging effort is needed to aspects of software engineering need improvement and howisolate, identify, and describe it. When test results are effective previous analyses and testing have been. 5–10 © IEEE – 2004 Version
  • 83. MATRIX OF TOPICS VS. REFERENCE MATERIAL [Bec02] [Bei90] [Jor02] [Kan99 [Kan01] [Lyu96] [Per95] [Pfl01] [Zhu97]1. Software Testing Fundamentals1.1 Testing-Related Terminology Definitions of testing and related terminology c1 c2 c2s2.2 Faults vs. failures c2 c2s2.2 c1 c81.2 Key Issues Test selection criteria / test adequacy criteria (or stopping c8s7.3 s1.1 rules) Testing effectiveness/ objectives for testing c1s1.4 c21 Testing for defect identification c1 c1 The oracle problem c1 Theoretical and practical limitations of testing c2 The problem of infeasible paths c3 Testability c3,c131.3 Relationships of Testing toother Activities Testing vs. static analysis techniques c1 c17 Testing vs. correctness proofs and formal verification c1s5 c8 Testing vs. debugging c1s2.1 Testing vs. programming c1s2.3 Testing and certification2. Test Levels2.1 The Target of the Tests c1 c13 c8 Unit testing c1 c17 c8s7.3 Integration testing c13,c14 c8s7.4 System testing c15 c92.2 Objectives of Testing c8 c9s8.3 Acceptance/qualification testing c10 c9s8.5 Installation testing c9 c9s8.6 Alpha and beta testing c13 Conformance testing / Functional testing/ Correctness c7 c8 testing Reliability achievement and evaluation by testing c7 c9s8.4 Regression testing c7 c11,c12 c9s8.1 Performance testing c17 c9s8.3 Stress testing c17 c9s8.3 Back-to-back testing Recovery testing c17 c9s8.3 Configuration testing c8 c9s8.3 Usability testing c8 c9s8.3 Test-driven development III © IEEE – 2004 Version 5–11
  • 84. [Bec02] [Bei90] [Jor02] [Kan99] [Kan01] [Lyu96] [Per95] [Pfl01] [Zhu97]3. Test Techniques3.1 Based on tester’s intuitionand experience Ad hoc testing c1 Exploratory testing c33.2 Specification-based Equivalence partitioning c7 c7 Boundary-value analysis c6 c7 Decision table c10s3 c9 Finite-state machine-based c11 c8 Testing from formal specifications s2.2 Random testing c13 c73.3 Code-based Control-flow-based criteria c3 c10 c8 Data-flow-based criteria c5 Reference models for code-based testing c3 c53.4 Fault-based Error guessing c7 Mutation testing c17 s3.2, s3.33.5 Usage-based Operational profile c15 c5 c9 Software Reliability Engineered Testing c63.6 Based on Nature ofApplication Object-oriented testing c17 c8s7.5 Component-based testing Web-based testing GUI testing c20 Testing of concurrent programs Protocol conformance testing Testing of distributed systems Testing of real-time systems3.7 Selecting and CombiningTechniques Functional and structural c1s2.2 c1,c11s11.3 c17 Deterministic vs. Random 5–12 © IEEE – 2004 Version
  • 85. [Bec02] [Bei90] [Jor02] [Kan99 [Kan01] [Lyu96] [Per95] [Pfl01] [Zhu97]4. Test-Related Measures4.1 Evaluation of the Programunder Test Program measurements to aid in c7s4.2 c9 planning and designing testing. Types, classification and statistics of c2 c1 c8 faults Fault density c20 Life test, reliability evaluation c9 Reliability growth models c7 c94.2 Evaluation of the TestsPerformed Coverage/thoroughness measures c9 c8 Fault seeding c8 Mutation score s3.2, s3.3 Comparison and relative c8,c11 c17 s5 effectiveness of different techniques5. Test Process5.1 Practical Considerations Attitudes/Egoless programming c13s3.2 c8 Test guides III C5 Test process management c1-c4 c9 Test documentation and work c13s5 c12 c19 c9s8.8 products c13s2.2, Internal vs. independent test team c15 c4 c9 c1s2.3 Cost/effort estimation and other c4,c21 process measures Termination c2s2.4 c2 Test reuse and test patterns c13s55.2 Test Activities Planning c12 c19 c87s7.6 Test case generation c7 Test environment development c11 Execution c13 c11 Test results evaluation c20,c21 Problem reporting/Test log c5 c20 Defect tracking c6 © IEEE – 2004 Version 5–13
  • 86. [Kan01] C. Kaner, J. Bach, and B. Pettichord, LessonsRECOMMENDED REFERENCES FOR SOFTWARE TESTING Learned in Software Testing, Wiley Computer Publishing, 2001.[Bec02] K. Beck, Test-Driven Development by Example,Addison-Wesley, 2002. [Lyu96] M.R. Lyu, Handbook of Software Reliability Engineering, Mc-Graw-Hill/IEEE, 1996, Chap. 2s2.2, 5-7.[Bei90] B. Beizer, Software Testing Techniques,International Thomson Press, 1990, Chap. 1-3, 5, 7s4, [Per95] W. Perry, Effective Methods for Software Testing,10s3, 11, 13. John Wiley & Sons, 1995, Chap. 1-4, 9, 10-12, 17, 19-21.[Jor02] P. C. Jorgensen, Software Testing: A Craftsmans [Pfl01] S. L. Pfleeger, Software Engineering: Theory andApproach, second edition, CRC Press, 2004, Chap. 2, 5- Practice, second ed., Prentice Hall, 2001, Chap. 8, 9.10, 12-15, 17, 20. [Zhu97] H. Zhu, P.A.V. Hall and J.H.R. May, “Software[Kan99] C. Kaner, J. Falk, and H.Q. Nguyen, Testing Unit Test Coverage and Adequacy,” ACM ComputingComputer Software, second ed., John Wiley & Sons, Surveys, vol. 29, iss. 4 (Sections 1, 2.2, 3.2, 3.3), Dec.1999, Chaps. 1, 2, 5-8, 11-13, 15. 1997, pp. 366-427. 5–14 © IEEE – 2004 Version
  • 87. (Kan99) C. Kaner, J. Falk, and H.Q. Nguyen, “TestingAPPENDIX A. LIST OF FURTHER READINGS Computer Software,” second ed., John Wiley & Sons, 1999.(Bac90) R. Bache and M. Müllerburg, “Measures ofTestability as a Basis for Quality Assurance,” Software (Lyu96) M.R. Lyu, Handbook of Software ReliabilityEngineering Journal, vol. 5, March 1990, pp. 86-92. Engineering, Mc-Graw-Hill/IEEE, 1996.(Bei90) B. Beizer, Software Testing Techniques, (Mor90) L.J. Morell, “A Theory of Fault-Based Testing,”International Thomson Press, second ed., 1990. IEEE Transactions on Software Engineering, vol. 16, iss. 8, August 1990, pp. 844-857.(Ber91) G. Bernot, M.C. Gaudel and B. Marre, “SoftwareTesting Based On Formal Specifications: a Theory and a (Ost88) T.J. Ostrand and M.J. Balcer, “The Category-Tool,” Software Engineering Journal, Nov. 1991, pp. Partition Method for Specifying and Generating387-405. Functional Tests,” Communications of the ACM, vol. 31, iss. 3, June 1988, pp. 676-686.(Ber96) A. Bertolino and M. Marrè, “How Many PathsAre Needed for Branch Testing?” Journal of Systems and (Ost98) T. Ostrand, A. Anodide, H. Foster, and T.Software, vol. 35, iss. 2, 1996, pp. 95-106. Goradia, “A Visual Test Development Environment for GUI Systems,” presented at ACM Proc. Int’l Symp. on(Ber96a) A. Bertolino and L. Strigini, “On the Use of Software Testing and Analysis (ISSTA ’98), ClearwaterTestability Measures for Dependability Assessment,” Beach, Florida, 1998.IEEE Transactions on Software Engineering, vol. 22, iss.2, Feb. 1996, pp. 97-108. (Per95) W. Perry, Effective Methods for Software Testing, John Wiley & Sons, 1995.(Bin00) R.V. Binder, Testing Object-Oriented SystemsModels, Patterns, and Tools, Addison-Wesley, 2000. (Pfl01) S.L. Pfleeger, Software Engineering: Theory and Practice, second ed., Prentice-Hall, 2001, Chap. 8, 9.(Boc94) G.V. Bochmann and A. Petrenko, “ProtocolTesting: Review of Methods and Relevance for Software (Pos96) R.M. Poston, Automating Specification-BasedTesting,” presented at ACM Proc. Int’l Symp. on Software Software Testing, IEEE, 1996.Testing and Analysis (ISSTA ’94), Seattle, Wash., 1994. (Rot96) G. Rothermel and M.J. Harrold, “Analyzing(Car91) R.H. Carver and K.C. Tai, “Replay and Testing Regression Test Selection Techniques,” IEEEfor Concurrent Programs,” IEEE Software, March 1991, Transactions on Software Engineering, vol. 22, iss. 8,pp. 66-74. Aug. 1996, p. 529.(Dic93) J. Dick and A. Faivre, “Automating the (Sch94) W. Schütz, “Fundamental Issues in TestingGeneration and Sequencing of Test Cases from Model- Distributed Real-Time Systems,” Real-Time SystemsBased Specifications,” presented at FME ’93: Industrial- Journal, vol. 7, iss. 2, Sept. 1994, pp. 129-157.Strength Formal Methods, LNCS 670, Springer-Verlag, (Voa95) J.M. Voas and K.W. Miller, “Software1993. Testability: The New Verification,” IEEE Software, May(Fran93) P. Frankl and E. Weyuker, “A Formal Analysis 1995, pp. 17-28.of the Fault Detecting Ability of Testing Methods,” IEEE (Wak99) S. Wakid, D.R. Kuhn, and D.R. Wallace,Transactions on Software Engineering, vol. 19, iss. 3, “Toward Credible IT Testing and Certification,” IEEEMarch 1993, p. 202. Software, July-Aug. 1999, pp. 39-47.(Fran98) P. Frankl, D. Hamlet, B. Littlewood, and L. (Wey82) E.J. Weyuker, “On Testing Non-testableStrigini, “Evaluating Testing Methods by Delivered Programs,” The Computer Journal, vol. 25, iss. 4, 1982,Reliability,” IEEE Transactions on Software Engineering, pp. 465-470.vol. 24, iss. 8, August 1998, pp. 586-601. (Wey83) E.J. Weyuker, “Assessing Test Data Adequacy(Ham92) D. Hamlet, “Are We Testing for True through Program Inference,” ACM Trans. onReliability?” IEEE Software, July 1992, pp. 21-27. Programming Languages and Systems, vol. 5, iss. 4,(Hor95) H. Horcher and J. Peleska, “Using Formal October 1983, pp. 641-655.Specifications to Support Software Testing,” Software (Wey91) E.J. Weyuker, S.N. Weiss, and D. Hamlet,Quality Journal, vol. 4, 1995, pp. 309-327. “Comparison of Program Test Strategies,” presented at(How76) W. E. Howden, “Reliability of the Path Analysis Proc. Symp. on Testing, Analysis and Verification (TAVTesting Strategy,” IEEE Transactions on Software 4), Victoria, British Columbia, 1991.Engineering, vol. 2, iss. 3, Sept. 1976, pp. 208-215. (Zhu97) H. Zhu, P.A.V. Hall, and J.H.R. May, “Software(Jor02) P.C. Jorgensen, Software Testing: A Craftsman’s Unit Test Coverage and Adequacy,” ACM ComputingApproach, second ed., CRC Press, 2004. Surveys, vol. 29, iss. 4, Dec. 1997, pp. 366-427.© IEEE – 2004 Version 5–15
  • 88. APPENDIX B. LIST OF STANDARDS(IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE (IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEEStandard Glossary of Software Engineering Terminology, Standard for the Classification of Software Anomalies,IEEE, 1990. IEEE, 1993.(IEEE829-98) IEEE Std 829-1998, Standard for Software (IEEE1228-94) IEEE Std 1228-1994, Standard forTest Documentation, IEEE, 1998. Software Safety Plans, IEEE, 1994.(IEEE982.1-88) IEEE Std 982.1-1988, IEEE Standard (IEEE12207.0-96) IEEE/EIA 12207.0-1996 //Dictionary of Measures to Produce Reliable Software, ISO/IEC12207:1995, Industry Implementation of Int. Std.IEEE, 1988. ISO/IEC 12207:95, Standard for Information Technology-(IEEE1008-87) IEEE Std 1008-1987 (R2003), IEEE Software Life Cycle Processes, IEEE, 1996.Standard for Software Unit Testing, IEEE, 1987. 5–16 © IEEE – 2004 Version
  • 89. CHAPTER 6 SOFTWARE MAINTENANCEACRONYMS BREAKDOWN OF TOPICS FOR SOFTWARE MAINTENANCECMMI Capability Maturity Model IntegrationICSM International Conference on Software The Software Maintenance KA breakdown of topics is Maintenance shown in Figure 1.SCM Software Configuration Management 1. Software Maintenance FundamentalsSQA Software Quality Assurance This first section introduces the concepts and terminologyV&V Verification and Validation that form an underlying basis to understanding the roleY2K Year 2000 and scope of software maintenance. The topics provide definitions and emphasize why there is a need forINTRODUCTION maintenance. Categories of software maintenance areSoftware development efforts result in the delivery of a critical to understanding its underlying meaning.software product which satisfies user requirements. 1.1. Definitions and TerminologyAccordingly, the software product must change or evolve. [IEEE1219-98:s3.1.12; IEEE12207.0-96:s3.1,s5.5;Once in operation, defects are uncovered, operatingenvironments change, and new user requirements surface. ISO14764-99:s6.1]The maintenance phase of the life cycle begins following Software maintenance is defined in the IEEE Standard fora warranty period or post-implementation support Software Maintenance, IEEE 1219, as the modification ofdelivery, but maintenance activities occur much earlier. a software product after delivery to correct faults, toSoftware maintenance is an integral part of a software life improve performance or other attributes, or to adapt thecycle. However, it has not, historically, received the same product to a modified environment. The standard alsodegree of attention that the other phases have. addresses maintenance activities prior to delivery of theHistorically, software development has had a much higher software product, but only in an information appendix ofprofile than software maintenance in most organizations. the standard.This is now changing, as organizations strive to squeeze The IEEE/EIA 12207 standard for software life cyclethe most out of their software development investment by processes essentially depicts maintenance as one of thekeeping software operating as long as possible. Concerns primary life cycle processes, and describes maintenanceabout the Year 2000 (Y2K) rollover focused significant as the process of a software product undergoingattention on the software maintenance phase, and the “modification to code and associated documentation dueOpen Source paradigm has brought further attention to to a problem or the need for improvement. The objectivethe issue of maintaining software artifacts developed by is to modify the existing software product whileothers. preserving its integrity.” ISO/IEC 14764, the internationalIn the Guide, software maintenance is defined as the standard for software maintenance, defines softwaretotality of activities required to provide cost-effective maintenance in the same terms as IEEE/EIA 12207 andsupport to software. Activities are performed during the emphasizes the pre-delivery aspects of maintenance,pre-delivery stage, as well as during the post-delivery planning, for example.stage. Pre-delivery activities include planning for post- 1.2. Nature of Maintenancedelivery operations, for maintainability, and for logistics [Pfl01:c11s11.2]determination for transition activities. Post-deliveryactivities include software modification, training, and Software maintenance sustains the software productoperating or interfacing to a help desk. throughout its operational life cycle. Modification requests are logged and tracked, the impact of proposedThe Software Maintenance KA is related to all other changes is determined, code and other software artifactsaspects of software engineering. Therefore, this KA are modified, testing is conducted, and a new version ofdescription is linked to all other chapters of the Guide. the software product is released. Also, training and daily support are provided to users. Pfleeger [Pfl01] states that “maintenance has a broader scope, with more to track and control” than development.© IEEE – 2004 Version 6-1
  • 90. A maintainer is defined by IEEE/EIA 12207 as an Maintainers can learn from the developer’s knowledge oforganization which performs maintenance activities the software. Contact with the developers and early[IEEE12207.0-96]. In this KA, the term will sometimes involvement by the maintainer helps reduce therefer to individuals who perform those activities, maintenance effort. In some instances, the softwarecontrasting them with the developers. engineer cannot be reached or has moved on to otherIEEE/EIA 12207 identifies the primary activities of tasks, which creates an additional challenge for thesoftware maintenance as: process implementation; maintainers. Maintenance must take the products of theproblem and modification analysis; modification development, code, or documentation, for example, andimplementation; maintenance review/acceptance; support them immediately and evolve/maintain themmigration; and retirement. These activities are discussed progressively over the software life cycle.in topic 3.2 Maintenance Activities. Software Maintenance Software Key Issues in Maintenance Techniques for Maintenance Software Process Maintenance Fundamentals Maintenance Definitions and Technical Maintenance Processes Program Comprehension Terminology Issues Nature of Management Maintenance Activities Re-engineering Maintenance Issues Reverse Engineering Need for Maintenance Maintenance Cost Estimation Majority of Software Maintenance Maintenance Costs Measurement Evolution of Soffware Categories of Maintenance Figure 1 Breakdown of topics for the Software Maintenance KA1.3. Need for Maintenance to corrective and non-corrective software actions. Maintenance must be performed in order to: [Pfl01:c11s11.2; Pig97: c2s2.3; Tak97:c1] Correct faultsMaintenance is needed to ensure that the softwarecontinues to satisfy user requirements. Maintenance is Improve the designapplicable to software developed using any software life Implement enhancementscycle model (for example, spiral). The system changes due 6-2 © IEEE – 2004 Version
  • 91. Interface with other systems maintenance decisions are aided by understanding what Adapt programs so that different hardware, happens to systems (and software) over time. Others state software, system features, and telecommunications that maintenance is continued development, except that facilities can be used there is an extra input (or constraint)–existing large software is never complete and continues to evolve. As it Migrate legacy software evolves, it grows more complex unless some action is taken Retire software to reduce this complexity.The maintainer’s activities comprise four key Since software demonstrates regular behavior and trends,characteristics, according to Pfleeger [Pfl01]: these can be measured. Attempts to develop predictive Maintaining control over the software’s day-to-day models to estimate maintenance effort have been made, functions and, as a result, useful management tools have been developed. [Art88], (Bel72) Maintaining control over software modification 1.6. Categories of Maintenance Perfecting existing functions [Art88:c1s1.2; Lie78; Dor02:v1c9s1.5; IEEE1219- Preventing software performance from degrading to unacceptable levels 98:s3.1.1,s3.1.2,s3.1.7,A.1.7; ISO14764-1.4. Majority of Maintenance Costs 99:s4.1,s4.3,s4.10, s4.11,s6.2; Pig97:c2s2.3] [Abr93:63-90; Pfl01:c11s11.3; Pig97:c3; Lientz & Swanson initially defined three categories of maintenance: corrective, adaptive, and perfective. [Lie78; Pre01:c30s2.1,c30s2.2] IEEE1219-98] This definition was later updated in theMaintenance consumes a major share of software life cycle Standard for Software Engineering-Software Maintenance,financial resources. A common perception of software ISO/IEC 14764 to include four categories, as follows:maintenance is that it merely fixes faults. However, studies Corrective maintenance: Reactive modification of aand surveys over the years have indicated that the majority, software product performed after delivery to correctover 80%, of the software maintenance effort is used for discovered problemsnon-corrective actions. [Abr93, Pig97, Pre01] Jones(Jon91) describes the way in which software maintenance Adaptive maintenance: Modification of a softwaremanagers often group enhancements and corrections product performed after delivery to keep a softwaretogether in their management reports. This inclusion of product usable in a changed or changingenhancement requests with problem reports contributes to environmentsome of the misconceptions regarding the high cost of Perfective maintenance: Modification of a softwarecorrections. Understanding the categories of software product after delivery to improve performance ormaintenance helps to understand the structure of software maintainabilitymaintenance costs. Also, understanding the factors that Preventive maintenance: Modification of a softwareinfluence the maintainability of a system can help to product after delivery to detect and correct latentcontain costs. Pfleeger [Pfl01] presents some of the faults in the software product before they becometechnical and non-technical factors affecting software effective faultsmaintenance costs, as follows: ISO/IEC 14764 classifies adaptive and perfective Application type maintenance as enhancements. It also groups together the Software novelty corrective and preventive maintenance categories into a Software maintenance staff availability correction category, as shown in Table 1. Preventive maintenance, the newest category, is most often performed Software life span on software products where safety is critical. Hardware characteristics Quality of software design, construction, Correction Enhancement documentation and testing Proactive Preventive Perfective1.5. Evolution of Software [Art88:c1s1.0,s1.1,s1.2,c11s1.1,s1.2; Leh97:108-124], Reactive Corrective Adaptive (Bel72) Table 1: Software maintenance categoriesLehman first addressed software maintenance andevolution of systems in 1969. Over a period of twenty 2. Key Issues in Software Maintenanceyears, his research led to the formulation of eight “Laws ofEvolution”. [Leh97] Key findings include the fact that A number of key issues must be dealt with to ensure themaintenance is evolutionary developments, and that effective maintenance of software. It is important to© IEEE – 2004 Version 6-3
  • 92. understand that software maintenance provides unique 2.1.3. Impact analysistechnical and management challenges for software [Art88:c3; Dor02:v1c9s1.10; Pfl01: c11s11.5]engineers. Trying to find a fault in software containing Impact analysis describes how to conduct, cost effectively,500K lines of code that the software engineer did not a complete analysis of the impact of a change in existingdevelop is a good example. Similarly, competing with software. Maintainers must possess an intimate knowledgesoftware developers for resources is a constant battle. of the software’s structure and content [Pfl01]. They usePlanning for a future release, while coding the next release that knowledge to perform impact analysis, whichand sending out emergency patches for the current release, identifies all systems and software products affected by aalso creates a challenge. The following section presents software change request and develops an estimate of thesome of the technical and management issues related to resources needed to accomplish the change. [Art88]software maintenance. They have been grouped under the Additionally, the risk of making the change is determined.following topic headings: The change request, sometimes called a modification Technical issues request (MR) and often called a problem report (PR), must Management issues first be analyzed and translated into software terms. [Dor02] It is performed after a change request enters the Cost estimation and software configuration management process. Arthur Measures [Art88] states that the objectives of impact analysis are:2.1. Technical Issues Determination of the scope of a change in order to2.1.1. Limited understanding plan and implement work [Dor02:v1c9s1.11.4; Pfl01:c11s11.3; Tak97:c3] Development of accurate estimates of resources needed to perform the workLimited understanding refers to how quickly a softwareengineer can understand where to make a change or a Analysis of the cost/benefits of the requested changecorrection in software which this individual did not Communication to others of the complexity of adevelop. Research indicates that some 40% to 60% of the given changemaintenance effort is devoted to understanding the The severity of a problem is often used to decide how andsoftware to be modified. Thus, the topic of software when a problem will be fixed. The software engineer thencomprehension is of great interest to software engineers. identifies the affected components. Several potentialComprehension is more difficult in text-oriented solutions are provided and then a recommendation is maderepresentation, in source code, for example, where it is as to the best course of action.often difficult to trace the evolution of software through itsreleases/versions if changes are not documented and when Software designed with maintainability in mind greatlythe developers are not available to explain it, which is often facilitates impact analysis. More information can be foundthe case. Thus, software engineers may initially have a in the Software Configuration Management KA.limited understanding of the software, and much has to be 2.1.4. Maintainabilitydone to remedy this. [ISO14764-99:s6.8s6.8.1; Pfl01: c9s9.4; Pig97:c16]2.1.2. Testing How does one promote and follow up on maintainability [Art88:c9; Pfl01:c11s11.3] issues during development? The IEEE [IEEE610.12-90]The cost of repeating full testing on a major piece of defines maintainability as the ease with which software cansoftware can be significant in terms of time and money. be maintained, enhanced, adapted, or corrected to satisfyRegression testing, the selective retesting of a software or specified requirements. ISO/IEC defines maintainability ascomponent to verify that the modifications have not caused one of the quality characteristics (ISO9126-01).unintended effects, is important to maintenance. As well, Maintainability sub-characteristics must be specified,finding time to test is often difficult. There is also the reviewed, and controlled during the software developmentchallenge of coordinating tests when different members of activities in order to reduce maintenance costs. If this isthe maintenance team are working on different problems at done successfully, the maintainability of the software willthe same time. [Plf01] When software performs critical improve. This is often difficult to achieve because thefunctions, it may be impossible to bring it offline to test. maintainability sub-characteristics are not an importantThe Software Testing KA provides additional information focus during the software development process. Theand references on the matter in its sub-topic 2.2.6 developers are preoccupied with many other things andRegression testing. often disregard the maintainer’s requirements. This in turn can, and often does, result in a lack of system documentation, which is a leading cause of difficulties in program comprehension and impact analysis. It has also been observed that the presence of systematic and mature 6-4 © IEEE – 2004 Version
  • 93. processes, techniques, and tools helps to enhance the is chosen to ensure that the software runs properly andmaintainability of a system. evolves to satisfy changing user needs. Since there are2.2. Management Issues many pros and cons to each of these options [Par86, Pig97], the decision should be made on a case-by-case2.2.1. Alignment with organizational objectives basis. What is important is the delegation or assignment of [Ben00:c6sa; Dor02:v1c9s1.6] the maintenance responsibility to a single group or personOrganizational objectives describe how to demonstrate the [Pig97], regardless of the organization’s structure.return on investment of software maintenance activities. 2.2.5. OutsourcingBennett [Ben00] states that “initial software development is [Dor02:v1c9s1.7; Pig97:c9s9.1,s9.2], (Car94;usually project-based, with a defined time scale and budget. McC02)The main emphasis is to deliver on time and within budgetto meet user needs. In contrast, software maintenance often Outsourcing of maintenance is becoming a major industry.has the objective of extending the life of software for as Large corporations are outsourcing entire portfolios oflong as possible. In addition, it may be driven by the need software systems, including software maintenance. Moreto meet user demand for software updates and enhance- often, the outsourcing option is selected for less mission-ments. In both cases, the return on investment is much less critical software, as companies are unwilling to lose controlclear, so that the view at senior management level is often of the software used in their core business. Carey (Car94)of a major activity consuming significant resources with no reports that some will outsource only if they can find waysclear quantifiable benefit for the organization.” of maintaining strategic control. However, control measures are hard to find. One of the major challenges for2.2.2. Staffing the outsourcers is to determine the scope of the [Dek92:10-17; Dor02:v1c9s1.6; Par86: c4s8-c4s11] maintenance services required and the contractual details. (Lie81) McCracken (McC02) states that 50% of outsourcersStaffing refers to how to attract and keep software provide services without any clear service-level agreement.maintenance staff. Maintenance is often not viewed as Outsourcing companies typically spend a number ofglamorous work. Deklava provides a list of staffing-related months assessing the software before they will enter into aproblems based on survey data. [Dek92] As a result, contractual relationship. [Dor02] Another challengesoftware maintenance personnel are frequently viewed as identified is the transition of the software to the outsourcer.“second-class citizens” (Lie81) and morale therefore [Pig97]suffers. [Dor02] 2.3. Maintenance Cost Estimation2.2.3. Process Software engineers must understand the different [Pau93; Ben00:c6sb; Dor02:v1c9s1.3] categories of software maintenance, discussed above, in order to address the question of estimating the cost ofSoftware process is a set of activities, methods, practices, software maintenance. For planning purposes, estimatingand transformations which people use to develop and costs is an important aspect of software maintenance.maintain software and the associated products. [Pau93] Atthe process level, software maintenance activities share 2.3.1. Cost estimationmuch in common with software development (for example, [Art88:c3; Boe81:c30; Jon98:c27; Pfl01:c11s11.3;software configuration management is a crucial activity in Pig97:c8]both). [Ben00] Maintenance also requires several activities It was mentioned in sub-topic 2.1.3, Impact Analysis, thatwhich are not found in software development (see section impact analysis identifies all systems and software products3.2 on unique activities for details). These activities present affected by a software change request and develops anchallenges to management. [Dor02] estimate of the resources needed to accomplish that change.2.2.4. Organizational aspects of maintenance [Art88] [Pfl01:c12s12.1-c12s12.3; Par86:c4s7; Maintenance cost estimates are affected by many technical Pig97:c2s2.5; Tak97:c8] and non-technical factors. ISO/IEC14764 states that “theOrganizational aspects describe how to identify which two most popular approaches to estimating resources fororganization and/or function will be responsible for the software maintenance are the use of parametric models andmaintenance of software. The team that develops the the use of experience” [ISO14764-99:s7.4.1]. Most often, asoftware is not necessarily assigned to maintain the combination of these is used.software once it is operational. 2.3.2. Parametric modelsIn deciding where the software maintenance function will [Ben00:s7; Boe81:c30; Jon98:c27; Pfl01:c11s11.3]be located, software engineering organizations may, for Some work has been undertaken in applying parametricexample, stay with the original developer or go to a cost modeling to software maintenance. [Boe81, Ben00] Ofseparate team (or maintainer). Often, the maintainer option© IEEE – 2004 Version 6-5
  • 94. significance is that data from past projects are needed in That list includes a number of measures for each of the fourorder to use the models. Jones [Jon98] discusses all aspects sub-characteristics of maintainability:of estimating costs, including function points Analyzability: Measures of the maintainer’s effort or(IEEE14143.1-00), and provides a detailed chapter on resources expended in trying to diagnosemaintenance estimation. deficiencies or causes of failure, or in identifying2.3.3. Experience parts to be modified [ISO14764-00:s7,s7.2,s7.2.1,s7.2.4; Pig97:c8; Changeability: Measures of the maintainer’s effort Sta94] associated with implementing a specifiedExperience, in the form of expert judgment (using the modificationDelphi technique, for example), analogies, and a work Stability: Measures of the unexpected behavior ofbreakdown structure, are several approaches which should software, including that encountered during testingbe used to augment data from parametric models. Clearly Testability: Measures of the maintainer’s and users’the best approach to maintenance estimation is to combine effort in trying to test the modified softwareempirical data and experience. These data should beprovided as a result of a measurement program. Certain measures of the maintainability of software can be obtained using available commercial tools. (Lag96; Apr00)2.4. Software Maintenance Measurement [IEEE1061-98:A.2; Pig97:c14s14.6; Gra87 ; Tak97: 3. Maintenance Process c6s6.1-c6s6.3] The Maintenance Process subarea provides references andGrady and Caswell [Gra87] discuss establishing a standards used to implement the software maintenancecorporate-wide software measurement program, in which process. The Maintenance Activities topic differentiatessoftware maintenance measurement forms and data maintenance from development and shows its relationshipcollection are described. The Practical Software and to other software engineering activities.Systems Measurement (PSM) project describes an issue-driven measurement process that is used by many The need for software engineering process is wellorganizations and is quite practical. [McG01] documented. CMMI models apply to software maintenance processes, and are similar to the developers’There are software measures that are common to all processes. [SEI01] Software Maintenance Capabilityendeavors, the following categories of which the Software Maturity models which address the unique processes ofEngineering Institute (SEI) has identified: size; effort; software maintenance are described in (Apr03, Nie02,schedule; and quality. [Pig97] These measures constitute a Kaj01).good starting point for the maintainer. Discussion ofprocess and product measurement is presented in the 3.1. Maintenance ProcessesSoftware Engineering Process KA. The software [IEEE1219-98:s4; ISO14764-99:s8; IEEE12207.0-measurement program is described in the Software 96:s5.5; Par86:c7s1; Pig97:c5; Tak97:c2]Engineering Management KA. Maintenance processes provide needed activities and2.4.1. Specific Measures detailed inputs/outputs to those activities, and are described [Car90:s2-s3; IEEE1219-98:Table3; Sta94:p239- in software maintenance standards IEEE 1219 and ISO/IEC 249] 14764. The maintenance process model described in the StandardAbran [Abr93] presents internal benchmarking techniques for Software Maintenance (IEEE1219) starts with theto compare different internal maintenance organizations. software maintenance effort during the post-delivery stageThe maintainer must determine which measures are and discusses items such as planning for maintenance. Thatappropriate for the organization in question. [IEEE1219- process is depicted in Figure 2.98; ISO9126-01; Sta94] suggests measures which are morespecific to software maintenance measurement programs. 6-6 © IEEE – 2004 Version
  • 95. Figure 2 The IEEE1219-98 Maintenance Process ActivitiesISO/IEC 14764 [ISO14764-99] is an elaboration of the Process ImplementationIEEE/EIA 12207.0-96 maintenance process. The activities Problem and Modification Analysisof the ISO/IEC maintenance process are similar to those ofthe IEEE, except that they are aggregated a little Modification Implementationdifferently. The maintenance process activities developed Maintenance Review/Acceptanceby ISO/IEC are shown in Figure 3. Migration Process Software Retirement Implementation Takang & Grubb [Tak97] provide a history of maintenance process models leading up to the development of the IEEE and ISO/IEC process models. Parikh [Par86] also gives a good overview of a generic maintenance process. Recently, agile methodologies have been emerging which promote light processes. This requirement emerges from the ever- increasing demand for fast turn-around of maintenance Problem and Maintenance services. Some experiments with Extreme maintenance are Modification Review/ presented in (Poo01). Analysis Acceptance 3.2. Maintenance Activities As already noted, many maintenance activities are similar to those of software development. Maintainers perform Modification analysis, design, coding, testing, and documentation. They Implementation must track requirements in their activities just as is done in development, and update documentation as baselines change. ISO/IEC14764 recommends that, when a maintainer refers to a similar development process, he must adapt it to meet his specific needs [ISO14764-99:s8.3.2.1, Migration 2]. However, for software maintenance, some activities Retirement involve processes unique to software maintenance.Figure 3 ISO/IEC 14764-00 Software Maintenance 3.2.1. Unique activitiesProcess [Art88:c3; Dor02:v1c9s1.9.1; IEEE1219- 98:s4.1,s4.2; ISO14764-99:s8.2.2.1,s8.3.2.1; Pfl01:c11s11.2]Each of the ISO/IEC 14764 primary software maintenanceactivities is further broken down into tasks, as follows. There are a number of processes, activities, and practices that are unique to software maintenance, for example:© IEEE – 2004 Version 6-7
  • 96. Transition: a controlled and coordinated sequence of Assess the risk of a given release and develop a activities during which software is transferred back-out plan in case problems should arise progressively from the developer to the maintainer Inform all the stakeholders [Dek92, Pig97] Whereas software development projects can typically last Modification Request Acceptance/Rejection: from some months to a few of years, the maintenance phase modification request work over a certain usually lasts for many years. Making estimates of resources size/effort/complexity may be rejected by is a key element of maintenance planning. Those resources maintainers and rerouted to a developer [Dor02], should be included in the developers’ project planning (Apr01) budgets. Software maintenance planning should begin with Modification Request and Problem Report Help the decision to develop a new system and should consider Desk: an end-user support function that triggers the quality objectives (IEEE1061-98). A concept document assessment, prioritization, and costing of should be developed, followed by a maintenance plan. modification requests [Ben00] The concept document for maintenance [ISO14764- Impact Analysis (see section 2.1.3 for details) 99:s7.2] should address: Software Support: help and advice to users The scope of the software maintenance concerning a request for information (for example, Adaptation of the software maintenance process business rules, validation, data meaning and ad-hoc requests/reports) (Apr03) Identification of the software maintenance organization Service Level Agreements (SLAs) and specialized (domain-specific) maintenance contracts which are An estimate of software maintenance costs the responsibility of the maintainers (Apr01) The next step is to develop a corresponding software maintenance plan. This plan should be prepared during3.2.2. Supporting activities software development, and should specify how users will [IEEE1219-98:A.7,A.11; IEEE12207.0-96:c6,c7; request software modifications or report problems. ITI01; Pig97:c10s10.2,c18] ;(Kaj01) Software maintenance planning [Pig97] is addressed inMaintainers may also perform supporting activities, such as IEEE 1219 [IEEE1219-98] and ISO/IEC 14764.software maintenance planning, software configuration [ISO14764-99] ISO/IEC14764 provides guidelines for amanagement, verification and validation, software quality maintenance plan.assurance, reviews, audits, and user training. Finally, at the highest level, the maintenance organizationAnother supporting activity, maintainer training, is also will have to conduct business planning activitiesneeded. [Pig97; IEEE12207.0-96] (Kaj01) (budgetary, financial, and human resources) just like all the other divisions of the organization. The management3.2.3. Maintenance planning activity knowledge required to do so can be found in the Related [IEEE1219-98:A.3; ISO14764-99:s7; ITI01; Disciplines of Software Engineering chapter. Pig97:c7,c8] 3.2.4. Software configuration managementAn important activity for software maintenance is planning, [Art88:c2,c10; IEEE1219-98:A.11; IEEE12207.0-and maintainers must address the issues associated with a 96:s6.2; Pfl01:c11s11.5; Tak97:c7]number of planning perspectives: Business planning (organizational level) The IEEE Standard for Software Maintenance, IEEE 1219 [IEEE1219-98], describes software configuration Maintenance planning (transition level) management as a critical element of the maintenance Release/version planning (software level) process. Software configuration management procedures Individual software change request planning should provide for the verification, validation, and audit of (request level) each step required to identify, authorize, implement, and release the software product.At the individual request level, planning is carried outduring the impact analysis (refer to sub-topic 2.1.3 Impact It is not sufficient to simply track Modification Requests orAnalysis for details). The release/version planning activity Problem Reports. The software product and any changesrequires that the maintainer [ITI01]: made to it must be controlled. This control is established by implementing and enforcing an approved software Collect the dates of availability of individual configuration management (SCM) process. The Software requests Configuration Management KA provides details of SCM Agree with users on the content of subsequent and discusses the process by which software change releases/versions requests are submitted, evaluated, and approved. SCM for Identify potential conflicts and develop alternatives software maintenance is different from SCM for software 6-8 © IEEE – 2004 Version
  • 97. development in the number of small changes that must be 4.2. Reengineeringcontrolled on operational software. The SCM process is [Arn92:c1,c3-c6; Dor02:v1c9s1.11.4; IEEE1219-98:implemented by developing and following a configuration B.2], (Fow99)management plan and operating procedures. Maintainers Reengineering is defined as the examination and alterationparticipate in Configuration Control Boards to determine of software to reconstitute it in a new form, and includesthe content of the next release/version. the subsequent implementation of the new form. Dorfman3.2.5. Software quality and Thayer [Dor02] state that reengineering is the most [Art98:c7s4; IEEE12207.0-96:s6.3; IEEE1219- radical (and expensive) form of alteration. Others believe 98:A.7; ISO14764-99:s5.5.3.2] that reengineering can be used for minor changes. It is often not undertaken to improve maintainability, but toIt is not sufficient, either, to simply hope that increased replace aging legacy software. Arnold [Arn92] provides aquality will result from the maintenance of software. It comprehensive compendium of topics, for example:must be planned and processes implemented to support the concepts, tools and techniques, case studies, and risks andmaintenance process. The activities and techniques for benefits associated with reengineering.Software Quality Assurance (SQA), V&V, reviews, andaudits must be selected in concert with all the other 4.3. Reverse engineeringprocesses to achieve the desired level of quality. It is also [Arn92:c12; Dor02:v1c9s1.11.3; IEEE1219-98:B.3;recommended that the maintainer adapt the software Tak97:c4, Hen01]development processes, techniques and deliverables, for Reverse engineering is the process of analyzing software toinstance testing documentation, and test results. identify the software’s components and their inter-[ISO14764-99] relationships and to create representations of the softwareMore details can be found in the Software Quality KA. in another form or at higher levels of abstraction. Reverse engineering is passive; it does not change the software, or4. Techniques for Maintenance result in new software. Reverse engineering efforts produce call graphs and control flow graphs from source code. OneThis subarea introduces some of the generally accepted type of reverse engineering is redocumentation. Anothertechniques used in software maintenance. type is design recovery [Dor02]. Refactoring is program4.1. Program Comprehension transformation which reorganizes a program without [Arn92:c14; Dor02:v1c9s1.11.4; Tak97:c3] changing its behavior, and is a form of reverse engineeringProgrammers spend considerable time in reading and that seeks to improve program structure. (Fow99)understanding programs in order to implement changes. Finally, data reverse engineering has gained in importanceCode browsers are key tools for program comprehension. over the last few years where logical schemas are recoveredClear and concise documentation can aid in program from physical databases. (Hen01)comprehension.© IEEE – 2004 Version 6-9
  • 98. MATRIX OF TOPICS VS. REFERENCE MATERIAL [Pfl01] [Pig97] [Sta94] [Pre04] [Art88] [Jon98] [Par86] [Boe81] [Car90] [Dor97] [Ben00] [Leh97] [Abr93] [Arn92] [Dek92] [Tak97] [Hen01] [ISO14764-00] [IEEE1061-98] [IEEE1219-98] [IEEE610.12-90] [IEEE12207.0-96]1. Software Maintenance Fundamentals1.1 Definitions and 63-90 s4 s3.1.12 s6.1Terminology1.2 Nature of Maintenance s3, s5.5 c11s11.21.3 Need for Maintenance c11s11.2 c2s2.3 c11.4 Majority of Maintenance 63-90 c11s11.3 c3 C31Costs c1s1.0- c1s1.2,1.5 Evolution of Software c11s1.1, 108-124 c11s1.2 s3.1.1, s4.1, s3.1.2, s4.3s4.11.6 Categories of Maintenance 63-90 c1s1.2 s5 v1c9s1.5 c2s2.3 s3.1.7, 0, s4.11, A.1.7 s6.22. Key Issues in Software Maintenance2.1 Technical Issues s6c v1c9s1.1 Limited understanding s11.4 c11s11.3 c3 1.4 Testing c9 c11s11.3 v1c9s1.1 s10.1, 0.1- Impact analysis c3 s10.2, c11s11.5 v1c9s1.1 s10.3 0.3 s5, s9.3, s6.8, Maintainability s3 c9s9.4 c16 s11.4 s6.8.12.2 Management Issues Alignment with s6a v1c9s1.6 organizational objectives c4s8- Staffing 10-17 v1c9s1.6 c4s11 Process s6b v1c9s1.3 c12s12.1 Organizational aspects of s6a, s7 c4s7 - c2s2.5 c8 maintenance c12s12.3 c9s9.1- Outsourcing s7 v1c9s1.7 c9s9.22.3 Maintenance Cost Estimation Cost estimation c3 s7 c30 c27 c11s11.3 c8 Parametric models s7 c30 c27 c11s11.3 s7,s7.2, Experience s7.2.1, c8 s7.2.4 6-10 © IEEE – 2004 Version
  • 99. s9.1, c6s6.1-2.4 Measures 63-90 A.2 s9.2, c14s14.6 c6s6.3 s10.1 Specific Measures 63-90 s9.2 s2-s3 Table 3 239-2493. Maintenance Process3.1 Maintenance Processes s8 s4 s8 c7s1 c5 c23.2 Maintenance Activities s9.1 s5.5 v1c9s1.9 s8.2.2.1, Unique Activities 63-90 c3 s6b s4.1-s4.2 c11s11.2 .1 s8.3.2.1 c10s10.2 Supporting Activities A.7,A.11 c6,c7 , c18 Maintenance Planning A.3 s7 c7,c8 Activity s2, s6b, Software Configuration s9.2, c2,c10 A.11 s6.2 c11s11.5 c7 Management s10.1, s11.4 s7, s8, s9.2, Software Quality 63-90 c7s4 s9.3, A.7 s6.3 s5.5.3.2 s11.4, s11.54. Techniques for Maintenance v1c9s1.14.1 Program Comprehension c14 s11.4 c3 1.4 s9.2, v1c9s1.14.2 Re-engineering c1,c3-c6 s11.4, B.2 1.4 s11.5 s9.2, s10.3, v1c9s1.14.3 Reverse Engineering c12 s11.2, * B.3 c4 1.3 s11.3, s11.5© IEEE – 2004 Version 6-11
  • 100. RECOMMENDED REFERENCES FOR SOFTWARE [IEEE12207.0-96] IEEE/EIA 12207.0-1996//ISO/MAINTENANCE IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95, Standard for Information Technology- [Abr93] A. Abran and H. Nguyenkim, “Measurement of Software Life Cycle Processes, IEEE, 1996.the Maintenance Process from a Demand-Based [ISO9126-01] ISO/IEC 9126-1:2001, SoftwarePerspective,” Journal of Software Maintenance: Research Engineering-Product Quality-Part 1: Quality Model, ISOand Practice, vol. 5, iss. 2, 1993, pp. 63-90. and IEC, 2001.[Arn93] R.S. Arnold, Software Reengineering: IEEE [ISO14764-99] ISO/IEC 14764-1999, SoftwareComputer Society Press, 1993. Engineering-Software Maintenance, ISO and IEC, 1999.[Art98] L.J. Arthur, Software Evolution: The Software [ITI01] IT Infrastructure Library, “Service Delivery andMaintenance Challenge, John Wiley & Sons, 1988. Service Support,” Stationary Office, Office of Government[Ben00] K.H. Bennett, “Software Maintenance: A of Commerce, 2001.Tutorial,” in Software Engineering, M. Dorfman and R. [Jon98] T.C. Jones, Estimating Software Costs, McGraw-Thayer, eds., IEEE Computer Society Press, 2000. Hill, 1998.[Boe81] B.W. Boehm, Software Engineering Economics, [Leh97] M.M. Lehman, “Laws of Software EvolutionPrentice Hall, 1981. Revisited,” presented at EWSPT96, 1997.[Car90] D.N. Card and R.L. Glass, Measuring Software [Lie78] B. Lienz, E.B. Swanson, and G.E. Tompkins,Design Quality, Prentice Hall, 1990. “Characteristics of Applications Software Maintenance,”[Dek92] S. Dekleva, “Delphi Study of Software Communications of the ACM, vol. 21, 1978.Maintenance Problems,” presented at the International [Par86] G. Parikh, Handbook of Software Maintenance,Conference on Software Maintenance, 1992. John Wiley & Sons, 1986.[Dor02] M. Dorfman and R.H. Thayer, eds., Software [Pfl01] S.L. Pfleeger, Software Engineering: Theory andEngineering (Vol. 1 & Vol. 2), IEEE Computer Society Practice, second ed., Prentice Hall, 2001.Press, 2002. [Pig97] T.M. Pigoski, Practical Software Maintenance:[Gra87] R.B. Grady and D.L. Caswell, Software Metrics: Best Practices for Managing your Software Investment,Establishing a Company-Wide Program, Prentice Hall, first ed., John Wiley & Sons, 1997.1987. [Pre04] R.S. Pressman, Software Engineering: A[Hen01] J. Henrard and J.-L. Hainaut, “Data Dependency Practitioner’s Approach, sixth ed., McGraw-Hill, 2004.Elicitation in Database Reverse Engineering,” Proc. of the5th European Conference on Software Maintenance and [SEI01] Software Engineering Institute, “CapabilityReengineering (CSMR 2001), IEEE Computer Society Maturity Model Integration, v1.1,” CMU/SEI-2002-TR-Press, 2001. 002, ESC-TR-2002-002, December 2001.[IEEE610.12-90] IEEE Std 610.12-1990 (R2002), IEEE [Sta94] G.E. Stark, L.C. Kern, and C.V. Vowell, “AStandard Glossary of Software Engineering Terminology, Software Metric Set for Program MaintenanceIEEE, 1990. Management,” Journal of Systems and Software, vol. 24, iss. 3, March 1994.[IEEE1061-98] IEEE Std 1061-1998, IEEE Standard for aSoftware Quality Metrics Methodology, IEEE, 1998. [Tak97] A. Takang and P. Grubb, Software Maintenance Concepts and Practice, International Thomson Computer[IEEE1219-98] IEEE Std 1219-1998, IEEE Standard for Press, 1997.Software Maintenance, IEEE, 1998. 6-12 © IEEE – 2004 Version
  • 101. APPENDIX A. LIST OF FURTHER READINGS Press, 2002. (Fow99) M. Fowler et al., Refactoring: Improving the(Abr93) A. Abran, “Maintenance Productivity & Quality Design of Existing Code, Addison-Wesley, 1999.Studies: Industry Feedback on Benchmarking,” presentedat the Software Maintenance Conference (ICSM93), 1993. (Gra87) R.B. Grady and D.L. Caswell, Software Metrics: Establishing a Company-Wide Program, Prentice Hall,(Apr00) A. April and D. Al-Shurougi, “Software Product 1987.Measurement for Supplier Evaluation,” presented atFESMA2000, 2000. (Gra92) R.B. Grady, Practical Software Metrics for Project Management and Process Management, Prentice Hall,(Apr01) A. April, J. Bouman, A. Abran, and D. Al- 1992.Shurougi, “Software Maintenance in a Service LevelAgreement: Controlling the Customer’s Expectations,” (Jon91) C. Jones, Applied Software Measurement,presented at European Software Measurement Conference, McGraw-Hill, 1991.2001. (Kaj01) M. Kajko-Mattson, “Motivating the Corrective(Apr03) A. April, A. Abran, and R. Dumke, “Software Maintenance Maturity Model (Cm3),” presented at SeventhMaintenance Capability Maturity Model (SM-CMM): International Conference on Engineering of ComplexProcess Performance Measurement,” presented at 13th Systems, 2001.International Workshop on Software Measurement (IWSM (Kaj01a) M. Kajko-Mattson, S. Forssander, and U. Olsson,2003), 2003. “Corrective Maintenance Maturity Model: Maintainer’s(Bas85) V.R. Basili, “Quantitative Evaluation of Software Education and Training,” presented at InternationalMethodology,” presented at First Pan-Pacific Computer Conference on Software Engineering, 2001.Conference, 1985. (Kho95) T.M. Khoshgoftaar, R.M. Szabo, and J.M. Voas,(Bel72) L. Belady and M.M. Lehman, “An Introduction to “Detecting Program Module with Low Testability,”Growth Dynamics,” Statistical Computer Performance presented at the International Conference on SoftwareEvaluation, W. Freiberger, ed., Academic Press, 1972. Maintenance-1995, 1995.(Ben00) K.H. Bennett and V.T. Rajlich, “Software (Lag96) B. Laguë and A. April, “Mapping for the ISO9126Maintenance and Evolution: A Roadmap,” The Future of Maintainability Internal Metrics to an Industrial ResearchSoftware Engineering, A. Finklestein, ed., ACM Press, Tool,” presented at SESS, 1996.2000. (Leh85) M.M. Lehman and L.A. Belady, Program(Bol95) C. Boldyreff, E. Burd, R. Hather, R. Mortimer, M. Evolution: Processes of Software Change, Academic Press,Munro, and E. Younger, “The AMES Approach to 1985.Application Understanding: A Case Study,” presented at (Leh97) M.M. Lehman, “Laws of Software Evolutionthe International Conference on Software Maintenance, Revisited,” presented at EWSPT96, 1997.1995. (Lie81) B.P. Lientz and E.B. Swanson, “Problems in(Boo94) G. Booch and D. Bryan, Software Engineering Application Software Mainteanance,” Communications ofwith Ada, third ed., Benjamin/Cummings, 1994. the ACM, vol. 24, iss. 11, 1981, pp. 763-769.(Cap94) M.A. Capretz and M. Munro, “Software (McC02) B. McCracken, “Taking Control of ITConfiguration Management Issues in the Maintenance of Performance,” presented at InfoServer LLC, 2002.Existing Systems,” Journal of Software Maintenance: (Nie02) F. Niessink, V. Clerk, and H. v. Vliet, “The ITResearch and Practice, vol. 6, iss. 2, 1994. Capability Maturity Model,” release L2+3-0.3 draft, 2002,(Car92) J. Cardow, “You Can’t Teach Software available at http://www.itservicecmm.org/doc/itscmm-123-Maintenance!” presented at the Sixth Annual Meeting and 0.3.pdf.Conference of the Software Management Association, (Oma91) P.W. Oman, J. Hagemeister, and D. Ash, “A1992. Definition and Taxonomy for Software Maintainability,”(Car94) D. Carey, “Executive Round-Table on Business University of Idaho, Software Engineering Test LabIssues in Outsourcing - Making the Decision,” CIO Technical Report 91-08 TR, November 1991.Canada, June/July 1994. (Oma92) P. Oman and J. Hagemeister, “Metrics for(Dor97) M. Dorfman and R.H. Thayer, eds., “Software Assessing Software System Maintainability,” presented atEngineering,” IEEE Computer Society Press, 1997. the International Conference on Software Maintenance ’92,(Dor02) M. Dorfman and R.H. Thayer, eds., Software 1992. (Pig93) T.M. Pigoski, “Maintainable Software: Why YouEngineering (Vol. 1 & Vol. 2), IEEE Computer Society Want It and How to Get It,” presented at the Third© IEEE – 2004 Version 6-13
  • 102. Software Engineering Research Forum - November 1993, (Sch99) S.R. Schach, Classical and Object-OrientedUniversity of West Florida Press, 1993. Software Engineering with UML and C++, McGraw-Hill,(Pig94) T.M. Pigoski, “Software Maintenance,” 1999.Encyclopedia of Software Engineering, John Wiley & (Sch97) S.L. Schneberger, Client/Server SoftwareSons, 1994. Maintenance, McGraw-Hill, 1997.(Pol03) M. Polo, M. Piattini, and F. Ruiz, eds., “Advances (Sch87) N.F. Schneidewind, “The State of Softwarein Software Maintenance Management: Technologies and Maintenance,” Proceedings of the IEEE, 1987.Solutions,” Idea Group Publishing, 2003. (Som96) I. Sommerville, Software Engineering, fifth ed.,(Poo00) C. Poole and W. Huisman, “Using Extreme Addison-Wesley, 1996.Programming in a Maintenance Environment,” IEEE (Val94) J.D. Vallett, S.E. Condon, L. Briand, Y.M. Kim,Software, vol. 18, iss. 6, November/December 2001, pp. and V.R. Basili, “Building on Experience Factory for42-50. Maintenance,” presented at the Software Engineering(Put97) L.H. Putman and W. Myers, “Industrial Strength Workshop, Software Engineering Laboratory, 1994.Software - Effective Management Using Measurement,”1997. 6-14 © IEEE – 2004 Version
  • 103. APPENDIX A. LIST OF STANDARDS(IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEEStandard Glossary of Software Engineering Terminology,IEEE, 1990.(IEEE1061-98) IEEE Std 1061-1998, IEEE Standard for aSoftware Quality Metrics Methodology, IEEE, 1998.(IEEE1219-98) IEEE Std 1219-1998, IEEE Standard forSoftware Maintenance, IEEE, 1998.(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std.ISO/IEC 12207:95, Standard for Information Technology -Software Life Cycle Processes, IEEE, 1996.(IEEE14143.1-00) IEEE Std 14143.1-2000//ISO/IEC14143-1:1998, Information Technology - SoftwareMeasurement-Functional Size Measurement - Part 1:Definitions of Concepts, IEEE, 2000.(ISO9126-01) ISO/IEC 9126-1:2001, SoftwareEngineering-Product Quality - Part 1: Quality Model, ISOand IEC, 2001.(ISO14764-99) ISO/IEC 14764-1999, SoftwareEngineering - Software Maintenance, ISO and IEC, 1999.(ISO15271-98) ISO/IEC TR 15271:1998, InformationTechnology - Guide for ISO/IEC 12207, (Software LifeCycle Process), ISO and IEC, 1998. [Abr93]© IEEE – 2004 Version 6-15
  • 104. 6-16 © IEEE – 2004 Version
  • 105. CHAPTER 7 SOFTWARE CONFIGURATION MANAGEMENT Software configuration management (SCM) is a supportingACRONYMS software life cycle process (IEEE12207.0-96) which benefits project management, development andCCB Configuration Control Board maintenance activities, assurance activities, and theCM Configuration Management customers and users of the end product.FCA Functional Configuration Audit The concepts of configuration management apply to all items to be controlled, although there are some differencesMTBF Mean Time Between Failures in implementation between hardware CM and softwarePCA Physical Configuration Audit CM.SCCB Software Configuration Control Board SCM is closely related to the software quality assuranceSCI Software Configuration Item (SQA) activity. As defined in the Software Quality KA, SQA processes provide assurance that the softwareSCM Software Configuration Management products and processes in the project life cycle conform toSCMP Software Configuration Management Plan their specified requirements by planning, enacting, andSCR Software Change Request performing a set of activities to provide adequate confidence that quality is being built into the software.SCSA Software Configuration Status Accounting SCM activities help in accomplishing these SQA goals. InSEI/CMMI Software Engineering Institute’s Capability some project contexts (see, for example, IEEE730-02), Maturity Model Integration specific SQA requirements prescribe certain SCMSQA Software Quality Assurance activities.SRS Software Requirement Specification The SCM activities are: management and planning of the SCM process, software configuration identification,USNRC U.S. Nuclear Regulatory Commission software configuration control, software configuration status accounting, software configuration auditing, andINTRODUCTION software release management and delivery. Figure 1 shows a stylized representation of these activities.A system can be defined as a collection of components Coordination of Change Activities (“Code Management”)organized to accomplish a specific function or set of Authorization of Changes Supportsfunctions (IEEE 610.12-90). The configuration of a system (Should changes be made?) Customer Maintenance Teamis the functional and/or physical characteristics of Status for: Project Managementhardware, firmware, or software, or a combination of these, Product Assurance Development Team Physical & Functionalas set forth in technical documentation and achieved in a Completenessproduct. (Buc96) It can also be thought of as a collection Mgmt. & Control Status Release Auditingof specific versions of hardware, firmware, or software Planning Management Accounting Processingitems combined according to specific build procedures to Developmentserve a particular purpose. Configuration management SCMP Team(CM), then, is the discipline of identifying the Configuration Identificationconfiguration of a system at distinct points in time for thepurpose of systematically controlling changes to the Figure 1. SCM Activitiesconfiguration, and maintaining the integrity and traceabilityof the configuration throughout the system life cycle. The Software Configuration Management KA is related to(Ber97) It is formally defined (IEEE610.12-90) as all the other KAs, since the object of configuration“A discipline applying technical and administrative management is the artifact produced and used throughoutdirection and surveillance to: identify and document the the software engineering process.functional and physical characteristics of a configurationitem, control changes to those characteristics, record andreport change processing and implementation status, andverify compliance with specified requirements.”© IEEE – 2004 Version 7–1
  • 106. program. Managing nonconforming items is usually theBREAKDOWN OF TOPICS FOR SCM responsibility of the quality assurance activity; however, SCM might assist with tracking and reporting on software configuration items falling into this category.1. Management of the SCM Process Perhaps the closest relationship is with the softwareSCM controls the evolution and integrity of a product by development and maintenance organizations.identifying its elements, managing and controlling change,and verifying, recording, and reporting on configuration It is within this context that many of the softwareinformation. From the software engineer’s perspective, configuration control tasks are conducted. Frequently, theSCM facilitates development and change implementation same tools support development, maintenance, and SCMactivities. A successful SCM implementation requires purposes.careful planning and management. This, in turn, requires an 1.2. Constraints and Guidance for the SCM Processunderstanding of the organizational context for, and the [Ber92:c5; IEEE828-98:c4s1,c4s2.3; Moo98]constraints placed on, the design and implementation of the Constraints affecting, and guidance for, the SCM processSCM process. come from a number of sources. Policies and procedures1.1. Organizational Context for SCM set forth at corporate or other organizational levels might [Ber92 :c4; Dar90:c2; IEEE828-98:c4s2.1] influence or prescribe the design and implementation of theTo plan an SCM process for a project, it is necessary to SCM process for a given project. In addition, the contractunderstand the organizational context and the relationships between the acquirer and the supplier might containamong the organizational elements. SCM interacts with provisions affecting the SCM process. For example, certainseveral other activities or organizational elements. configuration audits might be required, or it might be specified that certain items be placed under CM. WhenThe organizational elements responsible for the software software products to be developed have the potential toengineering supporting processes may be structured in affect public safety, external regulatory bodies may imposevarious ways. Although the responsibility for performing constraints (see, for example, USNRC1.169-97). Finally,certain SCM tasks might be assigned to other parts of the the particular software life cycle process chosen for aorganization such as the development organization, the software project and the tools selected to implement theoverall responsibility for SCM often rests with a distinct software affect the design and implementation of the SCMorganizational element or designated individual. process. [Ber92]Software is frequently developed as part of a larger system Guidance for designing and implementing an SCM processcontaining hardware and firmware elements. In this case, can also be obtained from “best practice,” as reflected inSCM activities take place in parallel with hardware and the standards on software engineering issued by the variousfirmware CM activities, and must be consistent with standards organizations. Moore [Moo98] provides asystem-level CM. Buckley [Buc96:c2] describes SCM roadmap to these organizations and their standards. Bestwithin this context. Note that firmware contains hardware practice is also reflected in process improvement andand software, therefore both hardware and software CM process assessment models such as the Softwareconcepts are applicable. Engineering Institute’s Capability Maturity ModelSCM might interface with an organization’s quality Integration (SEI/CMMI) (SEI01) and ISO/IEC15504assurance activity on issues such as records management Software Engineering–Process Assessment (ISO/IECand non-conforming items. Regarding the former, some 15504-98).items under SCM control might also be project recordssubject to provisions of the organization’s quality assurance 7–2 © IEEE – 2004 Version
  • 107. Software Configuration Management Software Software Management Software Software Software Configuration Release of the SCM Configuration Configuration Configuration Status Management Process Identification Control Auditing Accounting and Delivery Organizational Identifying Requesting, Software Software Software Context for SCM Items to be Evaluating and Configuration Functional Building Controlled Approving Status Configuration Constraints and Software Software Information Audit Software Guidance for Configuration Changes Release Software Software SCM Process Software Software Management Configuration Configuration Physical Planning for Configuration Status Items Control Board Configuration SCM Reporting Software Audit SCM Configuration Software Change Organization and In-Process Item Request Process Responsibilities Audits of a Relationships Implementing Software SCM Resources Software and Schedules Versions Software Baseline Changes Tool Selection Baseline and Acquiring Deviations and Implementation Waivers Software Vendor/ Configuration Subcontractor Items Control Software Interface Control Library Software Configuration Management Plan Surveillance of SCM SCM Measures and Measurement In-Process Audits of SCM Figure 2 Breakdown of topics for the Software Configuration Management KA1.3. Planning for SCM typically considered. The results of the planning activity are [Dar90:c2; IEEE12207.0-96 :c6.s2.1; recorded in an SCM Plan (SCMP), which is typically Som01:c29] subject to SQA review and audit.The planning of an SCM process for a given project should 1.3.1. SCM organization and responsibilitiesbe consistent with the organizational context, applicable [Ber92:c7; Buc96:c3; IEEE828-98:c4s2]constraints, commonly accepted guidance, and the nature of To prevent confusion about who will perform given SCMthe project (for example, size and criticality). The major activities or tasks, organizations to be involved in the SCMactivities covered are: Software Configuration process need to be clearly identified. SpecificIdentification, Software Configuration Control, Software responsibilities for given SCM activities or tasks also needConfiguration Status Accounting, Software Configuration to be assigned to organizational entities, either by title or byAuditing, and Software Release Management and Delivery. organizational element. The overall authority and reportingIn addition, issues such as organization and responsibilities, channels for SCM should also be identified, although thisresources and schedules, tool selection and implementation, might be accomplished at the project management orvendor and subcontractor control, and interface control are quality assurance planning stage.© IEEE – 2004 Version 7–3
  • 108. 1.3.2. SCM resources and schedules Code Mgmt Baselines, CCBs DBMS, Code Mgmt Systems [Ber92:c7; Buc96:c3; IEEE828-98:c4s4; c4s5] Systems Libraries, SCRsPlanning for SCM identifies the staff and tools involved in Planning Control Status Release Auditingcarrying out SCM activities and tasks. It addresses Accounting Management Management and Deliveryscheduling questions by establishing necessary sequencesof SCM tasks and identifying their relationships to the SCMP Development Teamproject schedules and milestones established at the projectmanagement planning stage. Any training requirements Configuration Identificationnecessary for implementing the plans and training new staffmembers are also specified. Change Change Release Audit1.3.3. Tool selection and implementation Implementation Evaluation & Approval Authorization & Preparation Procedures [Ber92:c15; Con98:c6; Pre01:c31]Different types of tool capabilities, and procedures for their Figure 3 Characterization of SCM Tools and relateduse, support SCM activities. Depending on the situation, proceduresthese tool capabilities can be made available with somecombination of manual tools, automated tools providing a In this example, code management systems support thesingle SCM capability, automated tools integrating a rangeof SCM (and perhaps other) capabilities, or integrated tool operation of software libraries by controlling access to library elements, coordinating the activities of multipleenvironments which serve the needs of multiple users, and helping to enforce operating procedures. Otherparticipants in the software engineering process (forexample, SCM, development, V&V). Automated tool tools support the process of building software and releasesupport becomes increasingly important, and increasingly documentation from the software elements contained in the libraries. Tools for managing software change requestsdifficult to establish, as projects grow in size and as projectenvironments become more complex. These tool support the change control procedures applied to controlledcapabilities provide support for: software items. Other tools can provide database management and reporting capabilities for management, the SCM Library development, and quality assurance activities. As the software change request (SCR) and approval mentioned above, the capabilities of several tool types procedures might be integrated into SCM systems, which in turn are closely coupled to various other software activities. code (and related work products) and change management tasks In planning, the software engineer picks SCM tools fit for the job. Planning considers issues that might arise in the reporting software configuration status and collecting implementation of these tools, particularly if some form of SCM measurements culture change is necessary. An overview of SCM systems software configuration auditing and selection considerations is given in [Dar90:c3,AppA], managing and tracking software documentation and a case study on selecting an SCM system is given in [Mid97]. Complementary information on SCM tools can be performing software builds found in the Software Engineering Tools and Methods KA. managing and tracking software releases and their delivery 1.3.4. Vendor/Subcontractor Control [Ber92:c13; Buc96:c11; IEEE828-98:c4s3.6]The tools used in these areas can also providemeasurements for process improvement. Royce [Roy98] A software project might acquire or make use of purchaseddescribes seven core measures of value in managing software products, such as compilers or other tools. SCMsoftware engineering processes. Information available from planning considers if and how these items will be takenthe various SCM tools relates to Royce’s Work and under configuration control (for example, integrated intoProgress management indicator and to his quality indicators the project libraries) and how changes or updates will beof Change Traffic and Stability, Breakage and Modularity, evaluated and managed.Rework and Adaptability, and MTBF (mean time between Similar considerations apply to subcontracted software. Infailures) and Maturity. Reporting on these indicators can be this case, the SCM requirements to be imposed on theorganized in various ways, such as by software subcontractor’s SCM process as part of the subcontract andconfiguration item or by type of change requested. the means for monitoring compliance also need to beFigure 3 shows a representative mapping of tool established. The latter includes consideration of what SCMcapabilities and procedures to SCM Activities. information must be available for effective compliance monitoring. 7–4 © IEEE – 2004 Version
  • 109. 1.3.5. Interface control processes and procedures. This could involve an SCM [IEEE828-98:c4s3.5] authority ensuring that those with the assigned responsibility perform the defined SCM tasks correctly.When a software item will interface with another software The software quality assurance authority, as part of aor hardware item, a change to either item can affect the compliance auditing activity, might also perform thisother. The planning for the SCM process considers how the surveillance.interfacing items will be identified and how changes to theitems will be managed and communicated. The SCM role The use of integrated SCM tools with process controlmay be part of a larger, system-level process for interface capability can make the surveillance task easier. Some toolsspecification and control, and may involve interface facilitate process compliance while providing flexibility forspecifications, interface control plans, and interface control the software engineer to adapt procedures. Other toolsdocuments. In this case, SCM planning for interface control enforce process, leaving the software engineer with lesstakes place within the context of the system-level process. flexibility. Surveillance requirements and the level ofA discussion of the performance of interface control flexibility to be provided to the software engineer areactivities is given in [Ber92:c12]. important considerations in tool selection.1.4. SCM Plan [Ber92:c7; Buc96:c3; Pau93:L2-81] 1.5.1. SCM measures and measurementThe results of SCM planning for a given project are [Buc96:c3; Roy98]recorded in a Software Configuration Management Plan SCM measures can be designed to provide specific(SCMP), a “living document” which serves as a reference information on the evolving product or to provide insightfor the SCM process. It is maintained (that is, updated and into the functioning of the SCM process. A related goal ofapproved) as necessary during the software life cycle. In monitoring the SCM process is to discover opportunitiesimplementing the SCMP, it is typically necessary to for process improvement. Measurements of SCM processesdevelop a number of more detailed, subordinate procedures provide a good means for monitoring the effectiveness ofdefining how specific requirements will be carried out SCM activities on an ongoing basis. These measurementsduring day-to-day activities. are useful in characterizing the current state of the process,Guidance on the creation and maintenance of an SCMP, as well as in providing a basis for making comparisons overbased on the information produced by the planning activity, time. Analysis of the measurements may produce insightsis available from a number of sources, such as [IEEE828- leading to process changes and corresponding updates to98:c4]. This reference provides requirements for the the SCMP.information to be contained in an SCMP. It also defines and Software libraries and the various SCM tool capabilitiesdescribes six categories of SCM information to be included provide sources for extracting information about thein an SCMP: characteristics of the SCM process (as well as providing Introduction (purpose, scope, terms used) project and management information). For example, information about the time required to accomplish various SCM Management (organization, responsibilities, types of changes would be useful in an evaluation of the authorities, applicable policies, directives, and criteria for determining what levels of authority are optimal procedures) for authorizing certain types of changes. SCM Activities (configuration identification, Care must be taken to keep the focus of the surveillance on configuration control, and so on) the insights that can be gained from the measurements, not SCM Schedules (coordination with other project on the measurements themselves. Discussion of process activities) and product measurement is presented in the Software SCM Resources (tools, physical resources, and human Engineering Process KA. The software measurement resources) program is described in the Software Engineering Management KA. SCMP Maintenance1.5. Surveillance of Software Configuration 1.5.2. In-process audits of SCM Management [Buc96:c15] [Pau93:L2-87] Audits can be carried out during the software engineeringAfter the SCM process has been implemented, some degree process to investigate the current status of specific elementsof surveillance may be necessary to ensure that the of the configuration or to assess the implementation of theprovisions of the SCMP are properly carried out (see, for SCM process. In-process auditing of SCM provides a moreexample [Buc96]). There are likely to be specific SQA formal mechanism for monitoring selected aspects of therequirements for ensuring compliance with specified SCM process and may be coordinated with the SQA function. See also subarea 5 Software Configuration Auditing.© IEEE – 2004 Version 7–5
  • 110. 2. Software Configuration Identification map the identified items to the software structure, as well as [IEEE12207.0-96:c6s2.2] the need to support the evolution of the software items andThe software configuration identification activity identifies their relationships.items to be controlled, establishes identification schemes 2.1.4. Software versionfor the items and their versions, and establishes the tools [Bab86:c2]and techniques to be used in acquiring and managingcontrolled items. These activities provide the basis for the Software items evolve as a software project proceeds. Aother SCM activities. version of a software item is a particular identified and specified item. It can be thought of as a state of an evolving2.1. Identifying Items to Be Controlled item. [Con98:c3-c5] A revision is a new version of an item [Ber92:c8; IEEE828-98:c4s3.1; Pau93:L2-83; that is intended to replace the old version of the item. A Som05:c29] variant is a new version of an item that will be added to theA first step in controlling change is to identify the software configuration without replacing the old version.items to be controlled. This involves understanding thesoftware configuration within the context of the system 2.1.5. Baselineconfiguration, selecting software configuration items, [Bab86:c5; Buc96:c4; Pre04:c27]developing a strategy for labeling software items and A software baseline is a set of software configuration itemsdescribing their relationships, and identifying the baselines formally designated and fixed at a specific time during theto be used, along with the procedure for a baseline’s software life cycle. The term is also used to refer to aacquisition of the items. particular version of a software configuration item that has2.1.1. Software configuration been agreed on. In either case, the baseline can only be [Buc96:c4; c6, Pre04:c27] changed through formal change control procedures. A baseline, together with all approved changes to theA software configuration is the set of functional and baseline, represents the current approved configuration.physical characteristics of software as set forth in the Commonly used baselines are the functional, allocated,technical documentation or achieved in a product developmental, and product baselines (see, for example,(IEEE610.12-90). It can be viewed as a part of an overall [Ber92]). The functional baseline corresponds to thesystem configuration. reviewed system requirements. The allocated baseline2.1.2. Software configuration item corresponds to the reviewed software requirements [Buc96:c4;c6; Con98:c2; Pre04:c27] specification and software interface requirements specification. The developmental baseline represents theA software configuration item (SCI) is an aggregation of evolving software configuration at selected times during thesoftware designated for configuration management and is software life cycle. Change authority for this baselinetreated as a single entity in the SCM process (IEEE610.12- typically rests primarily with the development organization,90). A variety of items, in addition to the code itself, is but may be shared with other organizations (for example,typically controlled by SCM. Software items with potential SCM or Test). The product baseline corresponds to theto become SCIs include plans, specifications and design completed software product delivered for systemdocumentation, testing materials, software tools, source and integration. The baselines to be used for a given project,executable code, code libraries, data and data dictionaries, along with their associated levels of authority needed forand documentation for installation, maintenance, change approval, are typically identified in the SCMP.operations, and software use.Selecting SCIs is an important process in which a balance 2.1.6. Acquiring software configuration itemsmust be achieved between providing adequate visibility for [Buc96:c4]project control purposes and providing a manageable Software configuration items are placed under SCM controlnumber of controlled items. A list of criteria for SCI at different times; that is, they are incorporated into aselection is given in [Ber92]. particular baseline at a particular point in the software life2.1.3. Software configuration item relationships cycle. The triggering event is the completion of some form [Con98:c2; Pre04:c27] of formal acceptance task, such as a formal review. Figure 2 characterizes the growth of baselined items as the lifeThe structural relationships among the selected SCIs, and cycle proceeds. This figure is based on the waterfall modeltheir constituent parts, affect other SCM activities or tasks, for purposes of illustration only; the subscripts used in thesuch as software building or analyzing the impact of figure indicate versions of the evolving items. The softwareproposed changes. Proper tracking of these relationships is change request (SCR) is described in topic 3.1 Requesting,also important for supporting traceability. The design of the Evaluating, and Approving Software Changes.identification scheme for SCIs should consider the need to 7–6 © IEEE – 2004 Version
  • 111. Requirements Design Test Readiness Acceptance approving certain changes, support for the implementation Review Review Review of those changes, and the concept of formal deviations from project requirements, as well as waivers of them. SRSA SRSB SRSC SRSD Information derived from these activities is useful in measuring change traffic and breakage, and aspects of SDDB SDDC SDDA rework. SCR control CodeA CodeB 3.1. Requesting, Evaluating, and Approving Software of SRS Test Test Changes models [IEEE828-98:c4s3.2; Pre04:c27; Som05:c29] SCR control PlansA PlansB of SRS, SDD The first step in managing changes to controlled items is models User ManualA determining what changes to make. The software change SCR control of SRS, SDD request process (see Figure 5) provides formal procedures Code, Test Regression for submitting and recording change requests, evaluating Plans Test DBA the potential cost and impact of a proposed change, and Figure 4 Acquisition of items accepting, modifying, or rejecting the proposed change. Requests for changes to software configuration items mayFollowing the acquisition of an SCI, changes to the item be originated by anyone at any point in the software lifemust be formally approved as appropriate for the SCI and cycle and may include a suggested solution and requestedthe baseline involved, as defined in the SCMP. Following priority. One source of change requests is the initiation ofapproval, the item is incorporated into the software baseline corrective action in response to problem reports. Regardlessaccording to the appropriate procedure. of the source, the type of change (for example, defect or2.2. Software Library enhancement) is usually recorded on the SCR. [Bab86:c2; c5; Buc96:c4; IEEE828- 98:c4s3.1; Pau93:L2-82; Som01:c29] Need for Change Preliminary InvestigationA software library is a controlled collection of software andrelated documentation designed to aid in software Changedevelopment, use, and maintenance (IEEE610.12-90). It is identified for controlled item CCB Review Rejected Inform Requesteralso instrumental in software release management anddelivery activities. Several types of libraries might be used, SCR generated Approvedeach corresponding to a particular level of maturity of the or updated Assign to Software ‘Emergency Path’software item. For example, a working library could Engineer usually also exists.support coding and a project support library could support Schedule, Changes can be implemented with SCR evaluated incompletetesting, while a master library could be used for finished design, test, complete change change process performed afterward completeproducts. An appropriate level of SCM control (associatedbaseline and level of authority for change) is associated Figure 5 Flow of a Change Control Processwith each library. Security, in terms of access control and This provides an opportunity for tracking defects andthe backup facilities, is a key aspect of library management. collecting change activity measurements by change type.A model of a software library is described in [Ber92:c14]. Once an SCR is received, a technical evaluation (alsoThe tool(s) used for each library must support the SCM known as an impact analysis) is performed to determine thecontrol needs for that library, both in terms of controlling extent of the modifications that would be necessary shouldSCIs and controlling access to the library. At the working the change request be accepted. A good understanding oflibrary level, this is a code management capability serving the relationships among software (and possibly, hardware)developers, maintainers, and SCM. It is focused on items is important for this task. Finally, an establishedmanaging the versions of software items while supporting authority, commensurate with the affected baseline, the SCIthe activities of multiple developers. At higher levels of involved, and the nature of the change, will evaluate thecontrol, access is more restricted and SCM is the primary technical and managerial aspects of the change request anduser. either accept, modify, reject, or defer the proposed change.These libraries are also an important source of information 3.1.1. Software Configuration Control Boardfor measurements of work and progress. [Ber92:c9; Buc96:c9,c11; Pre04:c27]3. Software Configuration Control The authority for accepting or rejecting proposed changes rests with an entity typically known as a Configuration [IEEE12207.0-96:c6s2.3; Pau93:L2-84] Control Board (CCB). In smaller projects, this authoritySoftware configuration control is concerned with managing may actually reside with the leader or an assignedchanges during the software life cycle. It covers the process individual rather than a multi-person board. There can befor determining what changes to make, the authority for multiple levels of change authority depending on a variety© IEEE – 2004 Version 7–7
  • 112. of criteria, such as the criticality of the item involved, the activities might contain provisions which cannot benature of the change (for example, impact on budget and satisfied at the designated point in the life cycle. Aschedule), or the current point in the life cycle. The deviation is an authorization to depart from a provisioncomposition of the CCBs used for a given system varies prior to the development of the item. A waiver is andepending on these criteria (an SCM representative would authorization to use an item, following its development,always be present). All stakeholders, appropriate to the that departs from the provision in some way. In these cases,level of the CCB, are represented. When the scope of a formal process is used for gaining approval for deviationsauthority of a CCB is strictly software, it is known as a from, or waivers of, the provisions.Software Configuration Control Board (SCCB). Theactivities of the CCB are typically subject to software 4. Software Configuration Status Accountingquality audit or review. [IEEE12207.0-96:c6s2.4; Pau93:L2-85; Pre04:c27; Som05:c29]3.1.2. Software change request process [Buc96:c9,c11; Pre04:c27] Software configuration status accounting (SCSA) is the recording and reporting of information needed for effectiveAn effective software change request (SCR) process management of the software configuration.requires the use of supporting tools and procedures rangingfrom paper forms and a documented procedure to an 4.1. Software Configuration Status Informationelectronic tool for originating change requests, enforcing [Buc96:c13; IEEE828-98:c4s3.3]the flow of the change process, capturing CCB decisions, The SCSA activity designs and operates a system for theand reporting change process information. A link between capture and reporting of necessary information as the lifethis tool capability and the problem-reporting system can cycle proceeds. As in any information system, thefacilitate the tracking of solutions for reported problems. configuration status information to be managed for theChange process descriptions and supporting forms evolving configurations must be identified, collected, and(information) are given in a variety of references, for maintained. Various information and measurements areexample [Ber92:c9]. needed to support the SCM process and to meet the3.2. Implementing Software Changes configuration status reporting needs of management, [Bab86:c6; Ber92:c9; Buc96:c9,c11; IEEE828- software engineering, and other related activities. The types 98:c4s3.2.4; Pre04:c27; Som05:c29] of information available include the approved configuration identification, as well as the identification and currentApproved SCRs are implemented using the defined implementation status of changes, deviations, and waivers.software procedures in accordance with the applicable A partial list of important data elements is given inschedule requirements. Since a number of approved SCRs [Ber92:c10].might be implemented simultaneously, it is necessary toprovide a means for tracking which SCRs are incorporated Some form of automated tool support is necessary tointo particular software versions and baselines. As part of accomplish the SCSA data collection and reporting tasks.the closure of the change process, completed changes may This could be a database capability, or it could be a stand-undergo configuration audits and software quality alone tool or a capability of a larger, integrated toolverification. This includes ensuring that only approved environment.changes have been made. The change request process 4.2. Software Configuration Status Reportingdescribed above will typically document the SCM (and [Ber92:c10; Buc96:c13]other) approval information for the change. Reported information can be used by various organizationalThe actual implementation of a change is supported by and project elements, including the development team, thethe library tool capabilities, which provide version maintenance team, project management, and softwaremanagement and code repository support. At a minimum, quality activities. Reporting can take the form of ad hocthese tools provide check-in/out and associated version queries to answer specific questions or the periodiccontrol capabilities. More powerful tools can support production of predesigned reports. Some informationparallel development and geographically distributed produced by the status accounting activity during theenvironments. These tools may be manifested as separate course of the life cycle might become quality assurancespecialized applications under the control of an independent records.SCM group. They may also appear as an integrated part of In addition to reporting the current status of thethe software engineering environment. Finally, they may be configuration, the information obtained by the SCSA canas elementary as a rudimentary change control system serve as a basis for various measurements of interest toprovided with an operating system. management, development, and SCM. Examples include3.3. Deviations and Waivers the number of change requests per SCI and the average [Ber92:c9; Buc96:c12] time needed to implement a change request.The constraints imposed on a software engineering effort orthe specifications produced during the development 7–8 © IEEE – 2004 Version
  • 113. well as distribution to customers. When different versions5. Software Configuration Auditing of a software item are available for delivery, such as [IEEE828-98:c4s3.4; IEEE12207.0-96:c6s2.5; versions for different platforms or versions with varying Pau93:L2-86; Pre04:c26c27] capabilities, it is frequently necessary to recreate specific versions and package the correct materials for delivery ofA software audit is an activity performed to independently the version. The software library is a key element inevaluate the conformance of software products and accomplishing release and delivery tasks.processes to applicable regulations, standards, guidelines,plans, and procedures (IEEE1028-97). Audits are 6.1. Software Buildingconducted according to a well-defined process consisting of [Bab86:c6; Som05:c29]various auditor roles and responsibilities. Consequently, Software building is the activity of combining the correcteach audit must be carefully planned. An audit can require versions of software configuration items, using thea number of individuals to perform a variety of tasks over a appropriate configuration data, into an executable programfairly short period of time. Tools to support the planning for delivery to a customer or other recipient, such as theand conduct of an audit can greatly facilitate the process. testing activity. For systems with hardware or firmware, theGuidance for conducting software audits is available in executable program is delivered to the system-buildingvarious references, such as [Ber92:c11; Buc96:c15] and activity. Build instructions ensure that the proper build(IEEE1028-97). steps are taken and in the correct sequence. In addition toThe software configuration auditing activity determines the building software for new releases, it is usually alsoextent to which an item satisfies the required functional and necessary for SCM to have the capability to reproducephysical characteristics. Informal audits of this type can be previous releases for recovery, testing, maintenance, orconducted at key points in the life cycle. Two types of additional release purposes.formal audits might be required by the governing contract Software is built using particular versions of supporting(for example, in contracts covering critical software): the tools, such as compilers. It might be necessary to rebuild anFunctional Configuration Audit (FCA) and the Physical exact copy of a previously built software configurationConfiguration Audit (PCA). Successful completion of item. In this case, the supporting tools and associated buildthese audits can be a prerequisite for the establishment of instructions need to be under SCM control to ensurethe product baseline. Buckley [Buc96:c15] contrasts the availability of the correct versions of the tools.purposes of the FCA and PCA in hardware versus software A tool capability is useful for selecting the correct versionscontexts, and recommends careful evaluation of the need of software items for a given target environment and forfor a software FCA and PCA before performing them. automating the process of building the software from the5.1. Software Functional Configuration Audit selected versions and appropriate configuration data. ForThe purpose of the software FCA is to ensure that the large projects with parallel development or distributedaudited software item is consistent with its governing development environments, this tool capability isspecifications. The output of the software verification and necessary. Most software engineering environmentsvalidation activities is a key input to this audit. provide this capability. These tools vary in complexity from requiring the software engineer to learn a specialized5.2. Software Physical Configuration Audit scripting language to graphics-oriented approaches thatThe purpose of the software physical configuration audit hide much of the complexity of an “intelligent” build(PCA) is to ensure that the design and reference facility.documentation is consistent with the as-built software The build process and products are often subject toproduct. software quality verification. Outputs of the build process5.3. In-process Audits of a Software Baseline might be needed for future reference and may becomeAs mentioned above, audits can be carried out during the quality assurance records.development process to investigate the current status of 6.2. Software Release Managementspecific elements of the configuration. In this case, an audit [Som05:c29]could be applied to sampled baseline items to ensure that Software release management encompasses theperformance is consistent with specifications or to ensure identification, packaging, and delivery of the elements of athat evolving documentation continues to be consistent with product, for example, executable program, documentation,the developing baseline item. release notes, and configuration data. Given that product changes can occur on a continuing basis, one concern for6. Software Release Management and Delivery release management is determining when to issue a release. [IEEE12207.0-96:c6s2.6] The severity of the problems addressed by the release andThe term “release” is used in this context to refer to the measurements of the fault densities of prior releases affectdistribution of a software configuration item outside the this decision. (Som01) The packaging task must identifydevelopment activity. This includes internal releases as which product items are to be delivered, and then select the© IEEE – 2004 Version 7–9
  • 114. correct variants of those items, given the intended distribution of the product to various customers or targetapplication of the product. The information documenting systems. An example would be a case where the supplierthe physical contents of a release is known as a version was required to notify a customer of newly reporteddescription document. The release notes typically describe problems.new capabilities, known problems, and platform A tool capability is needed for supporting these releaserequirements necessary for proper product operation. The management functions. It is useful to have a connectionpackage to be released also contains installation or with the tool capability supporting the change requestupgrading instructions. The latter can be complicated by the process in order to map release contents to the SCRs thatfact that some current users might have versions that are have been received. This tool capability might alsoseveral releases old. Finally, in some cases, the release maintain information on various target platforms and onmanagement activity might be required to track the various customer environments. 7–10 © IEEE – 2004 Version
  • 115. MATRIX OF TOPICS VS. REFERENCE MATERIAL [Pre04] [Ber92] [Dar90] [Buc96] [Pau93] [Roy98] [Bab86] [Con98] [Mid97] [Som05] [Moo98] [IEEE828-98] [IEEE12207.0-96] 1. Management of the SCM Process 1.1 Organizational Context for SCM c4 c2 c2 c4s2.1 1.2 Constraints and Guidance for SCM Process c5 c4s1, c4s2.3 * 1.3 Planning for SCM c2 6.2.1 c29 SCM organization and responsibilities c7 c3 c4s2 SCM resources and schedules c7 c3 c4s4, c4s5 Tool selection and implementation c15 c6 c3, App A * * Vendor/subcontractor control c13 c11 c4s3.6 Interface control c12 c4s3.5 1.4 SCM Plan c7 c3 c4 L2-81 1.5 Surveillance of SCM * c4 L2-87 SCM measures and measurement c3 188-202, 283-298 In-process audits of SCM c15 2. Software Configuration Identification c6s2.2 2.1 Identifying Items to be Controlled c8 c4s3.1 L2-83 c29 Software configuration c4,c6 c27 Software configuration item * c4,c6 c2 c27 Software configuration item relationships c2 c27 Software versions c2 c27 Baseline c5 * c4 c27 Acquiring software configuration items c4 2.2 Software Library c2,c5 c14 c4 c4s3.1 L2-82 c29 3. Software Configuration Control c6s2.3 L2-84 3.1 Requesting, Evaluating and Approving Software Changes c4s3.2 c27 c29 Software configuration control board c9 c9,c11 c27 Software change request process c9 c9,c11 c27 3.2 Implementing Software Changes c6 c9 c9,c11 c4s3.2.4 c27 c29 3.3 Deviations and Waivers c9 c12 4. Software Configuration Status Accounting c6s2.4 L2-85 c27 c29 4.1 Software Configuration Status Information c10 c13 c4s3.3 4.2 Software Configuration Status Reporting c10 c13 5. Software Configuration Auditing c11 c15 c4s3.4 c6s2.5 L2-86 c26, c27 5.1 Software Functional Configuration Audit 5.2 Software Physical Configuration Audit 5.3 In-Process Audits of a Software Baseline 6. Software Release Management and Delivery c6s2.6 6.1 Software Building c6 c29 6.2 Software Release Management c29© IEEE – 2004 Version 7–11
  • 116. [IEEE12207.0-96] IEEE/EIA 12207.0-1996//ISO/RECOMMENDED REFERENCES FOR SCM IEC12207:1995, Industry Implementation of Int. Std. ISO/IEC 12207:95, Standard for Information Technology-[Bab86] W.A. Babich, Software Configuration Software Life Cycle Processes, IEEE, 1996.Management, Coordination for Team Productivity, [Mid97] A.K. Midha, “Software ConfigurationAddison-Wesley, 1986. Management for the 21st Century,” Bell Labs Technical[Ber92] H.R. Berlack, Software Configuration Journal, vol. 2, iss. 1, Winter 1997, pp. 154-165.Management, John Wiley & Sons, 1992. [Moo98] J.W. Moore, Software Engineering Standards: A[Buc96] F.J. Buckley, Implementing Configuration User’s Roadmap, IEEE Computer Society, 1998.Management: Hardware, Software, and Firmware, second [Pau93] M.C. Paulk et al., “Key Practices of the Capabilityed., IEEE Computer Society Press, 1996. Maturity Model, Version 1.1,” technical report CMU/SEI-[Con98] R. Conradi and B. Westfechtel, “Version Models 93-TR-025, Software Engineering Institute, Carnegiefor Software Configuration Management,” ACM Mellon University, 1993.Computing Surveys, vol. 30, iss. 2, June 1998. [Pre04] R.S. Pressman, Software Engineering: A[Dar90] S.A. Dart, Spectrum of Functionality in Practitioner’s Approach, Sixth ed, McGraw-Hill, 2004.Configuration Management Systems, Software Engineering [Roy98] W. Royce, Software Project Management, AInstitute, Carnegie Mellon University, 1990. United Framework, Addison-Wesley, 1998.[IEEE828-98] IEEE Std 828-1998, IEEE Standard for [Som05] I. Sommerville, Software Engineering, seventhSoftware Configuration Management Plans, IEEE, 1998. ed., Addison-Wesley, 2005. 7–12 © IEEE – 2004 Version
  • 117. APPENDIX A. LIST OF FURTHER READINGS 1992. (Hoe02) A. v. d. Hoek, “Configuration Management(Bab86) W.A. Babich, Software Configuration Yellow Pages,” 2002, available at http://www.cmtoday.Management, Coordination for Team Productivity, com/yp/configuration_management.html.Addison-Wesley, 1986. (Hum89) W. Humphrey, Managing the Software Process,(Ber92) H.R. Berlack, Software Configuration Addison-Wesley, 1989.Management, John Wiley & Sons, 1992. (Pau95) M.C. Paulk et al., The Capability Maturity Model,(Ber97) E.H. Bersoff, “Elements of Software Configuration Guidelines for Improving the Software Process, Addison-Management,” in Software Engineering, M. Dorfman and Wesley, 1995.R.H. Thayer, eds., IEEE Computer Society Press, 1997. (Som01a) I. Sommerville, “Software Configuration(Buc96) F.J. Buckley, Implementing Configuration Management,” presented at ICSE SCM-6 Workshop,Management: Hardware, Software, and Firmware, second Berlin, 2001.ed., IEEE Computer Society Press, 1996. (USNRC1.169-97) USNRC Regulatory Guide 1.169,(ElE98) K. El-Emam et al., “SPICE, The Theory and “Configuration Management Plans for Digital ComputerPractice of Software Process Improvement and Capability Software Used in Safety Systems of Nuclear Power Plants,”Determination,” presented at IEEE Computer Society, presented at U.S. Nuclear Regulatory Commission,1998. Washington, D.C., 1997.(Est95) J. Estublier, “Software Configuration (Vin88) J. Vincent, A. Waters, and J. Sinclair, SoftwareManagement,” presented at ICSE SCM-4 and SCM-5 Quality Assurance: Practice and Implementation, PrenticeWorkshops, Berlin, 1995. Hall, 1988.(Gra92) R.B. Grady, Practical Software Metrics for Project (Whi91) D. Whitgift, Methods and Tools for SoftwareManagement and Process Management, Prentice Hall, Configuration Management, John Wiley & Sons, 1991.© IEEE – 2004 Version 7–13
  • 118. APPENDIX B. LIST OF STANDARDS Software Life Cycle Processes, IEEE, 1996. (IEEE12207.1-96) IEEE/EIA 12207.1-1996, Industry(IEEE730-02) IEEE Std 730-2002, IEEE Standard for Implementation of Int. Std. ISO/IEC 12207:95, StandardSoftware Quality Assurance Plans, IEEE, 2002. for Information Technology-Software Life Cycle Processes(IEEE828-98) IEEE Std 828-1998, IEEE Standard for - Life Cycle Data, IEEE, 1996.Software Configuration Management Plans, IEEE, 1998. (IEEE12207.2-97) IEEE/EIA 12207.2-1997, Industry(IEEE1028-97) IEEE Std 1028-1997 (R2002), IEEE Implementation of Int. Std. ISO/IEC 12207:95, StandardStandard for Software Reviews, IEEE, 1997. for Information Technology-Software Life Cycle Processes(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/ - Implementation Considerations, IEEE, 1997.IEC12207:1995, Industry Implementation of Int. Std. (ISO15846-98) ISO/IEC TR 15846:1998, InformationISO/IEC 12207:95, Standard for Information Technology- Technology - Software Life Cycle Processes - Configuration Management, ISO and IEC, 1998. 7–14 © IEEE – 2004 Version
  • 119. CHAPTER 8 SOFTWARE ENGINEERING MANAGEMENTACRONYMPMBOK Guide to the Project is not to diminish the importance of organizational Management Body of management issues. Knowledge Since the link to the related disciplines—obviouslySQA Software Quality Assurance management—is important, it will be described in more detail than in the other KA descriptions. Aspects ofINTRODUCTION organizational management are important in terms of their impact on software engineering—on policy management,Software Engineering Management can be defined as for instance: organizational policies and standards providethe application of management activities—planning, the framework in which software engineering iscoordinating, measuring, monitoring, controlling, and undertaken. These policies may need to be influenced byreporting—to ensure that the development and maintenance the requirements of effective software development andof software is systematic, disciplined, and quantified maintenance, and a number of software engineering-(IEEE610.12-90). specific policies may need to be established for effectiveThe Software Engineering Management KA therefore management of software engineering at an organizationaladdresses the management and measurement of software level. For example, policies are usually necessary toengineering. While measurement is an important aspect of establish specific organization-wide processes orall KAs, it is here that the topic of measurement programs procedures for such software engineering tasks asis presented. designing, implementing, estimating, tracking, and reporting. Such policies are essential to effective long-termWhile it is true to say that in one sense it should be possible software engineering management, by establishing ato manage software engineering in the same way as any consistent basis on which to analyze past performance andother (complex) process, there are aspects specific to implement improvements, for example.software products and the software life cycle processeswhich complicate effective management—just a few of Another important aspect of management is personnelwhich are as follows: management: policies and procedures for hiring, training, and motivating personnel and mentoring for career The perception of clients is such that there is often a development are important not only at the project level but lack of appreciation for the complexity inherent in also to the longer-term success of an organization. Software software engineering, particularly in relation to the engineering personnel may present unique training or impact of changing requirements. personnel management challenges (for example, It is almost inevitable that the software engineering maintaining currency in a context where the underlying processes themselves will generate the need for new or technology undergoes continuous and rapid change). changed client requirements. Communication management is also often mentioned as an As a result, software is often built in an iterative process overlooked but major aspect of the performance of rather than a sequence of closed tasks. individuals in a field where precise understanding of user needs and of complex requirements and designs is Software engineering necessarily incorporates aspects necessary. Finally, portfolio management, which is of creativity and discipline—maintaining an appropriate the capacity to have an overall vision not only of the balance between the two is often difficult. set of software under development but also of the The degree of novelty and complexity of software is software already in use in an organization, is necessary. often extremely high. Furthermore, software reuse is a key factor in maintaining There is a rapid rate of change in the underlying and improving productivity and competitiveness. Effective technology. reuse requires a strategic vision that reflects the unique power and requirements of this technique.With respect to software engineering, managementactivities occur at three levels: organizational and In addition to understanding the aspects of managementinfrastructure management, project management, and that are uniquely influenced by software, softwaremeasurement program planning and control. The last two engineers must have some knowledge of the more generalare covered in detail in this KA description. However, this aspects, even in the first four years after graduation that is targeted in the Guide.© IEEE – 2004 Version 8–1
  • 120. Organizational culture and behavior, and functional models are developed using statistical, expertenterprise management in terms of procurement, supply knowledge or other techniques.chain management, marketing, sales, and distribution, all The software engineering project management subareashave an influence, albeit indirectly, on an organization’s make extensive use of the software engineeringsoftware engineering process. measurement subarea.Relevant to this KA is the notion of project management, as Not unexpectedly, this KA is closely related to others in the“the construction of useful software artifacts” is normally Guide to the SWEBOK, and reading the following KAmanaged in the form of (perhaps programs of) individual descriptions in conjunction with this one would beprojects. In this regard, we find extensive support in the particularly useful.Guide to the Project Management Body of Knowledge(PMBOK) (PMI00), which itself includes the following Software Requirements, where some of the activities toproject management KAs: project integration management, be performed during the Initiation and Scope definitionproject scope management, project time management, phase of the project are describedproject cost management, project quality management, Software Configuration Management, as this deals withproject human resource management, and project the identification, control, status accounting, and auditcommunications management. Clearly, all these topics have of the software configuration along with softwaredirect relevance to the Software Engineering Management release management and deliveryKA. To attempt to duplicate the content of the Guide to the Software Engineering Process, because processes andPMBOK here would be both impossible and inappropriate. projects are closely related (this KA also describesInstead, we suggest that the reader interested in project process and product measurement)management beyond what is specific to softwareengineering projects consult the PMBOK itself. Project Software Quality, as quality is constantly a goal ofmanagement is also found in the Related Disciplines of management and is an aim of many activities that mustSoftware Engineering chapter. be managedThe Software Engineering Management KA consists of BREAKDOWN OF TOPICS FOR SOFTWAREboth the software project management process, in its first ENGINEERING MANAGEMENTfive subareas, and software engineering measurement in thelast subarea. While these two subjects are often regarded as As the Software Engineering Management KA is viewedbeing separate, and indeed they do possess many unique here as an organizational process which incorporates theaspects, their close relationship has led to their combined notion of process and project management, we have createdtreatment in this KA. Unfortunately, a common perception a breakdown that is both topic-based and life cycle-based.of the software industry is that it delivers products late, However, the primary basis for the top-level breakdown isover budget, and of poor quality and uncertain the process of managing a software engineering project.functionality. Measurement-informed management — an There are six major subareas. The first five subareas largelyassumed principle of any true engineering discipline — can follow the IEEE/EIA 12207 Management Process. The sixhelp to turn this perception around. In essence, subareas are:management without measurement, qualitative and Initiation and scope definition, which deals with thequantitative, suggests a lack of rigor, and measurement decision to initiate a software engineering projectwithout management suggests a lack of purpose or context.In the same way, however, management and measurement Software project planning, which addresses thewithout expert knowledge is equally ineffectual, so we activities undertaken to prepare for successful softwaremust be careful to avoid over-emphasizing the quantitative engineering from a management perspectiveaspects of Software Engineering Management (SEM). Software project enactment, which deals with generallyEffective management requires a combination of both accepted software engineering management activitiesnumbers and experience. that occur during software engineeringThe following working definitions are adopted here: Review and evaluation, which deal with assurance that Management process refers to the activities that are the software is satisfactory undertaken in order to ensure that the software Closure, which addresses the post-completion activities engineering processes are performed in a manner of a software engineering project consistent with the organization’s policies, goals, and standards. Software engineering measurement, which deals with the effective development and implementation of Measurement refers to the assignment of values and measurement programs in software engineering labels to aspects of software engineering (products, organizations (IEEE12207.0-96) processes, and resources as defined by [Fen98]) and the models that are derived from them, whether these The breakdown of topics for the Software Engineering Management KA is shown in Figure 1. 8–2 © IEEE – 2004 Version
  • 121. 1. Initiation and Scope Definition Software requirement methods for requirements elicitationThe focus of this set of activities is on the effective (for example, observation), analysis (for example, datadetermination of software requirements via various modeling, use-case modeling), specification, and validationelicitation methods and the assessment of the project’s (for example, prototyping) must be selected and applied,feasibility from a variety of standpoints. Once feasibility taking into account the various stakeholder perspectives.has been established, the remaining task within this process This leads to the determination of project scope, objectives,is the specification of requirements validation and change and constraints. This is always an important activity, as itprocedures (see also the Software Requirements KA). sets the visible boundaries for the set of tasks being1.1. Determination and Negotiation of Requirements undertaken, and is particularly so where the novelty of the undertaking is high. Additional information can be found in [Dor02: v2c4; Pfl01: c4; Pre04: c7; Som05: c5] the Software Requirements KA.© IEEE – 2004 Version 8–3
  • 122. 1.2. Feasibility Analysis (Technical, Operational, ongoing plan management, review, and revision are also Financial, Social/Political) clearly stated and agreed. [Pre04: c6; Som05: c6] 2.1. Process PlanningSoftware engineers must be assured that adequate Selection of the appropriate software life cycle model (forcapability and resources are available in the form of people, example, spiral, evolutionary prototyping) and theexpertise, facilities, infrastructure, and support (either adaptation and deployment of appropriate software lifeinternally or externally) to ensure that the project can be cycle processes are undertaken in light of the particularsuccessfully completed in a timely and cost-effective scope and requirements of the project. Relevant methodsmanner (using, for example, a requirement-capability and tools are also selected. [Dor02: v1c6,v2c8; Pfl01: c2;matrix). This often requires some “ballpark” estimation of Pre04: c2; Rei02: c1,c3,c5; Som05: c3; Tha97: c3] At theeffort and cost based on appropriate methods (for example, project level, appropriate methods and tools are used toexpert-informed analogy techniques). decompose the project into tasks, with associated inputs,1.3. Process for the Review and Revision of Requirements outputs, and completion conditions (for example, work breakdown structure). [Dor02: v2c7; Pfl01: c3; Pre04: c21;Given the inevitability of change, it is vital that agreement Rei02: c4,c5; Som05: c4; Tha97: c4,c6] This in turnamong stakeholders is reached at this early point as to the influences decisions on the project’s high-level schedulemeans by which scope and requirements are to be reviewed and organization structure.and revised (for example, via agreed change managementprocedures). This clearly implies that scope and 2.2. Determine Deliverablesrequirements will not be “set in stone” but can and should The product(s) of each task (for example, architecturalbe revisited at predetermined points as the process unfolds design, inspection report) are specified and characterized.(for example, at design reviews, management reviews). If [Pfl01: c3; Pre04: c24; Tha97: c4] Opportunities to reusechanges are accepted, then some form of traceability software components from previous developments or toanalysis and risk analysis (see topic 2.5 Risk Management) utilize off-the-shelf software products are evaluated. Use ofshould be used to ascertain the impact of those changes. A third parties and procured software are planned andmanaged-change approach should also be useful when it suppliers are selected.comes time to review the outcome of the project, as the 2.3. Effort, Schedule, and Cost Estimationscope and requirements should form the basis for theevaluation of success. [Som05: c6] See also the software Based on the breakdown of tasks, inputs, and outputs, theconfiguration control subarea of the Software expected effort range required for each task is determinedConfiguration Management KA. using a calibrated estimation model based on historical size-effort data where available and relevant, or other2. Software Project Planning methods like expert judgment. Task dependencies are established and potential bottlenecks are identified usingThe iterative planning process is informed by the scope and suitable methods (for example, critical path analysis).requirements and by the establishment of feasibility. At this Bottlenecks are resolved where possible, and the expectedpoint, software life cycle processes are evaluated and the schedule of tasks with projected start times, durations, andmost appropriate (given the nature of the project, its degree end times is produced (for example, PERT chart). Resourceof novelty, its functional and technical complexity, its requirements (people, tools) are translated into costquality requirements, and so on) is selected. Where estimates. [Dor02: v2c7; Fen98: c12; Pfl01: c3; Pre04: c23,relevant, the project itself is then planned in the form of a c24; Rei02: c5,c6; Som05: c4,c23; Tha97: c5] This is ahierarchical decomposition of tasks, the associated highly iterative activity which must be negotiated anddeliverables of each task are specified and characterized in revised until consensus is reached among affectedterms of quality and other attributes in line with stated stakeholders (primarily engineering and management).requirements, and detailed effort, schedule, and costestimation is undertaken. Resources are then allocated to 2.4. Resource Allocationtasks so as to optimize personnel productivity (at [Pfl01: c3; Pre04: c24; Rei02: c8,c9; Som05: c4;individual, team, and organizational levels), equipment and Tha97: c6,c7]materials utilization, and adherence to schedule. Detailedrisk management is undertaken and the “risk profile” of the Equipment, facilities, and people are associated with theproject is discussed among, and accepted by, all relevant scheduled tasks, including the allocation of responsibilitiesstakeholders. Comprehensive software quality management for completion (using, for example, a Gantt chart). Thisprocesses are determined as part of the planning process in activity is informed and constrained by the availability ofthe form of procedures and responsibilities for software resources and their optimal use under these circumstances,quality assurance, verification and validation, reviews, and as well as by issues relating to personnel (for example,audits (see the Software Quality KA). As an iterative productivity of individuals/teams, team dynamics,process, it is vital that the processes and responsibilities for organizational and team structures). 8–4 © IEEE – 2004 Version
  • 123. 2.5. Risk Management expectation that such adherence will lead to the successfulRisk identification and analysis (what can go wrong, how satisfaction of stakeholder requirements and achievementand why, and what are the likely consequences), critical of the project objectives. Fundamental to enactment are therisk assessment (which are the most significant risks in ongoing management activities of measuring, monitoring,terms of exposure, which can we do something about in controlling, and reporting.terms of leverage), risk mitigation and contingency 3.1. Implementation of Plansplanning (formulating a strategy to deal with risks and to [Pfl01: c3; Som05: c4]manage the risk profile) are all undertaken. Riskassessment methods (for example, decision trees and The project is initiated and the project activities areprocess simulations) should be used in order to highlight undertaken according to the schedule. In the process,and evaluate risks. Project abandonment policies should resources are utilized (for example, personnel effort,also be determined at this point in discussion with all other funding) and deliverables are produced (for example,stakeholders. [Dor02: v2c7; Pfl01: c3; Pre04: c25; Rei02: architectural design documents, test cases).c11; Som05: c4; Tha97: c4] Software-unique aspects of 3.2. Supplier Contract Managementrisk, such as software engineers’ tendency to add unwanted [Som05:c4]features or the risks attendant in software’s intangiblenature, must influence the project’s risk management. Prepare and execute agreements with suppliers, monitor supplier performance, and accept supplier products,2.6. Quality Management incorporating them as appropriate. [Dor02: v1c8,v2c3-c5; Pre04: c26; Rei02: c10; 3.3. Implementation of measurement process Som05: c24,c25; Tha97: c9,c10] [Fen98: c13,c14; Pre04: c22; Rei02: c10,c12;Quality is defined in terms of pertinent attributes of the Tha97: c3,c10]specific project and any associated product(s), perhaps in The measurement process is enacted alongside the softwareboth quantitative and qualitative terms. These quality project, ensuring that relevant and useful data are collectedcharacteristics will have been determined in the (see also topics 6.2 Plan the Measurement Process and 6.3specification of detailed software requirements. See also Perform the Measurement Process).the Software Requirements KA. 3.4. Monitor ProcessThresholds for adherence to quality are set for eachindicator as appropriate to stakeholder expectations for the [Dor02: v1c8, v2c2-c5,c7; Rei02: c10;software at hand. Procedures relating to ongoing SQA Som05: c25; Tha97: c3;c9]throughout the process and for product (deliverable) Adherence to the various plans is assessed continually andverification and validation are also specified at this stage at predetermined intervals. Outputs and completion(for example, technical reviews and inspections) (see also conditions for each task are analyzed. Deliverables arethe Software Quality KA). evaluated in terms of their required characteristics (for2.7. Plan Management example, via reviews and audits). Effort expenditure, [Som05: c4; Tha97: c4] schedule adherence, and costs to date are investigated, and resource usage is examined. The project risk profile isHow the project will be managed and how the plan will be revisited, and adherence to quality requirements ismanaged must also be planned. Reporting, monitoring, and evaluated.control of the project must fit the selected softwareengineering process and the realities of the project, and Measurement data are modeled and analyzed. Variancemust be reflected in the various artifacts that will be used analysis based on the deviation of actual from expectedfor managing it. But, in an environment where change is an outcomes and values is undertaken. This may be in theexpectation rather than a shock, it is vital that plans are form of cost overruns, schedule slippage, and the like.themselves managed. This requires that adherence to plans Outlier identification and analysis of quality and otherbe systematically directed, monitored, reviewed, reported, measurement data are performed (for example, defectand, where appropriate, revised. Plans associated with other density analysis). Risk exposure and leverage aremanagement-oriented support processes (for example, recalculated, and decisions trees, simulations, and so on aredocumentation, software configuration management, and rerun in the light of new data. These activities enableproblem resolution) also need to be managed in the same problem detection and exception identification based onmanner. exceeded thresholds. Outcomes are reported as needed and certainly where acceptable thresholds are surpassed.3. Software Project Enactment 3.5. Control ProcessThe plans are then implemented, and the processes [Dor02: v2c7; Rei02: c10; Tha97: c3,c9]embodied in the plans are enacted. Throughout, there is a The outcomes of the process monitoring activities providefocus on adherence to the plans, with an overriding the basis on which action decisions are taken. Where© IEEE – 2004 Version 8–5
  • 124. appropriate, and where the impact and associated risks are in topic 2.2 Objectives of Testing and in the Softwaremodeled and managed, changes can be made to the project. Quality KA, in topic 2.3 Reviews and Audits.This may take the form of corrective action (for example, 4.2. Reviewing and Evaluating Performanceretesting certain components), it may involve theincorporation of contingencies so that similar occurrences [Dor02: v1c8,v2c3,c5; Pfl01: c8,c9; Rei02: c10;are avoided (for example, the decision to use prototyping to Tha97: c3,c10]assist in software requirements validation), and/or it may Periodic performance reviews for project personnel provideentail the revision of the various plans and other project insights as to the likelihood of adherence to plans as well asdocuments (for example, requirements specification) to possible areas of difficulty (for example, team memberaccommodate the unexpected outcomes and their conflicts). The various methods, tools, and techniquesimplications. employed are evaluated for their effectiveness andIn some instances, it may lead to abandonment of the appropriateness, and the process itself is systematically andproject. In all cases, change control and software periodically assessed for its relevance, utility, and efficacyconfiguration management procedures are adhered to (see in the project context. Where appropriate, changes arealso the Software Configuration Management KA), made and managed.decisions are documented and communicated to all relevantparties, plans are revisited and revised where necessary, 5. Closureand relevant data is recorded in the central database (see The project reaches closure when all the plans andalso topic 6.3 Perform the Measurement Process). embodied processes have been enacted and completed. At3.6. Reporting this stage, the criteria for project success are revisited. [Rei02: c10; Tha97: c3,c10] Once closure is established, archival, post mortem, and process improvement activities are performed.At specified and agreed periods, adherence to the plans isreported, both within the organization (for example to the 5.1. Determining Closureproject portfolio steering committee) and to external [Dor02: v1c8,v2c3,c5; Rei02: c10; Tha97: c3,c10]stakeholders (for example, clients, users). Reports of this The tasks as specified in the plans are complete, andnature should focus on overall adherence as opposed to the satisfactory achievement of completion criteria isdetailed reporting required frequently within the project confirmed. All planned products have been delivered withteam. acceptable characteristics. Requirements are checked off and confirmed as satisfied, and the objectives of the project4. Review and Evaluation have been achieved. These processes generally involve allAt critical points in the project, overall progress towards stakeholders and result in the documentation of clientachievement of the stated objectives and satisfaction acceptance and any remaining known problem reports.of stakeholder requirements are evaluated. Similarly, 5.2. Closure Activitiesassessments of the effectiveness of the overall process todate, the personnel involved, and the tools and methods [Pfl01: c12; Som05: c4]employed are also undertaken at particular milestones. After closure has been confirmed, archival of project4.1. Determining Satisfaction of Requirements materials takes place in line with stakeholder-agreed methods, location, and duration. The organization’s [Rei02: c10; Tha97: c3,c10] measurement database is updated with final project dataSince attaining stakeholder (user and customer) satisfaction and post-project analyses are undertaken. A project postis one of our principal aims, it is important that progress mortem is undertaken so that issues, problems, andtowards this aim be formally and periodically assessed. opportunities encountered during the process (particularlyThis occurs on achievement of major project milestones via review and evaluation, see subarea 4 Review and(for example, confirmation of software design architecture, evaluation) are analyzed, and lessons are drawn from thesoftware integration technical review). Variances from process and fed into organizational learning andexpectations are identified and appropriate action is taken. improvement endeavors (see also the Software EngineeringAs in the control process activity above (see topic 3.5 Process KA).Control Process), in all cases change control and softwareconfiguration management procedures are adhered to (see 6. Software Engineering Measurementthe Software Configuration Management KA), decisions [ISO 15939-02]are documented and communicated to all relevant parties,plans are revisited and revised where necessary, and The importance of measurement and its role in betterrelevant data are recorded in the central database (see also management practices is widely acknowledged, and so itstopic 6.3 Perform the Measurement Process). More importance can only increase in the coming years. Effectiveinformation can also be found in the Software Testing KA, measurement has become one of the cornerstones of organizational maturity. 8–6 © IEEE – 2004 Version
  • 125. Key terms on software measures and measurement methods Identify information needs. Information needs are basedhave been defined in [ISO15939-02] on the basis of the on the goals, constraints, risks, and problems of theISO international vocabulary of metrology [ISO93]. organizational unit. They may be derived from business,Nevertheless, readers will encounter terminology organizational, regulatory, and/or product objectives.differences in the literature; for example, the term They must be identified and prioritized. Then, a subset“metrics” is sometimes used in place of “measures.” to be addressed must be selected and the resultsThis topic follows the international standard ISO/IEC documented, communicated, and reviewed by15939, which describes a process which defines the stakeholders [ISO 15939-02: 5.2.2].activities and tasks necessary to implement a software Select measures. Candidate measures must be selected,measurement process and includes, as well, a measurement with clear links to the information needs. Measuresinformation model. must then be selected based on the priorities of the6.1. Establish and Sustain Measurement Commitment information needs and other criteria such as cost of collection, degree of process disruption during Accept requirements for measurement. Each collection, ease of analysis, ease of obtaining accurate, measurement endeavor should be guided by consistent data, and so on [ISO15939-02: 5.2.3 and organizational objectives and driven by a set of Appendix C]. measurement requirements established by the organization and the project. For example, an Define data collection, analysis, and reporting organizational objective might be “first-to-market with procedures. This encompasses collection procedures new products.” [Fen98: c3,c13; Pre04: c22] This in and schedules, storage, verification, analysis, reporting, turn might engender a requirement that factors and configuration management of data [ISO15939-02: contributing to this objective be measured so that 5.2.4]. projects might be managed to meet this objective. Define criteria for evaluating the information products. - Define scope of measurement. The organizational Criteria for evaluation are influenced by the technical unit to which each measurement requirement is to be and business objectives of the organizational unit. applied must be established. This may consist of a Information products include those associated with the functional area, a single project, a single site, or product being produced, as well as those associated even the whole enterprise. All subsequent with the processes being used to manage and measure measurement tasks related to this requirement the project [ISO15939-02: 5.2.5 and Appendices D, E]. should be within the defined scope. In addition, the Review, approve, and provide resources for stakeholders should be identified. measurement tasks. - Commitment of management and staff to - The measurement plan must be reviewed and measurement. The commitment must be formally approved by the appropriate stakeholders. This established, communicated, and supported by includes all data collection procedures, storage, resources (see next item). analysis, and reporting procedures; evaluation Commit resources for measurement. The organization’s criteria; schedules; and responsibilities. Criteria for commitment to measurement is an essential factor for reviewing these artifacts should have been success, as evidenced by assignment of resources for established at the organizational unit level or higher implementing the measurement process. Assigning and should be used as the basis for these reviews. resources includes allocation of responsibility for the Such criteria should take into consideration previous various tasks of the measurement process (such as user, experience, availability of resources, and potential analyst, and librarian) and providing adequate funding, disruptions to projects when changes from current training, tools, and support to conduct the process in an practices are proposed. Approval demonstrates enduring fashion. commitment to the measurement process [ISO15939-02: 5.2.6.1 and Appendix F].6.2. Plan the Measurement Process - Resources should be made available for Characterize the organizational unit. The organizational implementing the planned and approved unit provides the context for measurement, so it is measurement tasks. Resource availability may be important to make this context explicit and to articulate staged in cases where changes are to be piloted the assumptions that it embodies and the constraints that before widespread deployment. Consideration it imposes. Characterization can be in terms of should be paid to the resources necessary for organizational processes, application domains, successful deployment of new procedures or technology, and organizational interfaces. An measures [ISO15939-02: 5.2.6.2]. organizational process model is also typically an element of the organizational unit characterization Acquire and deploy supporting technologies. This [ISO15939-02: 5.2.1]. includes evaluation of available supporting technologies, selection of the most appropriate© IEEE – 2004 Version 8–7
  • 126. technologies, acquisition of those technologies, and reviewing the data to ensure that they are meaningful deployment of those technologies [ISO 15939-02: and accurate, and that they can result in reasonable 5.2.7]. actions [ISO 15939-02: 5.3.3 and Appendix G].6.3. Perform the Measurement Process Communicate results. Information products must be Integrate measurement procedures with relevant documented and communicated to users and processes. The measurement procedures, such as data stakeholders [ISO 15939-02: 5.3.4]. collection, must be integrated into the processes they 6.4. Evaluate Measurement are measuring. This may involve changing current Evaluate information products. Evaluate information processes to accommodate data collection or generation products against specified evaluation criteria and activities. It may also involve analysis of current determine strengths and weaknesses of the information processes to minimize additional effort and evaluation products. This may be performed by an internal process of the effect on employees to ensure that the or an external audit and should include feedback from measurement procedures will be accepted. Morale measurement users. Record lessons learned in an issues and other human factors need to be considered. appropriate database [ISO 15939-02: 5.4.1 and In addition, the measurement procedures must be Appendix D]. communicated to those providing the data, training may need to be provided, and support must typically be Evaluate the measurement process. Evaluate the provided. Data analysis and reporting procedures must measurement process against specified evaluation typically be integrated into organizational and/or project criteria and determine the strengths and weaknesses of processes in a similar manner [ISO 15939-02: 5.3.1]. the process. This may be performed by an internal process or an external audit and should include Collect data. The data must be collected, verified, and feedback from measurement users. Record lessons stored [ISO 15939-02 :5.3.2]. learned in an appropriate database [ISO 15939-02: 5.4.1 Analyze data and develop information products. Data and Appendix D]. may be aggregated, transformed, or recoded as part of Identify potential improvements. Such improvements the analysis process, using a degree of rigor appropriate may be changes in the format of indicators, changes in to the nature of the data and the information needs. The units measured, or reclassification of categories. results of this analysis are typically indicators such as Determine the costs and benefits of potential graphs, numbers, or other indications that must be improvements and select appropriate improvement interpreted, resulting in initial conclusions to be actions. Communicate proposed improvements to the presented to stakeholders. The results and conclusions measurement process owner and stakeholders for must be reviewed, using a process defined by the review and approval. Also communicate lack of organization (which may be formal or informal). Data potential improvements if the analysis fails to identify providers and measurement users should participate in improvements [ISO 15939-02: 5.4.2]. 8–8 © IEEE – 2004 Version
  • 127. MATRIX OF TOPICS VS. REFERENCE MATERIAL [Dor02] [ISO15939-02] [Fen98] [Pfl01] [Pre04] [Rei02] [Som05] [Tha97]1. Initiation and scope definition1.1 Determination and negotiation of v2c4 c4 c7 c5requirements1.2 Feasibility analysis c6 c61.3 Process for the review and revision of c6requirements2. Software Project Planning v1c6,v2c7,2.1 Process planning c2,c3 c2,c21 c1,c3,c5 c3,c4 c3,c4,c6 v2c82.2 Determine deliverables c3 c24 c423 Effort, schedule and cost estimation v2c7 c12 c3 C23,c24 c5,c6 c4,c23 c52.4 Resource allocation c3 c24 c8,c9 c4 c6,c72.5 Risk management v2c7 c3 c25 c11 c4 c4 v1c8,v2c3-2.6 Quality management c26 c10 c24,c25 c9,c10 c52.7 Plan management c4 c43. Software Project Enactment3.1 Implementation of plans c3 c43.2 Supplier contract management c43.3 Implementation of measurement c13c,14 c22 c10,c12 c3,c10process v1c8,v2c2-3.4 Monitor process c10 c25 c3,c9 c5,c73.5 Control process v2c7 c10 c3,c93.6 Reporting c10 c3,c104. Review and evaluation4.1 Determining satisfaction of c10 c3,c10requirements v1c8,v2c3,4.2 Reviewing and evaluating performance c8,c9 c10 c3,c10 c55. Closure v1c8,v2c3,5.1 Determining closure c10 c3,c10 c55.2 Closure activities c12 c46. Software Engineering Measurement *6.1 Establish and sustain measurement c3,c13 c22commitment6.2 Plan the measurement process c5,C,D,E,F6.3 Perform the measurement process c5,G6.4 Evaluate measurement c5,D © IEEE – 2004 Version 8–9
  • 128. [Pfl01] S.L. Pfleeger, Software Engineering: Theory andRECOMMENDED REFERENCES FOR SOFTWARE Practice, second ed., Prentice Hall, 2001, Chap. 2-4, 8, 9,ENGINEERING MANAGEMENT 12, 13.[Dor02] M. Dorfman and R.H. Thayer, eds., Software [Pre04] R.S. Pressman, Software Engineering: AEngineering, IEEE Computer Society Press, 2002, Vol. 1, Practitioners Approach, sixth ed., McGraw-Hill, 2004,Chap. 6, 8, Vol. 2, Chap. 3, 4, 5, 7, 8. Chap. 2, 6, 7, 22-26.[Fen98] N.E. Fenton and S.L. Pfleeger, Software Metrics: A [Rei02] D.J. Reifer, ed., Software Management, IEEERigorous & Practical Approach, second ed., International Computer Society Press, 2002, Chap. 1-6, 7-12, 13.Thomson Computer Press, 1998, Chap. 1-14. [Som05] I. Sommerville, Software Engineering, seventh[ISO15939-02] ISO/IEC 15939:2002, Software ed., Addison-Wesley, 2005, Chap. 3-6, 23-25.Engineering — Software Measurement Process, ISO and [Tha97] R.H. Thayer, ed., Software Engineering ProjectIEC, 2002. Management, IEEE Computer Society Press, 1997, Chap. 1-10. 8–10 © IEEE – 2004 Version
  • 129. (Fay96) M.E. Fayad and M. Cline, “Managing Object-APPENDIX A. LIST OF FURTHER READINGS Oriented Software Development,” Computer, September(Adl99) T.R. Adler, J.G. Leonard, and R.K. Nordgren, 1996, pp. 26-31.“Improving Risk Management: Moving from Risk (Fen98) N.E. Fenton and S.L. Pfleeger, Software Metrics: AElimination to Risk Avoidance,” Information and Software Rigorous & Practical Approach, second ed., InternationalTechnology, vol. 41, 1999, pp. 29-34. Thomson Computer Press, 1998.(Bai98) R. Baines, “Across Disciplines: Risk, Design, (Fle99) R. Fleming, “A Fresh Perspective on OldMethod, Process, and Tools,” IEEE Software, July/August Problems,” IEEE Software, January/February 1999, pp.1998, pp. 61-64. 106-113.(Bin97) R.V. Binder, “Can a Manufacturing Quality Model (Fug98) A. Fuggetta et al., “Applying GQM in an IndustrialWork for Software?” IEEE Software, September/October Software Factory,” ACM Transactions on Software1997, pp. 101-102,105. Engineering and Methodology, vol. 7, iss. 4, 1998, pp. 411-(Boe97) B.W. Boehm and T. DeMarco, “Software Risk 448.Management,” IEEE Software, May/June 1997, pp. 17-19. (Gar97) P.R. Garvey, D.J. Phair, and J.A. Wilson, “An(Bri96) L.C. Briand, S. Morasca, and V.R. Basili, Information Architecture for Risk Assessment and“Property-Based Software Engineering Measurement,” Management,” IEEE Software, May/June 1997, pp. 25-34.IEEE Transactions on Software Engineering, vol. 22, iss. 1, (Gem97) A. Gemmer, “Risk Management: Moving beyond1996, pp. 68-86. Process,” Computer, May 1997, pp. 33-43.(Bri96a) L. Briand, K.E. Emam, and S. Morasca, “On the (Gla97) R.L. Glass, “The Ups and Downs of ProgrammerApplication of Measurement Theory in Software Stress,” Communications of the ACM, vol. 40, iss. 4, 1997,Engineering,” Empirical Software Engineering, vol. 1, pp. 17-19.1996, pp. 61-88. (Gla98) R.L. Glass, “Short-Term and Long-Term Remedies(Bri97) L.C. Briand, S. Morasca, and V.R. Basili, for Runaway Projects,” Communications of the ACM, vol.“Response to: Comments on ‘Property-based Software 41, iss. 7, 1998, pp. 13-15.Engineering Measurement: Refining the Addivity (Gla98a) R.L. Glass, “How Not to Prepare for a ConsultingProperties,’” IEEE Transactions on Software Engineering, Assignment, and Other Ugly Consultancy Truths,”vol. 23, iss. 3, 1997, pp. 196-197. Communications of the ACM, vol. 41, iss. 12, 1998, pp. 11-13.(Bro87) F.P.J. Brooks, “No Silver Bullet: Essence and (Gla99) R.L. Glass, “The Realities of Software TechnologyAccidents of Software Engineering,” Computer, Apr. 1987, Payoffs,” Communications of the ACM, vol. 42, iss. 2,pp. 10-19. 1999, pp. 74-79.(Cap96) J. Capers, Applied Software Measurement: (Gra99) R. Grable et al., “Metrics for Small Projects:Assuring Productivity and Quality, second ed., McGraw- Experiences at the SED,” IEEE Software, March/AprilHill, 1996. 1999, pp. 21-29.(Car97) M.J. Carr, “Risk Management May Not Be For (Gra87) R.B. Grady and D.L. Caswell, Software Metrics:Everyone,” IEEE Software, May/June 1997, pp. 21-24. Establishing A Company-Wide Program. Prentice Hall,(Cha96) R.N. Charette, “Large-Scale Project Management 1987.Is Risk Management,” IEEE Software, July 1996, pp. 110- (Hal97) T. Hall and N. Fenton, “Implementing Effective117. Software Metrics Programs,” IEEE Software, March/April(Cha97) R.N. Charette, K.M. Adams, and M.B. White, 1997, pp. 55-64.“Managing Risk in Software Maintenance,” IEEE (Hen99) S.M. Henry and K.T. Stevens, “Using Belbin’sSoftware, May/June 1997, pp. 43-50. Leadership Role to Improve Team Effectiveness: An(Col96) B. Collier, T. DeMarco,and P. Fearey, “A Defined Empirical Investigation,” Journal of Systems and Software,Process for Project Postmortem Review,” IEEE Software, vol. 44, 1999, pp. 241-250.July 1996, pp. 65-72. (Hoh99) L. Hohmann, “Coaching the Rookie Manager,”(Con97) E.H. Conrow and P.S. Shishido, “Implementing IEEE Software, January/February 1999, pp. 16-19.Risk Management on Software Intensive Projects,” IEEE (Hsi96) P. Hsia, “Making Software Development Visible,”Software, May/June 1997, pp. 83-89. IEEE Software, March 1996, pp. 23-26.(Dav98) A.M. Davis, “Predictions and Farewells,” IEEE (Hum97) W.S. Humphrey, Managing Technical People:Software, July/August 1998, pp. 6-9. Innovation, Teamwork, and the Software Process: Addison-(Dem87) T. DeMarco and T. Lister, Peopleware: Wesley, 1997.Productive Projects and Teams, Dorset House Publishing, (IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/1987. IEC12207:1995, Industry Implementation of Int. Std.(Dem96) T. DeMarco and A. Miller, “Managing Large ISO/IEC 12207:95, Standard for Information Technology-Software Projects,” IEEE Software, July 1996, pp. 24-27. Software Life Cycle Processes, IEEE, 1996.(Fav98) J. Favaro and S.L. Pfleeger, “Making Software (Jac98) M. Jackman, “Homeopathic Remedies for TeamDevelopment Investment Decisions,” ACM SIGSoft Toxicity,” IEEE Software, July/August 1998, pp. 43-45.Software Engineering Notes, vol. 23, iss. 5, 1998, pp. 69-74. (Kan97) K. Kansala, “Integrating Risk Assessment with Cost© IEEE – 2004 Version 8–11
  • 130. Estimation,” IEEE Software, May/June 1997, pp. 61-67. (Put97) L.H. Putman and W. Myers, Industrial Strength(Kar97) J. Karlsson and K. Ryan, “A Cost-Value Aproach Software — Effective Management Using Measurement,for Prioritizing Requirements,” IEEE Software, IEEE Computer Society Press, 1997.September/October 1997, pp. 87-74. (Rob99) P.N. Robillard, “The Role of Knowledge in(Kar96) D.W. Karolak, Software Engineering Risk Software Development,” Communications of the ACM, vol.Management, IEEE Computer Society Press, 1996. 42, iss. 1, 1999, pp. 87-92.(Kau99) K. Kautz, “Making Sense of Measurement for (Rod97) A.G. Rodrigues and T.M. Williams, “SystemSmall Organizations,” IEEE Software, March/April 1999, Dynamics in Software Project Management: Towards thepp. 14-20. Development of a Formal Integrated Framework,”(Kei98) M. Keil et al., “A Framework for Identifying European Journal of Information Systems, vol. 6, 1997, pp.Software Project Risks,” Communications of the ACM, vol. 51-66.41, iss. 11, 1998, pp. 76-83. (Rop97) J. Ropponen and K. Lyytinen, “Can Software Risk(Ker99) B. Kernighan and R. Pike, “Finding Performance Management Improve System Development: AnImprovements,” IEEE Software, March/April 1999, pp. 61-65. Exploratory Study,” European Journal of Information(Kit97) B. Kitchenham and S. Linkman, “Estimates, Systems, vol. 6, 1997, pp. 41-50.Uncertainty, and Risk,” IEEE Software, May/June 1997, (Sch99) C. Schmidt et al., “Disincentives forpp. 69-74. Communicating Risk: A Risk Paradox,” Information and(Lat98) F. v. Latum et al., “Adopting GQM-Based Software Technology, vol. 41, 1999, pp. 403-411.Measurement in an Industrial Environment,” IEEE (Sco92) R.L. v. Scoy, “Software Development Risk:Software, January-February 1998, pp. 78-86. Opportunity, Not Problem,” Software Engineering Institute,(Leu96) H.K.N. Leung, “A Risk Index for Software Carnegie Mellon University CMU/SEI-92-TR-30, 1992.Producers,” Software Maintenance: Research and Practice, (Sla98) S.A. Slaughter, D.E. Harter, and M.S. Krishnan,vol. 8, 1996, pp. 281-294. “Evaluating the Cost of Software Quality,”(Lis97) T. Lister, “Risk Management Is Project Communications of the ACM, vol. 41, iss. 8, 1998, pp. 67-73.Management for Adults,” IEEE Software, May/June 1997, (Sol98) R. v. Solingen, R. Berghout, and F. v. Latum,pp. 20-22. “Interrupts: Just a Minute Never Is,” IEEE Software,(Mac96) K. Mackey, “Why Bad Things Happen to Good September/October 1998, pp. 97-103.Projects,” IEEE Software, May 1996, pp. 27-32. (Whi95) N. Whitten, Managing Software Development(Mac98) K. Mackey, “Beyond Dilbert: Creating Cultures Projects: Formulas for Success, Wiley, 1995.that Work,” IEEE Software, January/February 1998, pp. 48-49. (Wil99) B. Wiley, Essential System Requirements: A(Mad97) R.J. Madachy, “Heuristic Risk Assessment Using Practical Guide to Event-Driven Methods, Addison-Cost Factors,” IEEE Software, May/June 1997, pp. 51-59. Wesley, 1999.(McC96) S.C. McConnell, Rapid Development: Taming (Zel98) M.V. Zelkowitz and D.R. Wallace, “ExperimentalWild Software Schedules, Microsoft Press, 1996. Models for Validating Technology,” Computer, vol. 31, iss.(McC97) S.C. McConnell, Software Project Survival 5, 1998, pp. 23-31.Guide, Microsoft Press, 1997.(McC99) S.C. McConnell, “Software EngineeringPrinciples,” IEEE Software, March/April 1999, pp. 6-8.(Moy97) T. Moynihan, “How Experienced ProjectManagers Assess Risk,” IEEE Software, May/June 1997,pp. 35-41.(Ncs98) P. Ncsi, “Managing OO Projects Better,” IEEESoftware, July/August 1998, pp. 50-60.(Nol99) A.J. Nolan, “Learning From Success,” IEEESoftware, January/February 1999, pp. 97-105.(Off97) R.J. Offen and R. Jeffery, “Establishing SoftwareMeasurement Programs,” IEEE Software, March/April1997, pp. 45-53.(Par96) K.V.C. Parris, “Implementing Accountability,”IEEE Software, July/August 1996, pp. 83-93.(Pfl97) S.L. Pfleeger, “Assessing Measurement (GuestEditor’s Introduction),” IEEE Software, March/April 1997,pp. 25-26.(Pfl97a) S.L. Pfleeger et al., “Status Report on SoftwareMeasurement,” IEEE Software, March/April 1997, pp. 33-43. 8–12 © IEEE – 2004 Version
  • 131. APPENDIX B. LIST OF STANDARDS(IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEEStandard Glossary of Software Engineering Terminology,IEEE, 1990.(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std.ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996.(ISO15939-02) ISO/IEC 15939:2002, SoftwareEngineering-Software Measurement Process, ISO andIEC, 2002.(PMI00) Project Management Institute StandardsCommittee, A Guide to the Project Management Body ofKnowledge (PMBOK), Project Management Institute,2000.© IEEE – 2004 Version 8–13
  • 132. 8–14 © IEEE – 2004 Version
  • 133. CHAPTER 9 SOFTWARE ENGINEERING PROCESS Finally, a third meaning could signify the actual set of activities performed within an organization, whichACRONYMS could be viewed as one process, especially from withinCMMI Capability Maturity Model Integration the organization. This meaning is used in the KA in a very few instances.EF Experience Factory This KA applies to any part of the management of softwareFP Function Point life cycle processes where procedural or technologicalHRM Human Resources Management change is being introduced for process or productIDEAL Initiating-Diagnosing-Establishing- improvement. Acting-Leaning (model) Software engineering process is relevant not only to largeOMG Object Management Group organizations. On the contrary, process-related activitiesQIP Quality Improvement Paradigm can, and have been, performed successfully by small organizations, teams, and individuals.SCAMPI CMM Based Appraisal for Process Improvement using the CMMI The objective of managing software life cycle processes is to implement new or better processes in actual practices, beSCE Software Capability Evaluation they individual, project, or organizational.SEPG Software Engineering Process Group This KA does not explicitly address human resources management (HRM), for example, as embodied in the People CMM (Cur02) and systems engineering processesINTRODUCTION [ISO1528-028; IEEE 1220-98].The Software Engineering Process KA can be examined on It should also be recognized that many softwaretwo levels. The first level encompasses the technical and engineering process issues are closely related to othermanagerial activities within the software life cycle disciplines, such as management, albeit sometimes using aprocesses that are performed during software acquisition, different terminology.development, maintenance, and retirement. The second isthe meta-level, which is concerned with the definition, BREAKDOWN OF TOPICS FOR SOFTWAREimplementation, assessment, measurement, management, ENGINEERING PROCESSchange, and improvement of the software life cycleprocesses themselves. The first level is covered by the other Figure 1 shows the breakdown of topics in this KA.KAs in the Guide. This KA is concerned with the second. 1. Process Implementation and ChangeThe term “software engineering process” can be interpreted This subarea focuses on organizational change. It describesin different ways, and this may cause confusion. the infrastructure, activities, models, and practical considerations for process implementation and change. One meaning, where the word the is used, as in the software engineering process, could imply that there is Described here is the situation in which processes are only one right way of performing software engineering deployed for the first time (for example, introducing an tasks. This meaning is avoided in the Guide, because inspection process within a project or a method covering no such process exists. Standards such as IEEE12207 the complete life cycle), and where current processes are speak of software engineering processes, meaning that changed (for example, introducing a tool, or optimizing a there are many processes involved, such as procedure). This can also be termed process evolution. In Development Process or Configuration Management both instances, existing practices have to be modified. If the Process. modifications are extensive, then changes in the organizational culture may also be necessary. A second meaning refers to the general discussion of processes related to software engineering. This is the meaning intended in the title of this KA, and the one most often intended in the KA description.© IEEE – 2004 Version 9–1
  • 134. Software Engineering Process Process Process and Implementation Process Process Product and Change Definition Assessment Measurement Software Life Cycle Process Assessment Process Measurement Process Models Models Infrastructure Software Process Software Life Process Software Products Management Cycle Processes Assessment Measurement Cycle Methods Models for Process Notations for Quality of Implementation and Process Measurement Results Change Definitions Software Practical Process Adaptation Information Models Considerations Automation Process Measurement Techniques Figure 1 Breakdown of topics for the Software Engineering Process KA1.1. Process Infrastructure may have to be established, such as a steering committee to [IEEE12207.0-96; ISO15504-98; SEL96] oversee the software engineering process effort.This topic includes the knowledge related to the software A description of an infrastructure for process improvementengineering process infrastructure. in general is provided in [McF96]. Two main types of infrastructure are used in practice: the SoftwareTo establish software life cycle processes, it is necessary to Engineering Process Group and the Experience Factory.have an appropriate infrastructure in place, meaning thatthe resources must be available (competent staff, tools, and 1.1.1. Software Engineering Process Group (SEPG)funding) and the responsibilities assigned. When these The SEPG is intended to be the central focus of softwaretasks have been completed, it is an indication of engineering process improvement, and it has a number ofmanagement’s commitment to, and ownership of, the responsibilities in terms of initiating and sustaining it.software engineering process effort. Various committees These are described in [Fow90]. 9–2 © IEEE – 2004 Version
  • 135. 1.1.2. Experience Factory (EF) change efforts treat the change as a project in its own right,The concept of the EF separates the project organization with appropriate plans, monitoring, and review.(the software development organization, for example) from Guidelines about process implementation and changethe improvement organization. The project organization within software engineering organizations, including actionfocuses on the development and maintenance of software, planning, training, management sponsorship, commitment,while the EF is concerned with software engineering and the selection of pilot projects, and which cover bothprocess improvement. processes and tools, are given in [Moi98; San98; Sti99].The EF is intended to institutionalize the collective learning Empirical studies on success factors for process change areof an organization by developing, updating, and delivering reported in (ElE99a).to the project organization experience packages (for The role of change agents in this activity is discussed inexample, guides, models, and training courses), also (Hut94). Process implementation and change can also bereferred to as process assets. The project organization seen as an instance of consulting (either internal oroffers the EF their products, the plans used in their external).development, and the data gathered during developmentand operation. Examples of experience packages are One can also view organizational change from thepresented in [Bas92]. perspective of technology transfer (Rog83). Software1.2. Software Process Management Cycle engineering articles which discuss technology transfer and the characteristics of recipients of new technology (which [Bas92; Fow90; IEEE12207.0-96; ISO15504-98; could include process-related technologies) are (Pfl99; McF96; SEL96] Rag89).The management of software processes consists of four There are two ways of approaching the evaluation ofactivities sequenced in an iterative cycle allowing for process implementation and change, either in terms ofcontinuous feedback and improvement of the software changes to the process itself or in terms of changes to theprocess: process outcomes (for example, measuring the return on The Establish Process Infrastructure activity consists investment from making the change). A pragmatic look at of establishing commitment to process implementation what can be achieved from such evaluation studies is given and change (including obtaining management buy-in) in (Her98). and putting in place an appropriate infrastructure Overviews of how to evaluate process implementation and (resources and responsibilities) to make it happen. change, and examples of studies that do so, can be found in The goal of the Planning activity is to understand the [Gol99], (Kit98; Kra99; McG94). current business objectives and process needs of the 2. Process Definition individual, project, or organization, to identify its A process definition can be a procedure, a policy, or a strengths and weaknesses, and to make a plan for standard. Software life cycle processes are defined for a process implementation and change. number of reasons, including increasing the quality of the The goal of Process Implementation and Change is to product, facilitating human understanding and execute the plan, deploy new processes (which may communication, supporting process improvement, involve, for example, the deployment of tools and supporting process management, providing automated training of staff), and/or change existing processes. process guidance, and providing automated execution support. The types of process definitions required will Process Evaluation is concerned with finding out how depend, at least partially, on the reason for the definition. well the implementation and change went, whether or not the expected benefits materialized. The results are It should also be noted that the context of the project and then used as input for subsequent cycles. organization will determine the type of process definition that is most useful. Important variables to consider include1.3. Models For Process Implementation And Change the nature of the work (for example, maintenance orTwo general models that have emerged for driving process development), the application domain, the life cycle model,implementation and change are the Quality Improvement and the maturity of the organization.Paradigm (QIP) [SEL96] and the IDEAL model [McF96]. 2.1. Software Life Cycle ModelsThe two paradigms are compared in [SEL96]. Evaluation ofprocess implementation and change outcomes can be [Pfl01:c2; IEEE12207.0-96]qualitative or quantitative. Software life cycle models serve as a high-level definition of the phases that occur during development. They are not1.4. Practical Considerations aimed at providing detailed definitions but at highlightingProcess implementation and change constitute an instance the key activities and their interdependencies. Examples ofof organizational change. Most successful organizational software life cycle models are the waterfall model, the© IEEE – 2004 Version 9–3
  • 136. throwaway prototyping model, evolutionary development, Various elements of a process can be defined, for example,incremental/iterative delivery, the spiral model, the activities, products (artifacts), and resources. Detailedreusable software model, and automated software synthesis. frameworks which structure the types of informationComparisons of these models are provided in [Com97], required to define processes are described in (Mad94).(Dav88), and a method for selecting among many of them There are a number of notations being used to definein (Ale91). processes (SPC92). A key difference between them is in the2.2. Software Life Cycle Processes type of information the frameworks mentioned aboveDefinitions of software life cycle processes tend to be more define, capture, and use. The software engineer should bedetailed than software life cycle models. However, software aware of the following approaches: data flow diagrams, inlife cycle processes do not attempt to order their processes terms of process purpose and outcomes [ISO15504-98], asin time. This means that, in principle, the software life a list of processes decomposed into constituent activitiescycle processes can be arranged to fit any of the software and tasks defined in natural language [IEEE12207.0-96],life cycle models. The main reference in this area is Statecharts (Har98), ETVX (Rad85), Actor-DependencyIEEE/EIA 12207.0: Information Technology — Software modeling (Yu94), SADT notation (Mcg93), Petri netsLife Cycle Processes [IEEE 12207.0-96]. (Ban95); IDEF0 (IEEE 1320.1-98), and rule-based (Bar95). More recently, a process modeling standard has beenThe IEEE 1074:1997 standard on developing life cycle published by the OMG which is intended to harmonizeprocesses also provides a list of processes and activities for modeling notations. This is termed the SPEM (Softwaresoftware development and maintenance [IEEE1074-97], as Process Engineering Meta-Model) specification. [OMG02]well as a list of life cycle activities which can be mappedinto processes and organized in the same way as any of the 2.4. Process Adaptationsoftware life cycle models. In addition, it identifies and [IEEE 12207.0-96; ISO15504-98; Joh99]links other IEEE software standards to these activities. In It is important to note that predefined processes—evenprinciple, IEEE Std 1074 can be used to build processes standardized ones—must be adapted to local needs, forconforming to any of the life cycle models. Standards example, organizational context, project size, regulatorywhich focus on maintenance processes are IEEE Std 1219- requirements, industry practices, and corporate cultures.1998 and ISO 14764: 1998 [IEEE 1219-98]. Some standards, such as IEEE/EIA 12207, containOther important standards providing process definitions mechanisms and recommendations for accomplishing theinclude adaptation. IEEE Std 1540: Software Risk Management 2.5. Automation (IEEE1540-01) [Pfl01:c2] IEEE Std 1517: Software Reuse Processes (IEEE Automated tools either support the execution of the process 1517-99) definitions or they provide guidance to humans performing the defined processes. In cases where process analysis is ISO/IEC 15939: Software Measurement Process performed, some tools allow different types of simulations [ISO15939-02]. See also the Software Engineering (for example, discrete event simulation). Management KA for a detailed description of this process. In addition, there are tools which support each of the above process definition notations. Moreover, these tools canIn some situations, software engineering processes must be execute the process definitions to provide automateddefined taking into account the organizational processes for support to the actual processes, or to fully automate them inquality management. ISO 9001 [ISO9001-00] provides some instances. An overview of process-modeling tools canrequirements for quality management processes, and be found in [Fin94] and of process-centered environmentsISO/IEC 90003 interprets those requirements for in (Gar96). Work on applying the Internet to the provisionorganizations developing software (ISO90003-04). of real-time process guidance is described in (Kel98).Some software life cycle processes emphasize rapid 3. Process Assessmentdelivery and strong user participation, namely agile Process assessment is carried out using both an assessmentmethods such as Extreme Programming [Bec99]. A form of model and an assessment method. In some instances, thethe selection problem concerns the choice along the agile term “appraisal” is used instead of assessment, and the termplan-driven method axis. A risk-based approach to making “capability evaluation” is used when the appraisal is for thethat decision is described in (Boe03a). purpose of awarding a contract.2.3. Notations for Process Definitions 3.1. Process Assessment ModelsProcesses can be defined at different levels of abstraction An assessment model captures what is recognized as good(for example, generic definitions vs. adapted definitions, practices. These practices may pertain to technical softwaredescriptive vs. prescriptive vs. proscriptive) [Pfl01]. 9–4 © IEEE – 2004 Version
  • 137. engineering activities only, or may also refer to, for 4. Process and Product Measurementexample, management, systems engineering, and human While the application of measurement to softwareresources management activities as well. engineering can be complex, particularly in terms ofISO/IEC 15504 [ISO15504-98] defines an exemplar modeling and analysis methods, there are several aspects ofassessment model and conformance requirements on other software engineering measurement which are fundamentalassessment models. Specific assessment models available and which underlie many of the more advancedand in use are SW-CMM (SEI95), CMMI [SEI01], and measurement and analysis processes. Furthermore,Bootstrap [Sti99]. Many other capability and maturity achievement of process and product improvement effortsmodels have been defined—for example, for design, can only be assessed if a set of baseline measures has beendocumentation, and formal methods, to name a few. ISO established.9001 is another common assessment model which has been Measurement can be performed to support the initiation ofapplied by software organizations (ISO9001-00). process implementation and change or to evaluate theA maturity model for systems engineering has also been consequences of process implementation and change, or itdeveloped, which would be useful where a project or can be performed on the product itself.organization is involved in the development and Key terms on software measures and measurement methodsmaintenance of systems, including software (EIA/IS731- have been defined in ISO/IEC 15939 on the basis of the99). ISO international vocabulary of metrology. ISO/IEC 15359The applicability of assessment models to small also provides a standard process for measuring both processorganizations is addressed in [Joh99; San98]. and product characteristics. [VIM93]There are two general architectures for an assessment Nevertheless, readers will encounter terminologicalmodel that make different assumptions about the order in differences in the literature; for example, the term “metric”which processes must be assessed: continuous and staged is sometimes used in place of “measure.”(Pau94). They are very different, and should be evaluated 4.1. Process Measurementby the organization considering them to determine whichwould be the most pertinent to their needs and objectives. [ISO15539-02] The term “process measurement” as used here means that3.2. Process Assessment Methods quantitative information about the process is collected, [Gol99] analyzed, and interpreted. Measurement is used to identify the strengths and weaknesses of processes and to evaluateIn order to perform an assessment, a specific assessment processes after they have been implemented and/ormethod needs to be followed to produce a quantitative changed.score which characterizes the capability of the process (ormaturity of the organization). Process measurement may serve other purposes as well. For example, process measurement is useful for managing aThe CBA-IPI assessment method, for example, focuses on software engineering project. Here, the focus is on processprocess improvement (Dun96), and the SCE method measurement for the purpose of process implementationfocuses on evaluating the capability of suppliers (Bar95). and change.Both of these were developed for the SW-CMM.Requirements on both types of methods which reflect what The path diagram in Figure 2 illustrates an importantare believed to be good assessment practices are provided assumption made in most software engineering projects,in [ISO15504-98], (Mas95). The SCAMPI methods are which is that usually the process has an impact on projectgeared toward CMMI assessments [SEI01]. The activities outcomes. The context affects the relationship between theperformed during an assessment, the distribution of effort process and process outcomes. This means that thison these activities, as well as the atmosphere during an process-to-process outcome relationship depends on theassessment are different when they are for improvement context.than when they are for a contract award. Not every process will have a positive impact on allThere have been criticisms of process assessment models outcomes. For example, the introduction of softwareand methods, for example (Fay97; Gra98). Most of these inspections may reduce testing effort and cost, but maycriticisms have been concerned with the empirical evidence increase elapsed time if each inspection introduces longsupporting the use of assessment models and methods. delays due to the scheduling of large inspection meetings.However, since the publication of these articles, there has (Vot93) Therefore, it is preferable to use multiple processbeen some systematic evidence supporting the efficacy of outcome measures which are important to theprocess assessments. (Cla97; Ele00; Ele00a; Kri99) organization’s business. While some effort can be made to assess the utilization of tools and hardware, the primary resource that needs to be© IEEE – 2004 Version 9–5
  • 138. managed in software engineering is personnel. As a result, 4.1.1. Size measurementthe main measures of interest are those related to the Software product size is most often assessed by measuresproductivity of teams or processes (for example, using a of length (for example, lines of source code in a module,measure of function points produced per unit of person- pages in a software requirements specification document),effort) and their associated levels of experience in software or functionality (for example, function points in aengineering in general, and perhaps in particular specification). The principles of functional sizetechnologies. [Fen98: c3, c11; Som05: c25] measurement are provided in IEEE Std 14143.1. International standards for functional size measurementProcess outcomes could, for example, be product quality methods include ISO/IEC 19761, 20926, and 20968 [IEEE(faults per KLOC (Kilo Lines of Code) or per Function 14143.1-00; ISO19761-03; ISO20926-03; ISO20968-02].Point (FP)), maintainability (the effort to make a certaintype of change), productivity (LOC (Lines of Code) or 4.1.2. Structure measurementFunction Points per person-month), time-to-market, or A diverse range of measures of software product structurecustomer satisfaction (as measured through a customer may be applied to both high- and low-level design and codesurvey). This relationship depends on the particular context artifacts to reflect control flow (for example the cyclomatic(for example, size of the organization or size of the number, code knots), data flow (for example, measures ofproject). slicing), nesting (for example, the nesting polynomial measure, the BAND measure), control structures (forIn general, we are most concerned about process outcomes. example, the vector measure, the NPATH measure), andHowever, in order to achieve the process outcomes that we modular structure and interaction (for example, informationdesire (for example, better quality, better maintainability, flow, tree-based measures, coupling and cohesion). [Fen98:greater customer satisfaction), we have to implement the c8; Pre04: c15]appropriate process. 4.1.3. Quality measurementOf course, it is not only the process that has an impact onoutcomes. Other factors, such as the capability of the staff As a multi-dimensional attribute, quality measurement isand the tools that are used, play an important role. When less straightforward to define than those above.evaluating the impact of a process change, for example, it is Furthermore, some of the dimensions of quality are likelyimportant to factor out these other influences. Furthermore, to require measurement in qualitative rather thanthe extent to which the process is institutionalized (that is, quantitative form. A more detailed discussion of softwareprocess fidelity) is important, as it may explain why “good” quality measurement is provided in the Software Qualityprocesses do not always give the desired outcomes in a KA, topic 3.4. ISO models of software product quality andgiven situation. of related measurements are described in ISO 9126, parts 1 to 4 [ISO9126-01]. [Fen98: c9,c10; Pre04: c15; Som05: c24] Proc ess 4.2. Quality Of Measurement Results Process Outc omes The quality of the measurement results (accuracy, reproducibility, repeatability, convertibility, random measurement errors) is essential for the measurement programs to provide effective and bounded results. Key characteristics of measurement results and related quality of Context measuring instruments have been defined in the ISO International vocabulary on metrology. [VIM93]Figure 2 Path diagram showing the relationship between The theory of measurement establishes the foundation onprocess and outcomes (results). which meaningful measurements can be made. The theory of measurement and scale types is discussed in [Kan02]. Measurement is defined in the theory as “the assignment of Software Product Measurement numbers to objects in a systematic way to represent [ISO9126-01] properties of the object.”Software product measurement includes, notably, the An appreciation of software measurement scales and themeasurement of product size, product structure, and implications of each scale type in relation to the subsequentproduct quality. selection of data analysis methods is especially important. [Abr96; Fen98: c2; Pfl01: c11] Meaningful scales are related to a classification of scales. For those, measurement theory provides a succession of more and more constrained ways of assigning the measures. If the numbers assigned are merely to provide labels to classify the objects, they are called nominal. If they are assigned in a way that ranks the 9–6 © IEEE – 2004 Version
  • 139. objects (for example, good, better, best), they are called Process measurement techniques have been classified intoordinal. If they deal with magnitudes of the property two general types: analytic and benchmarking. The tworelative to a defined measurement unit, they are called types of techniques can be used together since they areinterval (and the intervals are uniform between the numbers based on different types of information. (Car91)unless otherwise specified, and are therefore additive). 4.4.1. Analytical techniquesMeasurements are at the ratio level if they have an absolutezero point, so ratios of distances to the zero point are The analytical techniques are characterized as relying onmeaningful. “quantitative evidence to determine where improvements are needed and whether an improvement initiative has been4.3. Software Information Models successful.” The analytical type is exemplified by theAs the data are collected and the measurement repository is Quality Improvement Paradigm (QIP) consisting of a cyclepopulated, we become able to build models using both data of understanding, assessing, and packaging [SEL96]. Theand knowledge. techniques presented next are intended as other examples of analytical techniques, and reflect what is done in practice.These models exist for the purposes of analysis, [Fen98; Mus99], (Lyu96; Wei93; Zel98) Whether or not aclassification, and prediction. Such models need to be specific organization uses all these techniques will depend,evaluated to ensure that their levels of accuracy are at least partially, on its maturity.sufficient and that their limitations are known andunderstood. The refinement of models, which takes place Experimental Studies: Experimentation involvesboth during and after projects are completed, is another setting up controlled or quasi experiments in theimportant activity. organization to evaluate processes. (McG94) Usually, a new process is compared with the current process to4.3.1. Model building determine whether or not the former has better processModel building includes both calibration and evaluation of outcomes.the model. The goal-driven approach to measurementinforms the model building process to the extent that Another type of experimental study is processmodels are constructed to answer relevant questions and simulation. This type of study can be used to analyzeachieve software improvement goals. This process is also process behavior, explore process improvementinfluenced by the implied limitations of particular potentials, predict process outcomes if the currentmeasurement scales in relation to the choice of analysis process is changed in a certain way, and controlmethod. The models are calibrated (by using particularly process execution. Initial data about the performancerelevant observations, for example, recent projects, projects of the current process need to be collected, however, asusing similar technology) and their effectiveness is a basis for the simulation.evaluated (for example, by testing their performance onholdout samples). [Fen98: c4,c6,c13;Pfl01: c3,c11,c12; Process Definition Review is a means by which aSom05: c25] process definition (either a descriptive or a prescriptive one, or both) is reviewed, and deficiencies and4.3.2. Model implementation potential process improvements identified. TypicalModel implementation includes both interpretation and examples of this are presented in (Ban95; Kel98). Anrefinement of models–the calibrated models are applied to easy operational way to analyze a process is tothe process, their outcomes are interpreted and evaluated in compare it to an existing standard (national,the context of the process/project, and the models are then international, or professional body), such as IEEE/EIArefined where appropriate. [Fen98: c6; Pfl01: c3,c11,c12; 12207.0[IEEE12207.0-96]. With this approach,Pre04: c22; Som05: c25] quantitative data are not collected on the process, or, if4.4. Process Measurement Techniques they are, they play a supportive role. The individualsMeasurement techniques may be used to analyze software performing the analysis of the process definition useengineering processes and to identify strengths and their knowledge and capabilities to decide whatweaknesses. This can be performed to initiate process process changes would potentially lead to desirableimplementation and change, or to evaluate the process outcomes. Observational studies can alsoconsequences of process implementation and change. provide useful feedback for identifying process improvements. (Agr99)The quality of measurement results, such as accuracy,repeatability, and reproducibility, are issues in the Orthogonal Defect Classification is a technique whichmeasurement of software engineering processes, since there can be used to link faults found with potential causes.are both instrument-based and judgmental measurements, It relies on a mapping between fault types and faultas, for example, when assessors assign scores to a particular triggers. (Chi92; Chi96) The IEEE Standard on theprocess. A discussion and method for achieving quality of classification of faults (or anomalies) may be useful inmeasurement are presented in [Gol99]. this context (IEEE Standard for the Classification of Software Anomalies (IEEE1044-93).© IEEE – 2004 Version 9–7
  • 140. Root Cause Analysis is another common analytical The Personal Software Process defines a series oftechnique which is used in practice. This involves improvements to an individual’s development practicestracing back from detected problems (for example, in a specified order [Hum95]. It is ‘bottom-up’ in thefaults) to identify the process causes, with the aim of sense that it stipulates personal data collection andchanging the process to avoid these problems in the improvements based on the data interpretations.future. Examples for different types of processes are 4.4.2. Benchmarking techniquesdescribed in (Col93; Ele97; Nak91). The second type of technique, benchmarking, “depends onThe Orthogonal Defect Classification technique identifying an ‘excellent’ organization in a field anddescribed above can be used to find catagories in documenting its practices and tools.” Benchmarkingwhich many problems exist, at which point they can be assumes that if a less-proficient organization adopts theanalyzed. Orthogonal Defect Classification is thus a practices of the excellent organization, it will also becometechnique used to make a quantitative selection for excellent. Benchmarking involves assessing the maturity ofwhere to apply Root Cause Analysis. an organization or the capability of its processes. It isStatistical Process Control is an effective way to exemplified by the software process assessment work. Aidentify stability, or the lack of it, in the process general introductory overview of process assessments andthrough the use of control charts and their their application is provided in (Zah98).interpretations. A good introduction to SPC in thecontext of software engineering is presented in (Flo99). 9–8 © IEEE – 2004 Version
  • 141. MATRIX OF TOPICS VS. REFERENCE MATERIAL [Sti99] [Pfl01] [Fin94] [Pre04] [Gol99] [Joh99] [Bas92] [Bec99] [Boe03] [Fen98] [San98] [SEI01] [Abr96] [Moi98] [Fow90] [SEL96] [Som05] [Mus99] [McF96] [Com97] [OMG02]1. Process Implementation andChange1.1 Process Infrastructure * * * *1.2 Software Process Management * * * * Cycle1.3 Models for Process * * Implementation and Change1.4 Practical Considerations * * * *2. Process Definition2.1Software Life Cycle Models * c22.2 Software Life Cycle Processes * *2.3 Notations for Process Definitions * c22.4 Process Adaptation *2.5 Automation * c23. Process Assessment3.1 Process Assessment Models * * * *3.2 Process Assessment Methods * *4. Measurement c3,c4.1 Process Measurement c25 11 c8-4.2 Software Products Measurement c15 c24 c104.3 Quality of Measurement Results * c2 c114.4 Software Information Models c4,c c3,c11 Model building 6,c c25 ,c12 13 c3,c11 Model Implementation c6 c22 * c25 ,c124.5 Process Measurement * * * Techniques© IEEE – 2004 Version 9–9
  • 142. ISO 9001 ISO 9126 ISO VIM ISO 14764 ISO 15504 ISO 15288 ISO 15939 ISO 19761 ISO 20926 ISO 20968 IEEE 1044 IEEE 1061 IEEE 1074 IEEE 1219 IEEE 1517 IEEE 1540 ISO 9000-3 IEEE 12207 IEEE 14143.11. Process Implementation andChange * *1.1 Process Infrastructure1.2 Software Process Management * * Cycle1.3 Models for Process Implementation and Change1.4 Practical Considerations2. Process Definition2.1 Life Cycle Models * * * * * * * * *2.2 Software Life Cycle Processes *2.3 Notations for Process Definitions *2.4 Process Adaptation2.5 Automation3. Process Assessment * *3.1 Process Assessment Models *3.2 Process Assessment Methods4. Measurement * *4.1 Process Measurement * * * * *4.2 Software Products Measurement *4.3 Quality of Measurement Results4.4 Software Information Models Model building Model Implementation4.5 Process Measurement * *Techniques 9–10 © IEEE – 2004 Version
  • 143. 2001.RECOMMENDED REFERENCES [ISO15504-98] ISO/IEC TR 15504:1998, Information[Abr96] A. Abran and P.N. Robillard, “Function Points Technology - Software Process Assessment (parts 1-9), ISOAnalysis: An Empirical Study of its Measurement and IEC, 1998.Processes,” IEEE Transactions on Software Engineering, [ISO15939-02] ISO/IEC 15939:2002, Software Engineeringvol. 22, 1996, pp. 895-909. — Software Measurement Process, ISO and IEC, 2002.[Bas92] V. Basili et al., “The Software Engineering [Joh99] D. Johnson and J. Brodman, “Tailoring the CMMLaboratory — An Operational Software Experience for Small Businesses, Small Organizations, and SmallFactory,” presented at the International Conference on Projects,” presented at Elements of Software ProcessSoftware Engineering, 1992. Assessment and Improvement, 1999.[Bec99] K. Beck, Extreme Programming Explained: [McF96] B. McFeeley, IDEAL: A User’s Guide forEmbrace Change, Addison-Wesley, 1999. Software Process Improvement, Software Engineering[Boe03] B. Boehm and R. Turner, “Using Risk to Balance Institute CMU/SEI-96-HB-001, 1996, available at http://www.Agile and Plan-Driven Methods,” Computer, June 2003, sei.cmu.edu/pub/documents/96.reports/pdf/hb001.96.pdf.pp. 57-66. [Moi98] D. Moitra, “Managing Change for Software[Com97] E. Comer, “Alternative Software Life Cycle Process Improvement Initiatives: A Practical Experience-Models,” presented at International Conference on Based Approach,” Software Process — Improvement andSoftware Engineering, 1997. Practice, vol. 4, iss. 4, 1998, pp. 199-207.[ElE99] K. El-Emam and N. Madhavji, Elements of [Mus99] J. Musa, Software Reliability Engineering: MoreSoftware Process Assessment and Improvement, IEEE Reliable Software, Faster Development and Testing,Computer Society Press, 1999. McGraw-Hill, 1999.[Fen98] N.E. Fenton and S.L. Pfleeger, Software Metrics: A [OMG02] Object Management Group, “Software ProcessRigorous & Practical Approach, second ed., International Engineering Metamodel Specification,” 2002, available atThomson Computer Press, 1998. http://www.omg.org/docs/formal/02-11-14.pdf.[Fin94] A. Finkelstein, J. Kramer, and B. Nuseibeh, [Pfl01] S.L. Pfleeger, Software Engineering: Theory and“Software Process Modeling and Technology,” Research Practice, second ed., Prentice Hall, 2001.Studies Press Ltd., 1994. [Pre04] R.S. Pressman, Software Engineering: A[Fow90] P. Fowler and S. Rifkin, Software Engineering Practitioner’s Approach, sixth ed., McGraw-Hill, 2004.Process Group Guide, Software Engineering Institute, [San98] M. Sanders, “The SPIRE Handbook: Better, Faster,Technical Report CMU/SEI-90-TR-24, 1990, available at Cheaper Software Development in Small Organisations,”http://www.sei.cmu.edu/pub/documents/90.reports/pdf/tr24. European Commission, 1998.90.pdf. [SEI01] Software Engineering Institute, “Capability[Gol99] D. Goldenson et al., “Empirical Studies of Maturity Model Integration, v1.1,” 2001, available atSoftware Process Assessment Methods,” presented at http://www.sei.cmu.edu/cmmi/cmmi.html.Elements of Software Process Assessment andImprovement, 1999. [SEL96] Software Engineering Laboratory, Software Process Improvement Guidebook, NASA/GSFC, Technical[IEEE1074-97] IEEE Std 1074-1997, IEEE Standard for Report SEL-95-102, April 1996, available at http://sel.Developing Software Life Cycle Processes, IEEE, 1997. gsfc.nasa.gov/website/documents/online-doc/95-102.pdf.[IEEE12207.0-96] IEEE/EIA 12207.0-1996//ISO/ [Som05] I. Sommerville, Software Engineering, seventhIEC12207:1995, Industry Implementation of Int. Std ed., Addison-Wesley, 2005.ISO/IEC 12207:95, Standard for Information Technology-Software Life Cycle Processes, IEEE, 1996. [Sti99] H. Stienen, “Software Process Assessment and Improvement: 5 Years of Experiences with Bootstrap,”[VIM93] ISO VIM, International Vocabulary of Basic and Elements of Software Process Assessment andGeneral Terms in Metrology, ISO, 1993. Improvement, K. El-Emam and N. Madhavji, eds., IEEE[ISO9126-01] ISO/IEC 9126-1:2001, Software Engineering - Computer Society Press, 1999.Product Quality-Part 1: Quality Model, ISO and IEC,© IEEE – 2004 Version 9–11
  • 144. Empirical Study,” Better Software Practice for BusinessAppendix A. List of Further Readings Benefit: Principles and Experiences, R. Messnarz and C. Tully, eds., IEEE Computer Society Press, 1999.(Agr99) W. Agresti, “The Role of Design and Analysis in (ElE-00a) K. El-Emam and A. Birk, “Validating theProcess Improvement,” presented at Elements of Software ISO/IEC 15504 Measures of Software DevelopmentProcess Assessment and Improvement, 1999. Process Capability,” Journal of Systems and Software, vol.(Ale91) L. Alexander and A. Davis, “Criteria for Selecting 51, iss. 2, 2000, pp. 119-149.Software Process Models,” presented at COMPSAC ’91, (ElE-00b) K. El-Emam and A. Birk, “Validating the1991. ISO/IEC 15504 Measures of Software Requirements(Ban95) S. Bandinelli et al., “Modeling and Improving an Analysis Process Capability,” IEEE Transactions onIndustrial Software Process,” IEEE Transactions on Software Engineering, vol. 26, iss. 6, June 2000, pp. 541-566.Software Engineering, vol. 21, iss. 5, 1995, pp. 440-454. (Fay97) M. Fayad and M. Laitinen, “Process Assessment:(Bar95) N. Barghouti et al., “Two Case Studies in Considered Wasteful,” Communications of the ACM, vol.Modeling Real, Corporate Processes,” Software Process — 40, iss. 11, November 1997.Improvement and Practice, Pilot Issue, 1995, pp. 17-32. (Flo99) W. Florac and A. Carleton, Measuring the Software(Boe03a) B. Boehm and R. Turner, Balancing Agility and Process: Statistical Process Control for Software ProcessDiscipline: A Guide for the Perplexed, Addison-Wesley, Improvement, Addison-Wesley, 1999.2003. (Gar96) P. Garg and M. Jazayeri, “Process-Centered(Bur99) I. Burnstein et al., “A Testing Maturity Model for Software Engineering Environments: A Grand Tour,”Software Test Process Assessment and Improvement,” Software Process, A. Fuggetta and A. Wolf, eds., JohnSoftware Quality Professional, vol. 1, iss. 4, 1999, pp. 8-21. Wiley & Sons, 1996.(Chi92) R. Chillarege et al., “Orthogonal Defect (Gra97) R. Grady, Successful Software ProcessClassification - A Concept for In-Process Measurement,” Improvement, Prentice Hall, 1997.IEEE Transactions on Software Engineering, vol. 18, iss. (Gra88) E. Gray and W. Smith, “On the Limitations of11, 1992, pp. 943-956. Software Process Assessment and the Recognition of a(Chi96) R. Chillarege, “Orthogonal Defect Classification,” Required Re-Orientation for Global Process Improvement,”Handbook of Software Reliability Engineering, M. Lyu, Software Quality Journal, vol. 7, 1998, pp. 21-34.ed., IEEE Computer Society Press, 1996. (Har98) D. Harel and M. Politi, Modeling Reactive Systems(Col93) J. Collofello and B. Gosalia, “An Application of with Statecharts: The Statemate Approach, McGraw-Hill,Causal Analysis to the Software Production Process,” 1998.Software Practice and Experience, vol. 23, iss. 10, 1993, (Her98) J. Herbsleb, “Hard Problems and Hard Science: Onpp. 1095-1105. the Practical Limits of Experimentation,” IEEE TCSE(Cur02) B. Curtis, W. Hefley, and S. Miller, The People Software Process Newsletter, vol. 11, 1998, pp. 18-21,Capability Maturity Model: Guidelines for Improving the available at http://www.seg.iit.nrc.ca/SPN.Workforce, Addison-Wesley, 2002. (Hum95) W. Humphrey, A Discipline for Software(Dav88) A. Davis, E. Bersoff, and E. Comer, “A Strategy Engineering, Addison-Wesley, 1995.for Comparing Alternative Software Development Life (Hum99) W. Humphrey, An Introduction to the TeamCycle Models,” IEEE Transactions on Software Software Process, Addison-Wesley, 1999.Engineering, vol. 14, iss. 10, 1988, pp. 1453-1461. (Hut94) D. Hutton, The Change Agent’s Handbook: A(Dun96) D. Dunnaway and S. Masters, “CMM-Based Survival Guide for Quality Improvement Champions, Irwin,Appraisal for Internal Process Improvement (CBA IPI): 1994.Method Description,” Software Engineering InstituteCMU/SEI-96-TR-007, 1996, available at http://www.sei. (Kan02) S.H. Kan, Metrics and Models in Software Qualitycmu.edu/pub/documents/96.reports/pdf/tr007. 96.pdf. Engineering, second ed., Addison-Wesley, 2002.(EIA/IS731-99) EIA, “EIA/IS 731 Systems Engineering (Kel98) M. Kellner et al., “Process Guides: EffectiveCapability Model,” 1999, available at http://www.geia. Guidance for Process Participants,” presented at the 5thorg/eoc/G47/index.html. International Conference on the Software Process, 1998.(ElE-97) K. El-Emam, D. Holtje, and N. Madhavji, “Causal (Kit98) B. Kitchenham, “Selecting Projects for TechnologyAnalysis of the Requirements Change Process for a Large Evaluation,” IEEE TCSE Software Process Newsletter,System,” presented at Proceedings of the International iss. 11, 1998, pp. 3-6, available at http://www.seg.iit.nrc.Conference on Software Maintenance, 1997. ca/SPN.(ElE-99a) K. El-Emam, B. Smith, and P. Fusaro, “Success (Kra99) H. Krasner, “The Payoff for Software ProcessFactors and Barriers in Software Process Improvement: An Improvement: What It Is and How to Get It,” presented at 9–12 © IEEE – 2004 Version
  • 145. Elements of Software Process Assessment and (Rad85) R. Radice et al., “A Programming ProcessImprovement, 1999. Architecture,” IBM Systems Journal, vol. 24, iss. 2, 1985,(Kri99) M.S. Krishnan and M. Kellner, “Measuring Process pp. 79-90.Consistency: Implications for Reducing Software Defects,” (Rag89) S. Raghavan and D. Chand, “Diffusing Software-IEEE Transactions on Software Engineering, vol. 25, iss. 6, Engineering Methods,” IEEE Software, July 1989, pp. 81-90.November/December 1999, pp. 800-815. (Rog83) E. Rogers, Diffusion of Innovations, Free Press,(Lyu96) M.R. Lyu, Handbook of Software Reliability 1983.Engineering, Mc-Graw-Hill/IEEE, 1996. (Sch99) E. Schein, Process Consultation Revisited:(Mad94) N. Madhavji et al., “Elicit: A Method for Eliciting Building the Helping Relationship, Addison-Wesley, 1999.Process Models,” presented at Proceedings of the Third (SEI95) Software Engineering Institute, The CapabilityInternational Conference on the Software Process, 1994. Maturity Model: Guidelines for Improving the Software(Mas95) S. Masters and C. Bothwell, “CMM Appraisal Process, Addison-Wesley, 1995.Framework - Version 1.0,” Software Engineering Institute (SEL96) Software Engineering Laboratory, SoftwareCMU/SEI-TR-95-001, 1995, available at http://www. Process Improvement Guidebook, Software Engineeringsei.cmu.edu/pub/documents/95.reports/pdf/tr001.95.pdf. Laboratory, NASA/GSFC, Technical Report SEL-95-102,(McG94) F. McGarry et al., “Software Process April 1996, available at http://sel.gsfc.nasa.gov/website/Improvement in the NASA Software Engineering documents/online-doc/95-102.pdfLaboratory,” Software Engineering Institute CMU/SEI-94- (SPC92) Software Productivity Consortium, ProcessTR-22, 1994, available at http://www.sei.cmu.edu/pub/ Definition and Modeling Guidebook, Software Productivitydocuments/94.reports/pdf/ tr22.94.pdf. Consortium, SPC-92041-CMC, 1992.(McG01) J. McGarry et al., Practical Software (Som97) I. Sommerville and P. Sawyer, RequirementsMeasurement: Objective Information for Decision Makers, Engineering: A Good Practice Guide, John Wiley & Sons,Addison-Wesley, 2001. 1997.(Mcg93) C. McGowan and S. Bohner, “Model Based (Vot93) L. Votta, “Does Every Inspection Need aProcess Assessments,” presented at International Meeting?” ACM Software Engineering Notes, vol. 18, iss.Conference on Software Engineering, 1993. 5, 1993, pp. 107-114.(Nak91) T. Nakajo and H. Kume, “A Case History (Wei93) G.M. Weinberg, “Quality Software Management,”Analysis of Software Error Cause-Effect Relationship,” First-Order Measurement (Ch. 8, Measuring Cost andIEEE Transactions on Software Engineering, vol. 17, iss. 8, Value), vol. 2, 1993.1991. (Yu94) E. Yu and J. Mylopolous, “Understanding ‘Why’ in(Pau94) M. Paulk and M. Konrad, “Measuring Process Software Process Modeling, Analysis, and Design,”Capability Versus Organizational Process Maturity,” presented at 16th International Conference on Softwarepresented at 4th International Conference on Software Engineering, 1994Quality, 1994. (Zah98) S. Zahran, Software Process Improvement:(Pfl99) S.L. Pfleeger, “Understanding and Improving Practical Guidelines for Business Success, Addison-Technology Transfer in Software Engineering,” Journal of Wesley, 1998.Systems and Software, vol. 47, 1999, pp. 111-124. (Zel98) M. V. Zelkowitz and D. R. Wallace, “Experimental(Pfl01) S.L. Pfleeger, Software Engineering: Theory and Models for Validating Technology,” Computer, vol. 31, iss.Practice, second ed., Prentice Hall, 2001. 5, 1998, pp. 23-31.© IEEE – 2004 Version 9–13
  • 146. IEC14143-1:1998, Information Technology-SoftwareAppendix B. List of Standards Measurement-Functional Size Measurement-Part 1: Definitions of Concepts, IEEE, 2000. (ISO9001-00) ISO 9001:2000, Quality Management (IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEE Systems-Requirements, ISO, 1994.Standard for the Classification of Software Anomalies, (ISO9126-01) ISO/IEC 9126-1:2001, SoftwareIEEE, 1993. Engineering-Product Quality-Part 1: Quality Model, ISO(IEEE1061-98) IEEE Std 1061-1998, IEEE Standard for a and IEC, 2001.Software Quality Metrics Methodology, IEEE, 1998. (ISO14674-99) ISO/IEC 14674:1999, Information(IEEE1074-97) IEEE Std 1074-1997, IEEE Standard for Technology - Software Maintenance, ISO and IEC, 1999.Developing Software Life Cycle Processes, IEEE, 1997. (ISO15288-02) ISO/IEC 15288:2002, Systems(IEEE1219-98) IEEE Std 1219-1998, IEEE Standard for Engineering-System Life Cycle Process, ISO and IEC,Software Maintenance, IEEE, 1998. 2002.(IEEE1220-98) IEEE Std 1220-1998, IEEE Standard for (ISO15504-98) ISO/IEC TR 15504:1998, Informationthe Application and Management of the Systems Technology - Software Process Assessment (parts 1-9), ISOEngineering Process, IEEE, 1998. and IEC, 1998.(IEEE1517-99) IEEE Std 1517-1999, IEEE Standard for (ISO15939-02) ISO/IEC 15939:2002, SoftwareInformation Technology-Software Life Cycle Processes- Engineering-Software Measurement Process, ISO and IEC,Reuse Processes, IEEE, 1999. 2002.(IEEE1540-01) IEEE Std 1540-2001//ISO/IEC16085:2003, (ISO19761-03) ISO/IEC 19761:2003, SoftwareIEEE Standard for Software Life Cycle Processes-Risk Engineering-Cosmic FPP-A Functional Size MeasurementManagement, IEEE, 2001. Method, ISO and IEC, 2003.(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/ (ISO20926-03) ISO/IEC 20926:2003, SoftwareIEC12207:1995, Industry Implementation of Int. Std Engineering-IFPUG 4.1 Unadjusted Functional SizeISO/IEC 12207:95, Standard for Information Technology- Measurement Method-Counting Practices Manual, ISO andSoftware Life Cycle Processes, IEEE, 1996. IEC, 2003.(IEEE12207.1-96) IEEE/EIA 12207.1-1996, Industry (ISO20968-02) ISO/IEC 20968:2002, SoftwareImplementation of Int. Std ISO/IEC 12207:95, Standard for Engineering-MK II Function Point Analysis - CountingInformation Technology-Software Life Cycle Processes - Practices Manual, ISO and IEC, 2002.Life Cycle Data, IEEE, 1996. (ISO90003-04) ISO/IEC 90003:2004, Software and(IEEE12207.2-97) IEEE/EIA 12207.2-1997, Industry Systems Engineering - Guidelines for the Application ofImplementation of Int. Std ISO/IEC 12207:95, Standard for ISO9001:2000 to Computer Software, ISO and IEC, 2004.Information Technology-Software Life Cycle Processes - (VIM93) ISO VIM, International Vocabulary of Basic andImplementation Considerations, IEEE, 1997. General Terms in Metrology, ISO, 1993.(IEEE14143.1-00) IEEE Std 14143.1-2000//ISO/ 9–14 © IEEE – 2004 Version
  • 147. CHAPTER 10 SOFTWARE ENGINEERING TOOLS AND METHODSACRONYM Software Engineering Tools and MethodsCASE Computer Assisted Software Engineering Software Engineering Software Engineering Tools MethodsINTRODUCTION Software Requirements Heuristic MethodsSoftware development tools are the computer-based tools Tools Structured methodsthat are intended to assist the software life cycle processes. Requirements modeling Requirements traceability Data-oriented methodsTools allow repetitive, well-defined actions to be Software Design Tools Object-oriented methodsautomated, reducing the cognitive load on the software Software Construction Formal Methodsengineer who is then free to concentrate on the creative Tools Specification languages &aspects of the process. Tools are often designed to support Program editors notations Compilers & code generators Refinementparticular software engineering methods, reducing any Interpreters Debuggers Verification/proving propertiesadministrative load associated with applying the method Software Testing Toolsmanually. Like software engineering methods, they are Test generators Prototyping Methodsintended to make software engineering more systematic, Test execution frameworks Test evaluation Styles Prototyping targetand they vary in scope from supporting individual tasks to Test management Evaluation techniques Performance analysisencompassing the complete life cycle. Software Maintenance ToolsSoftware engineering methods impose structure on the Comprehensionsoftware engineering activity with the goal of making the Reengineeringactivity systematic and ultimately more likely to be Software Configurationsuccessful. Methods usually provide a notation and Management Tools Defect, enhancement, issuevocabulary, procedures for performing identifiable tasks, and problem tracking Version managmentand guidelines for checking both the process and the Release and buildproduct. They vary widely in scope, from a single life cycle Software Engineering Management Toolsphase to the complete life cycle. The emphasis in this KA is Project planning and trackingon software engineering methods encompassing multiple Risk management Measurementlife cycle phases, since phase-specific methods are covered Software Engineeringby other KAs. Process Tools Process modelingWhile there are detailed manuals on specific tools and Process management Integrated CASE environmentsnumerous research papers on innovative tools, generic Process-centered software engineering environmentstechnical writings on software engineering tools are Software Quality Toolsrelatively scarce. One difficulty is the high rate of change in Review and auditsoftware tools in general. Specific details alter regularly, Static analysis Miscellaneous Toolsmaking it difficult to provide concrete, up-to-date Issuesexamples. Tool integration techniques Meta toolsThe Software Engineering Tools and Methods KA covers Tool evaluationthe complete life cycle processes, and is therefore related toevery KA in the Guide. Figure 1 Breakdown of topics in the Software Engineering Tools and Methods KA© IEEE – 2004 Version 10–1
  • 148. programming environments. This topic also coversBREAKDOWN OF TOPICS FOR SOFTWARE preprocessors, linker/loaders, and code generators.ENGINEERING TOOLS AND METHODS Interpreters. These tools provide software execution through emulation. They can support software1. Software Engineering Tools construction activities by providing a more controllableThe first five topics in the Software Engineering Tools and observable environment for program execution.subarea correspond to the first five KAs of the Guide Debuggers. These tools are considered a separate(Software Requirements, Software Design, Software category since they support the software constructionConstruction, Software Testing, and Software process, but they are different from program editorsMaintenance). The next four topics correspond to the and compilers.remaining KAs (Software Configuration Management,Software Engineering Management, Software Engineering 1.4. Software Testing ToolsProcess, and Software Quality). An additional topic is [Dor02, Pfl01, Rei96]provided, Miscellaneous, addressing areas such as toolintegration techniques which are potentially applicable to Test generators. These tools assist in the developmentall classes of tools. of test cases. Test execution frameworks. These tools enable the1.1. Software Requirements Tools execution of test cases in a controlled environment [Dor97, Dor02] where the behavior of the object under test is observed.Tools for dealing with software requirements have been Test evaluation tools. These tools support theclassified into two categories: modeling and traceability assessment of the results of test execution, helping totools. determine whether or not the observed behavior Requirements modeling tools. These tools are used for conforms to the expected behavior. eliciting, analyzing, specifying, and validating software Test management tools. These tools provide support requirements for all aspects of the software testing process. Requirement traceability tools. [Dor02] These tools are Performance analysis tools. [Rei96] These tools are becoming increasingly important as the complexity of used for measuring and analyzing software software grows. Since they are also relevant in other performance, which is a specialized form of testing life cycle processes, they are presented separately from where the goal is to assess performance behavior rather the requirements modeling tools. than functional behavior (correctness).1.2. Software Design Tools 1.5. Software Maintenance Tools [Dor02] [Dor02, Pfl01]This topic covers tools for creating and checking software This topic encompasses tools which are particularlydesigns. There are a variety of such tools, with much of this important in software maintenance where existing softwarevariety being a consequence of the diversity of software is being modified. Two categories are identified:design notations and methods. In spite of this variety, no comprehension tools and reengineering tools.compelling divisions for this topic have been found. Comprehension tools. [Re196] These tools assist in the1.3. Software Construction Tools human comprehension of programs. Examples include [Dor02, Rei96] visualization tools such as animators and program slicers.This topic covers software construction tools. These toolsare used to produce and translate program representation Reengineering tools. In the Software Maintenance KA,(for instance, source code) which is sufficiently detailed reengineering is defined as the examination andand explicit to enable machine execution. alteration of the subject software to reconstitute it in a new form, and includes the subsequent implementation Program editors. These tools are used for the creation of the new form. Reengineering tools support that and modification of programs, and possibly the activity. documents associated with them. They can be general- purpose text or document editors, or they can be Reverse engineering tools assist the process by working specialized for a target language. backwards from an existing product to create artifacts such as specification and design descriptions, which then can be Compilers and code generators. Traditionally, transformed to generate a new product from an old one. compilers have been noninteractive translators of source code, but there has been a trend to integrate compilers and program editors to provide integrated 10–2 © IEEE – 2004 Version
  • 149. 1.6. Software Configuration Management Tools Process-centered software engineering environments. [Rei96] (Gar96) These environments explicitly [Dor02, Rei96, Som05] incorporate information on the software life cycleTools for configuration management have been divided into processes and guide and monitor the user according tothree categories: tracking, version management, and release the defined process.tools. 1.9. Software Quality Tools Defect, enhancement, issue, and problem-tracking tools. These tools are used in connection with the [Dor02] problem-tracking issues associated with a particular Quality tools are divided into two categories: inspection software product. and analysis tools. Version management tools. These tools are involved in the management of multiple versions of a product. Review and audit tools. These tools are used to support reviews and audits. Release and build tools. These tools are used to manage the tasks of software release and build. The Static analysis tools. [Cla96, Pfl01, Rei96] These tools category includes installation tools which have become are used to analyze software artifacts, such as syntactic widely used for configuring the installation of software and semantic analyzers, as well as data, control flow, products. and dependency analyzers. Such tools are intended for checking software artifacts for conformance or forAdditional information is given in the Software verifying desired properties.Configuration Management KA, topic 1.3 Planning forSCM. 1.10. Miscellaneous Tool Issues1.7. Software Engineering Management Tools [Dor02] [Dor02] This topic covers issues applicable to all classes of tools.Software engineering management tools are subdivided Three categories have been identified: tool integrationinto three categories: project planning and tracking, risk techniques, meta-tools, and tool evaluation.management, and measurement. Tool integration techniques. [Pfl01, Rei96, Som01] Project planning and tracking tools. These tools are (Bro94) Tool integration is important for making used in software project effort measurement and cost individual tools cooperate. This category potentially estimation, as well as project scheduling. overlaps with the integrated CASE environments category where integration techniques are applied; Risk management tools. These tools are used in however, it was felt that it is sufficiently distinct to identifying, estimating, and monitoring risks. merit a category of its own. Typical kinds of tool Measurement tools. The measurement tools assist in integration are platform, presentation, process, data, performing the activities related to the software and control. measurement program. Meta-tools. Meta-tools generate other tools; compiler-1.8. Software Engineering Process Tools compilers are the classic example. Tool evaluation. [Pfl01] (IEEE1209-92, IEEE1348-95, [Dor02, Som05] Mos92, Val97) Because of the continuous evolution ofSoftware engineering process tools are divided into software engineering tools, tool evaluation is anmodeling tools, management tools, and software essential topic.development environments. 2. Software Engineering Methods Process modeling tools. [Pfl01] These tools are used to model and investigate software engineering processes. The Software Engineering Methods subarea is divided into three topics: heuristic methods dealing with informal Process management tools. These tools provide approaches, formal methods dealing with mathematically support for software engineering management. based approaches, and prototyping methods dealing with Integrated CASE environments. [Rei96, Som05] software engineering approaches based on various forms of (ECMA55-93, ECMA69-94, IEEE1209-92, prototyping. These three topics are not disjoint; rather they IEEE1348-95, Mul96) Integrated computer-aided represent distinct concerns. For example, an object-oriented software engineering tools or environments covering method may incorporate formal techniques and rely on multiple phases of the software engineering life cycle prototyping for verification and validation. Like software belong in this subtopic. Such tools perform multiple engineering tools, methodologies continuously evolve. functions and hence potentially interact with the Consequently, the KA description avoids as far as possible software life cycle process being executed. naming particular methodologies.© IEEE – 2004 Version 10–3
  • 150. 2.1. Heuristic methods classified as model-oriented, property-oriented, or [Was96] behavior-oriented.This topic contains four categories: structured, data- Refinement. [Pre04] This topic deals with how theoriented, object-oriented, and domain-specific. The method refines (or transforms) the specification into adomain-specific category includes specialized methods for form which is closer to the desired final form of andeveloping systems which involve real-time, safety, or executable program.security aspects. Verification/proving properties. [Cla96, Pfl01, Som05] Structured methods. [Dor02, Pfl01, Pre04, Som05] The This topic covers the verification properties that are system is built from a functional viewpoint, starting specific to the formal approach, including both with a high-level view and progressively refining this theorem proving and model checking. into a more detailed design. 2.3. Prototyping Methods Data-oriented methods. [Dor02, Pre04] Here, the [Pre04, Som05, Was96] starting points are the data structures that a program manipulates rather than the function it performs. This subsection covers methods involving software prototyping and is subdivided into prototyping styles, Object-oriented methods. [Dor02, Pfl01, Pre04, Som05] targets, and evaluation techniques. The system is viewed as a collection of objects rather than functions. Prototyping styles. [Dor02, Pfl01, Pre04] (Pom96) The prototyping styles topic identifies the various2.2. Formal Methods approaches: throwaway, evolutionary, and executable specification. [Dor02, Pre04, Som05] Prototyping target. [Dor97] (Pom96) Examples of theThis subsection deals with mathematically based software targets of a prototyping method may be requirements,engineering methods, and is subdivided according to the architectural design, or the user interface.various aspects of formal methods. Prototyping evaluation techniques. This topic covers Specification languages and notations. [Cla96, Pfl01, the ways in which the results of a prototype exercise Pre01] This topic concerns the specification notation or are used. language used. Specification languages can be 10–4 © IEEE – 2004 Version
  • 151. MATRIX OF TOPICS VS. REFERENCE MATERIAL [Dor02] [Pfl01] [Cla96] [Pre04] [Rei96] [Som05] [Was96] {Dor97} {PFL98} 1. Software Tools {c4s1} ,v2c8s4 1.1 Software Requirements Tools Requirement modeling tools Requirements traceability tools v1c4s2 v2c8s4 1.2 Software Design Tools v2c8s4 c112s2 1.3 Software Construction Tools Program editors Compilers and code generators Interpreters Debuggers 1.4 Software Testing Tools v2c8s4 C8s7,c9s7 c112s3 Test generators Test execution frameworks Test evaluation tools Test management tools Performance analysis tools c112s5 1.5 Software Maintenance Tools v2c8s4 c11s5 Comprehension tools c112s5 Reengineering tools 1.6 Software Configuration Management Tools v2c8s4 c11s5 c112s3 c29 Defect, enhancement, issue, and problem-tracking tools Version management tools Release and build tools© IEEE – 2004 Version 10–5
  • 152. [Dor02] [Pfl01] [Cla96] [Pre04] [Rei96] [Som05] [Was96] {Dor97} {PFL98}1.7 Software Engineering Management Tools v2c8s4Project planning and tracking toolsRisk management toolsMeasurement tools1.8 Software Engineering Process Tools v2c8s4Process modeling tools c2s3, 2s4Process management toolsIntegrated CASE environments c112s3, c112s4 c3Process-centered software engineering environments c112s51.9 Software Quality Tools v2c8s4Review and audit toolsStatic analysis tools * C8s7 c112s51.10 Miscellaneous Tool Issues v2c8s4Tool integration techniques c1s8 c112s4 *Meta-toolsTool evaluation C9s102. Development Methods2.1 Heuristic Methods *Structured methods v1c5s1, v1c6s3 c4s5 c7-c9 c15Data-oriented methods v1c5s1, v1c6s3 c7-c9Object-oriented methods v1c6s2, v1c6s3 c4s4, c6, c8s5 c7-c9 c122.2 Formal Methods v1c6s5 c28 c9Specification languages and notation * c4s5RefinementVerification/proving properties * c5s7, c8s32.3 Prototyping Methods c8 *Styles v1c4s4 c4s6, c5s6Prototyping target v1c4s4Evaluation techniques 10–6 © IEEE – 2004 Version
  • 153. [Pfl01] S.L. Pfleeger, Software Engineering: Theory andRECOMMENDED REFERENCES FOR SOFTWARE Practice, second ed., Prentice Hall, 2001.ENGINEERING TOOLS AND METHODS [Pre04] R.S. Pressman, Software Engineering: A[Cla96] E.M. Clarke et al., “Formal Methods: State of the Practitioners Approach, sixth ed., McGraw-Hill, 2004.Art and Future Directions,” ACM Computer Surveys, vol. [Rei96] S.P. Reiss, Software Tools and Environments in28, iss. 4, 1996, pp. 626-643. The Computer Science and Engineering Handbook, CRC[Dor97] M. Christensen, M. Dorfman and R.H. Thayer, Press, 1996.eds., Software Engineering, IEEE Computer Society Press, [Som05] I. Sommerville, Software Engineering, seventh1997. ed., Addison-Wesley, 2005.[Dor02] M. Christensen, M. Dorfman and R.H. Thayer, [Was96] A.I. Wasserman, “Toward a Discipline ofeds., Software Engineering, Vol. 1 & Vol. 2, IEEE Software Engineering,” IEEE Software, vol. 13, iss. 6,Computer Society Press, 2002. November 1996, pp. 23-31.© IEEE – 2004 Version 10–7
  • 154. (Moo98) J.W. Moore, Software Engineering Standards, AAPPENDIX A. LIST OF FURTHER READINGS Users Roadmap, IEEE Computer Society Press, 1998.(Ber93) E.V. Berard, Essays on Object-Oriented Software (Mos92) V. Mosley, “How to Assess Tools Efficiently andEngineering, Prentice Hall, 1993. Quantitatively,” IEEE Software, vol. 9, iss. 3, May 1992,(Bis92) W. Bischofberger and G. Pomberger, Prototyping- pp. 29-32.Oriented Software Development: Concepts and Tools, (Mül96) H.A. Muller, R.J. Norman, and J. Slonim, eds.,Springer-Verlag, 1992. “Computer Aided Software Engineering,” special issue of(Bro94) A.W. Brown et al., Principles of CASE Tool Automated Software Engineering, vol. 3, iss. 3/4, Kluwer,Integration, Oxford University Press, 1994. 1996.(Car95) D.J. Carney and A.W. Brown, “On the Necessary (Mül00) H. Müller et al., “Reverse Engineering: AConditions for the Composition of Integrated Software Roadmap,” The Future of Software Engineering, A.Engineering Environments,” presented at Advances in Finkelstein, ed., ACM, 2000, pp. 49-60.Computers, 1995. (Pom96) G. Pomberger and G. Blaschek, Object-(Col94) D. Coleman et al., Object-Oriented Development: Orientation and Prototyping in Software Engineering:The Fusion Method, Prentice Hall, 1994. Prentice Hall, 1996.(Cra95) D. Craigen, S. Gerhart, and T. Ralston, “Formal (Pos96) R.M. Poston, Automating Specification-basedMethods Reality Check: Industrial Usage,” IEEE Software Testing, IEEE Press, 1996.Transactions on Software Engineering, vol. 21, iss. 2, (Ric92) C. Rich and R.C. Waters, “Knowledge IntensiveFebruary 1995, pp. 90-98. Software Engineering Tools,” IEEE Transactions on(Fin00) A. Finkelstein, ed., The Future of Software Knowledge and Data Engineering, vol. 4, iss. 5, OctoberEngineering, ACM, 2000. 1992, pp. 424-430.(Gar96) P.K. Garg and M. Jazayeri, Process-Centered (Son92) X. Song and L.J. Osterweil, “Towards Objective,Software Engineering Environments, IEEE Computer Systematic Design-Method Comparisons,” IEEE Software,Society Press, 1996. vol. 9, iss. 3, May 1992, pp. 43-53.(Har00) W. Harrison, H. Ossher, and P. Tarr, “Software (Tuc96) A.B. Tucker, The Computer Science andEngineering Tools and Environments: A Roadmap,” 2000. Engineering Handbook, CRC Press, 1996.(Jar98) S. Jarzabek and R. Huang, “The Case for User- (Val97) L.A. Valaer and R.C.B. II, “Choosing a UserCentered CASE Tools,” Communications of the ACM, vol. Interface Development Tool,” IEEE Software, vol. 14, iss.41, iss. 8, August 1998, pp. 93-99. 4, 1997, pp. 29-39.(Kit95) B. Kitchenham, L. Pickard, and S.L. Pfleeger, (Vin90) W.G. Vincenti, What Engineers Know and How“Case Studies for Method and Tool Evaluation,” IEEE They Know It — Analytical Studies from AeronauticalSoftware, vol. 12, iss. 4, July 1995, pp. 52-62. History, John Hopkins University Press, 1990.(Lam00) A. v. Lamsweerde, “Formal Specification: A (Wie98) R. Wieringa, “A Survey of Structured and Object-Roadmap,” The Future of Software Engineering, A. Oriented Software Specification Methods and Techniques,”Finkelstein, ed., ACM, 2000, pp. 149-159. ACM Computing Surveys, vol. 30, iss. 4, 1998, pp. 459- 527.(Mey97) B. Meyer, Object-Oriented SoftwareConstruction, second ed., Prentice Hall, 1997. 10–8 © IEEE – 2004 Version
  • 155. APPENDIX B. LIST OF STANDARDS(ECMA55-93) ECMA, TR/55 Reference Model forFrameworks of Software Engineering Environments, thirded., 1993.(ECMA69-94) ECMA, TR/69 Reference Model for ProjectSupport Environments, 1994.(IEEE1175.1-02) IEEE Std 1175.1-2002, IEEE Guide forCASE Tool Interconnections—Classification andDescription, IEEE Press, 2002.(IEEE1209-92) IEEE Std 1209-1992, RecommendedPractice for the Evaluation and Selection of CASE Tools,(ISO/IEC 14102, 1995), IEEE Press, 1992.(IEEE1348-95) IEEE Std 1348-1995, RecommendedPractice for the Adoption of CASE Tools, (ISO/IEC 14471),IEEE Press, 1995.(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/IEC12207:1995, Industry Implementation of Int. Std.ISO/IEC 12207:95, Standard for Information Technology—Software Life Cycle Processes, IEEE Press, 1996.© IEEE – 2004 Version 10–9
  • 156. 10–10 © IEEE – 2004 Version
  • 157. CHAPTER 11 SOFTWARE QUALITY The important concept is that the software requirementsACRONYMS define the required quality characteristics of the software and influence the measurement methods and acceptanceCMMI Capability Maturity Model Integrated criteria for assessing these characteristics.COTS Commercial Off-the-Shelf Software 1.1. Software Engineering Culture and EthicsPDCA Plan, Do, Check, Act Software engineers are expected to share a commitment to software quality as part of their culture. A healthy softwareSQA Software Quality Assurance engineering culture is described in [Wie96].SQM Software Quality Management Ethics can play a significant role in software quality, theTQM Total Quality Management culture, and the attitudes of software engineers. The IEEEV&V Verification and Validation Computer Society and the ACM [IEEE99] have developed a code of ethics and professional practice based on eight principles to help software engineers reinforce attitudesINTRODUCTION related to quality and to the independence of their work.What is software quality, and why is it so important that it 1.2. Value and Costs of Qualitybe pervasive in the SWEBOK Guide? Over the years, [Boe78; NIST03; Pre04; Wei93]authors and organizations have defined the term “quality”differently. To Phil Crosby (Cro79), it was “conformance to The notion of “quality” is not as simple as it may seem. Foruser requirements.” Watts Humphrey (Hum89) refers to it as any engineered product, there are many desired qualities“achieving excellent levels of fitness for use,” while IBM relevant to a particular perspective of the product, to becoined the phrase “market-driven quality,” which is based on discussed and determined at the time that the productachieving total customer satisfaction. The Baldrige criteria requirements are set down. Quality characteristics may befor organizational quality (NIST03) use a similar phrase, required or not, or may be required to a greater or lesser“customer-driven quality,” and include customer satisfaction degree, and trade-offs may be made among them. [Pfl01]as a major consideration. More recently, quality has been The cost of quality can be differentiated into preventiondefined in (ISO9001-00) as “the degree to which a set of cost, appraisal cost, internal failure cost, and externalinherent characteristics fulfills requirements.” failure cost. [Hou99]This chapter deals with software quality considerations A motivation behind a software project is the desire towhich transcend the life cycle processes. Software quality create software that has value, and this value may or mayis a ubiquitous concern in software engineering, and so it is not be quantified as a cost. The customer will have somealso considered in many of the KAs. In summary, the maximum cost in mind, in return for which it is expectedSWEBOK Guide describes a number of ways of achieving that the basic purpose of the software will be fulfilled. Thesoftware quality. In particular, this KA will cover static customer may also have some expectation as to the qualitytechniques, those which do not require the execution of the of the software. Sometimes customers may not havesoftware being evaluated, while dynamic techniques are thought through the quality issues or their related costs. Iscovered in the Software Testing KA. the characteristic merely decorative, or is it essential to the software? If the answer lies somewhere in between, as is almost always the case, it is a matter of making theBREAKDOWN OF SOFTWARE QUALITY TOPICS customer a part of the decision process and fully aware of both costs and benefits. Ideally, most of these decisions will1. Software Quality Fundamentals be made in the software requirements process (see theAgreement on quality requirements, as well as clear Software Requirements KA), but these issues may arisecommunication to the software engineer on what throughout the software life cycle. There is no definite ruleconstitutes quality, require that the many aspects of quality as to how these decisions should be made, but the softwarebe formally defined and discussed. engineer should be able to present quality alternatives and their costs. A discussion concerning cost and the value ofA software engineer should understand the underlying quality requirements can be found in [Jon96:c5;meanings of quality concepts and characteristics and their Wei96:c11].value to the software under development or to maintenance.© IEEE – 2004 Version 11-1
  • 158. Software Quality Software Quality Software Quality Practical Management Fundamentals Considerations Processes Software Engineering Software Quality Application Quality Culture and Ethics Assurance Requirements Value and Costs Verification and Defect of Quality Validation Characterization Software Quality Models and Reviews and Management Quality Audits Techniques Characteristics Software Quality Quality Improvement Measurement Figure 1 Breakdown of topics for the Software Quality KA1.3. Models and Quality Characteristics Models and criteria which evaluate the capabilities of [Dac01; Kia95; Lap91; Lew92; Mus99; NIST; Pre01; software organizations are primarily project organization and management considerations, and, as such, are covered Rak97; Sei02; Wal96] in the Software Engineering Management and SoftwareTerminology for software quality characteristics differs Engineering Process KAs.from one taxonomy (or model of software quality) to Of course, it is not possible to completely distinguish theanother, each model perhaps having a different number of quality of the process from the quality of the product.hierarchical levels and a different total number ofcharacteristics. Various authors have produced models of Process quality, discussed in the Software Engineeringsoftware quality characteristics or attributes which can be Process KA of this Guide, influences the qualityuseful for discussing, planning, and rating the quality of characteristics of software products, which in turn affectsoftware products. [Boe78; McC77] ISO/IEC has defined quality-in-use as perceived by the customer.three related models of software product quality (internal Two important quality standards are TickIT [Llo03] andquality, external quality, and quality in use) (ISO9126-01) one which has an impact on software quality, theand a set of related parts (ISO14598-98). ISO9001-00 standard, along with its guidelines for application to software [ISO90003-04].1.3.1. Software engineering process quality Another industry standard on software quality is CMMISoftware quality management and software engineering [SEI02], also discussed in the Software Engineering Processprocess quality have a direct bearing on the quality of the KA. CMMI intends to provide guidance for improvingsoftware product. processes. Specific process areas related to quality 11–2 © IEEE – 2004 Version
  • 159. management are (a) process and product quality assurance, linked to the quality of the process used to create it.(b) process verification, and (c) process validation. CMMI (Cro79, Dem86, Jur89)classifies reviews and audits as methods of verification, and Approaches such as the Total Quality Managementnot as specific processes like (IEEE12207.0-96). (TQM) process of Plan, Do, Check, and Act (PDCA) areThere was initially some debate over whether ISO9001 or tools by which quality objectives can be met.CMMI should be used by software engineers to ensure Management sponsorship supports process and productquality. This debate is widely published, and, as a result, evaluations and the resulting findings. Then, anthe position has been taken that the two are improvement program is developed identifying detailedcomplementary and that having ISO9001 certification can actions and improvement projects to be addressed in ahelp greatly in achieving the higher maturity levels of the feasible time frame. Management support implies thatCMMI. [Dac01] each improvement project has enough resources to achieve the goal defined for it. Management sponsorship1.3.2. Software product quality must be solicited frequently by implementing proactiveThe software engineer needs, first of all, to determine the communication activities. The involvement of workreal purpose of the software. In this regard, it is of prime groups, as well as middle-management support andimportance to keep in mind that the customer’s resources allocated at project level, are discussed in therequirements come first and that they include quality Software Engineering Process KA.requirements, not just functional requirements. Thus, thesoftware engineer has a responsibility to elicit quality 2. Software Quality Management Processesrequirements which may not be explicit at the outset and Software quality management (SQM) applies to allto discuss their importance as well as the level of perspectives of software processes, products, anddifficulty in attaining them. All processes associated with resources. It defines processes, process owners, andsoftware quality (for example, building, checking, and requirements for those processes, measurements of theimproving quality) will be designed with these process and its outputs, and feedback channels. (Art93)requirements in mind, and they carry additional costs. Software quality management processes consist of manyStandard (ISO9126-01) defines, for two of its three activities. Some may find defects directly, while othersmodels of quality, the related quality characteristics and indicate where further examination may be valuable. Thesub-characteristics, and measures which are useful for latter are also referred to as direct-defect-findingassessing software product quality. (Sur03) activities. Many activities often serve as both.The meaning of the term “product” is extended to include Planning for software quality involves:any artifact which is the output of any process used to buildthe final software product. Examples of a product include, (1) Defining the required product in terms of its qualitybut are not limited to, an entire system requirements characteristics (described in more detail in, for instance,specification, a software requirements specification for a the Software Engineering Management KA).software component of a system, a design module, code, (2) Planning the processes to achieve the required producttest documentation, or reports produced as a result of (described in, for instance, the Software Design and thequality analysis tasks. While most treatments of quality are Software Construction KAs).described in terms of the final software and system These aspects differ from, for instance, the planning SQMperformance, sound engineering practice requires that processes themselves, which assess planned qualityintermediate products relevant to quality be evaluated characteristics versus actual implementation of thosethroughout the software engineering process. plans. The software quality management processes1.4. Quality Improvement must address how well software products will, or do, [NIST03; Pre04; Wei96] satisfy customer and stakeholder requirements, provide value to the customers and other stakeholders,The quality of software products can be improved through an and provide the software quality needed to meetiterative process of continuous improvement which requires software requirements.management control, coordination, and feedback from manyconcurrent processes: (1) the software life cycle processes, SQM can be used to evaluate the intermediate products as(2) the process of error/defect detection, removal, and well as the final product.prevention, and (3) the quality improvement process. (Kin92) Some of the specific SQM processes are defined inThe theory and concepts behind quality improvement, standard (IEEE12207.0-96):such as building in quality through the prevention and Quality assurance processearly detection of errors, continuous improvement, and Verification processcustomer focus, are pertinent to software engineering.These concepts are based on the work of experts in quality Validation processwho have stated that the quality of a product is directly Review process© IEEE – 2004 Version 11-3
  • 160. Audit process measurement processes are provided to the appropriateThese processes encourage quality and also find possible organization.problems. But they differ somewhat in their emphasis. The SQA plan defines the means that will be used toSQM processes help ensure better software quality in a ensure that software developed for a specific productgiven project. They also provide, as a by-product, general satisfies the user’s requirements and is of the highestinformation to management, including an indication of the quality possible within project constraints. In order to doquality of the entire software engineering process. The so, it must first ensure that the quality target is clearlySoftware Engineering Process and Software Engineering defined and understood. It must consider management,Management KAs discuss quality programs for the development, and maintenance plans for the software.organization developing the software. SQM can provide Refer to standard (IEEE730-98) for details.relevant feedback for these areas. The specific quality activities and tasks are laid out, withSQM processes consist of tasks and techniques to indicate their costs and resource requirements, their overallhow software plans (for example, management, management objectives, and their schedule in relation todevelopment, configuration management) are being those objectives in the software engineering management,implemented and how well the intermediate and final development, or maintenance plans. The SQA plan shouldproducts are meeting their specified requirements. Results be consistent with the software configuration managementfrom these tasks are assembled in reports for management plan (refer to the Software Configuration Managementbefore corrective action is taken. The management of an KA). The SQA plan identifies documents, standards,SQM process is tasked with ensuring that the results of practices, and conventions governing the project and howthese reports are accurate. they will be checked and monitored to ensure adequacy and compliance. The SQA plan also identifies measures,As described in this KA, SQM processes are closely statistical techniques, procedures for problem reportingrelated; they can overlap and are sometimes even and corrective action, resources such as tools, techniques,combined. They seem largely reactive in nature because and methodologies, security for physical media, training,they address the processes as practiced and the products and SQA reporting and documentation. Moreover, theas produced; but they have a major role at the planning SQA plan addresses the software quality assurancestage in being proactive in terms of the processes and activities of any other type of activity described in theprocedures needed to attain the quality characteristics and software plans, such as procurement of supplier softwaredegree of quality needed by the stakeholders in the to the project or commercial off-the-shelf softwaresoftware. (COTS) installation, and service after delivery of theRisk management can also play an important role in software. It can also contain acceptance criteria as well asdelivering quality software. Incorporating disciplined risk reporting and management activities which are critical toanalysis and management techniques into the software life software quality.cycle processes can increase the potential for producing a 2.2. Verification & Validationquality product (Cha89). Refer to the SoftwareEngineering Management KA for related material on risk [Fre98; Hor03; Pfl01; Pre04; Som05; Wal89; Wal96]management. For purposes of brevity, Verification and Validation2.1. Software Quality Assurance (V&V) are treated as a single topic in this Guide rather than as two separate topics as in the standard[Ack02; Ebe94; Fre98; Gra92; Hor03; Pfl01; Pre04; (IEEE12207.0-96). “Software V&V is a disciplinedRak97; Sch99; Som05; Voa99; Wal89; Wal96] approach to assessing software products throughout theSQA processes provide assurance that the software product life cycle. A V&V effort strives to ensure thatproducts and processes in the project life cycle conform to quality is built into the software and that the softwaretheir specified requirements by planning, enacting, and satisfies user requirements” (IEEE1059-93).performing a set of activities to provide adequate V&V addresses software product quality directly and usesconfidence that quality is being built into the software. testing techniques which can locate defects so that theyThis means ensuring that the problem is clearly and can be addressed. It also assesses the intermediateadequately stated and that the solution’s requirements are products, however, and, in this capacity, the intermediateproperly defined and expressed. SQA seeks to maintain steps of the software life cycle processes.the quality throughout the development and maintenanceof the product by the execution of a variety of activities at The V&V process determines whether or not products ofeach stage which can result in early identification of a given development or maintenance activity conform toproblems, an almost inevitable feature of any complex the requirement of that activity, and whether or not theactivity. The role of SQA with respect to process is to final software product fulfills its intended purpose andensure that planned processes are appropriate and later meets user requirements. Verification is an attempt toimplemented according to plan, and that relevant ensure that the product is built correctly, in the sense that the output products of an activity meet the specifications 11–4 © IEEE – 2004 Version
  • 161. imposed on them in previous activities. Validation is an 2.3.2. Technical reviewsattempt to ensure that the right product is built, that is, theproduct fulfills its specific intended purpose. Both the [Fre98; Hor03; Lew92; Pfl01; Pre04;verification process and the validation process begin early Som05; Voa99; Wal89; Wal96]in the development or maintenance phase. They providean examination of key product features in relation both to “The purpose of a technical review is to evaluate athe product’s immediate predecessor and to the software product to determine its suitability for itsspecifications it must meet. intended use. The objective is to identify discrepanciesThe purpose of planning V&V is to ensure that each from approved specifications and standards. The resultsresource, role, and responsibility is clearly assigned. The should provide management with evidence confirming (or not) that the product meets the specifications and adheresresulting V&V plan documents and describes the variousresources and their roles and activities, as well as the to standards, and that changes are controlled” (IEEE1028-techniques and tools to be used. An understanding of the 97).different purposes of each V&V activity will help in the Specific roles must be established in a technical review: acareful planning of the techniques and resources needed to decision-maker, a review leader, a recorder, and technicalfulfill their purposes. Standards (IEEE1012-98:s7 and staff to support the review activities. A technical reviewIEEE1059-93: Appendix A) specify what ordinarily goes requires that mandatory inputs be in place in order tointo a V&V plan. proceed:The plan also addresses the management, communication, Statement of objectivespolicies, and procedures of the V&V activities and their A specific software productinteraction, as well as defect reporting and documentationrequirements. The specific project management plan2.3. Reviews and Audits The issues list associated with this productFor purposes of brevity, reviews and audits are treated as The technical review procedurea single topic in this Guide, rather than as two separate The team follows the review procedure. A technicallytopics as in (IEEE12207.0-96). The review and audit qualified individual presents an overview of the product,process is broadly defined in (IEEE12207.0-96) and in and the examination is conducted during one or moremore detail in (IEEE1028-97). Five types of reviews or meetings. The technical review is completed once all theaudits are presented in the IEEE1028-97 standard: activities listed in the examination have been completed. Management reviews 2.3.3. Inspections Technical reviews [Ack02; Fre98; Gil93; Rad02; Rak97] Inspections “The purpose of an inspection is to detect and identify Walk-throughs software product anomalies” (IEEE1028-97). Two Audits important differentiators of inspections as opposed to reviews are as follows:2.3.1. Management reviews“The purpose of a management review is to monitor 1. An individual holding a management positionprogress, determine the status of plans and schedules, over any member of the inspection team shall notconfirm requirements and their system allocation, or participate in the inspection.evaluate the effectiveness of management approachesused to achieve fitness for purpose” [IEEE1028-97]. They 2. An inspection is to be led by an impartialsupport decisions about changes and corrective actions facilitator who is trained in inspectionthat are required during a software project. Management techniques.reviews determine the adequacy of plans, schedules, andrequirements and monitor their progress or Software inspections always involve the author of aninconsistencies. These reviews may be performed on intermediate or final product, while other reviews mightproducts such as audit reports, progress reports, V&V not. Inspections also include an inspection leader, areports, and plans of many types, including risk recorder, a reader, and a few (2 to 5) inspectors. Themanagement, project management, software configuration members of an inspection team may possess differentmanagement, software safety, and risk assessment, among expertise, such as domain expertise, design methodothers. Refer to the Software Engineering Management expertise, or language expertise. Inspections are usuallyand to the Software Configuration Management KAs for conducted on one relatively small section of the product atrelated material. a time. Each team member must examine the software product and other review inputs prior to the review© IEEE – 2004 Version 11-5
  • 162. meeting, perhaps by applying an analytical technique identify instances of nonconformance and produce a(refer to section 3.3.3) to a small section of the product, or report requiring the team to take corrective action.to the entire product with a focus only on one aspect, for While there may be many formal names for reviews andexample, interfaces. Any anomaly found is documented audits such as those identified in the standard (IEEE1028-and sent to the inspection leader. During the inspection, 97), the important point is that they can occur on almostthe inspection leader conducts the session and verifies that any product at any stage of the development oreveryone has prepared for the inspection. A checklist, maintenance process.with anomalies and questions germane to the issues ofinterest, is a common tool used in inspections. The 3. Practical Considerationsresulting list often classifies the anomalies (refer toIEEE1044-93 for details) and is reviewed for 3.1. Software Quality Requirementscompleteness and accuracy by the team. The inspection [Hor03; Lew92; Rak97; Sch99; Wal89; Wal96]exit decision must correspond to one of the followingthree criteria: 3.1.1. Influence factors 1. Accept with no or at most minor reworking Various factors influence planning, management, and selection of SQM activities and techniques, including: 2. Accept with rework verification The domain of the system in which the software will 3. Reinspect reside (safety-critical, mission-critical, business-Inspection meetings typically last a few hours, whereas critical)technical reviews and audits are usually broader in scope System and software requirementsand take longer. The commercial (external) or standard (internal)2.3.4. Walk-throughs components to be used in the system [Fre98; Hor03; Pfl01; Pre04; Som05; The specific software engineering standards applicable Wal89; Wal96] The methods and software tools to be used for“The purpose of a walk-through is to evaluate a software development and maintenance and for qualityproduct. A walk-through may be conducted for the evaluation and improvementpurpose of educating an audience regarding a software The budget, staff, project organization, plans, andproduct.” (IEEE1028-97) The major objectives are to scheduling of all the processes[IEEE1028-97]: The intended users and use of the system Find anomalies The integrity level of the system Improve the software product Information on these factors influences how the SQM Consider alternative implementations processes are organized and documented, how specific Evaluate conformance to standards and specifications SQM activities are selected, what resources are needed, and which will impose bounds on the efforts.The walk-through is similar to an inspection but is 3.1.2. Dependabilitytypically conducted less formally. The walk-through is In cases where system failure may have extremely severeprimarily organized by the software engineer to give his consequences, overall dependability (hardware, software,teammates the opportunity to review his work, as an and human) is the main quality requirement over andassurance technique. above basic functionality. Software dependability includes2.3.5. Audits such characteristics as fault tolerance, safety, security, and usability. Reliability is also a criterion which can be [Fre98; Hor03; Pfl01; Pre01; Som05; defined in terms of dependability (ISO9126). Voa99; Wal89; Wal96] The body of literature for systems must be highly dependable (“high confidence” or “high integrity“The purpose of a software audit is to provide an systems”). Terminology for traditional mechanical andindependent evaluation of the conformance of software electrical systems which may not include software hasproducts and processes to applicable regulations, been imported for discussing threats or hazards, risks,standards, guidelines, plans, and procedures” [IEEE1028- system integrity, and related concepts, and may be found97]. The audit is a formally organized activity, with in the references cited for this section.participants having specific roles, such as lead auditor,another auditor, a recorder, or an initiator, and includes arepresentative of the audited organization. The audit will 11–6 © IEEE – 2004 Version
  • 163. 3.1.3. Integrity levels of software Failures found in testing as a result of software faults are included as defects in the discussion in this section.The integrity level is determined based on the possible Reliability models are built from failure data collectedconsequences of failure of the software and the during software testing or from software in service, andprobability of failure. For software in which safety or thus can be used to predict future failures and to assist insecurity is important, techniques such as hazard analysis decisions on when to stop testing. [Mus89]for safety or threat analysis for security may be used todevelop a planning activity which would identify where One probable action resulting from SQM findings is topotential trouble spots lie. The failure history of similar remove the defects from the product under examination.software may also help in identifying which techniques Other actions enable the achievement of full value fromwill be most useful in detecting faults and assessing the findings of SQM activities. These actions includequality. Integrity levels (for example, gradation of analyzing and summarizing the findings, and usingintegrity) are proposed in (IEEE1012-98). measurement techniques to improve the product and the process as well as to track the defects and their removal.3.2. Defect Characterization Process improvement is primarily discussed in the [Fri95; Hor03; Lew92; Rub94; Wak99; Wal89] Software Engineering Process KA, with the SQM processSQM processes find defects. Characterizing those defects being a source of information.leads to an understanding of the product, facilitates Data on the inadequacies and defects found during thecorrections to the process or the product, and informs implementation of SQM techniques may be lost unlessproject management or the customer of the status of the they are recorded. For some techniques (for example,process or product. Many defect (fault) taxonomies exist, technical reviews, audits, inspections), recorders areand, while attempts have been made to gain consensus on present to set down such information, along with issuesa fault and failure taxonomy, the literature indicates that and decisions. When automated tools are used, the toolthere are quite a few in use [Bei90, Chi96, Gra92], output may provide the defect information. Data aboutIEEE1044-93) Defect (anomaly) characterization is also defects may be collected and recorded on an SCRused in audits and reviews, with the review leader often (software change request) form and may subsequently bepresenting a list of anomalies provided by team members entered into some type of database, either manually orfor consideration at a review meeting. automatically, from an analysis tool. Reports aboutAs new design methods and languages evolve, along with defects are provided to the management of theadvances in overall software technologies, new classes of organization.defects appear, and a great deal of effort is required to 3.3. Software Quality Management Techniquesinterpret previously defined classes. When tracking [Bas94; Bei90; Con86; Chi96; Fen97; Fri95; Lev95;defects, the software engineer is interested in not only thenumber of defects but also the types. Information alone, Mus89; Pen93; Sch99; Wak99; Wei93; Zel98]without some classification, is not really of any use in SQM techniques can be categorized in many ways: static,identifying the underlying causes of the defects, since people-intensive, analytical, dynamic.specific types of problems need to be grouped together inorder for determinations to be made about them. The point 3.3.1. Static techniquesis to establish a defect taxonomy that is meaningful to the Static techniques involve examination of the projectorganization and to the software engineers. documentation and software, and other information aboutSQM discovers information at all stages of software the software products, without executing them. Thesedevelopment and maintenance. Typically, where the word techniques may include people-intensive activities (as de-“defect” is used, it refers to a “fault” as defined below. fined in 3.3.2) or analytical activities (as defined in 3.3.3)However, different cultures and standards may use conducted by individuals, with or without the assistancesomewhat different meanings for these terms, which have of automated tools.led to attempts to define them. Partial definitions taken 3.3.2. People-intensive techniquesfrom standard (IEEE610.12-90) are: Error: “A difference…between a computed result The setting for people-intensive techniques, including and the correct result” reviews and audits, may vary from a formal meeting to an informal gathering or a desk-check situation, but (usually, Fault: “An incorrect step, process, or data definition at least) two or more people are involved. Preparation in a computer program” ahead of time may be necessary. Resources other than the Failure: “The [incorrect] result of a fault” items under examination may include checklists and results from analytical techniques and testing. These Mistake: “A human action that produces an incorrect activities are discussed in (IEEE1028-97) on reviews and result” audits. [Fre98, Hor03] and [Jon96, Rak97]© IEEE – 2004 Version 11-7
  • 164. 3.3.3. Analytical techniques specification to ensure the output’s traceability, consistency, completeness, correctness, and performance.A software engineer generally applies analytical This confirmation also includes the outputs of thetechniques. Sometimes several software engineers use the development and maintenance processes, collecting,same technique, but each applies it to different parts of the analyzing, and measuring the results. SQA ensures thatproduct. Some techniques are tool-driven; others are appropriate types of tests are planned, developed, andmanual. Some may find defects directly, but they are implemented, and V&V develops test plans, strategies,typically used to support other techniques. Some also cases, and procedures.include various assessments as part of overall qualityanalysis. Examples of such techniques include complexity Testing is discussed in detail in the Software Testing KA.analysis, control flow analysis, and algorithmic analysis. Two types of testing may fall under the headings SQA and V&V, because of their responsibility for the quality ofEach type of analysis has a specific purpose, and not all the materials used in the project:types are applied to every project. An example of asupport technique is complexity analysis, which is useful Evaluation and test of tools to be used on the projectfor determining whether or not the design or (IEEE1462-98)implementation is too complex to develop correctly, to Conformance test (or review of conformance test) oftest, or to maintain. The results of a complexity analysis components and COTS products to be used in themay also be used in developing test cases. Defect-finding product; there now exists a standard for softwaretechniques, such as control flow analysis, may also be packages (IEEE1465-98)used to support another activity. For software with many Sometimes an independent V&V organization may bealgorithms, algorithmic analysis is important, especially asked to monitor the test process and sometimes towhen an incorrect algorithm could cause a catastrophic witness the actual execution to ensure that it is conductedresult. There are too many analytical techniques to list in accordance with specified procedures. Again, V&Vthem all here. The list and references provided may offer may be called upon to evaluate the testing itself: adequacyinsights into the selection of a technique, as well as of plans and procedures, and adequacy and accuracy ofsuggestions for further reading. results.Other, more formal, types of analytical techniques are Another type of testing that may fall under the heading ofknown as formal methods. They are used to verify V&V organization is third-party testing. The third party issoftware requirements and designs. Proof of correctness not the developer, nor is in any way associated with theapplies to critical parts of software. They have mostly development of the product. Instead, the third party is anbeen used in the verification of crucial parts of critical independent facility, usually accredited by some body ofsystems, such as specific security and safety requirements. authority. Their purpose is to test a product for(Nas97) conformance to a specific set of requirements.3.3.4. Dynamic techniques 3.4. Software Quality MeasurementDifferent kinds of dynamic techniques are performed [Gra92]throughout the development and maintenance of software. The models of software product quality often includeGenerally, these are testing techniques, but techniques measures to determine the degree of each qualitysuch as simulation, model checking, and symbolic characteristic attained by the product.execution may be considered dynamic. Code reading isconsidered a static technique, but experienced software If they are selected properly, measures can supportengineers may execute the code as they read through it. In software quality (among other aspects of the software lifethis sense, code reading may also qualify as a dynamic cycle processes) in multiple ways. They can help in thetechnique. This discrepancy in categorizing indicates that management decision-making process. They can findpeople with different roles in the organization may problematic areas and bottlenecks in the software process;consider and apply these techniques differently. and they can help the software engineers assess the quality of their work for SQA purposes and for longer-Some testing may thus be performed in the development term process quality improvement.process, SQA process, or V&V process, again dependingon project organization. Because SQM plans address With the increasing sophistication of software, questionstesting, this section includes some comments on testing. of quality go beyond whether or not the software works toThe Software Testing KA provides discussion and how well it achieves measurable quality goals.technical references to theory, techniques for testing, and There are a few more topics where measurement supportsautomation. SQM directly. These include assistance in deciding when3.3.5. Testing to stop testing. For this, reliability models and benchmarks, both using fault and failure data, are useful.The assurance processes described in SQA and V&Vexamine every output relative to the software requirement 11–8 © IEEE – 2004 Version
  • 165. The cost of SQM processes is an issue which is almost schedule has not been respected, such as in testing, or thatalways raised in deciding how a project should be certain classes of faults will become more intense unlessorganized. Often, generic models of cost are used, which some corrective action is taken in development. Theare based on when a defect is found and how much effort predictive techniques assist in planning test time and init takes to fix the defect relative to finding the defect predicting failure. More discussion on measurement inearlier in the development process. Project data may give general appears in the Software Engineering Process anda better picture of cost. Discussion on this topic can be Software Engineering Management KAs. More specificfound in [Rak97: pp. 39-50]. Related information can be information on testing measurement is presented in thefound in the Software Engineering Process and Software Software Testing KA.Engineering Management KAs. References [Fen97, Jon96, Kan02, Pfl01] provideFinally, the SQM reports themselves provide valuable discussion on defect analysis, which consists of measuringinformation not only on these processes, but also on how defect occurrences and then applying statistical methodsall the software life cycle processes can be improved. to understanding the types of defects that occur mostDiscussions on these topics are found in [McC04] and frequently, that is, answering questions in order to assess(IEEE1012-98). their density. They also aid in understanding the trendsWhile the measures for quality characteristics and product and how well detection techniques are working, and howfeatures may be useful in themselves (for example, the well the development and maintenance processes arenumber of defective requirements or the proportion of progressing. Measurement of test coverage helps todefective requirements), mathematical and graphical estimate how much test effort remains to be done, and totechniques can be applied to aid in the interpretation of predict possible remaining defects. From thesethe measures. These fit into the following categories and measurement methods, defect profiles can be developedare discussed in [Fen97, Jon96, Kan02, Lyu96, Mus99]. for a specific application domain. Then, for the next software system within that organization, the profiles can Statistically based (for example, Pareto analysis, run be used to guide the SQM processes, that is, to expend the charts, scatter plots, normal distribution) effort where the problems are most likely to occur. Statistical tests (for example, the binomial test, chi- Similarly, benchmarks, or defect counts typical of that squared test) domain, may serve as one aid in determining when the Trend analysis product is ready for delivery. Prediction (for example, reliability models) Discussion on using data from SQM to improve development and maintenance processes appears in theThe statistically based techniques and tests often provide a Software Engineering Management and the Softwaresnapshot of the more troublesome areas of the software Engineering Process KAs.product under examination. The resulting charts andgraphs are visualization aids which the decision-makerscan use to focus resources where they appear mostneeded. Results from trend analysis may indicate that a© IEEE – 2004 Version 11-9
  • 166. MATRIX OF TOPICS VS. REFERENCE MATERIAL [NIST03] [McC77] [ISO900 [ISO900 [Mus99] [Hou99] [Lew92] [Rak97] [Wal96] [IEEE99 [Lap91] [Wei93] [Wei96] [Dac01] [Boe78] [Jon96] [Kia95] [Pre04] [Llo03] [Sei02] [Pfl01] 03-04] 1-00] ]1. Software Quality Fundamental1.1 Software Engineering Culture and Ethics * *1.2 Value and Cost of Quality * * * * * * * *1.3 Models and Quality Characteristics * * * * * * * * * * * * * * *1.4 Software Quality Improvement * *