• Save
Software implementation testing
Upcoming SlideShare
Loading in...5
×
 

Software implementation testing

on

  • 1,156 views

 

Statistics

Views

Total Views
1,156
Views on SlideShare
961
Embed Views
195

Actions

Likes
0
Downloads
0
Comments
0

3 Embeds 195

http://ferlianus-gulo.blogspot.com 193
http://ferlianus-gulo.blogspot.it 1
http://www.blogger.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Software implementation testing Software implementation testing Presentation Transcript

  • 1Software Implementation &Testing
  • 2Software Implementation• Seven major activities– Coding– Testing– Installation– Documentation– Training– Support• Purpose– To convert final physical system specifications into working andreliable software– To document work that has been done– To provide help for current and future users
  • 3The Process of Coding, Testingand Installation• Coding– Physical design specifications are turned into workingcomputer code• Testing– Tests are performed using various strategies– Testing can be performed in parallel with coding• Installation– Process during which the current system is replaced bythe new system
  • 4The Process of Documenting the System,Training Users and Supporting Users• Two audiences for documentation– The personnel who will maintain the system throughout itsproductive life– The people who will use the system as part of their daily lives• Deliverables– Documentation• System documentation• User documentation– User training plan• Classes• Tutorials– User training modules• Training materials• Computer-based training aids– User support plan• Help desk• On-line help• Bulletin boards and other support mechanisms
  • 5• A test plan is developed during the analysis phase• During the design phase, a unit test plan and asystem test plan are developed• The actual testing is done during implementation• Test plans provide improved communicationamong all parties involved in testing– Serve as checklistsSoftware Application Testing
  • 6Design testcasesPrepare testdataRunprogramwithtest dataCompare resultstotest casesTestcasesTestdataTestresultsTestreportsThe testing process
  • 7Software Application TestingTypes of Testing• Inspection– A testing technique in which participants examineprogram code for predictable language-specific errors• Walkthrough– A peer group review of any product created during thesystems development process; also called a structuredwalkthrough• Desk Checking– A testing technique in which the program code issequentially executed manually by the reviewer
  • 8• Unit Testing– Each module is tested alone in an attempt todiscover any errors in its code, also calledmodule testing• Integration Testing– The process of bringing together all of themodules that a program comprises for testingpurposes. Modules are typically integrated in atop-down, incremental fashionSoftware Application TestingTypes of Testing
  • 9Software Application TestingTypes of Testing• System Testing– The bringing together of all the programs that a systemcomprises for testing purposes. Programs are typicallyintegrated in a top-down, incremental fashion• Stub Testing– A technique used in testing, especially where modulesare written and tested in a top-down fashion, where afew lines of code are used to substituted for subordinatemodules17.9
  • 10Software Application TestingThe Testing Process1. The purpose of the testing is confirming that thesystem satisfies requirements2. Testing must be planned• Test Case– A specific scenario of transactions, queries ornavigation paths that represent a typical, critical orabnormal use of the system– Test cases and results should be thoroughlydocumented so they can be repeated for each revisionof an application
  • 11Black-box testing• An approach to testing where the programis considered as a ‘black-box’• The program test cases are based on thesystem specification• Test planning can begin early in thesoftware process
  • 12Black-box testingIeInput test dataOeOutput test resultsSystemInputs causinganomalousbehaviourOutputs which revealthe presence ofdefects
  • 13Equivalence partitioning• Input data and output results often fall intodifferent classes where all members of aclass are related• Each of these classes is an equivalencepartition where the program behaves in anequivalent way for each class member• Test cases should be chosen from eachpartition
  • 14Equivalence partitioningSystemOutputsInvalid inputs Valid inputs
  • 15Structural testing• Sometime called white-box testing• Derivation of test cases according toprogram structure. Knowledge of theprogram is used to identify additional testcases• Objective is to exercise all programstatements (not all path combinations)
  • 16White-box testingComponentcodeTestoutputsTest dataDerivesTests
  • 17Path testing• The objective of path testing is to ensure that theset of test cases is such that each path through theprogram is executed at least once• The starting point for path testing is a programflow graph that shows nodes representing programdecisions and arcs representing the flow of control• Statements with conditions are therefore nodes inthe flow graph
  • 18Program flow graphs• Describes the program control flow. Each branchis shown as a separate path and loops are shownby arrows looping back to the loop condition node• Used as a basis for computing the cyclomaticcomplexity• Cyclomatic complexity = Number of edges -Number of nodes + 2 (Number of regions ingraph).
  • 19Binary search flow graph1234657while bottom <= topif (elemArray [mid] == key(if (elemArray [mid]< key89bottom > top
  • 20Software Application TestingAcceptance Testing by Users• The process whereby actual users test a completedsoftware system, the end result of which is the users’acceptance of it• Alpha Testing– User testing of a completed software system using simulated data– Recovery testing• Forces the software (or environment) to fail in order to verify thatrecovery is properly performed– Security testing• Verifies that protection mechanisms built into the system will protectit from improper penetration– Stress testing• Tries to break the system– Performance testing• Determines how the system performs on the range of possibleenvironments in which it may be used
  • 21Software Application TestingAcceptance Testing by Users• Beta Testing– User testing of a completed information systemusing real data in the real user environment
  • 22• Verification:"Are we building the product right"• The software should conform to itsspecification• Validation:"Are we building the right product"• The software should do what the user reallyrequiresVerification vs validation
  • 23Installation• The organizational process of changing overfrom the current information system to a newone• Four approaches– Direct Installation• Changing over from the old information system to a newone by turning off the old system when the new one isturned on– Parallel Installation• Running the old information system and the new one at thesame time until management decides the old system can beturned off
  • 24Installation– Single location installation• Trying out an information system at one site andusing the experience to decide if and how the newsystem should be deployed throughout theorganization– Phased Installation• Changing from the old information system to thenew one incrementally, starting with one or a fewfunctional components and then graduallyextending the installation to cover the whole newsystem
  • 25Planning Installation• Considerations– Data conversion• Error correction• Loading from current system– Planned system shutdown– Business cycle of organization
  • 26Documenting The System• System documentation– Detailed information about a system’s designspecifications, its internal workings and itsfunctionality– Internal documentation• System documentation that is part of the program source codeor is generated at compile time– External documentation• System documentation that includes the outcome of structureddiagramming techniques such as data flow and entity-relationship diagrams
  • 27Documenting The System• User Documentation– Written or other visual information about anapplication system, how it works, and how touse it• Preparing user documentation– Traditional source has been informationsystems department– Application-oriented documentation is nowoften supplied by vendors and users themselves
  • 28Standard DocumentationSpecific Area StandardDeveloperStandardnumberQuality plans IEEEIEEE/EIA730, 730.11498/IS 640RequirementsspecificationsIEEEISO/IEC83012207DesignspecificationsIEEEISO/IEC1016, 1016.112207UserdocumentationIEEE 1063
  • 29Training Information System Users• Potential training topics– Use of the system– General computer concepts– Information system concepts– Organizational concepts– System management– System installation
  • 30Training Information System Users• Training methods– Resident expert– Computer-aided instruction– Formal courses– Software help components– Tutorials– Interactive training manuals– External sources, such as vendors• Electronic performance support system (EPSS)– Component of a software package or application inwhich training and educational information isembedded
  • 31Supporting Information SystemUsers• Support is extremely important to users– J.D. Power and Associates survey found usersupport to be number one criterion contributingto user satisfaction with personal computing• Most organizations provide support by twomeans– Information center– Help desk
  • 32Supporting Information System UsersInformation Center• An organizational unit whose mission is to support users inexploiting information technology• Staff might perform the following tasks– Install new hardware or software and set up user accounts– Consult with users writing programs in fourth-generationlanguages– Extract data from organizational databases onto personalcomputers– Answer basic on-demand questions– Provide a demonstration site for viewing hardware and software– Work with users to submit system change requests
  • 33Supporting Information System UsersHelp Desk• A single point of contact for all user inquiries andproblems about a particular information system orfor all users in a particular department
  • 34Why Implementation SometimesFails• Two conditions necessary for a successfulimplementation– Management support of the system underdevelopment– Involvement of users in the developmentprocess
  • 35Why Implementation SometimesFails• Insights about implementation process– Risk– Commitment to the project– Commitment to change– Extent of project definition and planning– Realistic user expectations• Implementation success factors– Extent to which system is used– User’s satisfaction with system
  • 36Electronic Commerce Application:Pine Valley Furniture• System implementation and operation of an Internet-basedelectronic commerce project is no different than otherprojects• Develop test cases– Simple functionality– Multiple functionality– Function chains– Elective function– Emergency/crisis• Bug tracking and system evolution• Alpha and beta testing the WebStore• WebStore installation
  • 37Project Close Down• Evaluate team– Reassign members to other projects• Notify all affected parties that thedevelopment project is ending and that youare switching to operation and maintenancemode• Conduct post-project reviews• Close out customer contract– Formal signoff17.37
  • 38Software QualityWhat is software quality and how do we measure it?customer’s viewpoint –> meets specificationsdeveloper’s viewpoint –> easy to maintain, test, .. .Other attributes of software that affect its quality:– safety – understandability – portability– security – testability – usability– reliability – adaptability – reusability– resilience – modularity – efficiency– robustness – complexity – learnabilityWe need to select critical quality attributes early in thedevelopment process and plan for how to achieve themSoftware quality is not just about meetingspecifications and removing defects!
  • 391. We have a set of standards and quality attributes that asoftware product must meet.There is a goal to achieve.2. We can measure the quality of a software product.There is a way to determine how well the productconforms to the standards and the quality attributes.3. We track the values of the quality attributes.It is possible to assess how well we are doing.4. We use information about software quality to improve the qualityof future software products.There is feedback into the software development process.Principles of SQA
  • 402. Provide a framework around which to implement SQA processensures that best practices are properly followed3. Assist in ensuring continuity of project workreduces learning effort when starting new work• Product standards:define the characteristics all product artifacts should exhibitso as to have quality• Process standards:define how the software process should be conducted toensure quality softwareEach project needs to decide which standards should be:ignored; used as is; modified; createdWe need software standards because they:1. Encapsulate best (or most appropriate) practicesacquired after much trial and error –> helps avoid previous mistakesSoftware Standards
  • 41Software Metricsmetric: any type of measurement that relates to asoftware system, process or related artifactmetric: any type of measurement that relates to asoftware system, process or related artifactControl metrics - used to control the development process(e.g., effort expended, elapsed time, disk usage, etc.)Predictor metrics - used to predict an associated product quality(e.g., cyclomatic complexity can predict ease of maintenance)External attribute: something we can only discover after thesoftware has been put into use (e.g.,ease of maintenance)Internal attribute: something we can measure directly from thesoftware itself (e.g., cyclomatic complexity)
  • 42We want to use internal attributes topredict the value of external attributesHard to formulate and validate relationships between internaland external attributesSoftware metrics must be collected, calibrated, interpretedexternalmaintainabilityreliabilityportabilityusabilityinternal# of parameterscyclomatic complexitylines of code# of error messageslength of user manualSoftware Metrics (2)
  • 43For a design, the key quality attribute is maintainabilityFor a design, the key quality attribute is maintainability• For design components maintainability is related to:– Cohesion - how closely related is the functionality of thecomponent?– Coupling - how independent is the component?– Understandability - how easy is it to understand what thecomponent does?– Adaptability - how easy is it to change the component?• Most of these cannot be measured directly, but it is reasonable toinfer that there is a relationship between these attributes and the“complexity” of a componentmeasure complexityProduct Quality —Design Quality Metrics
  • 44a) Structural fan-in/fan-outfan-in – number of calls to a componentfan-out – number of called componentshigh fan-in => high coupling (complexity)high fan-out => calling component has high complexityb) Informational fan-in/fan-out– Consider also the number of parameters passed plus access to shareddata structurescomplexity = length x (fan-in x fan-out)2It has been validated using the Unix systemIt is a useful predictor of effort required for implementationProduct Quality —Design Quality Metrics (2)
  • 45Product Quality —Design Quality Metrics (3)c) IEEE Standard 982.1-1988• Looks at: subsystem properties (number of subsystems anddegree of coupling)database properties (number of attributes andclasses)Compute a design structure quality index—DSQI (0-1)Used to compare with past designs; if DSQI is too low,further design work and review may be required• We can also consider changes made throughout the lifetime of the softwareand compute how stable the product is (i.e., how many changes insubsystems in the current release)Define a software maturity index (SMI)As SMI approaches 1, the product begins to stabilize
  • 46IEEE Standard 982.1-1988S1 = the total number of subsystems defined in the programarchitectureS2 = the number of subsystems whose correct functiondepends on the source of data input or that produces datato be used elsewhereS3 = the number of subsystems whose correct functiondepends on prior processingS4 = the number of database items (includes data objects andall attributes that define objects)S5 = the total number of unique database itemsS6 = the number of database segments (different records orindividual objectsS7 = the number of subsystems with a single entry and exit
  • 47IEEE Standard 982.1-1988 (2)Program structure:D1 = 1 if the architecture was developed using a distinct methodD1 = 0 otherwiseSubsystem independence: D2 = 1 - (S2/S1)Subsystems not dependent on prior processing: D3 = 1 - (S3/S1)Database size: D4 = 1 - (S5/S4)Database compartmentalization: D5 = 1 - (S6/S4)Subsystem entrance/exit characteristic: D6 = 1 - (S7/S1)DSQI = ΣwiDi wi = relative weighting of each Di
  • 48IEEE Standard 982.1-1988 (3)SMI = [MT - (Fa + Fc + Fd)]/ MTMT = the number of subsystems in the current releaseFc = the number of subsystems in the current releasethat have been changedFa = the number of subsystems in the current releasethat have been addedFd = the number of subsystems in the precedingrelease that were deleted in the current release
  • 49Product Quality —Program Quality Metricsa) Halstead’s Software Science• Looks at operators and operands in a component and calculates values forcomponent volume, V, component difficulty, D, and effort, E, required toimplement the componentn1 = number of unique operators in a componentn2 = number of unique operands in a componentN1 = the total number of operatorsN2 = the total number of operandsL = N1+ N2 (component length)V = L * log2(n1 + n2) (component volume in bits)D = (n1/2) * (N2/n2) (difficulty of implementing the component)E = V * D (effort required to implement the component)For a component, the key quality attributes arecorrectness and difficulty of implementationFor a component, the key quality attributes arecorrectness and difficulty of implementation
  • 50Product Quality —Program Quality Metrics (2)b) McCabe’s Complexity Metric• Looks at control flow in a componentCyclomatic Complexity –> measures component’s logical complexityan indication of the testing difficulty of a component• Studies have shown a distinct relationship between the CyclomaticComplexity of a component and the number of errors found in thesource code, as well as the time required to find and fix errorsc) Other program quality metrics– Length of code– Length of identifiers– Depth of conditional nestingStandards can be established to avoid complex components and/orhighlight problem components
  • 51Project Quality — ReviewsRequirementscaptureAnalysisDesignImplementationTestingrequirementswalkthroughsanalysiswalkthroughsdesignwalkthroughscodewalkthroughstest planreviewLeads to earlyLeads to earlydiscovery of defectsdiscovery of defectsAnalysis and Designintroduce 50-60%of all defects.Formal technical reviewscan uncover 75% of these!principal method of validating the quality of a projectprincipal method of validating the quality of a project