• Like
  • Save

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Extending Function Point Estimation for Testing MDM Applications

  • 1,715 views
Published

Several factors create the need for a different type of approach for estimating the effort of testing MDM applications.

Several factors create the need for a different type of approach for estimating the effort of testing MDM applications.

Published in Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,715
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. • Cognizant 20-20 InsightsExtending Function Point Estimation forTesting MDM Applications Executive Summary functions across the enterprise. The approach comprises the following steps: Effort estimation of testing has been a much debated topic. A variety of techniques are used — ranging from percentage of the development • Collect input specifications. effort to more refined approaches based on • Compute MDM application size (this includes use case and test case points — depending the ETL and MDM parts of testing) in function on functional and technological complexity. points. Underlying testing normally focuses on end-user • Determine the number of test cases for MDM functionality. testing (including ETL test cases). Testing of master data management (MDM) The MDM test estimation approach highlighted applications is different. As such, it requires a in this document is aligned with the International different approach when estimating effort. In an Function Point User Group’s (IFPUG) guidelines MDM testing project, there are specific factors for function point analysis (FPA). that impact estimation. They include: Steps of Estimation Process Flow: • The effort needed to prepare scenario-specific Size Estimation test data and loading scripts. The input and output interfaces of the MDM appli- • Script execution and data loading time. cation are counted, and the following general considerations are applied while calculating the • Availability of a separate MDM hub. function points: This white paper analyzes the impact of such factors, as well as the approach that should be • Step 1: Identify the Application Boundary for the MDM Project. adopted to estimate the effort needed for testing MDM solutions. The application boundary determines the function points that need to be counted as Estimation Approach part of the MDM application (including the ETL System and integration testing in MDM focus part). The application boundary indicates the on verifying the system functions, data quality, border between the software being measured exception handling and integration of business (in terms of testing) and the user and other cognizant 20-20 insights | august 2011
  • 2. applications that integrate with the MDM appli- This requirement can be identified and mapped cation. with a function point elementary process “External Output” (EO), as it involves querying Figure 1 depicts the application boundary and and deriving data using business logic and, counting scope of an MDM project. It contains hence, fulfilling the necessary conditions the following: for EO. > ETL layer functionalities. Applying Size Estimation Technique in > MDM and publish layer functionalities. MDM Testing Projects > End-to-end application functionalities, in- When it comes to testing types, the following cluding the ETL, MDM and publish layers. options are considered for an MDM testing• Step 2: Determine the Unadjusted Function project:- Point Count 1. Option A : Database-intensive testing deliver- The unadjusted function point count (UFPC) ables with data flow requirements for: reflects the specific countable MDM and ETL functionality provided to the user by the > Source to landing data loading (i.e., land process). project or application. The user function- ality is evaluated in terms of what is to be > Landing to staging data loading delivered by the application, not how it is to (i.e., stage process). be delivered. Only user-requested and user- > Staging to base object data loading defined components are counted. (i.e., load process). The UFPC can be counted by identifying and Database-intensive testing is required to mapping different user-requested MDM func- perform in each layer of data staging, as tionalities using function point elementary mentioned above. For example: processes. For example, an MDM testing requirement can be stated as, “Verification of > Data standardization and cleansing to be customer master PKEY_SRC_ID formation as verified for the stage process. per business rule.”Identifying Application Boundary and Testing Scope ETL Application Boundary X-Reference Tables Match Staging 1 Merge Source System 1 E Target Data Cleansing Landing Area MDM System T and Services Transformation Staging 2 MDM Hub L Source System 2 Data Publication Data Validation Data Harmonization Rejection Cleansing ETL Reject Standardization ETL Layer MDM System MDM Application BoundaryFigure 1 cognizant 20-20 insights 2
  • 3. > Auto match and merge of data as per busi- Hence, this functionality can be mapped against ness rule to be verified for the load process. the FP elementary process “External Query” (EQ). Figure 2 provides a pictorial view to identify the2. Option B: UI console-based testing deliver- elementary processes of function point analysis ables with the data steward-specific require- in such data migration activities. ments, such as: > Manual match and merge of records as per • Step 3: Determine the Value Adjustment Factor business rule. The value adjustment factor (VAF) indicates > Trust rule verification for data from differ- the general functionality provided to the user ent sources. of the application. The VAF comprises general system characteristics (GSC) that assess > Ability to create/edit new and existing re- the general functionality of the application. cords, etc. Examples of such characteristics are:Activities related to each of the above sections can > Distributed data processing.be mapped directly with the elementary processesof function point analysis. For example, consider > Performance objective of the MDM hub.the following data standardization and cleansing > Online data entry on the downstream ap-requirement: “Customer address records should plications.be free from junk characters (#, &, ^, %, !), and The VAF can vary between 0.65 and 1.35.‘Street’ should be displayed as ‘STRT.’“ • Step 4: Calculate the Adjusted Function PointA simple SQL query will be implemented in the Count (AFPC)test steps in order to verify the above require- The adjusted function point count is calculatedment. The query doesn’t need to have any logical using a specific formula:data derivation (e.g., concatenation or selectinga sub-string from the record) or mathemati- AFPC = VAF * UFPCcal calculation in order to verify the cleansing • Step 5: Normalize Using Test Casesrequirement. It is required to fetch the record as On obtaining the size of the application init is stored in the database as per the conditions terms of FP, the number of normalized teststated in the requirement.Identifying Elementary Processes for MDM Data Flow Integrated MDM Application Load Process Boundary Match Source System 1 Staging 1 E Landing Area Merge T Land P Land Process n Stage Process Target MDM System Staging 2 MDM HUB Services L Source System 2 Elementary MDM System Process - EQ External OutputFigure 2 cognizant 20-20 insights 3
  • 4. cases (manual) befitting the application is Project Specific Factors for MDM: Testing calculated using a formula proposed by and The impact of these factors varies from project to based on historical data from Capers Jones. project. Based on the situation, these factors may increase or decrease effort. Adjusted Function Number of Normalized Point Count Test Cases Beyond total effort, a percentage of common factors and project-specific factors must be added AFPC ( AFPC) ^ a in order to arrive at the final adjusted effort.Note: ‘a’ is a factor that can be a range of value Final Adjusted Effort = Total Effort + Totalthat varies with the AFPC. Effort * (% of Common Factors + % of Project-Specific Factors)Effort Estimation Factors such as initiation and planning, closure,The effort estimation for an MDM testing project number of iterations, etc. need to be consideredis computed on the basis of the Organizational separately and added to the above figure.Baseline Productivity (OBP) figures for MDMtesting projects. The total effort required by Challengesthe project based on productivity figures is as Having outlined the approach, it is still importantfollows: to highlight that — unlike UI-intensive applica- Total Effort in Person Hours (PH) = Number tion testing — effort estimation for testing MDM of Normalized Test Cases / Productivity applications is still a new concept. Estimation has (in Normalized Test Cases per PH). many challenges, a few of which include:It is a requirement to conduct a productivity 1. Non-availability of industry-standard produc-baselining exercise within the organization that tivity values for MDM technologies.uses essential data from closed testing projects 2. Non-availability of detailed requirement speci-— namely, actual project size and effort data from fications at the estimation stage.the key members of closed projects. The final size 3. The need for skilled function point countersis established in terms of normalized test cases for consistent size estimation, especiallyand the effort in PH. The effort for test design and people with sufficient training and practicetest execution needs to be captured separately in with counting rules.order to derive the productivity figure for eachcase. This yields the productivity data point for 4. The availability of subject matter experts foreach case and project. The median value of these the application in order to get a logical view ofdata points gives us the OBP for test design and the application.execution. Final NotesCommon Factors for MDM — Testing Projects: Based on the estimation approach highlighted inThese factors always increase the effort required. this paper, we have built a tool for MDM testing estimation. This tool not only provides simple Factor Affecting Effort interfaces to capture user inputs, but it also implements the calculations for effort estimation. Project management (strategy, planning, Additionally, it addresses the majority of the monitoring & reporting) challenges mentioned above by making realistic Quality assurance assumptions based on our rich experience with Retesting, reworking & defect tracking MDM application testing. Training effort Environment setup and integration with test management tool Test data preparation cognizant 20-20 insights 4
  • 5. About the AuthorPrabuddha Samaddar is a consultant who leads Cognizant’s MDM Testing Team within its CustomerSolution Testing Practice. He functions as Cognizant’s MDM testing subject matter expert. Prabuddahas in-depth knowledge in different estimation techniques, such as function point analysis, and richexperience developing estimation models, writing white papers on estimation and presenting estimationcapabilities to clients. He can be reached at Prabuddha.Samaddar@cognizant.com.About CognizantCognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process out-sourcing services, dedicated to helping the world’s leading companies build stronger businesses. Headquartered inTeaneck, New Jersey (U.S.), Cognizant combines a passion for client satisfaction, technology innovation, deep industryand business process expertise, and a global, collaborative workforce that embodies the future of work. With over 50delivery centers worldwide and approximately 118,000 employees as of June 30, 2011, Cognizant is a member of theNASDAQ-100, the S&P 500, the Forbes Global 2000, and the Fortune 500 and is ranked among the top performing andfastest growing companies in the world. Visit us online at www.cognizant.com or follow us on Twitter: Cognizant. World Headquarters European Headquarters India Operations Headquarters 500 Frank W. Burr Blvd. Haymarket House #5/535, Old Mahabalipuram Road Teaneck, NJ 07666 USA 28-29 Haymarket Okkiyam Pettai, Thoraipakkam Phone: +1 201 801 0233 London SW1Y 4SP UK Chennai, 600 096 India Fax: +1 201 801 0243 Phone: +44 (0) 20 7321 4888 Phone: +91 (0) 44 4209 6000 Toll Free: +1 888 937 3277 Fax: +44 (0) 20 7321 4890 Fax: +91 (0) 44 4209 6060 Email: inquiry@cognizant.com Email: infouk@cognizant.com Email: inquiryindia@cognizant.com© Copyright 2011, Cognizant. All rights reserved. No part of this document may be reproduced, stored in a retrieval system, transmitted in any form or by anymeans, electronic, mechanical, photocopying, recording, or otherwise, without the express written permission from Cognizant. The information contained herein issubject to change without notice. All other trademarks mentioned herein are the property of their respective owners.