Testing data warehouse applications by Kirti Bhushan

5,254 views

Published on

Testing data warehouse applications by Kirti Bhushan

Published in: Technology, Business
0 Comments
9 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
5,254
On SlideShare
0
From Embeds
0
Number of Embeds
462
Actions
Shares
0
Downloads
419
Comments
0
Likes
9
Embeds 0
No embeds

No notes for slide

Testing data warehouse applications by Kirti Bhushan

  1. 1. Data Warehouse Testing Strategy Ver 1.0 Kirti Bhushan Page 1 of 30
  2. 2. Table of Contents Introduction......................................................................................................................................3 About Data Warehousing.............................................................................................................3 Need for Data Warehouse testing................................................................................................3 Challenges to Data Warehouse testing........................................................................................3 Functional Testing Model................................................................................................................6 Data Warehouse Testing Model.......................................................................................................7 Project Definition Phase..............................................................................................................7 Test Design Phase........................................................................................................................9 Test Development Phase..............................................................................................................9 Test Execution Phase.................................................................................................................10 Acceptance.................................................................................................................................11 Data Warehouse Testing Architecture...........................................................................................12 Goals of Data Warehouse Testing.................................................................................................13 Data Completeness Testing........................................................................................................13 Data Transformation Testing.....................................................................................................14 Data Quality Testing..................................................................................................................15 Non-Functional Testing.............................................................................................................15 Error log and Audit log..........................................................................................................16 Backup and Recovery Testing...............................................................................................16 Security Testing.........................................................................................................................16 System Security.....................................................................................................................17 Database Security...................................................................................................................17 Application Security .............................................................................................................17 ETL Tool Security.................................................................................................................18 BI Tool Security.....................................................................................................................18 Performance and Scalability......................................................................................................19 Integration Testing.....................................................................................................................20 Reports Testing..........................................................................................................................20 User Acceptance Testing...........................................................................................................22 Regression Testing.....................................................................................................................23 Scope of Testing............................................................................................................................24 Roles and Responsibilities.............................................................................................................25 Artifacts / Deliverables..................................................................................................................26 Software Project Plan (SPP)......................................................................................................26 System Test Plan .......................................................................................................................26 System Test Cases/Test Plan and Scripts (TPS)........................................................................26 Tools and Automation in Data Warehousing.............................................................................27 References......................................................................................................................................30 Page 2 of 30
  3. 3. Introduction About Data Warehousing A data warehouse is the main repository of an organization's historical data, its corporate memory. It contains the raw material for management's decision support system. The critical factor leading to the use of a data warehouse is that a data analyst can perform complex queries and analysis, such as data mining, on the information without slowing down the operational systems. Data Warehouse can be formally defined terms; it is Subject-oriented, meaning that the data in the database is organized so that all the data elements relating to the same real-world event or object are linked together; Time-variant, meaning that the changes to the data in the database are tracked and recorded so that reports can be produced showing changes over time; Non-volatile, meaning that data in the database is never over-written or deleted, once committed, the data is static, read-only, but retained for future reporting; and Integrated, meaning that the database contains data from most or all of an organization's operational applications, and that this data is made consistent. Need for Data Warehouse testing Businesses are increasingly focusing on the collection and organization of data for strategic decision-making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage. There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions. Given the importance of early detection of software defects we need to come up with a strategy to effectively test an ETL application. Challenges to Data Warehouse testing Challenges to data warehouse testing can be summarized as below: • • • Page 3 of 30 Data selection from multiple source systems and analysis that follows pose great challenge. Volume and the complexity of the data. Inconsistent and redundant data in a data warehouse.
  4. 4. • • • • • • Loss of data during the ETL process. Non-Availability of comprehensive test bed. Critical data for bussiness. Data quality not assured at source. Very high cost of quality. This is because any defect slippage will translate into significantly high costs for the organization. 100% data verification not feasible always, so more stress on ETL components to ensure data behaves as expected within these modules. Data Warehouse Testing is Different than traditional testing Data Warehouse Testing is Different than the traditional testing in many ways. All works in Data Warehouse population are mostly through batch runs. Therefore the testing is different from what is done in transaction systems. Unlike a typical transaction system, data warehouse testing is different on the following counts: User-Triggered vs. System triggered Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the systemtriggered scenarios (Like billing, Valuation.) In data Warehouse, most of the testing is system triggered as per the scripts for ETL ('Extraction, Transformation and Loading'), the view refresh scripts etc. Therefore typically Data-Warehouse testing is divided into two parts--> 'Backend' testing where the source systems data is compared to the end-result data in Loaded area, and 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP. Batch vs. online gratification This is something, which makes it a challenge to retain users interest. A transaction system will provide instant OR at least overnight gratification to the users, when they enter a transaction, which either is processed online OR maximum via overnight batch. In the case of data- warehouse, most of the action is happening in the back-end and users have to trace the individual transactions to the MIS and views produced by the OLAP tools. This is the same challenge, when you ask users to test the month-end mammoth reports/financial statements churned out by the transaction systems. Volume of Test Data The test data in a transaction system is a very small sample of the overall production data. Typically to keep the matters simple, we include as many test Page 4 of 30
  5. 5. cases as are needed to comprehensively include all possible test scenarios, in a limited set of test data.. Data Warehouse has typically large test data as one does try to fill-up maximum possible combination and permutations of dimensions and facts. For example, if you are testing the location dimension, you would like the location-wise sales revenue report to have some revenue figures for most of the 100 cities and the 44 states. This would mean that you have to have thousands of sales transaction data at sales office level (assuming that sales office is lowest level of granularity for location dimension). Possible scenarios/ Test Cases If a transaction system has hundred (say) different scenarios, the valid and possible combination of those scenarios will not be unlimited. However, in case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of Data. In other words, 'You can never fully test a data Warehouse' Therefore one has to be creative in designing the test scenarios to gain a high level of confidence. Test Data Preparation This is linked to the point of possible test scenarios and volume of data. Given that a data- warehouse needs lots of both, the effort required to prepare the same is much more. Programming for testing challenge In case of transaction systems, users/business analysts typically test the output of the system. However, in case of data warehouse, as most of the action is happening at the back-end, most of the 'Data Warehouse data Quality testing' and 'Extraction, Transformation and Loading' testing is done by running separate stand-alone scripts. These scripts compare pre-Transformation to post Transformation (say) comparison of aggregates and throw out the pilferages. Users roles come in play, when their help is needed to analyze the same (if designers OR business analysts are not able to figure it out). Page 5 of 30
  6. 6. Functional Testing Model The diagram below shows the phases of a conventional functional testing model. The phases in this model are as follows: 1 Project Definition Phase 2 Functional Test Design Phase 3 Functional Test Case Preparation Phase 4 Functional Test Execution Phase 5 Functional Test Acceptance Phase Page 6 of 30
  7. 7. Data Warehouse Testing Model The conventional functional testing model is tailored below to suit any Data Warehouse Testing project. The Data Warehouse Testing model has similar phases but the activities which are part of the phases are different and are relevant to the Data Warehouse projects. Project Definition Phase Purpose • • • Identify the project scope and understand the customer requirements Request for Infrastructure and Human resources for the project Define the Software Project Plan Entry Criteria Statement of Work, High Level Business Requirement Document Page 7 of 30
  8. 8. Tasks Project Initiation • Allocate Project Id • Understand Project Requirements • Prepare Project Kick-off request • Submit Project Kick-off request for approval • Prepare Work Order • Submit Work Order for approval Software Project Plan • List the assumptions made • Identify the deliverables of the project • Define development process and verification activities and specify deviations from the standard process, if any. • Identify the project organization depending on the size of the project • Identify the risk and the risk management plans for the project. • Review the estimated effort, Schedule and milestones for the project provided in the estimation worksheet. • Identify the project management process. • Prepare the training plan for the project • Identify the hardware, software, and other project specific requirements for the project. • Identify the quality goals for the project. • List the verification activities and mention the deviations, if any. • List the Invoicing Schedule for the project. • Identify the project metrics to be collected • Define the organization for Configuration Management activities • Identify the Configurable Items, libraries to store them and version numbering scheme. • Define configuration control mechanisms • Define configuration Status Accounting mechanisms. • Plan for configuration Audits • Prepare a Software Project Plan document • Submit the Software Project Plan for review Exit Criteria Page 8 of 30
  9. 9. Reviewed and Approved Software Project Plan (SPP) Test Design Phase Purpose • Define the Test approach and develop a test design specification. • Prepare the associated test data. Entry Criteria Approved Software Project Plan (SPP), Approved SOW and Approved Functional Specification (or) Business Requirement Document (BRD) in detail Tasks Prepare Test Design Specification and Master Test Plan • Derive the Detailed Master Test Plan • Prepare the Master Test Plan based on the Estimation • Review the Functional specification. • Identify at a high level the modules and their integration. • Identify major business (functional) process. • Create Positive, Negative and Destructive Test scenarios. • Create Test data requirements. • Define Timeline and Resource schedule • Resource loading for the activities defined in the Project plan. • Assign unique identifier for each test scenario and trace it to requirements.(this may be a part of TPS) • Peer Review of Master Test Plan. • Review of Master Test Plan by Team members. • QA management review of Master Test Plan. Exit Criteria Reviewed and Approved Test Design Specification and Software Test Plan (STP) Test Development Phase Purpose Page 9 of 30
  10. 10. • Identify and prepare test cases Entry Criteria Functional specification, ETL specification, ER diagram, Report Design Document, Approved Test Design Document and Software Test Plan STP) Tasks Prepare Test Cases • Create detailed test cases for all identified types of testing. • Create test cases for the functionality to be tested covering all scenarios captured during Test Design Phase. • Assign unique identifier for the Test cases and trace it to requirements • Mapping Test Cases to the corresponding High level scenarios • Perform peer review of the test cases • Test cases need to be review by the team members • Rework of Test Cases • Modify Test Cases in case of any defects found during Peer review • Update the Traceability Matrix. Exit Criteria Reviewed and Approved System Test Cases/Test plan and Scripts (TPS) document. Test Execution Phase Purpose • To Test the Product • Prepare the Test Report • Package and release of test deliverables Entry Criteria Approved Test Cases, Approved Test Data and Unit Tested Source Code Tasks Execute Tests for all identified testing types Page 10 of 30
  11. 11. • Execute the test cases as per the test case specification • Record defects • Prepare Test report for the test failures and defects found • Test deliverable package release note Prepare Test Report • Prepare test report for each testing phase • Prepare test summary report based on test incident reports Package • Package the test data and test results • Prepare delivery note • Prepare release notes Release • Verify the test results package and release it Exit Criteria Approved Test Results and Approved Test Summary. Acceptance Purpose • To verify Test Deliverables • Obtain Customer approval and Sign-off Entry Criteria Approved Test Deliverable Package Tasks • • Support acceptance test. Package and Deliver. Exit Criteria Customer Sign-off. Page 11 of 30
  12. 12. Data Warehouse Testing Architecture The architecture below depicts the various types of testing that can be performed for any data warehouse testing project. Page 12 of 30
  13. 13. Goals of Data Warehouse Testing Listed below are the different types of testing needed to ensure the quality if the Data warehouse. • Data Completeness testing – to ensure that all expected data is loaded. • Data Transformation testing – to ensure that all data is transformed correctly according to business rules and/or design specifications. • Data Quality testing – to ensure that the ETL application correctly rejects, substitutes default values, corrects or ignores and reports invalid data. • Non-Functional Testing – to ensure that an application or entire system can successfully recover from a variety of hardware, software or network malfunctions without loss of data or data integrity. It also involves verifying log files like Audit log and Error log. • Security Testing – to ensure that only those users granted access to the system can access the applications and only through the appropriate gateways • Performance and Scalability Testing– to ensure that data loads and queries perform within expected time frames and that the technical architecture is scalable. • Integration Testing – to ensure that the ETL process functions well with other upstream and downstream processes. • Reports Testing – to ensure consistency and accuracy of the data reported. • User-Acceptance testing –to ensure the solution meets users’ current expectations and anticipates their future expectations. • Regression Testing – to ensure existing functionality remains intact each time a new release of code is completed. Data Completeness Testing One of the most basic tests of data completeness is to verify that all expected data loads into the data warehouse. This includes validating that all records, all fields and the full contents of each field are loaded. Test Strategies to consider for Data completeness testing include: • Page 13 of 30 Comparing record counts between source data, data loaded to the warehouse and rejected records.
  14. 14. • • • • Comparing unique values of key fields between source data and data loaded to the warehouse. This is a valuable technique that points out a variety of possible data errors without doing a full validation on all fields. Utilizing a data profiling tool that shows the range and value distributions of fields in a data set. This can be used during testing and in production to compare source and target data sets and point out any data anomalies from source systems that may be missed even when the data movement is correct. Populating the full contents of each field to validate that no truncation occurs at any step in the process. For example, if the source data field is a string (30) make sure to test it with 30 characters. Testing the boundaries of each field to find any database limitations. For example, for a decimal (3) field include values of -99 and 999, and for date fields include the entire range of dates expected. Depending on the type of database and how it is indexed, it is possible that the range of values the database accepts is too small. Data Transformation Testing Validating that data is transformed correctly based on business rules can be the most complex part of testing an ETL application with significant transformation logic. One typical method is to pick some sample records and “stare and compare” to validate data transformations manually. This can be useful but requires manual testing steps and testers who understand the ETL logic. A combination of automated data profiling and automated data movement validations is a better long-term strategy. Here are some simple automated data movement techniques: • • • Page 14 of 30 Create a spreadsheet of scenarios of input data and expected results and validate these with the business customer. This is a good requirements elicitation exercise during design and can also be used during testing. Create test data that includes all scenarios. Elicit the help of an ETL developer to automate the process of populating data sets with the scenario spreadsheet to allow for flexibility because scenarios will change. Utilize data profiling results to compare range and distribution of values in each field between source and target data.
  15. 15. • • • • Validate correct processing of ETL-generated fields such as surrogate keys. Validate that data types in the warehouse are as specified in the design and/or the data model. Set up data scenarios that test referential integrity between tables. For example, what happens when the data contains foreign key values not in the parent table? Validate parent-to-child relationships in the data. Set up data scenarios that test how orphaned child records are handled. Data Quality Testing Data quality is defined as ‘how the ETL system handles data rejection, substitution, correction and notification without modifying data.’ To ensure success in testing data quality, include as many data scenarios as possible. Typically, data quality rules are defined during design, for example: • • • • Reject the record if a certain decimal field has nonnumeric data. Substitute null if a certain decimal field has nonnumeric data. Validate and correct the state field if necessary based on the ZIP code. Compare product code to values in a lookup table, and if there is no match load anyway but report to users. Depending on the data quality rules of the application being tested, scenarios to test might include null key values, duplicate records in source data and invalid data types in fields (e.g., alphabetic characters in a decimal field). Review the detailed test scenarios with business users and technical designers to ensure that all are on the same page. Data quality rules applied to the data will usually be invisible to the users once the application is in production; users will only see what’s loaded to the database. For this reason, it is important to ensure that what is done with invalid data is reported to the users. These data quality reports present valuable data that sometimes reveals systematic issues with source data. In some cases, it may be beneficial to populate the “before” data in the database for users to view. Non-Functional Testing Page 15 of 30
  16. 16. Error log and Audit log Error log and Audit log ensures the following: • Error messages should be in proper format as in the ETL specification. • The log file name should be same as session name, with timestamp and. log extension. • Is there a test case to check whether mail has been sent to the customer in case there is a failure in loading the data. Backup and Recovery Testing Backup and recovery testing ensures that an application or entire system can successfully recover from a variety of hardware, software or network malfunctions without loss of data or data integrity. To ensure maximum system availability and uptime, a proper backup plan must be prepared. The plan must include backup frequency, media and storage. All backup systems must be able to be restored easily and properly “take over” for the failed system without loss of data or transactions or even valuable downtime. BI Tools, which are mostly repository based (i.e., metadata stored in an relational database management system), poses a different challenge when it comes to recovery for configuration management purposes. Recovery testing is an antagonistic test process in which the application or system is exposed to extreme conditions (or simulated conditions) such as device I/O failures or invalid database pointers/keys. Recovery processes are invoked, and the application/system is monitored and/or inspected to verify proper application, system and data recovery has been achieved. Both the database administrator and the system administrator plan and execute any such testing. Security Testing When it comes to security of a system, there are three aspects of it: Authentication, access control and privileges. Security testing focuses on the following two key areas of security: • System security - This looks after authentication and access control, including logging into and remote access to the system. • Application security - This includes access to the data or business functions and privileges thereof. Page 16 of 30
  17. 17. System Security System security ensures that only those users granted access to the system can access the applications and only through the appropriate gateways. Obviously, this is governed by the overall enterprise-wide security policy. Each individual user is assigned a unique user ID and password, which has governing password-changing policy. Generally such security is implemented through such technologies as Windows NT Authentication and/or Lightweight Directory Access Protocol (LDAP) at the OS and network level and the database security. The network administrator(s) and the system administrator(s) will be responsible for setting up and governing such system security. A customized homegrown security implementation is also not uncommon, although they are giving way to the industry choices of OS/LDAP authentication. A typical requirement that is coming across in a DW/BI application is what is known as single sign-on (SSO) capability. While the requirement is very simple, “users once logged on to their desktop using the local area network (LAN) user ID and password do not want to enter yet another user ID and password (may be same or different from the LAN ID)”, implementing this feature calls for tight integration between the various DW/BI tools and the security application. The tools come with out-of-the-box integration features with NT/LDAP authentication. However, integration with homegrown security is always a challenge and needs to be tested thoroughly. Database Security The databases used in a DW/BI project have basically three different types of users: • Database administrators. Responsible for creating other users as well as creating and maintaining the database objects for the application. • Developers. Responsible for developing the data warehouse application, using ETL tools and/or BI tools. • Dedicated application users. Responsible for the database connection in the N-Tier architecture for both ETL and BI tools within the production environment. Database security testing will concentrate on testing the authentication and/or privileges of the above-referenced types of users. Application Security This type of security ensures that, based upon desired restrictions, users are limited to specific functions and data only after they have been authenticated successfully. In a DW/BI project, application security will include that found in the ETL tool and in the BI application as described below. Page 17 of 30
  18. 18. ETL Tool Security When it comes to an ETL tool, the following repository privileges will be tested: • Server administrator, • Repository administrator, • Session operator, • Using designer, • Browsing repository, and • Creating sessions and batches. ETL security testing will concentrate on testing the ability to perform expected tasks while having particular repository privileges granted to a ETL tool user. BI Tool Security When it comes to a BI tool, the application security involves the privileges of a given user. The privileges can be user expertise profile-based (e.g., power user versus basic user) and/or functional subject area-based (access to sales reports versus marketing reports or access to dashboard reports versus operational reports). Data level access restrictions are also not very uncommon (e.g., East Region Manager should see East region data whereas West Region Manager should see West region data only while running the same report). Generally, two separate user profiles are created in the BI tool and implemented by using security role objects: • Basic users. Those users who will be accessing the application using the Web environment. They will be mostly running canned reports and are not allowed to create their own report. The security testing will validate this. • Power users. Those users who will be able to access the application both from the Web and a client-server desktop environment. They not only can execute the canned reports but also can create their own reports based on pre-existing reusable objects line metrics, attributes and filters. Some expert data stewards are given privileges to create new reusable objects. Functional subject area-based access or access restrictions are implemented in a BI tool by using what is known as a security group. The access control list (ACL) of various reusable objects in a BI tool will have entries of specific security groups who should be granted access with read/write/execute privileges. Individual end users are assigned to these groups in turn, thereby allowing or restricting access to specific objects. Page 18 of 30
  19. 19. The data level access restrictions are implemented in a BI tool by what is called “security filters.” Security filters are associated with either an individual user or a security group. When these individuals and/or the groups execute a report, the BI tool will automatically append the security filter condition to all SQL generated by the tool against the database, if applicable. BI tool security testing will concentrate on testing the privileges of the basic user and the power user. It will also check subject area-specific access control based on various group level accesses. The testing will also check for password change policy. It is more important to test that end users are not able to access those reports which they are not given grants to. Apart from the two user profiles listed above, there is also a BI tool administrator and a developer role. Developers will have privileges for using the BI tool architect and agent software via the desktop. The test cases used for testing the power user’s security access can also be used to test the developer’s security access as well. The BI tool administrator has all-inclusive privileges. No separate testing is required. Performance and Scalability As the volume of data in a data warehouse grows, ETL load times can be expected to increase and performance of queries can be expected to degrade. This can be mitigated by having a solid technical architecture and good ETL design. The aim of the performance testing is to point out any potential weaknesses in the ETL design, such as reading a file multiple times or creating unnecessary intermediate files. The following strategies will help discover performance issues: • Load the database with peak expected production volumes to ensure that this volume of data can be loaded by the ETL process within the agreedupon window. • Compare these ETL loading times to loads performed with a smaller amount of data to anticipate scalability issues. Compare the ETL processing times component by component to point out any areas of weakness. • Monitor the timing of the reject process and consider how large volumes of rejected data will be handled. • Perform simple and multiple join queries to validate query performance on large database volumes. Work with business users to develop sample queries and acceptable performance criteria for each query. Page 19 of 30
  20. 20. Integration Testing Typically, system testing only includes testing within the ETL application. The endpoints for system testing are the input and output of the ETL code being tested. Integration testing shows how the application fits into the overall flow of all upstream and downstream applications. When creating integration test scenarios, consider how the overall process can break and focus on touch points between applications rather than within one application. Consider how process failures at each step would be handled and how data would be recovered or deleted if necessary. Most issues found during integration testing are either data related to or resulting from false assumptions about the design of another application. Therefore, it is important to integration test with production-like data. Real production data is ideal, but depending on the contents of the data, there could be privacy or security concerns that require certain fields to be randomized before using it in a test environment. As always, don’t forget the importance of good communication between the testing and design teams of all systems involved. To help bridge this communication gap, gather team members from all systems together to formulate test scenarios and discuss what could go wrong in production. Run the overall process from end to end in the same order and with the same dependencies as in production. Integration testing should be a combined effort and not the responsibility solely of the team testing the ETL application. Reports Testing End user reporting is the final component of a DW/BI Project. Reports provide the visual output to the end consumer. Typically, the BI reports are most likely developed using an Online Analytical Processing (OLAP) tool like Cognos, Business Objects, Hyperion, MicroStrategy, etc. The reports run aggregate SQL queries against the data stored in the data mart and/or the DW and display them in a suitable format either on a Web browser or on a client application interface. Once the initial view is rendered, the reporting tool interface provides various ways of manipulating the information such as sorting, pivoting, adding subtotals, and adding view filters to slice-and-dice the information further. Keep in mind some special considerations while testing the reports: • The ETL process should be complete and the data mart must be populated. • The BI tools generally have an SQL engine which will generate the SQL based on the how the dimension and fact tables are mapped in the tool. Additionally, there may be some global or report-specific parameters set to handle very large database (VLDB)-related optimization Page 20 of 30
  21. 21. • • • • • • • • • • requirements. As such, testing of the BI tool will concentrate on validating the SQL generated; this in turn validates the dimensional model and the report specification vis-à-vis the design. Unit testing of the BI reports is recommended to test the layout format per the design mockup, style sheets, prompts and filters, attributes and metrics on the report. Unit testing will be executed both in the desktop and Web environment. System testing of the BI reports should concentrate on various report manipulation techniques like the drilling, sorting and export functions of the reports in the Web environment. Testing of the reports will require an initial load of data followed by two incremental loads of data. Dashboard reports and/or documents need special consideration for testing, because they are high visibility reports used by the top management and because they have various charts, gauges and data points to provide a visual insight to the performance of the organization in question. There may be some trending reports, or more specifically called comp reports, that compare the performance of an organizational unit over two time periods. Testing these reports needs special consideration especially if a fiscal calendar is used instead of an English calendar for time period comparison. For reports containing derived metrics (for example, “cost per click,” which is defined as the sum of cost divided by sum of clicks) special focus should be paid to any subtotals. The subtotal row should use a “smart-total,” i.e., do the aggregation first and then do the division instead of adding up the individual cost per click of each row in the report. Reports with “nonaggregateable” metrics (e.g., inventory at hand) also need special attention to the subtotal row. It should not, for example, add up the inventory for each week and show the inventory of the month. During unit testing, all data formats should be verified against the standard. For example, metrics with monetary value should show the proper currency symbol, decimal point precision (at least two places) and the appropriate positive or negative. For example, negative numbers should be shown in red and enclosed in braces. During system testing, while testing the drill-down capability of reports, care must be taken to verify that the subtotal at the drill-down report matches with the corresponding row of the summary report. At times, it is desirable to carry the parent attribute to the drill-down report; verify the requirements for this. Page 21 of 30
  22. 22. • • • When testing a report containing conditional metrics, care should be taken to check for “outer join condition;” i.e., nonexistence of one condition is reflected appropriately with the existence of the other condition. Reports with multilevel sorting needs special attention for testing especially if the multilevel sorting includes both attributes and metrics to be sorted. Reports containing metrics at different dimensionality and with percentto-total metrics and/or cumulative metrics needs special attention to check that the subtotals are hierarchy-aware (i.e., they “break” or “reinitialized” at the appropriate levels). User Acceptance Testing The main reason for building a data warehouse application is to make data available to business users. Users know the data best, and their participation in the testing effort is a key component to the success of a data warehouse implementation. User-acceptance testing (UAT) typically focuses on data loaded to the data warehouse and any views that have been created on top of the tables, not the mechanics of how the ETL application works. Consider the following strategies: • Use data that is either from production or as near to production data as possible. Users typically find issues once they see the “real” data, sometimes leading to design changes. • Test database views comparing view contents to what is expected. It is important that users sign off and clearly understand how the views are created. • Plan for the system test team to support users during UAT. The users will likely have questions about how the data is populated and need to understand details of how the ETL works. • Consider how the users would require the data loaded during UAT and negotiate how often the data will be refreshed. Page 22 of 30
  23. 23. Regression Testing Regression testing is revalidation of existing functionality with each new release of code. When building test cases, remember that they will likely be executed multiple times as new releases are created due to defect fixes, enhancements or upstream systems changes. Building automation during system testing will make the process of regression testing much smoother. Test cases should be prioritized by risk in order to help determine which need to be rerun for each new release. A simple but effective and efficient strategy to retest basic functionality is to store source data sets and results from successful runs of the code and compare new test results with previous runs. When doing a regression test, it is much quicker to compare results to a previous execution than to do an entire data validation again. Page 23 of 30
  24. 24. Scope of Testing The above section “Goals of Data Warehouse Testing” details all the areas that can be possibly tested for a Data Warehouse project. For any specific project, the approach for Testing would detail the selected areas and types of testing to be performed – which would be a subset of the “Goals of Data Warehouse Testing”. After the Testing approach is defined, the assumptions for the project would be defined followed by the In-scope and out-of –scope testing activities. Page 24 of 30
  25. 25. Roles and Responsibilities The roles and responsibilities of the Testers, the Subject Matter Experts (SMEs) from the BI team would be clearly defined as per the Test Approach. Usage of the SMEs at various phases of the testing project would also be defined and their roles and responsibilities need to be charted out clearly. Page 25 of 30
  26. 26. Artifacts / Deliverables Software Project Plan (SPP) The Software Project Plan (SPP) will be as per the template applicable per RBS IDC format. The high level areas that would be covered in the SPP are: • Scope • Project Planning • Quality Planning • Software Configuration Management Planning System Test Plan The System Test Plan will be as per the template applicable per RBS IDC format.. The high level areas that would be covered in the Test Plan are: • Purpose • Project Synopsis • Testing Environment • Schedule for System Testing • Scope of System Testing • Test Data • Test Cycles • Out of Scope of System Testing • Constraints of System Testing • Entry Criteria of System testing • Suspension Criteria of System testing • Exit Criteria of System testing • Tools Used System Test Cases/Test Plan and Scripts (TPS) The System Test Case/Scripts will be as per the template applicable RBS IDC format. The high level areas that would be covered in the TPS are: • • • • • High Level Scenarios Detailed Test Cases Predictions Test Execution Scripts(SQL) Updated Requirement Traceability Matrix Page 26 of 30
  27. 27. Tools and Automation in Data Warehousing There are no standard guidelines on the tools that can be used for data warehouse testing. Majority of the testing teams go with the tool that has been used for the data warehouse implementation. A drawback of this approach is redundancy. The same transformation logic will have to be developed for DWH implementation and also for its testing.One may Try selecting an independent tool for testing of the DWH. Eg: Transformation logic implemented using a tool 'X' can be tested by reproducing the same logic in yet another tool- say 'Y'. Tool Selection also depends on the test strategy viz exhaustive verification, Sampling, Aggregation etc. Reusability & Scalibility of the test Suite being develpoed is a very important factor to be considered. Tools with built-in test strategies help in deskilling. Also one shpould focus on and explore areas of automation in datawarehousing and use other tools to automate such areas. Example of these areas could be automation of web applications used for test data generation. ETL tools and automation The ETL technology which helps in automating the process of data loading has one type which produces code and another that produces run time modules which can be parameterized. To get the real benefit of Data Warehouse you need to go thru pain of automating data loading to it from various sources (depicted in diagram below). ETL software can help you in automating such process of data loading from Operational environment to Data Warehouse environment. Page 27 of 30
  28. 28. Automation Plan for datawarehouse testing • Spend as much time as you can in analysis phase itself and understand as much as you can for source system. • • Because your database and table structure would be different for Source and Destination, so create source side queries which should provide you the result which is expected by user from target data warehouse. Get the result set from source certified by customer for their needs. Based on the data loading and data transformations done for the target data warehouse, create target side queries which should yield the same results set which you got in above step. This seems very easy to do but actually not, when you work on creating schema you miss the various fields which are required to join tables in queries. If you are able to fetch the same result set from Target data warehouse then you can get your design and data certified from customer. You can use any tool based on in which language your ETL’s are written and on which RDBMS your Data Warehouse is hosted. I would recommend C#, SQL SSIS technology if your ETL’s are designed for SQL Server and Data Warehouse is hosted on SQL Server. So to conclude about the tools and Automation scope in datawarehouse testing we can broadly say that, In data warehouse systems ETL’s are the tools which can pull the data from operation systems to data warehouse systems, which are needed for various regulatory compliance and audit needs. Test automation of this ETL’s can help you save lot of time in data analysis and every monthly, Page 28 of 30
  29. 29. quarterly, half yearly and yearly efforts depending on your data loading frequency to data. Data warehouse testing: Best practices Focus on Data Quality: If the data quality is ascertained, then the testing of ETL logic is pretty much straight forward. One need not even test 100% of the data .The ETL logic alone can be tested pitching it against all possible data sets. However, signing off the data quality is no easy task, simply becuae of its sheer volume. Identify Critical Business Scenarios: The Sampling technique will have to be adopted many a times in Data warehouse testing. However, what constitutes a sample? 10% or 20% or 60%? Identification of critical business scenarios and including more test data is a route to success. Automation: Automate as much as possible! The Data warehouse test Suite will be used time and again as the database will be periodically updated. Hence a regression suite should be built and be available for use at any time. This will save much of a time. Page 29 of 30
  30. 30. References 1 www.wikipedia.org Bill Inmon, an early and influential practitioner [Wikipedia.org] 3 http://www.dmreview.com/ 2 Page 30 of 30

×