How can a quality engineering and assurance consultancy keep you ahead of othersgreyaudrina
Quality Engineering and Assurance has undergone a drastic change due to the digital transformation. A business that deploys automation is in a better position to handle the competition and offer a world-class customer experience.
Testing the Data Warehouse—Big Data, Big ProblemsTechWell
Data warehouses are critical systems for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data is a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led numerous data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing: methods for assuring data completeness, monitoring data transformations, measuring quality, and more. Geoff explores the opportunities for test automation as part of the data warehouse process, describing how you can harness automation tools to streamline the work and minimize overhead.
Testing Big Data application is more a verification of its data processing rather than testing the individual features. It demands a high level of testing skills as the processing is very fast.
DATPROF Test data Management (data privacy & data subsetting) - EnglishDATPROF
The possibilities of DATPROF Subet en DATPROF Privacy. For Subsetting databases en masking databases. To improve testing of software and to comply to data privacy regulations
Data summit connect fall 2020 - rise of data opsRyan Gross
Data governance teams attempt to apply manual control at various points for consistency and quality of the data. By thinking of our machine learning data pipelines as compilers that convert data into executable functions and leveraging data version control, data governance and engineering teams can engineer the data together, filing bugs against data versions, applying quality control checks to the data compilers, and other activities. This talk illustrates how innovations are poised to drive process and cultural changes to data governance, leading to order-of-magnitude improvements.
How to Test Big Data Systems | QualiTest GroupQualitest
Big Data is perceived as a huge amount of data and information but it is a lot more than this. Big Data may be said to be a whole set of approach, tools and methods of processing large volumes of unstructured as well as structured data. The three parameters on which Big Data is defined i.e. Volume, Variety and Velocity describes how you have to process an enormous amount of data in different formats at different rates.
QualiTest is the world’s second largest pure play software testing and QA company. Testing and QA is all that we do! visit us at: www.QualiTestGroup.com
How can a quality engineering and assurance consultancy keep you ahead of othersgreyaudrina
Quality Engineering and Assurance has undergone a drastic change due to the digital transformation. A business that deploys automation is in a better position to handle the competition and offer a world-class customer experience.
Testing the Data Warehouse—Big Data, Big ProblemsTechWell
Data warehouses are critical systems for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data is a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led numerous data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing: methods for assuring data completeness, monitoring data transformations, measuring quality, and more. Geoff explores the opportunities for test automation as part of the data warehouse process, describing how you can harness automation tools to streamline the work and minimize overhead.
Testing Big Data application is more a verification of its data processing rather than testing the individual features. It demands a high level of testing skills as the processing is very fast.
DATPROF Test data Management (data privacy & data subsetting) - EnglishDATPROF
The possibilities of DATPROF Subet en DATPROF Privacy. For Subsetting databases en masking databases. To improve testing of software and to comply to data privacy regulations
Data summit connect fall 2020 - rise of data opsRyan Gross
Data governance teams attempt to apply manual control at various points for consistency and quality of the data. By thinking of our machine learning data pipelines as compilers that convert data into executable functions and leveraging data version control, data governance and engineering teams can engineer the data together, filing bugs against data versions, applying quality control checks to the data compilers, and other activities. This talk illustrates how innovations are poised to drive process and cultural changes to data governance, leading to order-of-magnitude improvements.
How to Test Big Data Systems | QualiTest GroupQualitest
Big Data is perceived as a huge amount of data and information but it is a lot more than this. Big Data may be said to be a whole set of approach, tools and methods of processing large volumes of unstructured as well as structured data. The three parameters on which Big Data is defined i.e. Volume, Variety and Velocity describes how you have to process an enormous amount of data in different formats at different rates.
QualiTest is the world’s second largest pure play software testing and QA company. Testing and QA is all that we do! visit us at: www.QualiTestGroup.com
Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to streamline and minimize overhead.
What is a Data Warehouse and How Do I Test It?RTTS
ETL Testing: A primer for Testers on Data Warehouses, ETL, Business Intelligence and how to test them.
Are you hearing and reading about Big Data, Enterprise Data Warehouses (EDW), the ETL Process and Business Intelligence (BI)? The software markets for EDW and BI are quickly approaching $22 billion, according to Gartner, and Big Data is growing at an exponential pace.
Are you being tasked to test these environments or would you like to learn about them and be prepared for when you are asked to test them?
RTTS, the Software Quality Experts, provided this groundbreaking webinar, based upon our many years of experience in providing software quality solutions for more than 400 companies.
You will learn the answer to the following questions:
• What is Big Data and what does it mean to me?
• What are the business reasons for a building a Data Warehouse and for using Business Intelligence software?
• How do Data Warehouses, Business Intelligence tools and ETL work from a technical perspective?
• Who are the primary players in this software space?
• How do I test these environments?
• What tools should I use?
This slide deck is geared towards:
QA Testers
Data Architects
Business Analysts
ETL Developers
Operations Teams
Project Managers
...and anyone else who is (a) new to the EDW space, (b) wants to be educated in the business and technical sides and (c) wants to understand how to test them.
Testing the Data Warehouse―Big Data, Big ProblemsTechWell
Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to streamline and minimize overhead.
Testing the Data Warehouse―Big Data, Big ProblemsTechWell
Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to streamline and minimize overhead.
Most organisations think that they have poor data quality, but don’t know how to measure it or what to do about it. Teams of data scientists, analysts, and ETL developers are either blindly taking a “garbage in -> garbage out” approach, or worse still, “cleansing” data to fit their limited perspectives. DataOps is a systematic approach to measuring data and for planning mitigations for bad data.
Data Warehousing in Pharma: How to Find Bad Data while Meeting Regulatory Req...RTTS
In the U.S., pharmaceutical firms must meet electronic record-keeping regulations set by the Food and Drug Administration (FDA). The regulation is Title 21 CFR Part 11, commonly known as Part 11.
Part 11 requires regulated firms to implement controls for software and systems involved in processing many forms of data as part of business operations and product development.
Enterprise data warehouses are used by the pharmaceutical and medical device industries for storing data covered by Part 11. QuerySurge, the only test tool designed specifically for automating the testing of data warehouses and the ETL process, is the market leader in testing data warehouses used by Part 11-governed companies.
For more on QuerySurge and Pharma, please visit
http://www.querysurge.com/solutions/pharmaceutical-industry
Daniel is a Project Leader at Datayaan.
He has worked on designing and implementing innovative solutions for complex business problems and has helped companies with digital transformation.
Telehealth, Transport Logistics, and Telcom are some of the key areas his work covers.
And on the tech side he has widespread knowledge and experience in Microservices,IoT and Cloud.
He's going to talk about his approach in transforming an organization to leverage data-driven decision making.
For this he presents Transport Logistics as a use case and walks us through an overview of how the transformation takes place:
How the Data is Collected and Processed.. What we can do using the collected data.. and how the organization is benefitted..
He is also going to shed some light on how IoT can be used to automate data collection which is very crucial for building an effective data-driven business model
Creating a Data validation and Testing StrategyRTTS
Creating A Data Validation & Testing Strategy
Are you struggling with formulating a strategy for how to validate the massive amount of data continuously entering your data warehouse or data lake?
We can help you!
Learn how RTTS’ Data Validation Assessment provides:
- an evaluation of your current data validation process
- recommendations on how to improve your process and
- a proposal for successful implementation
This slide deck addresses the following issues:
- How do I find out if I have bad data?
- How do I ensure I am testing the proper data permutations?
- How much of my data needs to be validated and automated?
- Which critical data endpoints need to be tested?
- How do I test data in my cloud environments?
And much more!
For more information, visit:
https://www.rttsweb.com/services/solutions/data-validation-assessment
Completing the Data Equation: Test Data + Data Validation = SuccessRTTS
Completing the Data Equation
In this presentation, we tackle 2 major challenges to assuring your data quality:
1) Test Data Generation
2) Data Validation
We illustrate how GenRocket and QuerySurge, used in conjunction, can solve these challenges. Also see how they can be easily integrated into your Continuous Integration/Continuous Delivery pipeline.
Session Overview
- Primary challenges organizations are facing with their data projects
- Key success factors for data validation & testing
- How to setup a workflow around test data generation and data validation using GenRocket & QuerySurge
- How to automate this workflow in your CI/CD DataOps pipeline
to see the video, go to https://www.youtube.com/embed/Zy25i74l-qo?autoplay=1&showinfo=0
Smarter Analytics: Supporting the Enterprise with AutomationInside Analysis
The Briefing Room with Barry Devlin and WhereScape
Live Webcast on June 10, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=5230c31ab287778c73b56002bc2c51a
The data warehouse is intended to support analysis by making the right data available to the right people in a timely fashion. But conditions change all the time, and when data doesn’t keep up with the business, analysts quickly turn to workarounds. This leads to ungoverned and largely un-managed side projects, which trade short-term wins for long-term trouble. One way to keep everyone happy is by creating an integrated environment that pulls data from all sources, and is capable of automating both the model development and delivery of analyst-ready data.
Register for this episode of The Briefing Room to hear data warehousing pioneer and Analyst Barry Devlin as he explains the critical components of a successful data warehouse environment, and how traditional approaches must be augmented to keep up with the times. He’ll be briefed by WhereScape CEO Michael Whitehead, who will showcase his company’s data warehousing automation solutions. He’ll discuss how a fast, well-managed and automated infrastructure is the key to empowering faster, smarter, repeatable decision making.
Visit InsideAnlaysis.com for more information.
Implementing Azure DevOps with your Testing ProjectRTTS
Implementing Azure DevOps With Your Testing Project
Are you challenged with different teams working on different platforms making it difficult to get insight into another team’s work?
Is your team seeking ways to automate the code deployments so you can spend more time developing new features and writing more tests, and spend less time deploying and running manual tests?
RTTS, a Microsoft Gold DevOps Partner, will take you through solving these challenges with Azure DevOps.
Tuesday, June 16th 2020 @11am ET
Session Overview
------------------------------------
During the webinar, we will walk you through the following process of utilizing Azure DevOps:
- The challenges that inspired the Azure DevOps solution that you may experience as well
- The strategy for implementing Azure Devops
- Solutions in our every day processes to increase our times efficiency and save time
- A demo of an Azure DevOps environment for testing teams
The see a recording of the webinar, please visit:
https://www.youtube.com/watch?v=2vIic3wxaS4
To learn more about RTTS, please visit:
https://www.rttsweb.com
Whitepaper: Volume Testing Thick Clients and DatabasesRTTS
Even in the current age of cloud computing there are still endless benefits of developing thick client software: non-dependency on browser version, offline support, low hosting fees, and utilizing existing end user hardware, to name a few.
It's more than likely that your organization is utilizing at least a few thick client applications. Now consider this: as your user base grows, does your think client's back-end server need to grow as well? How quickly? How do you ensure that you provide the correct amount of additional capacity without overstepping and unnecessarily eating into your profits? The answer is volume testing.
Read how RTTS does this with IBM Rational Performance Tester.
Testing Big Data: Automated Testing of Hadoop with QuerySurgeRTTS
Are You Ready? Stepping Up To The Big Data Challenge In 2016 - Learn why Testing is pivotal to the success of your Big Data Strategy.
According to a new report by analyst firm IDG, 70% of enterprises have either deployed or are planning to deploy big data projects and programs this year due to the increase in the amount of data they need to manage.
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data and Hadoop. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data - all with one data testing tool.
Data Warehouse Testing in the Pharmaceutical IndustryRTTS
In the U.S., pharmaceutical firms and medical device manufacturers must meet electronic record-keeping regulations set by the Food and Drug Administration (FDA). The regulation is Title 21 CFR Part 11, commonly known as Part 11.
Part 11 requires regulated firms to implement controls for software and systems involved in processing many forms of data as part of business operations and product development.
Enterprise data warehouses are used by the pharmaceutical and medical device industries for storing data covered by Part 11 (for example, Safety Data and Clinical Study project data). QuerySurge, the only test tool designed specifically for automating the testing of data warehouses and the ETL process, has been effective in testing data warehouses used by Part 11-governed companies. The purpose of QuerySurge is to assure that your warehouse is not populated with bad data.
In industry surveys, bad data has been found in every database and data warehouse studied and is estimated to cost firms on average $8.2 million annually, according to analyst firm Gartner. Most firms test far less than 10% of their data, leaving at risk the rest of the data they are using for critical audits and compliance reporting. QuerySurge can test up to 100% of your data and help assure your organization that this critical information is accurate.
QuerySurge not only helps in eliminating bad data, but is also designed to support Part 11 compliance.
Learn more at www.QuerySurge.com
The State of Logistics Outsourcing; 2010 Third Party Logistics StudyDennis Wereldsma
This 2010 15th Annual Third-Party Logistics Study, based
on research conducted in mid-2010, examines the
current state of the global market for 3PL services, and
explores in depth issues surrounding total landed cost
calculation. The report also considers supply chain
issues, including the role of 3PL s in two vertical markets,
Life Sciences and Fast-Moving Consumer Goods.
Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to streamline and minimize overhead.
What is a Data Warehouse and How Do I Test It?RTTS
ETL Testing: A primer for Testers on Data Warehouses, ETL, Business Intelligence and how to test them.
Are you hearing and reading about Big Data, Enterprise Data Warehouses (EDW), the ETL Process and Business Intelligence (BI)? The software markets for EDW and BI are quickly approaching $22 billion, according to Gartner, and Big Data is growing at an exponential pace.
Are you being tasked to test these environments or would you like to learn about them and be prepared for when you are asked to test them?
RTTS, the Software Quality Experts, provided this groundbreaking webinar, based upon our many years of experience in providing software quality solutions for more than 400 companies.
You will learn the answer to the following questions:
• What is Big Data and what does it mean to me?
• What are the business reasons for a building a Data Warehouse and for using Business Intelligence software?
• How do Data Warehouses, Business Intelligence tools and ETL work from a technical perspective?
• Who are the primary players in this software space?
• How do I test these environments?
• What tools should I use?
This slide deck is geared towards:
QA Testers
Data Architects
Business Analysts
ETL Developers
Operations Teams
Project Managers
...and anyone else who is (a) new to the EDW space, (b) wants to be educated in the business and technical sides and (c) wants to understand how to test them.
Testing the Data Warehouse―Big Data, Big ProblemsTechWell
Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to streamline and minimize overhead.
Testing the Data Warehouse―Big Data, Big ProblemsTechWell
Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to streamline and minimize overhead.
Most organisations think that they have poor data quality, but don’t know how to measure it or what to do about it. Teams of data scientists, analysts, and ETL developers are either blindly taking a “garbage in -> garbage out” approach, or worse still, “cleansing” data to fit their limited perspectives. DataOps is a systematic approach to measuring data and for planning mitigations for bad data.
Data Warehousing in Pharma: How to Find Bad Data while Meeting Regulatory Req...RTTS
In the U.S., pharmaceutical firms must meet electronic record-keeping regulations set by the Food and Drug Administration (FDA). The regulation is Title 21 CFR Part 11, commonly known as Part 11.
Part 11 requires regulated firms to implement controls for software and systems involved in processing many forms of data as part of business operations and product development.
Enterprise data warehouses are used by the pharmaceutical and medical device industries for storing data covered by Part 11. QuerySurge, the only test tool designed specifically for automating the testing of data warehouses and the ETL process, is the market leader in testing data warehouses used by Part 11-governed companies.
For more on QuerySurge and Pharma, please visit
http://www.querysurge.com/solutions/pharmaceutical-industry
Daniel is a Project Leader at Datayaan.
He has worked on designing and implementing innovative solutions for complex business problems and has helped companies with digital transformation.
Telehealth, Transport Logistics, and Telcom are some of the key areas his work covers.
And on the tech side he has widespread knowledge and experience in Microservices,IoT and Cloud.
He's going to talk about his approach in transforming an organization to leverage data-driven decision making.
For this he presents Transport Logistics as a use case and walks us through an overview of how the transformation takes place:
How the Data is Collected and Processed.. What we can do using the collected data.. and how the organization is benefitted..
He is also going to shed some light on how IoT can be used to automate data collection which is very crucial for building an effective data-driven business model
Creating a Data validation and Testing StrategyRTTS
Creating A Data Validation & Testing Strategy
Are you struggling with formulating a strategy for how to validate the massive amount of data continuously entering your data warehouse or data lake?
We can help you!
Learn how RTTS’ Data Validation Assessment provides:
- an evaluation of your current data validation process
- recommendations on how to improve your process and
- a proposal for successful implementation
This slide deck addresses the following issues:
- How do I find out if I have bad data?
- How do I ensure I am testing the proper data permutations?
- How much of my data needs to be validated and automated?
- Which critical data endpoints need to be tested?
- How do I test data in my cloud environments?
And much more!
For more information, visit:
https://www.rttsweb.com/services/solutions/data-validation-assessment
Completing the Data Equation: Test Data + Data Validation = SuccessRTTS
Completing the Data Equation
In this presentation, we tackle 2 major challenges to assuring your data quality:
1) Test Data Generation
2) Data Validation
We illustrate how GenRocket and QuerySurge, used in conjunction, can solve these challenges. Also see how they can be easily integrated into your Continuous Integration/Continuous Delivery pipeline.
Session Overview
- Primary challenges organizations are facing with their data projects
- Key success factors for data validation & testing
- How to setup a workflow around test data generation and data validation using GenRocket & QuerySurge
- How to automate this workflow in your CI/CD DataOps pipeline
to see the video, go to https://www.youtube.com/embed/Zy25i74l-qo?autoplay=1&showinfo=0
Smarter Analytics: Supporting the Enterprise with AutomationInside Analysis
The Briefing Room with Barry Devlin and WhereScape
Live Webcast on June 10, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=5230c31ab287778c73b56002bc2c51a
The data warehouse is intended to support analysis by making the right data available to the right people in a timely fashion. But conditions change all the time, and when data doesn’t keep up with the business, analysts quickly turn to workarounds. This leads to ungoverned and largely un-managed side projects, which trade short-term wins for long-term trouble. One way to keep everyone happy is by creating an integrated environment that pulls data from all sources, and is capable of automating both the model development and delivery of analyst-ready data.
Register for this episode of The Briefing Room to hear data warehousing pioneer and Analyst Barry Devlin as he explains the critical components of a successful data warehouse environment, and how traditional approaches must be augmented to keep up with the times. He’ll be briefed by WhereScape CEO Michael Whitehead, who will showcase his company’s data warehousing automation solutions. He’ll discuss how a fast, well-managed and automated infrastructure is the key to empowering faster, smarter, repeatable decision making.
Visit InsideAnlaysis.com for more information.
Implementing Azure DevOps with your Testing ProjectRTTS
Implementing Azure DevOps With Your Testing Project
Are you challenged with different teams working on different platforms making it difficult to get insight into another team’s work?
Is your team seeking ways to automate the code deployments so you can spend more time developing new features and writing more tests, and spend less time deploying and running manual tests?
RTTS, a Microsoft Gold DevOps Partner, will take you through solving these challenges with Azure DevOps.
Tuesday, June 16th 2020 @11am ET
Session Overview
------------------------------------
During the webinar, we will walk you through the following process of utilizing Azure DevOps:
- The challenges that inspired the Azure DevOps solution that you may experience as well
- The strategy for implementing Azure Devops
- Solutions in our every day processes to increase our times efficiency and save time
- A demo of an Azure DevOps environment for testing teams
The see a recording of the webinar, please visit:
https://www.youtube.com/watch?v=2vIic3wxaS4
To learn more about RTTS, please visit:
https://www.rttsweb.com
Whitepaper: Volume Testing Thick Clients and DatabasesRTTS
Even in the current age of cloud computing there are still endless benefits of developing thick client software: non-dependency on browser version, offline support, low hosting fees, and utilizing existing end user hardware, to name a few.
It's more than likely that your organization is utilizing at least a few thick client applications. Now consider this: as your user base grows, does your think client's back-end server need to grow as well? How quickly? How do you ensure that you provide the correct amount of additional capacity without overstepping and unnecessarily eating into your profits? The answer is volume testing.
Read how RTTS does this with IBM Rational Performance Tester.
Testing Big Data: Automated Testing of Hadoop with QuerySurgeRTTS
Are You Ready? Stepping Up To The Big Data Challenge In 2016 - Learn why Testing is pivotal to the success of your Big Data Strategy.
According to a new report by analyst firm IDG, 70% of enterprises have either deployed or are planning to deploy big data projects and programs this year due to the increase in the amount of data they need to manage.
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data and Hadoop. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data - all with one data testing tool.
Data Warehouse Testing in the Pharmaceutical IndustryRTTS
In the U.S., pharmaceutical firms and medical device manufacturers must meet electronic record-keeping regulations set by the Food and Drug Administration (FDA). The regulation is Title 21 CFR Part 11, commonly known as Part 11.
Part 11 requires regulated firms to implement controls for software and systems involved in processing many forms of data as part of business operations and product development.
Enterprise data warehouses are used by the pharmaceutical and medical device industries for storing data covered by Part 11 (for example, Safety Data and Clinical Study project data). QuerySurge, the only test tool designed specifically for automating the testing of data warehouses and the ETL process, has been effective in testing data warehouses used by Part 11-governed companies. The purpose of QuerySurge is to assure that your warehouse is not populated with bad data.
In industry surveys, bad data has been found in every database and data warehouse studied and is estimated to cost firms on average $8.2 million annually, according to analyst firm Gartner. Most firms test far less than 10% of their data, leaving at risk the rest of the data they are using for critical audits and compliance reporting. QuerySurge can test up to 100% of your data and help assure your organization that this critical information is accurate.
QuerySurge not only helps in eliminating bad data, but is also designed to support Part 11 compliance.
Learn more at www.QuerySurge.com
The State of Logistics Outsourcing; 2010 Third Party Logistics StudyDennis Wereldsma
This 2010 15th Annual Third-Party Logistics Study, based
on research conducted in mid-2010, examines the
current state of the global market for 3PL services, and
explores in depth issues surrounding total landed cost
calculation. The report also considers supply chain
issues, including the role of 3PL s in two vertical markets,
Life Sciences and Fast-Moving Consumer Goods.
Reboarding the Sales Force: Leveraging Onboarding Principles to Keep Seasoned...QstreamInc
Enlightened sales organizations understand the need to invest in onboarding: a structured training and development effort to speed new hires up the learning curve. But given the incredible pace of change in today’s B2B sales environment, traditional onboarding is no longer enough. Sales enablement pros must find new approaches for optimizing salesforce performance and productivity throughout their tenure, essentially “reboarding” their sales teams as conditions demand. In this webcast, co-presented with the Sales Management Association, we explore the concept of “reboarding,” why it’s so critical for fast-growing enterprises, and share data-driven best practices for keeping both novice and veteran sales reps alike, engaged and ready to win.
Deliver Trusted Data by Leveraging ETL TestingCognizant
We explore how extract, transform and load (ETL) testing with SQL scripting is crucial to data validation and show how to test data on a large scale in a streamlined manner with an Informatica ETL testing tool.
Leveraging Automated Data Validation to Reduce Software Development Timeline...Cognizant
Our enterprise solution for automating data validation - called dataTestPro - facilitates quality assurance (QA) by managing heterogeneous data testing, improving test scheduling, increasing data testing speed and reducing data -validation errors drastically.
Building a Robust Big Data QA Ecosystem to Mitigate Data Integrity ChallengesCognizant
With big data growing exponentially, the need to test semi-structured and unstructured data has risen; we offer several strategies for big data quality assurance (QA), taking into account data security, scalability and performance issues. Our recommendations center around data warehouse testing, performance testing and test data management.
Automate data warehouse etl testing and migration testing the agile wayTorana, Inc.
Data Warehouse, ETL & Migration projects are exposed to huge financial risks due to lack of QA automation. At iCEDQ, we suggest the agile rules based testing approach for all data integration projects.
Query Wizards - data testing made easy - no programmingRTTS
Fast and easy. No Programming needed. The latest QuerySurge release introduces the new Query Wizards. The Wizards allow both novice and experienced team members to validate their organization's data quickly with no SQL programming required.
The Wizards provide an immediate ROI through their ease-of-use and ensure that minimal time and effort are required for developing tests and obtaining results. Even novice testers are productive as soon as they start using the Wizards!
According to a recent survey of Data Architects and other data experts on LinkedIn, approximately 80% of columns in a data warehouse have no transformations, meaning the Wizards can test all of these columns quickly & easily, (The columns with transformations can be tested using the QuerySurge Design library using custom SQL coding.)
There are 3 Types of automated Data Comparisons:
- Column-Level Comparison
- Table-Level Comparison
- Row Count Comparison
There are also automated features for filtering (‘Where’ clause) and sorting (‘Order By’ clause).
The Wizards provide both novices and non-technical team members with a fast & easy way to be productive immediately and speed up testing for team members skilled in SQL.
Trial our software either as a download or in the cloud at www.QuerySurge.com. The trial comes with a built-in tutorial and sample data.
Testing(Manual or Automated) depends a lot on the test data being used. In a fast paced dynamic agile development quality of data being used for testing is paramount for success.
How to Automate your Enterprise Application / ERP TestingRTTS
Your organization has a major system that is central to running its business.
-Maybe it’s an ERP system running SAP, Oracle, Lawson or maybe a CRM system running Salesforce or Microsoft Dynamics,
- or it’s a banking or trading system at a bank or other financial institution,
- or an HR system running payroll through PeopleSoft or Workday
Whatever the system is, it is constantly sending or receiving data feeds (generally in XML or flat file formats) to or from a customer, vendor, or another internal system.
These major data interfaces are present in companies across every industry — from Financials to Pharmaceuticals, and Retail to Utilities — and they are handling data that is crucial to each business. As systems become more complex, it becomes more difficult for you to catch bad records or major data defects effectively before they reach their target system.
Catch those "hard-to-find" data defects
Your systems could be sending/receiving hundreds of feeds from different applications or data sources and each with different owners. In these circumstances, you may have little to no control over the format or quality of the data. Now this data needs to be integrated, mapped, and transformed into your systems. Can your existing manual testing process handle this task?
The challenges you’re facing:
Business: You’re working under time and resource constraints, so you need to speed up testing yet still increase coverage of data tested
Technology: There is no easy way to natively test flat files, XML files, databases or Excel against any other data format
Resources: You do not have enough people to test all of the data from the data feeds all of the time
You know that this data needs to be consistently accurate and reliable — and catching any bad data or data defects seems almost impossible.
Solve your Data Interface testing challenges
QuerySurge is built to automate the testing for any movement of data, testing simple or complex transformations (ETL), as well as data movement without any transformation.
- Test across different platforms, whether Big Data, data warehouse, database(s), NoSQL document store, flat files, json, web services or xml.
- Automate the testing effort from the kickoff of tests to the data comparison to auto-emailing the results.
- Speed up data testing and validation by as much as 1,000 times.
- Schedule tests to run immediately, every Tuesday at 2:00am or after an event, such as an ETL job, triggers the tests.
- Utilize the Data Analytics Dashboard and Data Intelligence Reports to analyze your data testing.
- Get 100% coverage with a dramatic decrease in testing time
It will allow you to quickly compare file to file, file to XML, and XML/files to a database without having to import your files into a database first (it also compares database to database).
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
1. Deliver Trusted Data by Leveraging
ETL Testing
Data-rich organizations seeking to assure data quality can
systemize the validation process by leveraging automated testing
to increase coverage, accuracy and competitive advantage, thus
boosting credibility with end users.
Executive Summary
All quality assurance teams use the process of
extract, transform and load (ETL) testing with
SQL scripting in conjuction with eyeballing the
data on Excel spreadsheets. This process can
take a huge amount of time and can be error-
prone due to human intervention. This process
is tedious because to validate data, the same
test SQL scripts need to be executed repeat-
edly. This can lead to a defect leakage due to
assorted, capacious and robust data. To test the
data effectively, the tester needs advanced data-
base skills that include writing complex join
queries and creating stored procedures, triggers
and SQL packages.
Manual methods of data validation can also
impact the project schedules and undermine
end-user confidence regarding data delivery (i.e.,
delivering data to users via flat files or on Web
sites). Moreover, data quality issues can under-
cut competitive advantage and have an indirect
impact on the long-term viability of a company
and its products.
Organizations can overcome these challenges by
mechanizing the data validation process. But that
raises an important question: How can this be
done without spending extra money? The answer
led us to consider Informatica‘s ETL testing tool.
This white paper demonstrates how Informatica
can be used to automate the data testing pro-
cess. It also illustrates how this tool can help
QE&A teams reduce the numbers of hours spent
on their activities, increase coverage and achieve
100% accuracy in validating the data. This means
that organizations can deliver complete, repeat-
able, auditable and trustable test coverage in less
time without extending basic SQL skill sets.
Data Validation Challenges
Consistency in the data received for ETL is a
perennial challenge. Typically, data received from
various sources lacks commonality in how it is
formatted and provided. And big data only makes
it more pressing an issue. Just a few years ago, 10
million records of data was considered a big deal.
Today, the volume of the data stored by enterpris-
es can be in the range of billions and trillions.
• Cognizant 20-20 Insights
cognizant 20-20 insights | december 2014
2. 2cognizant 20-20 insights
Quick Take
Addressing Organizational Data Quality Issues
2cognizant 20-20 insights
DAY
1 Preparing
Data Update
PMO & Functional Managers
QA Team
ETL/DB Team
Receive Data
Apply ETL
on Data
DAY
2
Test Data in
QA Env & Sign-off
DAY
3
Test Data in
Prod Env & Sign-off
Functional Data
Validation in QA Env
Functional Data
Validation in
Prod Env (UAT)
Release to
Production
Data Release Cycle
Figure 1
Our experimentation with automated data vali-
dation with a U.S.-based client revealed that by
mechanizing the data validation process, data
quality issues can be completely eradicated. The
automation of the data validation process brings
the following value additions:
• Provides a data validation platform which is
workable and sustainable for the long term.
• Tailored, project-specific framework for data
quality testing.
• Reduces turnaround time of each test execution
cycle.
• Simplifies the test management process by
simplifying the test approach.
• Increases test coverage along with greater
accuracy of validation.
The Data Release Cycle and Internal Challenges
This client releases product data sets on a peri-
odic basis, typically monthly. As a result, the data
volume in each release is huge. One product suite
has seven different products under its umbrella
and data is released in three phases per month.
Each phase has more than 50 million records to
be processed from each product within the suite.
Due to manual testing, the turnaround time for
each phase used to be three to five days, depend-
ing on the number of tasks involved in each phase.
Production release of the quality data is a huge
undertaking by the QE&A team, and it was a big
challenge to make business owners happy by
reducing the time-to-market (i.e., the time from
processing the data once it is received to releas-
ing it to the market). By using various automation
methods, we were able to reduce time-to-market
from between three and five days to between one
and three days (see Figure 1).
3. cognizant 20-20 insights 3
Reasons for accretion of voluminous data include:
• Executive management’s need to focus on
data-driven decision-making by using business
intelligence tools.
• Company-wide infrastructural changes such as
data center migrations.
• Mergers and acquisitions among data-produc-
ing companies.
• Business owners’ need to gain greater insight
into streamlining production, reducing time-to-
market and increasing product quality.
If the data is abundant, and from multiple sources,
there is a chance junk data can be present. Also,
odds are there is excessive duplication, null sets
and redundant data available in the assortment.
And due to mishandling, there is potential loss of
the data.
However, organizations must overcome these
challenges by having appropriate solutions in
place to avoid credibility issues. Thus, for data
warehousing and migration initiatives, data valida-
tion plays a vital role ensuring overall operational
effectiveness. But operational improvements are
never without their challenges, including:
• Data validation is significantly different from
conventional ways of testing. It requires more
advanced scripting skills in multiple SQL
servers such as Microsoft SQL 2008, Sybase
IQ, Vertica, Netizza, etc.
• Heterogeneity in the data sources leads to
mishandling of the interrelationships between
multiple data source formats.
• During application upgrades, making sure that
older application repository data is the same as
the data in the new repository.
• SQL query execution is tedious and
cumbersome, because of repetitious execution
of the queries.
• Missing test scenarios, due to manual execution
of queries.
• Total accuracy may not always be possible.
• Time taken for execution varies from one
person to another.
• Strict supervision is required with each test.
• The ETL process entails numerous stages; it
can be difficult to adopt a testing schedule
given the manual effort required.
• The quality assurance team needs progres-
sive elaboration (i.e., continuous improvement
of key processes) to standardize the process
due to complex architectures and multilayered
designs.
A Business-Driven Approach
to Data Validation
To meet the business demand for data validation,
we have developed a surefire and comprehensive
solution that can be utilized in various areas such
as data warehousing, data extraction, transfor-
mations, loading, database testing and flat-file
validation.
The Informatica tool that is used for the ETL pro-
cess can also be used as a validation tool to verify
the business rules associated with the data. This
tool has the capability to significantly reduce
manual effort and increase ETL productivity by
lowering costs, thereby improving the bottom line.
Our Data Validation Procedures as a Framework
There are four methods required to implement
a one-stop solution for addressing data quality
issues (see Figure 2).
Data Validation Methods
Figure 2
Informatica
Data Validation
DB Stored
Procedures
Selenium
Macros
4. cognizant 20-20 insights 4
Each methods has its own adoption procedures.
High-level details include the following:
Informatica Data Validation
The following activities are required to create
an Informatica data validation framework (see
Figure 3):
• Accrual of business rules from product/
business owners based on their expectations.
• Convert business rules into test scenarios and
test cases.
• Derive expected results of each test case
associated with each scenario.
• Write a SQL query for each of the test cases.
• Update the SQL test cases in input files (test
case basic info, SQL query).
• Create Informatica workflows to execute the
queries and update the results in the respective
SQL tables.
• Trigger Informatica workflows for executing
jobs and send e-mail notifications with
validation results.
Validate Comprehensive Data with Stored
Procedures
The following steps are required for data valida-
tion using stored procedures (see Figure 4, next
page):
• Prepare validation test scenarios.
• Convert test scenarios into test cases.
• Derive the expected results for all test cases.
• Write stored procedure-compatible SQL
queries that represent each test case.
• Compile all SQL queries as a package or
test build.
• Store all validation transact-SQL statements
in a single execution plan, calling it “stored
procedure.”
• Execute the stored procedure whenever any
data validation is carried out.
Source Files Apply Transformation
Rules
Staging Area
(SQL Server)
Export Flat Files
Sybase IQ
Warehouse
Quality
Assurance
(QA)
Test Cases
with Expected
Results
QA DB Tables
UPDATE TEST RESULTS
Validate Test
Cases
Test Case
Results
(Pass/Fail?)
Web ProductionEnd-Users
(External and Internal)
Test Case
Validation ReportE-mail
ETL ETL ETL
ETL ETLETL
ETL
ETL
ETL
ETL
ETL
PASS
ETL
FAIL
A Data Validation Framework: Pictorial View
Figure 3
5. cognizant 20-20 insights 5
Validating with Stored Procedures
Figure 4
One-to-One Data Comparision Using Macros
The following activities are required to handle
data validation (see Figure 5):
• Prepare validation test scenarios.
• Convert test scenarios into test cases.
• Derive a list of expected results for each test
scenario.
• Specify input parameters for a given test
scenario, if any.
• Write a macro to carry out validation work for
one-to-one data comparisions.
Selenium Functional Testing Automation
The following are required to perform data valida-
tion (see Figure 6, next page):
• Prepare validation test scenarios.
• Convert test scenarios into test cases.
• Derive an expected result for each test case.
• Specify input parameters for a given test
scenario.
• Derive test configuration data for setting up
the QA environment.
• Build a test suite that contains multiple test
builds according to test scenarios.
• Have a framework containing multiple test
suites.
• Execute the automation test suite per the
validation requirement.
• Analyze the test results and share those results
with project stakeholders.
Salient Solution Features,
Benefits Secured
The following features and benefits of our
framework were reinforced by a recent client
engagement (see sidebar, page 7).
Core Features
• Compatible with all database servers.
• Zero manual intervention for the execution of
validation queries.
• 100% efficiency in validating the larger-scale
data.
• Reconciliation of production activities with the
help of automation.
• Reduces level of effort and resources required
to perform ETL testing.
Sybase IQ
Stored Procedure
EXECUTION
FACT TABLE
PRODUCTION
Test Case Test Case Desc Pass/Fail
Test_01 Accuracy Pass
Test_02 Min Period Fail
Test_03 Max Period Pass
Test_04 Time Check Pass
Test_05 Data Period Check Pass
Test_06 Change Values Pass
YES
FAIL
Marketing Files
Add Macro
Quality Assurance
Publications
Execute Macro
Sybase IQ
Data Warehouse
DATA 2
DATA 1
PASS
FAIL
Applying Macro Magic to
Data Validation
Figure 5
6. cognizant 20-20 insights 6
• Comprehensive test coverage ensures lower
business risk and greater confidence in data
quality.
• Remote scheduling of test activities.
Benefits Reaped
• Test case validation results are in user-friendly
formats like .csv, .xlsx and HTML.
• Validation results are stored for future
purposes.
• Reuse of test cases and SQL scripts for
regression testing.
• No scope for human errors.
• Supervision isn’t required while executing test
cases.
• 100% accuracy in test case execution at all
times.
• Easy maintainance of SQL scripts and related
test cases.
• No variation in time taken for execution of test
cases.
• 100% reliability on testing and its coverage.
The Bottom Line
As the digital age proceeds, it is very important
for organizations to progressively elaborate their
processes with suitable information and aware-
ness to drive business success. Hence, business
data collected from various operational sources
is cleansed and condolidated per the business
requirement to separate signals from noise. This
data is then stored in a protected environment,
for an extended time period.
Fine-tuning this data will help facilitate per-
formance management, tactical and strategic
decisions and the execution thereof for busi-
ness advantage. Well-organized business data
enables and empowers business owners to make
well-informed decisions. These decisions have
the capacity to drive competitive advantage for
an organization. On an average, organizations
lose $8.2 million annually due to poor data qual-
ity, according to industry research on the subject.
A study by B2B research firm Sirius Decisions
shows that by following best practices in data
quality, a company can boost its revenue by
66%.1
And market research by Information Week
found that 46% of those surveyed believe data
quality is a barrier that undermines business
intelligence mandates.2
Hence, it is safe to assume poor data quality is
undercutting many enterprises. Few have taken
the necessary steps to avoid jeopardizing their
businesses. By implementing the types of data
testing frameworks discussed above, compa-
nies can improve their processes by reducing the
time taken for ETL. This, in turn, will dramatically
reduce their time-to-market turnaround and sup-
port the management mantra of ”under-promise
and over-deliver.” Moreover, few organizations
need to spend extra money on these frame-
works given that existing infrastructure is being
used. This has a direct positive impact on a com-
pany’s bottom line since no additional overhead
is required to hire new human resources or add
additional infrastructure.
GUI: Web-Based
Application
Selenium
Automation
Functional Validation
by Selenium
Production GUI Users
FAIL
PASS
Facilitating Functional Test Automation
Figure 6
7. cognizant 20-20 insights 7
Looking Ahead
In the Internet age, where data is considered a
business currency, organizations must capitalize
on their return on investment in a most efficient
way to maintain their competitive edge. Hence,
data quality plays a pivotal role when making
strategic decisions.
The impact of poor information quality on busi-
ness can be measured with four different
magnitudes: increased costs, decreased revenues,
decreased confidence and increased risk. Thus it
is crucial for any organization to implement a
foolproof solution where a company can use its
own product to validate the quality and capabili-
ties of the product. In other words, adopting an
“eating your own dog food” ideology.
Having said that, it is necessary for any data-
driven business to focus on data quality, as poor
quality has a high probability of becoming the
major bottleneck.
Fixing an Information Services Data Quality Issue
Quick Take
This client is a U.S.-based leading financial ser-
vices provider for real estate professionals. The
services it provides include comprehensive data,
analytical and other related services. Powerful
insight gained from this knowledge provides the
perspective necessary to identify, understand and
take decisive action to effectively solve key busi-
ness challenges.
Challenges Faced
This client faced major challenges in the end-to-
end quality assurance of its data, as the majority
of the company’s test procedures were manual.
Because of these manual methods, turnaround
time or time-to-market of the data was greater
than its competitors. As such, the client wanted a
long-term solution to overcome this.
The Solution
Our QE&A team offered various solutions. The
focus areas were database and flat file valida-
tions. As we explained above, database testing
was automated by using Informatica and other
conventional methods such as the creation of
stored procedures and macros which were used
for validating the flat files.
Benefits
• 50% reduction in time-to-market.
• 100% test coverage.
• 82% automation capability.
• Highly improved data quality.
Figure 7 illustrates the breakout of each method
used and their contributions to the entire QE&A
process.
Figure 7
18%
38%
11%
3%
30%
Manual
DB – Stored Procedure
Selenium
Macro
DB – Informatica
8. cognizant 20-20 insights 8
Footnotes
1
“Data Quality Best Practices Boost Revenue by 66 Percent,” http://www.destinationcrm.com/Articles/
CRM-News/Daily-News/Data-Quality-Best-Practices-Boost-Revenue-by-66-Percent-52324.aspx.
2
Douglas Henschen, “Research: 2012 BI and Information Management,” http://reports.informationweek.
com/abstract/81/8574/Business-Intelligence-and-Information-Management/research-2012-bi-and-infor-
mation-management.html.
References
• CoreLogic U.S., Technical & Product Management, Providing IT Infrastructural Support and Business
Knowledge on the Data.
• Cognizant Quality Engineering & Assurance (QE&A) and Enterprise Information Management (EIM),
ETL QE&A Architectural Set Up and QE & A Best Practices.
• Ravi Kalakota & Marcia Robinson, E-Business 2.0: Roadmap for Success, “Chapter Four: Thinking
E-Business Design — More Than a Technology” and “Chapter Five: Constructing The E-Business Archi-
tecture-Enterprise Apps.”
• Jonathan G. Geiger, “Data Quality Management, The Most Critical Initiative You Can Implement,” Intel-
ligent Solutions, Inc., Boulder, CO, www2.sas.com/proceedings/sugi29/098-29.pdf.
• www.informatica.com/in/etl-testing/. (An article on Informatica’s proprietary Data Validation Option
available in its Data Integration Tool.)
• www.expressanalytics.net/index.php?option=com_content&view=article&id=10&Itemid=8. (Literature
on the importance of the data warehouse and business intelligence.)
• http://spotfire.tibco.com/blog/?p=7597. (Understanding the benefits of data warehousing.)
• www.ijsce.org/attachments/File/v3i1/A1391033113.pdf. (Significance of data warehousing and data
mining in business applications.)
• www.corelogic.com/about-us/our-company.aspx#container-Overview. (About the CoreLogic client.)
• www.cognizant.com/InsightsWhitepapers/Leveraging-Automated-Data-Validation-to-Reduce-Soft-
ware-Development-Timelines-and-Enhance-Test-Coverage.pdf. (A white paper on dataTestPro, a pro-
prietary tool by Cognizant used for automating the data validation process.)