The document discusses InfoSphere Optim Test Data Management and its capabilities for managing test data across the information lifecycle. It highlights risks of poor test data management like privacy issues and inefficient processes. InfoSphere Optim provides capabilities like masking sensitive data, extracting targeted test data subsets, and automating test data refresh. It also supports test data management for IMS databases on z/OS.
With the advent of Web 2.0 and Service Oriented Application architectures, file-encapsulation is rapidly becoming the pervasive means for storing, distributing, and managing business data by policy. However, until very recently, a unified scalable, light weight, policy-based file management architecture was not available. This presentation explores how the file area network (FAN) leverages file virtualization, network-based policy enforcement, and globally distributed access, in an inclusive and open approach that not only "allows" but "calls for" multiple vendor technologies to interoperate.
Saksham Sarode - Building Effective test Data Management in Distributed Envir...TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Building Effective test Data Management in Distributed Environment by Saksham Sarode. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
With the advent of Web 2.0 and Service Oriented Application architectures, file-encapsulation is rapidly becoming the pervasive means for storing, distributing, and managing business data by policy. However, until very recently, a unified scalable, light weight, policy-based file management architecture was not available. This presentation explores how the file area network (FAN) leverages file virtualization, network-based policy enforcement, and globally distributed access, in an inclusive and open approach that not only "allows" but "calls for" multiple vendor technologies to interoperate.
Saksham Sarode - Building Effective test Data Management in Distributed Envir...TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Building Effective test Data Management in Distributed Environment by Saksham Sarode. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Management & streamlining of test data is more than important and test data management remains a critical component in the testing life cycle for software & apps.
Test data management or TDM, facilitates test data during various phases of a software development life cycle. The data consumed, tested & modified is constantly put to use during the complete software cycle.
The evolution of Test Data Management into a comprehensive service ensures that the need for relevant data during various phases of the software life cycles are taken care of pushing faster go-market times.
Get More Insight at:
http://softwaretestingsolution.com/blog/test-data-management-managed-service-software-quality-assurance/
How Can Test Data Management Overcome Mainframe Testing Challenges?CA Technologies
How a sophisticated, end-to-end test data management strategy can be used to reduce infrastructure costs and mitigate risk while providing testers with all the data they need, when they need it.
For more information, please visit http://cainc.to/Nv2VOe
Test Data Management 101—Featuring a Tour of CA Test Data Manager (Formerly G...CA Technologies
Ever wonder exactly how Test Data Manager (TDM) works and how you can maximize your TDM investment? In this session we will cover:
- What value does TDM provide organizations?
- What can CA Test Data Manager do to help?
This session will teach how you can maximize your investment.
For more information, please visit http://cainc.to/Nv2VOe
While the companies are making the use of information oceans and derive profits from the data they store; at the same time they suffer from it. It is obvious that no company can cope with data growth by just increasing their hardware capacity. Companies need to find out smart solutions for this inevitable growth.
When we degrade the subject into testing, we observe that IT organizations are deeply focusing on the collection and organization of data for their testing processes. The ability to control this process and use test data has become the key competitive advantage for these organizations because benefits of such mechanisms will worth against their tradeoffs. Ultimately, test data management plays a vital role in any software development project and unstructured processes may lead organizations to;
•Do inadequate testing (poor quality of product)
•Be unresponsive (increased time-to-market)
•Do redundant operations and rework (increased costs)
•Be non-compliant with regulatory norms (especially on data confidentiality and usage)
No matter which approach you choose to eliminate the challenges of this important subject, test data management; basic requirements for you to be successful are; combination of good test cases and test data, along with the proper usage of tools to help you automating extraction, transformation and governance of the data being used.
Test Veri Yönetimi
Yazılım testlerinin etkinliğini belirleyen en önemli unsurlardan bir tanesi kullanılan test veri setidir. Testlerin dar bir test veri setiyle yapılması:
- test kapsamının düşmesine
- testlerin yanlış sonuçlar vermesine
- canlıda beklenmeyen hataların çıkmasına
neden olmaktadır. Test veri setlerinin optimum seviyede doğru verilerle oluşturulabilmesi için iki kritik başarı faktörü bulunmaktadır.
1-Milyonlarca test verisi içerisinden test kapsamını belli seviyede sağlayak test veri kümesinin oluşturulabilmesi için uluslararası test tekniklerinin kullanılması
- Denklik sınıfı test tekniği (equivalance partitioning test technique)
- Sınır değer test tekniği (boundary value test technique)
- Pairwise test tekniği
- Combinatorial test tekniği
- ….
2- Doğru test veri yönetimi aracının seçilmesi
- Canlı ortamdaki verileri maskeleyerek test verisi oluşturan araçlar
- Girilen veri tiplerine uygun rastgele test verisi yaratan araçlar
Test veri yönetimi ile ilgili daha fazla bilgi almak için:
Test veri yönetimi ile ilgili yaklaşımımızı içeren sunumu görmek için tıklayınız: http://www.slideshare.net/keytorc
Keytorc’un test veri yönetimi konusunda uzman ekibiyle iletişime geçmek için:www.keytorc.com ya da blogs.keytorc.com
This presentation provides a technical overview of IBM Optim and its benefits.
Three areas of focus:
Mitigate Risk: Much of the “data related” risk that an organization carries is related to keeping sensitive data private, preventing data breaches, and safely storing and retiring data that is no longer required on the online systems. Companies must comply to regulations and policies, and lack of proper data protection can lead to penalties, including damaging a company’s reputation.
Deal with Data Growth: Another challenge is dealing with the explosive data growth for many applications. Without properly managing the data volume, companies will see the impact in the performance of their system over time. This is particularly a problem when service level agreements (SLA’s) are in place that mandate set response times.
Control Costs: The costs of managing data spans across initial design of the data structure throughout all lifecycle phases - until ultimately retiring the data. IT staff is under constant pressure to deliver more for less. Some major costs for managing data include storage hardware costs, storage management costs (archiving, storing, retrieving, etc.), and costs of protecting the data per compliance regulations.
Qiagram is a collaborative visual data exploration environment that enables investigator-initiated, hypothesis-driven data exploration, allowing business users as well as IT professionals to easily ask complex questions against complex data sets.
Anzo Smart Data Lake 4.0 - a Data Lake Platform for the Enterprise Informatio...Cambridge Semantics
Only with a rich and interactive semantic layer can your data and analytics stack deliver true on-demand access to data, answers and insights - weaving data together from across the enterprise into an information fabric. In this webinar we introduce Anzo Smart Data Lake 4.0, which provides that rich and interactive semantic layer to your data.
Organizational compliance and security SQL 2012-2019 by George WaltersGeorge Walters
The compliance and security aspects of SQL Server, and the greater platform, are covered here. This goes through CTP 2.3 of SQL 2019. I start with the history of security in SQL Server, from the changes with SQL 2005, then into SQL 2008, 2008r2, 2012, 2014, 2016, 2017. We cover the requirement for installation, auditing, encryption, compliance, and so forth.
Modern management of data pipelines made easierCloverDX
From data discovery, classification and cataloging to governance, anonymization and better management of data over its lifetime.
- How to make data discovery and classification easier and faster at scale with smart algorithms
- Best practices for standardization of data structures and semantics across organizations
- What’s driving the paradigm shift from development to declaration of data pipelines
- How to meet regulatory and audit requirements more easily with better transparency of data processes
You might think you know what’s in your data, but at enterprise scale, it’s almost impossible. Just because you have a column called ‘last name’, that’s not necessarily what it contains.
Automating data discovery by using data matching algorithms to identify and classify all your data – wherever it sits – can make the process vastly more efficient, as well as helping identify all the PII (Personally Identifiable Information) across your organization.
These slides originally accompanied a webinar that described some ways in which you can better manage modern data pipelines. You can watch the full video here: https://www.cloverdx.com/webinars/modern-management-of-data-pipelines-made-easier
Enterprise Data and Analytics Architecture Overview for Electric UtilityPrajesh Bhattacharya
How would you go about creating an enterprise data and analytics architecture for electric utility that 1) will be relevant in the long run, 2) will be easy to implement and 3) will start bringing value to the organization fairly quickly? What will be the components? Who will be the users? The operation of electric utility will change significantly by 2025. How will you future-proof the architecture?
Big Data – Shining the Light on Enterprise Dark DataHitachi Vantara
Content stored for a business purpose is often without structure or metadata required to determine its original purpose. With Hitachi Data Discovery Suite and Hitachi Content Platform, businesses can uncover dark data that could be leveraged for better business insight and uncover compliance issues that could prevent business risks. View this session and learn: What is enterprise dark data? How can enterprise dark data impact business decisions? How can you augment your underutilized data and deliver more value? How can you decrease the headache and challenges created by dark data? For more information please visit: http://www.hds.com/products/file-and-content/
Management & streamlining of test data is more than important and test data management remains a critical component in the testing life cycle for software & apps.
Test data management or TDM, facilitates test data during various phases of a software development life cycle. The data consumed, tested & modified is constantly put to use during the complete software cycle.
The evolution of Test Data Management into a comprehensive service ensures that the need for relevant data during various phases of the software life cycles are taken care of pushing faster go-market times.
Get More Insight at:
http://softwaretestingsolution.com/blog/test-data-management-managed-service-software-quality-assurance/
How Can Test Data Management Overcome Mainframe Testing Challenges?CA Technologies
How a sophisticated, end-to-end test data management strategy can be used to reduce infrastructure costs and mitigate risk while providing testers with all the data they need, when they need it.
For more information, please visit http://cainc.to/Nv2VOe
Test Data Management 101—Featuring a Tour of CA Test Data Manager (Formerly G...CA Technologies
Ever wonder exactly how Test Data Manager (TDM) works and how you can maximize your TDM investment? In this session we will cover:
- What value does TDM provide organizations?
- What can CA Test Data Manager do to help?
This session will teach how you can maximize your investment.
For more information, please visit http://cainc.to/Nv2VOe
While the companies are making the use of information oceans and derive profits from the data they store; at the same time they suffer from it. It is obvious that no company can cope with data growth by just increasing their hardware capacity. Companies need to find out smart solutions for this inevitable growth.
When we degrade the subject into testing, we observe that IT organizations are deeply focusing on the collection and organization of data for their testing processes. The ability to control this process and use test data has become the key competitive advantage for these organizations because benefits of such mechanisms will worth against their tradeoffs. Ultimately, test data management plays a vital role in any software development project and unstructured processes may lead organizations to;
•Do inadequate testing (poor quality of product)
•Be unresponsive (increased time-to-market)
•Do redundant operations and rework (increased costs)
•Be non-compliant with regulatory norms (especially on data confidentiality and usage)
No matter which approach you choose to eliminate the challenges of this important subject, test data management; basic requirements for you to be successful are; combination of good test cases and test data, along with the proper usage of tools to help you automating extraction, transformation and governance of the data being used.
Test Veri Yönetimi
Yazılım testlerinin etkinliğini belirleyen en önemli unsurlardan bir tanesi kullanılan test veri setidir. Testlerin dar bir test veri setiyle yapılması:
- test kapsamının düşmesine
- testlerin yanlış sonuçlar vermesine
- canlıda beklenmeyen hataların çıkmasına
neden olmaktadır. Test veri setlerinin optimum seviyede doğru verilerle oluşturulabilmesi için iki kritik başarı faktörü bulunmaktadır.
1-Milyonlarca test verisi içerisinden test kapsamını belli seviyede sağlayak test veri kümesinin oluşturulabilmesi için uluslararası test tekniklerinin kullanılması
- Denklik sınıfı test tekniği (equivalance partitioning test technique)
- Sınır değer test tekniği (boundary value test technique)
- Pairwise test tekniği
- Combinatorial test tekniği
- ….
2- Doğru test veri yönetimi aracının seçilmesi
- Canlı ortamdaki verileri maskeleyerek test verisi oluşturan araçlar
- Girilen veri tiplerine uygun rastgele test verisi yaratan araçlar
Test veri yönetimi ile ilgili daha fazla bilgi almak için:
Test veri yönetimi ile ilgili yaklaşımımızı içeren sunumu görmek için tıklayınız: http://www.slideshare.net/keytorc
Keytorc’un test veri yönetimi konusunda uzman ekibiyle iletişime geçmek için:www.keytorc.com ya da blogs.keytorc.com
This presentation provides a technical overview of IBM Optim and its benefits.
Three areas of focus:
Mitigate Risk: Much of the “data related” risk that an organization carries is related to keeping sensitive data private, preventing data breaches, and safely storing and retiring data that is no longer required on the online systems. Companies must comply to regulations and policies, and lack of proper data protection can lead to penalties, including damaging a company’s reputation.
Deal with Data Growth: Another challenge is dealing with the explosive data growth for many applications. Without properly managing the data volume, companies will see the impact in the performance of their system over time. This is particularly a problem when service level agreements (SLA’s) are in place that mandate set response times.
Control Costs: The costs of managing data spans across initial design of the data structure throughout all lifecycle phases - until ultimately retiring the data. IT staff is under constant pressure to deliver more for less. Some major costs for managing data include storage hardware costs, storage management costs (archiving, storing, retrieving, etc.), and costs of protecting the data per compliance regulations.
Qiagram is a collaborative visual data exploration environment that enables investigator-initiated, hypothesis-driven data exploration, allowing business users as well as IT professionals to easily ask complex questions against complex data sets.
Anzo Smart Data Lake 4.0 - a Data Lake Platform for the Enterprise Informatio...Cambridge Semantics
Only with a rich and interactive semantic layer can your data and analytics stack deliver true on-demand access to data, answers and insights - weaving data together from across the enterprise into an information fabric. In this webinar we introduce Anzo Smart Data Lake 4.0, which provides that rich and interactive semantic layer to your data.
Organizational compliance and security SQL 2012-2019 by George WaltersGeorge Walters
The compliance and security aspects of SQL Server, and the greater platform, are covered here. This goes through CTP 2.3 of SQL 2019. I start with the history of security in SQL Server, from the changes with SQL 2005, then into SQL 2008, 2008r2, 2012, 2014, 2016, 2017. We cover the requirement for installation, auditing, encryption, compliance, and so forth.
Modern management of data pipelines made easierCloverDX
From data discovery, classification and cataloging to governance, anonymization and better management of data over its lifetime.
- How to make data discovery and classification easier and faster at scale with smart algorithms
- Best practices for standardization of data structures and semantics across organizations
- What’s driving the paradigm shift from development to declaration of data pipelines
- How to meet regulatory and audit requirements more easily with better transparency of data processes
You might think you know what’s in your data, but at enterprise scale, it’s almost impossible. Just because you have a column called ‘last name’, that’s not necessarily what it contains.
Automating data discovery by using data matching algorithms to identify and classify all your data – wherever it sits – can make the process vastly more efficient, as well as helping identify all the PII (Personally Identifiable Information) across your organization.
These slides originally accompanied a webinar that described some ways in which you can better manage modern data pipelines. You can watch the full video here: https://www.cloverdx.com/webinars/modern-management-of-data-pipelines-made-easier
Enterprise Data and Analytics Architecture Overview for Electric UtilityPrajesh Bhattacharya
How would you go about creating an enterprise data and analytics architecture for electric utility that 1) will be relevant in the long run, 2) will be easy to implement and 3) will start bringing value to the organization fairly quickly? What will be the components? Who will be the users? The operation of electric utility will change significantly by 2025. How will you future-proof the architecture?
Big Data – Shining the Light on Enterprise Dark DataHitachi Vantara
Content stored for a business purpose is often without structure or metadata required to determine its original purpose. With Hitachi Data Discovery Suite and Hitachi Content Platform, businesses can uncover dark data that could be leveraged for better business insight and uncover compliance issues that could prevent business risks. View this session and learn: What is enterprise dark data? How can enterprise dark data impact business decisions? How can you augment your underutilized data and deliver more value? How can you decrease the headache and challenges created by dark data? For more information please visit: http://www.hds.com/products/file-and-content/
Kelly technologies is the best data science training institute in hyderabad.We provide our trainings by industrial real time experts so that our students know about real time market technology.
Organizational compliance and security in Microsoft SQL 2012-2016George Walters
Organizational compliance and security in Microsoft SQL 2012-2016. This covers encryption at rest and in transit, securing data, application design considerations, Audit, and T-SQL to help you get compliant.
Standardization of “Safety Drug” Reporting Applicationshalleyzand
proposes an Information Technology infrastructure model that provides drug providers IT organization with a strategic perspective for how to computerize their Safety Drug reporting activity. It introduces software development concepts, methods, techniques and tools for collecting data from multiple platforms and generates reports from them by scripting queries.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Optim test data management for IMS 2011
1. InfoSphere Optim Test Data Management Solution– IMS Focus Peter Costigan – Product Line Manager, Optim Solutions 9/28/2011
2.
3. Mastering information across the Information Supply Chain Transactional & Collaborative Applications Business Analytics Applications External Information Sources Trusted Relevant Governed Analyze Integrate Manage Cubes Streams Big Data Master Data Content Data Streaming Information Information Governance Data Warehouses Content Analytics Govern Quality Security & Privacy Lifecycle Standards Integrate & Cleanse
4. Requirements to manage data across its lifecycle Validate test results Define policies Report & retrieve archived data Enable compliance with retention & e-discovery Move only the needed information Integrate into single data source Create & refresh test data Manage data growth Classify & define data and relationships Develop database structures & code Enhance performance Discover where data resides Develop & Test Discover & Define Optimize, Archive & Access Consolidate & Retire Information Governance Core Disciplines Lifecycle Management
5.
6.
7. Optim Captures Complete Business Objects Business data is related across a wide variety of data sources
11. Sensitive Production Data: What’s the risk? Hundreds of thousands of secret reports regarding US wars in Iraq and Afghanistan published on WikiLeaks. December 2010: A private in the US military, downloaded top secret military documents and passed them to journalist for publication. This puts US national security at risk as well as the lives of those named in reports. Unprotected test data sent to and used by test/development teams as well as third-party consultants. February 2009: An FAA server used for application development & testing was breached, exposing the personally identifiable information of 45,000+ employees. SQL injection is fast becoming one of the biggest and most high profile web security threats. April 2011 : A mass SQL injection attack that initially compromised 28,000 websites shows no sign of slowing down. Known as LizaMoon, this malicious code is after anything stored in a database. Hackers obtained personal information on 70 million subscribers. April 2011: Malicious outsiders stole name, address (city, state, zip), country, email address, birth date, PlayStation Network/Qriocity password and login, and handle/PSN online ID, and possibly credit card numbers from 70 million Sony PlayStation users.
12.
13.
14.
15.
16.
17.
18.
19. Requirements to manage data across its lifecycle Validate test results Define policies Report & retrieve archived data Enable compliance with retention & e-discovery Move only the needed information Integrate into single data source Create & refresh test data Manage data growth Classify & define data and relationships Develop database structures & code Enhance performance Discover where data resides Develop & Test Discover & Define Optimize, Archive & Access Consolidate & Retire Information Governance Core Disciplines Lifecycle Management
20.
21. InfoSphere Discovery Speeds Understanding Data Table 1 Table 25 The Discovery Engine analyzes data values to automatically discover the columns that relate rows across data sources, and the columns which contain sensitive data . IBM InfoSphere Discovery Hit Rate: 98% X - Row Member SS # Age Phone Sex 1 595846226 123-45-6789 15 (123) 456-7890 M 2 567472596 138-27-1604 8 (138) 271-6037 F 3 540450092 154-86-4196 22 (154) 864-1961 M 4 514714372 173-44-7900 55 (173) 447-8996 F 5 490204164 194-26-1648 4 (194) 261-6476 F 6 466861109 217-57-3046 66 (217) 573-0453 M 987,623 444629628 243-68-1812 25 (243) 681-8107 F 987,624 423456789 272-92-3629 87 (272) 923-6280 M ID Demo1 595846226 0 567472596 1 540450091 2 514714372 3 490204164 1 466861109 0 444629628 3 423456789 2
22.
23.
24.
25.
26.
Editor's Notes
This presentation is the Essentials of Test Data Management part of the InfoSphere Information Lifecycle Management Solutions
We are going to cover the following: -Information Governance: Review -What is Test Data Management -Role of Test Data Management in the Testing Discipline -Risks and Challenges of Poor Test Data Management -Best Practices in Test Data Management -Data Privacy Concerns with Test Data -IBM InfoSphere Optim Test Data Management Solution -Conclusion
This slide you have seen in the Information Lifecycle Management presentation. There are typically hundreds or even thousands of different systems throughout an organization. Information can come in from many places (transaction systems, operational systems, document repositories, external information sources), and in many formats (data, content, streaming). Wherever it comes from, there are often meaningful relationships between various sources of data. We manage all this information in our systems, integrate to build warehouses and master the data to get single views and analyze it to make business decisions. This is a supply chain of information, flowing throughout the organization. Integration information, ensuring its quality and interpreting it correctly is crucial to using the information to make better decisions. Information must be turned into a trusted asset, and governed to maintain the quality over its lifecycle.
We went through the requirements for Information lifecycle management. We are focusing on Develop and Test. Specifically efficiently creating the test & development environments (and protecting sensitive data within), effectively validating test results and quickly & securely deploying the application
How our enterprises creating test data today…manually or just cloning their entire production to obtain their test database. The downside of cloning your entire production is that you now have a data growth problem and uses significant storage. In addition, you have a privacy issues because you have exposed sensitive data to developers and testers using production data for testing.
The business benefits of test data management: More time for testing In many organizations, 30-40% of test script execution is spent on manufacturing new test data…and much of this is done manually today. Automating Test Data Management will reduce the amount of time spent creating new data thereby allowing for the execution of more tests Reduce cost Maximize allocated disk space Catch errors earlier in the testing cycle because now you have realistic test data to test with. Shift errors from production to test Increase data quality Enforce data ownership Test Data Management offers role driven security to support level segmentation of the development and testing teams Reduce data dependencies across test sets Multiple test sets often use the same data, but different tests can negatively impact other tests using the same data. Test Data Management allows for the creation of an unlimited number of test data sets and can create unique IDs each time to ensue clean data is used when testing
Why is it important to mask sensitive information….some examples: -Hackers obtained personal information on 70 million subscribers to Sony PlayStation . See article: http://online.wsj.com/article/SB10001424052748704587004576245131531712342.html 'LizaMoon' Mass SQL Injection Attack Escalates Out of Control. See article: http://www.eweek.com/c/a/Security/LizaMoon-Mass-SQL-Injection-Attack-Escalates-Out-of-Control-378108/ -Federal Aviation Administration: Exposes unprotected test data to a third party http://fcw.com/articles/2009/02/10/faa-data-breach.aspx Release of thousands of classified documents by WikiLeaks founder Julian Assange jeopardizes U.S. national security. US Army launches investigation. http://www.mcclatchydc.com/2010/12/23/105763/army-wikileaks-probe-could-lead.html
Ever since the inception of Information Technology (aka Electronic Data Processing) it has become commonly accepted to allow a certain percentage of IT staff to have access to the production environment. These "trusted employees" were carefully screened and usually in close proximity to executive management due to the confidentiality of critical sensitive corporate data. Originally, this was a practical matter and was voluntarily implemented by the enterprise. Over the years, the onslaught of international Data Privacy Legislation has made this a compliance matter as well. Today's large, multi-national enterprise is faced with numerous cross-border data privacy exposures. Additionally with the deployment of third-party contractors, there is further separation from the traditional "trusted employee". Data Masking provides development teams with meaningful test data, without exposing sensitive private information. Static data masking is the most common and most tradition approach. Static data masking extracts rows from production databases, conceal data values that ultimately get stored in the columns in the test databases. The concealed values are physically stored in the target databases. Dynamic data masking (a term coined by Gartner), is an emerging technology that performs data obfuscation at the presentation layer in real time. Implemented at the SQL protocol layer, operating as a database listener, in-bound SQL from any application is inspected and then dynamically re-written to include the appropriate masking function. The result is data masking at the presentation layer without having to change the underlying database or the application source code.
We went through the requirements for Information lifecycle management. We are focusing on Develop and Test. Specifically efficiently creating the test & development environments (and protecting sensitive data within), effectively validating test results and quickly & securely deploying the application
Most companies are still struggling with the first step of understanding their complex heterogeneous data landscapes for test data management. – with the resulting impact on the overall quality of applications. Some of the challenges are knowing what data is needed for test cases, lack of understanding of where data is located and how the data is related, limited understanding of the confidential data elements. It’s cost prohibitive to conduct manual analysis and hand coding.
-Test Data Management allows development teams to accelerate testing activities on a project -Test Data Management exploits production data while ensuring security of confidential data -Providing testers and developers with access to test data can improve operational efficiency and optimize resources on a project -A comprehensive Test Data Management solution is needed to minimize cost and shorten development cycles
You want to point customers to the InfoSphere Optim ibm.com page, solution sheet, whitepaper and case study on test data management.