SlideShare a Scribd company logo
1 of 53
Data Lifecycle
Management
© 2012 IBM Corporation
Information Management
Managing the Data Lifecycle from
Requirement to Retirement
InfoSphere Optim
© 2012 IBM Corporation2
Information Management
What we’ll discuss
Challenges managing the lifecycle
of application data
What’s at Stake
Leveraging an Information
Governance Approach
Optimizing the data lifecycle
• Discover & Define
• Develop & Test
• Optimize & Archive
• Consolidate & Retire
IBM InfoSphere Solutions for Data
Lifecycle Management
© 2012 IBM Corporation3
Information Management
Information
Governance
Govern
Quality Security &
Privacy
Lifecycle Standards
Transactional &
Collaborative
Applications
Business Analytics
Applications
External
Information
Sources
Success requires governance across the “Information Supply Chain”
Analyze
Integrate
Manage
Cubes
Big Data
Master
Data
Content
Data
Streaming
Information
Data
Warehouses
Content
Analytics
© 2012 IBM Corporation4
Information Management
InfoSphere
Your trusted platform for managing trusted information
Comprehensive
Most mature and
comprehensive
capabilities across
all of IIG
Integrated
Integrated capabilities
designed to address
enterprise use cases
Intelligent
Prebuilt, Automated,
Proactive
Transform Enterprise
Business Processes &
Applications with
Trusted Information
Deliver Trusted
Information for Data
Warehousing and
Business Analytics
Build and Manage
Single View
Integrate & Govern
Big DataMake Enterprise
Applications more
Efficient
Consolidate
and Retire
Applications
Secure Enterprise
Data & Ensure
Compliance
InfoSphere is the market leader in every category of
Information Integration and Governance
4
© 2012 IBM Corporation5
Information Management
Data lifecycle management challenges
Forrester estimates that 75% of data stored in large OLTP
applications is typically inactive.
“ Source: Noel Yuhanna, Forrester Research Inc.,
Your Enterprise Data Archiving Strategy, February 2011
New application functionality to meet business needs
is not deployed on schedule
Disclosure of confidential data kept in test/
development environments
Application defects or database errors are discovered
after deployment
Increased operational and infrastructure costs impact
IT budget
© 2012 IBM Corporation6
Information Management
Making applications more efficient remains a major concern
Data volume growth remains a major concern for most
“What are the key database management issues you see as challenging your organization?”
Data volume growth remains a major concern for most
“What are the key database management issues you see as challenging your organization?”
Base: 194 enterprise database management professionals
Source: November 2010 Global Database Management Systems Online Survey; Forrester Research, Inc.
Delivering improved performance
Delivering higher availability
High data volume growth
Increasing data management costs
Database upgrades
Securing private data
Data integration issues
Database migrations
Lack of DBA resources
Too many security patches to deploy 47% 48% 5%
49% 45% 6%
50% 45% 5%
53% 42% 5%
55% 42% 3%
56% 41% 3%
58% 35% 7%
59% 37% 4%
60% 37% 3%
62% 35% 3%
Challenging Not challenging Don’t know
© 2012 IBM Corporation7
Information Management
The real organizational impact of ignoring application efficiency
High Capital Expenditures
• Increased capital expenditures for data storage
– Server CPU’s, disk storage systems, database software, power, space,
management…
Decreased Productivity
• Business users losing productivity because systems are not available
• Large amount of time by database administrators spent improving
performance vs. adding new capabilities
Missed Service Level Agreements
• Extended time and failures when backing up databases for recovery efforts
• Unable to meet deadlines (end of quarter results, updates, upgrades)
Ad Hoc Test Data Deployment
• Lengthy time to setup and deploy test environments
• Deploying applications that partially tested
© 2012 IBM Corporation8
Information Management
Requirements for managing data across its lifecycle
Capture & replay
production workloads
Define policies
Report & retrieve
archived data
Enable compliance
with retention &
e-discovery
Move only the
needed information
Integrate into single
data source
Create & refresh test
data
Manage data growth
Classify & define
data and
relationships
Develop & test
database structures/
code
Enhance
performance
Discover where
data resides
Develop &Develop &
TestTest
Discover &Discover &
DefineDefine
Optimize &Optimize &
ArchiveArchive
Consolidate &Consolidate &
RetireRetire
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation9
Information Management
You can’t govern what you
don’t understand
Discover &
Define
Develop &
Test
Optimize &
Archive
Consolidate &
Retire
Information Governance Core Disciplines
Lifecycle Management
?
??
?
?
?
?
?
?
??
?
?
?
?
?
?
?
??
?
?
?
?
??
?
?
?
?
• Complex data relationships
within and across sources
• Historical and reference data
for archiving
• Test data needed to satisfy test
cases
• Sensitive data identification
© 2012 IBM Corporation10
Information Management
Discover & define business objects across
heterogeneous databases & applications
Referentially-intact subsets of data across
related tables & applications, including
metadata.
DBADBA
ViewView
Overall historical “snapshot”
of business activity,
representing an application
data record – e.g. payment,
invoice, customer
BusinessBusiness
ViewView
Federated access to related business objects across the enterprise
CRM on
Oracle database
ERP / Financials
on DB2
Custom Inventory Mgmt
on DB2
Discover &
Define
Develop &
Test
Optimize &
Archive
Consolidate &
Retire
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation11
Information Management
Requirements for managing data across its lifecycle
Capture & replay
production workloads
Define policies
Report & retrieve
archived data
Enable compliance
with retention &
e-discovery
Move only the
needed information
Integrate into single
data source
Create & refresh test
data
Manage data growth
Classify & define
data and
relationships
Develop & test
database structures/
code
Enhance
performance
Discover where
data resides
Develop &Develop &
TestTest
Discover &Discover &
DefineDefine
Optimize &Optimize &
ArchiveArchive
Consolidate &Consolidate &
RetireRetire
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation12
Information Management
Increasing Risk
Mandatory to protect
data and comply with
regulations
Increasing Costs
Defects are caught
late in the cycle
Organizations continue to be challenged with
building quality applications
Time to Market
Lack of realistic test
data and inadequate
environments
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation13
Information Management
Increasing Risk
45,000+
Number of sensitive
records exposed to 3rd
party during testingc
62%
companies use actual
customer data to test
applicationsa
Time to Market
37%
Satisfied with speed of
software developmentf
30-50%
Time testing teams
spend on setting up test
environments, instead of
testingb
Increasing Costs
$300 billion
Annual costs of software-
related downtime.d
32%
Low success rate for
software projectse
Organizations continue to be challenged with
building quality applications
a. The Ponemon Institute. The Insecurity of Test Data: The Unseen Crisis
b. NIST, Planning Report. The Economic Impacts of Inadequate Infrastructure for Software Testing
c. Federal Aviation Administration: Exposes unprotected test data to a third party http://fcw.com/articles/2009/02/10/faa-data-breach.aspx
d. The Standish Group, Comparative Economic Normalization Technology Study, CHAOS Chronicles v12.3.9, June 30, 2008
e. The Standish Group, Chaos Report, April 2009
f. Forrester Research, “Corporate Software Development Fails To Satisfy On Speed Or Quality", 2005
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation14
Information Management
The impact of inefficient test practices
Challenges in Test Data Management
“We did not want to create ‘fake’ or unrealistic test data. All test
data had to be created and set up manually. So it would take us a
month to setup test data for 30 or more accounts.”
-- Large US Healthcare Insurer
“We needed to improve efficiencies in development and testing
environments, as well as our production environments. We can
create realistic test environments that use much less disk space
than we would by cloning the production database.”
-- Allianz Seguros
“Our staff wanted to implement more efficient and cost-effective
testing processes that would shorten the time for creating and
managing multiple test environments.”
-- Cetelem
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation15
Information Management
100 GB100 GB
25 GB
50 GB50 GB
• Create targeted, right-sized test
environments
• Substitute sensitive data with fictionalized
yet contextually accurate data
• Easily refresh, reset and maintain test
environments
• Compare data to pinpoint and resolve
application defects faster
• Accelerate release schedules
Employ effective test data management practices
Production or Production Clone
25 GB
2TB
Development
Unit Test
Training
Integration
Test
Subset & Mask
-Compare
-Refresh
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation16
Information Management
Speed Delivery
Refresh test data
speeding testing and
application delivery
Reduce Risk
Mask sensitive
information for
compliance and
protection
Reduce Cost
Automate creation of
realistic “right sized” test
data to reduce the size
of test environments
• Understand what test data is needed for test cases
• Create “right-sized” test data by subsetting
• Ensure masked data is contextually appropriate to the data it replaced,
so as not to impede testing
• Easily refresh & maintain test environments by developers and testers
• Automate test result comparisons to identify hidden errors
• Support for custom & packaged ERP applications in heterogeneous
environments
• Understand what test data is needed for test cases
• Create “right-sized” test data by subsetting
• Ensure masked data is contextually appropriate to the data it replaced,
so as not to impede testing
• Easily refresh & maintain test environments by developers and testers
• Automate test result comparisons to identify hidden errors
• Support for custom & packaged ERP applications in heterogeneous
environments
Improve application quality and delivery efficiency
with InfoSphere Optim Test Data Management Solution
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation17
Information Management
Organizations continue to be challenged with
planning enterprise changes
Channels
Direct
Call
Centers
Internet
Business
Partners
Business Units
Finance
Administration
Sales &
Marketing
Product
Management
Consulting
Services
New Business
Development
Data
DB
DB
DW
CRM
App
App
DB
DB
DW
CRM
App
App
DB
DBCRM
App
App
DB
DB
DBCRM
App
App
DB
DB
DB
DW
CRM
App
App
Systems
Core
Systems
ERP
Core
Systems
Core
Systems
Core
Systems
Core
Systems
CRM
CRM
ERP
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation18
Information Management
Consequences of poorly planned
enterprise changes
Loss of customer
satisfaction
Higher costs Delays due to extra
testing cycles
Time consuming process of
rolling back changes; Inability
to identify the source of the
problem; Manual process of
modifying test scripts
No methods for fully testing
databases workloads
Inability to deliver required
functionality; Missed services
level agreements; Inability to
process transactions
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation19
Information Management
Mitigate challenges with realistic testing
Coordinate analysts,
developers, DBAs
and testers
Align teams on
project plans
Track the impact of
data lifecycle events
Assess impact
with reports
• Collaborate in context
using production
workloads
• Which requirements are
in the development plan?
• Are we including data
lifecycle events?
• What databases changes
are we making?
• Does the test team have
realistic test cases built
from production
workloads?
• Transparency across
teams
• Analysis of data related
changes to understand
problems
Use actual production workloads for testing NOT simulated user scenarios
Develop a broad approach to testing covering the full data lifecycle
Include database workload record and replay as part of existing
testing procedures
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation20
Information Management
IBM InfoSphere Optim Query Capture and Replay
Fully assess change impact before production deployment
• Limit the use of laborious database test script creation and load simulators by
leveraging actual production workloads for testing
• Deploy a single database testing solution across heterogeneous systems
• Accelerate project delivery by identifying and correcting potential problems sooner
• Develop more streamlined, accurate database tests
• Compliment existing database tools to pinpoint database problems faster to
improve performance
• Solve problems more rapidly with deep diagnostics and validation reports
• Limit the use of laborious database test script creation and load simulators by
leveraging actual production workloads for testing
• Deploy a single database testing solution across heterogeneous systems
• Accelerate project delivery by identifying and correcting potential problems sooner
• Develop more streamlined, accurate database tests
• Compliment existing database tools to pinpoint database problems faster to
improve performance
• Solve problems more rapidly with deep diagnostics and validation reports
Improve customer
satisfaction
Anticipate and correct
potential problems
sooner
Reduce cost of
change
Establish consistent
database testing
processes
Meet SLAs
Ensure well tuned, high
performing workloads
before deployment
Discover&
Define
Optimize &
Archive
Consolidate &
RetireDevelop &
Test
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation21
Information Management
Requirements for managing data across its lifecycle
Capture & replay
production workloads
Define policies
Report & retrieve
archived data
Enable compliance
with retention &
e-discovery
Move only the
needed information
Integrate into single
data source
Create & refresh test
data
Manage data growth
Classify & define
data and
relationships
Develop & test
database structures/
code
Enhance
performance
Discover where
data resides
Develop &Develop &
TestTest
Discover &Discover &
DefineDefine
Optimize &Optimize &
ArchiveArchive
Consolidate &Consolidate &
RetireRetire
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation22
Information Management
Organizations have been increasingly challenged
with successfully managing data growth
Increasing Costs Poor Application
Performance
Manage Risk
& Compliance
Business users & customers
wait for application response;
DBA’s spend majority of time
fixing performance issues
The “keep everything”
strategy can impact disaster
recovery and data retention &
disposal compliance
Buying more storage is not a
“cheap” fix when you add the
operational burden
Develop &
Test
Discover&
Define
Consolidate &
RetireOptimize &
Archive
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation23
Information Management
Organizations have been increasingly challenged
with successfully managing data growth
Increasing Costs Poor Application
Performance
Manage Risk
& Compliance
(a) Merv Adrian, IT Market Strategies, “Data Growth Challenges Demand Proactive Data Management”, November 2009
(b) IDC, “Worldwide Archival Storage Solutions 2011–2015 Forecast: Archiving Needs Thrive in an Information-Thirsty World”, October 2011
(c) Simple-Talk, “Managing Data Growth in SQL Server ”, January 2010
(d) IBM Client Case Study: Toshiba TEC Europe; archiving reduced batch process time by 75%
(e) IDC Quick Poll Survey 2011, “Data Management for IT Optimization and Compliance”, November 2011
3-10xCost of managing storage
over the cost to procurea
$1.1 billionAmount organizations
will have spent in 2011
on storageb
80%The time DBA’s spend
weekly on disk capacity
issuesc
250 hoursThe amount of time needed
to run “daily” batch
processesd
50%of firms retain structured
data for 7+ yearse
57%of firms use Back-up for
data retention needse
Develop &
Test
Discover&
Define
Consolidate &
RetireOptimize &
Archive
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation24
Information Management
Archive historical data for data
growth management
CurrentCurrent
Production
HistoricalHistorical
ArchiveArchive
RetrieveRetrieveRestored DataRestored Data
Universal Access to Application DataUniversal Access to Application Data
Data Archives
Historical DataHistorical Data
Reference DataReference Data
Can selectively
restore archived
data records
ODBC / JDBC XML Report WriterApplication
Data Archiving is an intelligent process for moving inactive
or infrequently accessed data that still has value, while
providing the ability to search and retrieve the data
Develop &
Test
Discover&
Define
Consolidate &
RetireOptimize &
Archive
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation25
Information Management
Requirements for managing data across its lifecycle
Capture & replay
production workloads
Define policies
Report & retrieve
archived data
Enable compliance
with retention &
e-discovery
Move only the
needed information
Integrate into single
data source
Create & refresh test
data
Manage data growth
Classify & define
data and
relationships
Develop & test
database structures/
code
Enhance
performance
Discover where
data resides
Develop &Develop &
TestTest
Discover &Discover &
DefineDefine
Optimize &Optimize &
ArchiveArchive
Consolidate &Consolidate &
RetireRetire
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation26
Information Management
When it’s time to retire or
consolidate applications
Develop &
Test
Optimize &
Archive
Discover&
Define Consolidate &
Retire
Information Governance Core Disciplines
Lifecycle Management
In almost ALL cases, access to legacy data MUST be retained while
the application and database are eliminated
Application portfolio has redundant systems
acquired via mergers and acquisitions
Line of business divested; application is no
longer needed
Legacy technologies not compatible with current
IT direction
• Old database and/or application versions
no longer supported by manufacturer
Required technical skills or application
knowledge no longer available
Budget pressures – do more with less
© 2012 IBM Corporation27
Information Management
Retire redundant and legacy applications
Archived Data after ConsolidationInfrastructure before Retirement
User
Archive
Engine
Archive
Data
User
User
DatabaseApplication DataUser
DatabaseApplication DataUser
DatabaseApplication DataUser
Preserve application data in its business context
• Capture all related data, including transaction details, reference data & associated metadata
• Capture any related reference data that may reside in other application databases
Retire out-of-date packaged applications as well as legacy custom applications
• Leverage out-of-box support of packaged applications to quickly identify & extract the
complete business object
Shut down legacy system without a replacement
• Provide fast and easy retrieval of data for research and reporting, as well as audits and
e-discovery requests
Develop &
Test
Optimize &
Archive
Discover&
Define Consolidate &
Retire
Information Governance Core Disciplines
Lifecycle Management
© 2012 IBM Corporation28
Information Management
Effectively archive and manage data growth with InfoSphere Optim
• Discover & identify data record types to archive across heterogeneous environments
• Intelligently archive data to improve application performance and support data
retention
• Capture & store historical data in its original business context
• Define & maintain data retention policies consistently across the enterprise
• Ensure long-term, application-independent access of archived data via multiple
access methods
• Support for custom & packaged ERP applications in heterogeneous environments
• Leverage comprehensive solution for application consolidation and retirement across
your enterprise
• Discover & identify data record types to archive across heterogeneous environments
• Intelligently archive data to improve application performance and support data
retention
• Capture & store historical data in its original business context
• Define & maintain data retention policies consistently across the enterprise
• Ensure long-term, application-independent access of archived data via multiple
access methods
• Support for custom & packaged ERP applications in heterogeneous environments
• Leverage comprehensive solution for application consolidation and retirement across
your enterprise
Reduce Costs
Reduce hardware,
software, storage &
maintenance costs of
enterprise applications
Improve
Performance
Improve application
performance &
streamline back-ups
and upgrades
Minimize Risk
Support data retention
regulations & safely
retire legacy/redundant
applications
© 2012 IBM Corporation29
Information Management
What is the market saying?
IBM’s Optim product
line led the database
archiving and ILM
segment in 2010 with
a 52.3% share, and
showed nearly 18%
growth in 2010.
Today, IBM continues to
lead the industry with the
most comprehensive data
archiving solution and the
largest installed base …
IBM’s customers spoke
highly of the Optim
solution’s reliability and
strong performance.
Organizations can
realize benefits in the
form of improved
operational and capital
cost savings, improved
IT and end user
efficiency, as well as
higher levels of data
protection and
application performance
[with InfoSphere Optim].
Source: IDC – Worldwide
Database Development &
Management Tools 2010
Vendor & Segment
Analysis, December 2011
Source: Forrester Research –
“Your Enterprise Data Archiving
Strategy”, N. Yuhanna,
February 2011
Source: Forrester Research –
Total Economic Impact Study
for InfoSphere Optim,
October 2009
“ “ “
© 2012 IBM Corporation30
Information Management
Gartner key findings
Source: Gartner, Inc., “Market Trends: World, Database Archiving Market Continues Rapid Growth, 2011”, S. Childs & A Dayley, September 2011
Vendor Market Share by
2010 Revenue
Vendor Market Share by
Total Number of Customers
We believe the market for
database archiving and application
retirement is vibrant and dynamic,
and will see continued solid growth
over the next five years.
Organizations are looking to database
archiving vendors that offer packaged and
custom application support in order to
control storage growth, improve application
performance, and support compliance,
audit and e-discovery activities.
“ “
© 2012 IBM Corporation31
Information Management
IBM provides the expertise to manage the data lifecycle improving
application efficiency
The top challenge for 43% of CFOs is improving governance,
controls, and risk management
CFO Survey: Current state & future direction, IBM Business Consulting Services
Reduce the cost of data storage, software and
hardware
Improve application efficiency and performance
Reduce risk and support compliance with
retention requirements
Speed time to market and improve quality
© 2012 IBM Corporation32
Information Management
Solution components:
• IBM InfoSphere Optim Data Growth
Solution
• IBM InfoSphere Optim Data
Masking Solution
• IBM InfoSphere Optim Test Data
Management Solution
“We knew that Optim would
provide the capabilities we
needed to automate the process
of retrieving archived records.
So far, we have shortened the
retrieval process from several
days to a few hours.”
—Xavier Mascaró, Senior DBA,
Allianz Seguros
Allianz Seguros
Reduces record retrieval time from days to hours
Allianz Seguros Case Study
The need:
The Allianz Seguros often had at least three development and testing mainframe
environments in use at the same time. The quality of these environments
degraded quickly because they were used for multiple tests. Every few months,
the development team had to use an in-house “subsetting” program to copy test
data from its large application production environment, which comprised about
700 GB with tables that contained over 200 million rows.
The solution:
IBM InfoSphere Optim’s subsetting capabilities help create realistic test
environments that preserve data integrity, even when combined with data
masking techniques. Allianz also uses InfoSphere Optim to provide
comprehensive archiving capabilities for managing data growth and faster, more
efficient methods for archiving, accessing and retrieving archived records.
The benefits:
• Reduced resources needed for application development and testing by
creating smaller, more robust testing environments
• Applied proven data masking techniques to protect privacy and support
compliance with LOPD regulations
• Reduced time to retrieve archived insurance records from days to hours
© 2012 IBM Corporation33
Information Management
Arek Oy
Deploys a pension earnings and accrual system in 30 months
The benefits:
• Improved development and testing efficiencies, enabling Arek Oy
to promote faster deployment of new pension application
functionality and enhancements
• Protected confidential data to strengthen public confidence and
support TyEL compliance requirements.
The need:
Pension laws (TyEL) in Finland changed radically in 2007. In response, Arek
Oy had to develop and deliver a tested and reliable Pension Earnings and
Accrual System within 30 months. Arek Oy had to protect confidential
employee salary and pension information in multiple non-production
(development and testing) environments. Failure to satisfy requirements
would result in loss of customer good will and future business opportunities.
The solution:
Using IBM InfoSphere Optim subsetting capabilities rather than cloning
large production databases made it possible for Arek Oy staff to create
robust, realistic test databases that supported faster iterative testing cycles.
In addition, InfoSphere Optim offered proven capabilities for performing
complex data masking routines, while preserving the integrity of the pension
data for development and testing purposes.
“We see Optim as an integral
part of our development
solution set. Optim’s data
masking capabilities help
ensure that we can protect
privacy in our development
and testing environments.”
— Katri Savolainen, Project Manager,
Arek Oy
Solution components:
• IBM InfoSphere Optim Data Masking
Solution
• IBM InfoSphere Optim Test Data
Management Solution
Arek Oy
Case Study
© 2012 IBM Corporation34
Information Management
Toshiba TEC Europe
Reduces database size by 30% resulting in 75% increase in application availability
Toshiba TEC Europe
implemented InfoSphere Optim
to archive historical
transactions. This reduced
batch processing time for
19,000 jobs from 250 hours to
only 65 hours. As a result,
Toshiba TEC was able to satisfy
business unit requirements and
support continued business
growth.
Solution components:
• IBM InfoSphere Optim Data
Growth Solution for Oracle E-
Business Suite
Toshiba TEC Europe
Case Study
The benefits:
• Managed continued data growth by archiving historical transactions
• Deployed Oracle E-Business Suite across all business units
• Reduced database size by 30%
• Increased application availability by 75% by archiving historical transactions
• Improved service levels and operations by ensuring access to current and
historical transactions.
The need:
Proactively manage application data growth to support business expansion
and deployment of Oracle E-Business Suite across business units. Increase
application availability by reducing the time to complete 19,000 daily batch
processing jobs exceeding 250 hours. Integrate and consolidate data and
processes with the other Toshiba European entities to improve service levels
and operational efficiencies.
The solution:
IBM InfoSphere Optim provides comprehensive database archiving capabilities
to address data growth issues and integrates with Oracle E-Business Suite to
improve overall business processes.
© 2012 IBM Corporation35
Information Management
Virginia Community College System (VCCS)
Retains access to student data and deploys policy-driven archiving
Solution components:
• IBM InfoSphere Optim Data
Growth Solution for PeopleSoft
Enterprise Campus Solutions
IMP14312-USEN-00
“We anticipate that the initial
payback from implementing
Optim will be that we have
successfully stopped that vicious
cycle of running out and
purchasing high-performance
hardware, which can cost
hundreds of thousands of
dollars.”
—Andy Clark, Technical Lead for
HR/CS, Virginia Community
College System
Source: VCCS case study approved for external use.
The benefits:
• Effectively manages data growth to improve service levels
• Offers flexibility to archive 10 or more years of inactive student data
• Enables staff to selectively restore student records as needed
• Lowered infrastructure costs by eliminating frequent expensive server
upgrades
The need:
The Virginia Community College System wanted an out-of-the box archiving
solution for PeopleSoft Enterprise Campus Solutions that would help
manage data growth without expensive server upgrades, support
compliance requirements, and reduce the time spent on performance tuning
and related issues.
The solution:
Using IBM® InfoSphere™ Optim™ software, VCCS can archive complete
historical student records in batches for students who have graduated or
been inactive for at least 10 years; access archived data for reporting and
analysis; process requests for transcripts against archived student data
without having to restore the data to the production environment; and
selectively restore a complete record for a single student on demand.
© 2012 IBM Corporation36
Information Management
Learn more
• Product Family Webpage
• Solution Sheet: InfoSphere Optim Solutions for
Data Lifecycle Management
• Quick ROI: Self Service BVA for Data Growth
• Case Studies: InfoSphere Optim
Contact your IBM representative for more
information on InfoSphere Optim data lifecycle
management solutions!
Thank you
© 2012 IBM Corporation38
Information Management
IBM InfoSphere Optim Solutions
ProductionProduction Dev/TestDev/Test
ArchiveArchive
Discover
Understand
Classify
 Reduce hardware, software, storage &
maintenance costs for enterprise applications
 Improve application performance & streamline
back-ups and upgrades
 Support data retention regulations & safely retire
legacy/redundant applications
Archive
 Reduce cost, reduce risk & speed application
delivery by maintaining right-sized test environments
 Ensure compliance and privacy by masking
 Improve customer satisfaction, reduce the cost of
change and meet SLAs by using production
workloads for testing
Test
 Accelerate data management projects and
reduce risk by understanding complex data
relationships within & across systems
Discover
• Capture
• Archive
• Retire
DATA
• Replay
SQL
SQL
ApplicationApplicationApplicationApplication
DATA
• Subset
• Mask
• Compare
• Refresh
Mask
 Ensure compliance by masking on demand
© 2012 IBM Corporation39
Information Management
IBM InfoSphere Optim supports the heterogeneous enterprise
Single, scalable, heterogeneous information lifecycle management solution provides a central point to
deploy policies to extract, archive, subset, and protect application data records from creation to deletion
Manage Data GrowthData MaskingTest Data Management Application RetirementDiscover
Partner-delivered SolutionsPartner-delivered Solutions
© 2012 IBM Corporation40
Information Management
InfoSphere Optim Packages
Wo
InfoSphere Optim
Data Privacy
InfoSphere Optim
Test Data Management
(Core + SAP)
InfoSphere Optim
Archive
EnterpriseEdition
WorkgroupEdition
InfoSphere Optim Enterprise Edition for Oracle Applications
InfoSphere Optim Query Capture & Replay
© 2012 IBM Corporation41
Information Management
IBM InfoSphere Discovery
Requirements
Benefits
• Automation of manual
activities accelerates time
to value
• Business insight into data
relationships reduces
project risk
• Provides consistency
across information
agenda projects
• Define business objects
for archival and test data
applications
• Discover data
transformation rules and
heterogeneous
relationships
• Identify hidden sensitive
data for privacy
Accelerate project deployment by
automating discovery of your
distributed data landscape
Discovery
© 2012 IBM Corporation42
Information Management
IBM InfoSphere Optim Test Data Management Solution
Requirements
Benefits
• Deploy new functionality
more quickly and with
improved quality
• Easily refresh & maintain
test environments
• Protect sensitive
information from misuse &
fraud with data masking
• Accelerate delivery of test
data through refresh
• Create referentially intact,
“right-sized” test databases
• Automate test result
comparisons to identify
hidden errors
• Protect confidential data
used in test, training &
development
• Shorten iterative testing
cycles and accelerate time
to market
Create “right-size”
production-like environments
for application testing
Test Data
Management
100 GB100 GB
25 GB
50 GB50 GB
25 GB
2TB2TB
Development
Unit Test
Training
Integration Test
-Subset
-Mask
Production or
Production Clone
InfoSphere Optim TDM supports data on distributed platforms (LUW) and z/OS.
Out-of-the-box subset support for packaged applications ERP/CRM solutions as well as :
OtherOther
-Compare
-Refresh
© 2012 IBM Corporation43
Information Management
IBM InfoSphere Optim Data Masking Solution
Personal identifiable
information is masked
with realistic but fictional
data for testing &
development purposes.
JASON MICHAELS ROBERT SMITH
Requirements
Benefits
• Protect sensitive
information from misuse
and fraud
• Prevent data breaches
and associated fines
• Achieve better data
governance
• Protect confidential data
used in test, training &
development systems
• Implement proven data
masking techniques
• Support compliance with
privacy regulations
• Solution supports custom
& packaged ERP
applications
De-identify sensitive information
with realistic but fictional data
for testing & development
purposes
Data Masking
© 2012 IBM Corporation44
Information Management
IBM InfoSphere Optim Data Growth Solution
• Reduce hardware, storage
and maintenance costs
• Streamline application
upgrades and improve
application performance
• Safely retire legacy &
redundant applications
while retaining the data
• Archive, manage and
retain application &
warehouse data according
to business policies
• Minimize downtime during
application upgrades
• Consolidate application
portfolio and retire legacy
applications
Manage data growth and improve
performance by intelligently
archiving historical data
CurrentCurrent
HistoricalHistorical
ArchiveArchive
RetrieveRetrieve
Universal Access to Application DataUniversal Access to Application Data
Historical DataHistorical Data
Reference DataReference Data
XML Report
Writer
Application ODBC / JDBC
InfoSphere Optim Data Growth supports data on distributed platforms (LUW) and z/OS.
Out-of-the-box archiving support for packaged applications and data warehouses include:
Data Growth
Requirements
Benefits
IBM NetezzaIBM Netezza®®
TeradataTeradata®®
OtherOther
© 2012 IBM Corporation45
Information Management
IBM InfoSphere Optim Solution for Application Retirement
Infrastructure before RetirementInfrastructure before Retirement
`
User DatabaseApplication Data
`
User DatabaseApplication Data
`
User DatabaseApplication Data
Archived Data after ConsolidationArchived Data after Consolidation
`
User Archive DataArchive Engine
`
User
`
User
Requirements
Benefits
• Ensure compliance and
access to valuable data
• Safely retire legacy &
redundant applications
while retaining the data
• Reduce hardware, storage
and maintenance costs
• Archive, manage and
retain application data
according to data retention
policies
• Provide application
independent access to
archived data
• Consolidate application
portfolio and retire legacy
applications
Manage and provide access to
application data as part of an
application retirement project
Application
Retirement
© 2012 IBM Corporation46
Information Management
InfoSphere Optim System Analyzer for SAP®
Applications
Automatically identify SAP
system changes & understand
their impact to minimize risk
OSA for SAP
SAP Certified
Optim System Analyzer supports both distributed (LUW) and z/OS platforms.
Requirements
Benefits
• Reduce time, cost,
complexity & risk for SAP
problem solving
• Provide sophisticated real-
time data to help solve any
SAP lifecycle initiative
• Eliminate manual effort of
retrieving, comparing &
reporting on SAP data
• Provide automation to help
solve on-going SAP lifecycle
activities
• Pre-built templates to support
major SAP lifecycle events
and maintenance
• Detailed impact analysis,
precise diagnostics &
advanced reporting
© 2012 IBM Corporation47
Information Management
InfoSphere Optim Business Process Analyzer for SAP applications
Establish traceability between
your SAP business processes
and data structures
SAP Certified
BPA for SAP
Requirements
Benefits
• Improve collaboration
between business user and
technical user
• Ensure better visibility of the
impact of changes to the
SAP business process
• Mitigate risk early in the
SAP application lifecycle
• Automatically capture
business process from your
SAP data structures
• Visualize how changes will
impact the business process
• Manage business processes
in single- and multi-instance
SAP landscapes
© 2012 IBM Corporation48
Information Management
InfoSphere Optim Test Data Management Solution
for SAP®
Applications
Create manageable real-world
SAP data scenarios to improve
the quality of development,
testing & training
TDM for SAP
SAP Certified
Optim TDM for SAP supports both distributed (LUW) and z/OS platforms.
Requirements
Benefits
• Improve operational
efficiencies by shortening
iterative testing cycles
• Extract data with no client
copy downtime or
performance degradation
• Reduce the time, cost & risk
across SAP lifecycle events
• Create targeted, “right-sized”
subsets for test, development
and training environments
• Leverage pre-built business
objects and user-defined
criteria for extracting data
• Invoked within SAP GUI
featuring easy point-and-click
environment
© 2012 IBM Corporation49
Information Management
InfoSphere Optim Application Repository Analyzer
Analyze application metadata to
identify relationships &
customizations within the Oracle
family of packaged applications
ARA
Requirements
Benefits
• Quickly identify application
customizations to speed data
lifecycle projects
• Reduce time and improve
quality of application updates
• Easily export the complete
business object for archive,
subsetting & masking projects
• Analyze application metadata
to identify data models,
relationships & customizations
• Compare data model
structures across application
versions & releases
• Integrate with InfoSphere
Optim solutions
© 2012 IBM Corporation50
Information Management
IBM InfoSphere Optim Query Capture and Replay
Requirements
Benefits
Capture production workloads and
replay them in testing environments
• Minimize unexpected
production problems
• Shorten testing cycles
• Develop more realistic
database testing scenarios
• Identify database problems
sooner with validation
reports and performance
tuning
• Use actual production
workloads for testing
rather than fabricated
scenarios
• Extend quality testing
efforts to include the data
layer
Test
Database
Source DatabaseApplication
InfoSphere
Optim Query
Capture and
Replay
Record
Play
Record and replay SQL
© 2012 IBM Corporation51
Information Management
IBM InfoSphere Optim
Top market leader in database archiving & information lifecycle management
“IBM’s Optim product line led the database archiving and ILM segment
in 2010 with a 52.3% share, and showed nearly 18% growth in 2010.”
Source: IDC - Worldwide Database Development & Management Tools 20010 Vendor & Segment Analysis, December 2011
© 2012 IBM Corporation52
Information Management
Client Value Engagement (CVE)
Helping our customers achieve their business and technical objectives
Identify Technical &Identify Technical &
Business ChallengesBusiness Challenges
Identify Technical &Identify Technical &
Business ChallengesBusiness Challenges
Determine CurrentDetermine Current
(As-Is) Costs(As-Is) Costs
Determine CurrentDetermine Current
(As-Is) Costs(As-Is) Costs
Determine FutureDetermine Future
(To-Be) Costs(To-Be) Costs
Determine FutureDetermine Future
(To-Be) Costs(To-Be) Costs
Technical SolutionTechnical Solution
BlueprintBlueprint
Technical SolutionTechnical Solution
BlueprintBlueprint
CVECVE
Final ResultsFinal Results
CVECVE
Final ResultsFinal Results
Define & Identify
technical & business
Problems / Challenges
Identify Future Process &
Costs with the
Recommended Solution
(To-Be)
CVE Engagement
Summary & Final
Analysis
Identify Current
Process and Costs
(As-Is)
Measure the
Difference Between
As-Is & To-Be
InfoSphere Optim CVE Offerings:
Data Growth, Application Retirement, Test Data Management & Data Masking
© 2012 IBM Corporation53
Information Management
InfoSphere Optim Services Offerings
On Demand Consulting Services Offering
Expert Review Services Offering
• InfoSphere Optim Install Package
InfoSphere Optim Implementation & Configuration (custom)
• Bronze
• Silver
• Gold
Analyze &
Design
Monitor
InfoSphere Optim
Health Check
InfoSphere
Optim Upgrade
Education Assessment
Start up Configure
Deploy &
Operate
Pre-sales
Strategic
Planning
• InfoSphere Optim Data Growth Offering
• InfoSphere Optim Decommissioning Foundation Services
InfoSphere Discovery
JumpStart

More Related Content

What's hot

What is a Data Warehouse and How Do I Test It?
What is a Data Warehouse and How Do I Test It?What is a Data Warehouse and How Do I Test It?
What is a Data Warehouse and How Do I Test It?
RTTS
 
MDM Strategy & Roadmap
MDM Strategy & RoadmapMDM Strategy & Roadmap
MDM Strategy & Roadmap
victorlbrown
 
Enabling a Data Mesh Architecture with Data Virtualization
Enabling a Data Mesh Architecture with Data VirtualizationEnabling a Data Mesh Architecture with Data Virtualization
Enabling a Data Mesh Architecture with Data Virtualization
Denodo
 

What's hot (20)

Master Data Management
Master Data ManagementMaster Data Management
Master Data Management
 
Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?
 
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
 
What is a Data Warehouse and How Do I Test It?
What is a Data Warehouse and How Do I Test It?What is a Data Warehouse and How Do I Test It?
What is a Data Warehouse and How Do I Test It?
 
Data Lifecycle Management
Data Lifecycle ManagementData Lifecycle Management
Data Lifecycle Management
 
Data Management: Case Study Presented @ Enterprise Data World 2010
Data Management:  Case Study Presented @ Enterprise Data World 2010Data Management:  Case Study Presented @ Enterprise Data World 2010
Data Management: Case Study Presented @ Enterprise Data World 2010
 
Zero to Snowflake Presentation
Zero to Snowflake Presentation Zero to Snowflake Presentation
Zero to Snowflake Presentation
 
Business Data Lake Best Practices
Business Data Lake Best PracticesBusiness Data Lake Best Practices
Business Data Lake Best Practices
 
Data Governance
Data GovernanceData Governance
Data Governance
 
Snowflake essentials
Snowflake essentialsSnowflake essentials
Snowflake essentials
 
Scale Your Mission-Critical Applications With Neo4j Fabric and Clustering Arc...
Scale Your Mission-Critical Applications With Neo4j Fabric and Clustering Arc...Scale Your Mission-Critical Applications With Neo4j Fabric and Clustering Arc...
Scale Your Mission-Critical Applications With Neo4j Fabric and Clustering Arc...
 
Hannover Messe: Evolution of a cognitive Digital Twin
Hannover Messe: Evolution of a cognitive Digital TwinHannover Messe: Evolution of a cognitive Digital Twin
Hannover Messe: Evolution of a cognitive Digital Twin
 
Designing a modern data warehouse in azure
Designing a modern data warehouse in azure   Designing a modern data warehouse in azure
Designing a modern data warehouse in azure
 
MDM Strategy & Roadmap
MDM Strategy & RoadmapMDM Strategy & Roadmap
MDM Strategy & Roadmap
 
Enabling a Data Mesh Architecture with Data Virtualization
Enabling a Data Mesh Architecture with Data VirtualizationEnabling a Data Mesh Architecture with Data Virtualization
Enabling a Data Mesh Architecture with Data Virtualization
 
End to End Supply Chain Control Tower
End to End Supply Chain Control TowerEnd to End Supply Chain Control Tower
End to End Supply Chain Control Tower
 
Master Data Management - Aligning Data, Process, and Governance
Master Data Management - Aligning Data, Process, and GovernanceMaster Data Management - Aligning Data, Process, and Governance
Master Data Management - Aligning Data, Process, and Governance
 
project report on DATACENTER
project report on DATACENTERproject report on DATACENTER
project report on DATACENTER
 
Building a Logical Data Fabric using Data Virtualization (ASEAN)
Building a Logical Data Fabric using Data Virtualization (ASEAN)Building a Logical Data Fabric using Data Virtualization (ASEAN)
Building a Logical Data Fabric using Data Virtualization (ASEAN)
 
Big data PPT prepared by Hritika Raj (Shivalik college of engg.)
Big data PPT prepared by Hritika Raj (Shivalik college of engg.)Big data PPT prepared by Hritika Raj (Shivalik college of engg.)
Big data PPT prepared by Hritika Raj (Shivalik college of engg.)
 

Similar to IBM InfoSphere Optim Solutions - Highlights

E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...
E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...
E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...
InSync2011
 
Optim Insync10 Paul Griffin presentation
Optim Insync10 Paul Griffin presentationOptim Insync10 Paul Griffin presentation
Optim Insync10 Paul Griffin presentation
InSync Conference
 

Similar to IBM InfoSphere Optim Solutions - Highlights (20)

E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...
E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...
E-Business Suite 2 _ Ben Davis _ Achieving outstanding optim data management ...
 
Managing Data Warehouse Growth in the New Era of Big Data
Managing Data Warehouse Growth in the New Era of Big DataManaging Data Warehouse Growth in the New Era of Big Data
Managing Data Warehouse Growth in the New Era of Big Data
 
Six Reasons to Upgrade your Database
Six Reasons to Upgrade your DatabaseSix Reasons to Upgrade your Database
Six Reasons to Upgrade your Database
 
IBM 2016 - Six reasons to upgrade your database
IBM 2016 - Six reasons to upgrade your databaseIBM 2016 - Six reasons to upgrade your database
IBM 2016 - Six reasons to upgrade your database
 
How companies are managing growth, gaining insights and cutting costs in the ...
How companies are managing growth, gaining insights and cutting costs in the ...How companies are managing growth, gaining insights and cutting costs in the ...
How companies are managing growth, gaining insights and cutting costs in the ...
 
Six Reasons to Upgrade your Database
Six Reasons to Upgrade your DatabaseSix Reasons to Upgrade your Database
Six Reasons to Upgrade your Database
 
Maximizing Oil and Gas (Data) Asset Utilization with a Logical Data Fabric (A...
Maximizing Oil and Gas (Data) Asset Utilization with a Logical Data Fabric (A...Maximizing Oil and Gas (Data) Asset Utilization with a Logical Data Fabric (A...
Maximizing Oil and Gas (Data) Asset Utilization with a Logical Data Fabric (A...
 
Effectively Managing Your Historical Data
Effectively Managing Your Historical DataEffectively Managing Your Historical Data
Effectively Managing Your Historical Data
 
Application Consolidation and Retirement
Application Consolidation and RetirementApplication Consolidation and Retirement
Application Consolidation and Retirement
 
Software Defined Environment - IBM Point of View
Software Defined Environment  - IBM Point of ViewSoftware Defined Environment  - IBM Point of View
Software Defined Environment - IBM Point of View
 
Optim Insync10 Paul Griffin presentation
Optim Insync10 Paul Griffin presentationOptim Insync10 Paul Griffin presentation
Optim Insync10 Paul Griffin presentation
 
09 mdm tool comaprison
09 mdm tool comaprison09 mdm tool comaprison
09 mdm tool comaprison
 
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...
 
KASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
KASHTECH AND DENODO: ROI and Economic Value of Data VirtualizationKASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
KASHTECH AND DENODO: ROI and Economic Value of Data Virtualization
 
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...
 
Analytics in a day
Analytics in a day Analytics in a day
Analytics in a day
 
Data Con LA 2022 - Why Data Quality vigilance requires an End-to-End, Automat...
Data Con LA 2022 - Why Data Quality vigilance requires an End-to-End, Automat...Data Con LA 2022 - Why Data Quality vigilance requires an End-to-End, Automat...
Data Con LA 2022 - Why Data Quality vigilance requires an End-to-End, Automat...
 
Data Ninja Webinar Series: Realizing the Promise of Data Lakes
Data Ninja Webinar Series: Realizing the Promise of Data LakesData Ninja Webinar Series: Realizing the Promise of Data Lakes
Data Ninja Webinar Series: Realizing the Promise of Data Lakes
 
Toad Business Intelligence Suite
Toad Business Intelligence Suite Toad Business Intelligence Suite
Toad Business Intelligence Suite
 
Systems Management 2.0: How to Gain Control of Unruly & Distributed Networks
Systems Management 2.0: How to Gain Control of Unruly & Distributed NetworksSystems Management 2.0: How to Gain Control of Unruly & Distributed Networks
Systems Management 2.0: How to Gain Control of Unruly & Distributed Networks
 

Recently uploaded

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 

Recently uploaded (20)

[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 

IBM InfoSphere Optim Solutions - Highlights

  • 1. Data Lifecycle Management © 2012 IBM Corporation Information Management Managing the Data Lifecycle from Requirement to Retirement InfoSphere Optim
  • 2. © 2012 IBM Corporation2 Information Management What we’ll discuss Challenges managing the lifecycle of application data What’s at Stake Leveraging an Information Governance Approach Optimizing the data lifecycle • Discover & Define • Develop & Test • Optimize & Archive • Consolidate & Retire IBM InfoSphere Solutions for Data Lifecycle Management
  • 3. © 2012 IBM Corporation3 Information Management Information Governance Govern Quality Security & Privacy Lifecycle Standards Transactional & Collaborative Applications Business Analytics Applications External Information Sources Success requires governance across the “Information Supply Chain” Analyze Integrate Manage Cubes Big Data Master Data Content Data Streaming Information Data Warehouses Content Analytics
  • 4. © 2012 IBM Corporation4 Information Management InfoSphere Your trusted platform for managing trusted information Comprehensive Most mature and comprehensive capabilities across all of IIG Integrated Integrated capabilities designed to address enterprise use cases Intelligent Prebuilt, Automated, Proactive Transform Enterprise Business Processes & Applications with Trusted Information Deliver Trusted Information for Data Warehousing and Business Analytics Build and Manage Single View Integrate & Govern Big DataMake Enterprise Applications more Efficient Consolidate and Retire Applications Secure Enterprise Data & Ensure Compliance InfoSphere is the market leader in every category of Information Integration and Governance 4
  • 5. © 2012 IBM Corporation5 Information Management Data lifecycle management challenges Forrester estimates that 75% of data stored in large OLTP applications is typically inactive. “ Source: Noel Yuhanna, Forrester Research Inc., Your Enterprise Data Archiving Strategy, February 2011 New application functionality to meet business needs is not deployed on schedule Disclosure of confidential data kept in test/ development environments Application defects or database errors are discovered after deployment Increased operational and infrastructure costs impact IT budget
  • 6. © 2012 IBM Corporation6 Information Management Making applications more efficient remains a major concern Data volume growth remains a major concern for most “What are the key database management issues you see as challenging your organization?” Data volume growth remains a major concern for most “What are the key database management issues you see as challenging your organization?” Base: 194 enterprise database management professionals Source: November 2010 Global Database Management Systems Online Survey; Forrester Research, Inc. Delivering improved performance Delivering higher availability High data volume growth Increasing data management costs Database upgrades Securing private data Data integration issues Database migrations Lack of DBA resources Too many security patches to deploy 47% 48% 5% 49% 45% 6% 50% 45% 5% 53% 42% 5% 55% 42% 3% 56% 41% 3% 58% 35% 7% 59% 37% 4% 60% 37% 3% 62% 35% 3% Challenging Not challenging Don’t know
  • 7. © 2012 IBM Corporation7 Information Management The real organizational impact of ignoring application efficiency High Capital Expenditures • Increased capital expenditures for data storage – Server CPU’s, disk storage systems, database software, power, space, management… Decreased Productivity • Business users losing productivity because systems are not available • Large amount of time by database administrators spent improving performance vs. adding new capabilities Missed Service Level Agreements • Extended time and failures when backing up databases for recovery efforts • Unable to meet deadlines (end of quarter results, updates, upgrades) Ad Hoc Test Data Deployment • Lengthy time to setup and deploy test environments • Deploying applications that partially tested
  • 8. © 2012 IBM Corporation8 Information Management Requirements for managing data across its lifecycle Capture & replay production workloads Define policies Report & retrieve archived data Enable compliance with retention & e-discovery Move only the needed information Integrate into single data source Create & refresh test data Manage data growth Classify & define data and relationships Develop & test database structures/ code Enhance performance Discover where data resides Develop &Develop & TestTest Discover &Discover & DefineDefine Optimize &Optimize & ArchiveArchive Consolidate &Consolidate & RetireRetire Information Governance Core Disciplines Lifecycle Management
  • 9. © 2012 IBM Corporation9 Information Management You can’t govern what you don’t understand Discover & Define Develop & Test Optimize & Archive Consolidate & Retire Information Governance Core Disciplines Lifecycle Management ? ?? ? ? ? ? ? ? ?? ? ? ? ? ? ? ? ?? ? ? ? ? ?? ? ? ? ? • Complex data relationships within and across sources • Historical and reference data for archiving • Test data needed to satisfy test cases • Sensitive data identification
  • 10. © 2012 IBM Corporation10 Information Management Discover & define business objects across heterogeneous databases & applications Referentially-intact subsets of data across related tables & applications, including metadata. DBADBA ViewView Overall historical “snapshot” of business activity, representing an application data record – e.g. payment, invoice, customer BusinessBusiness ViewView Federated access to related business objects across the enterprise CRM on Oracle database ERP / Financials on DB2 Custom Inventory Mgmt on DB2 Discover & Define Develop & Test Optimize & Archive Consolidate & Retire Information Governance Core Disciplines Lifecycle Management
  • 11. © 2012 IBM Corporation11 Information Management Requirements for managing data across its lifecycle Capture & replay production workloads Define policies Report & retrieve archived data Enable compliance with retention & e-discovery Move only the needed information Integrate into single data source Create & refresh test data Manage data growth Classify & define data and relationships Develop & test database structures/ code Enhance performance Discover where data resides Develop &Develop & TestTest Discover &Discover & DefineDefine Optimize &Optimize & ArchiveArchive Consolidate &Consolidate & RetireRetire Information Governance Core Disciplines Lifecycle Management
  • 12. © 2012 IBM Corporation12 Information Management Increasing Risk Mandatory to protect data and comply with regulations Increasing Costs Defects are caught late in the cycle Organizations continue to be challenged with building quality applications Time to Market Lack of realistic test data and inadequate environments Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 13. © 2012 IBM Corporation13 Information Management Increasing Risk 45,000+ Number of sensitive records exposed to 3rd party during testingc 62% companies use actual customer data to test applicationsa Time to Market 37% Satisfied with speed of software developmentf 30-50% Time testing teams spend on setting up test environments, instead of testingb Increasing Costs $300 billion Annual costs of software- related downtime.d 32% Low success rate for software projectse Organizations continue to be challenged with building quality applications a. The Ponemon Institute. The Insecurity of Test Data: The Unseen Crisis b. NIST, Planning Report. The Economic Impacts of Inadequate Infrastructure for Software Testing c. Federal Aviation Administration: Exposes unprotected test data to a third party http://fcw.com/articles/2009/02/10/faa-data-breach.aspx d. The Standish Group, Comparative Economic Normalization Technology Study, CHAOS Chronicles v12.3.9, June 30, 2008 e. The Standish Group, Chaos Report, April 2009 f. Forrester Research, “Corporate Software Development Fails To Satisfy On Speed Or Quality", 2005 Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 14. © 2012 IBM Corporation14 Information Management The impact of inefficient test practices Challenges in Test Data Management “We did not want to create ‘fake’ or unrealistic test data. All test data had to be created and set up manually. So it would take us a month to setup test data for 30 or more accounts.” -- Large US Healthcare Insurer “We needed to improve efficiencies in development and testing environments, as well as our production environments. We can create realistic test environments that use much less disk space than we would by cloning the production database.” -- Allianz Seguros “Our staff wanted to implement more efficient and cost-effective testing processes that would shorten the time for creating and managing multiple test environments.” -- Cetelem Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 15. © 2012 IBM Corporation15 Information Management 100 GB100 GB 25 GB 50 GB50 GB • Create targeted, right-sized test environments • Substitute sensitive data with fictionalized yet contextually accurate data • Easily refresh, reset and maintain test environments • Compare data to pinpoint and resolve application defects faster • Accelerate release schedules Employ effective test data management practices Production or Production Clone 25 GB 2TB Development Unit Test Training Integration Test Subset & Mask -Compare -Refresh Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 16. © 2012 IBM Corporation16 Information Management Speed Delivery Refresh test data speeding testing and application delivery Reduce Risk Mask sensitive information for compliance and protection Reduce Cost Automate creation of realistic “right sized” test data to reduce the size of test environments • Understand what test data is needed for test cases • Create “right-sized” test data by subsetting • Ensure masked data is contextually appropriate to the data it replaced, so as not to impede testing • Easily refresh & maintain test environments by developers and testers • Automate test result comparisons to identify hidden errors • Support for custom & packaged ERP applications in heterogeneous environments • Understand what test data is needed for test cases • Create “right-sized” test data by subsetting • Ensure masked data is contextually appropriate to the data it replaced, so as not to impede testing • Easily refresh & maintain test environments by developers and testers • Automate test result comparisons to identify hidden errors • Support for custom & packaged ERP applications in heterogeneous environments Improve application quality and delivery efficiency with InfoSphere Optim Test Data Management Solution Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 17. © 2012 IBM Corporation17 Information Management Organizations continue to be challenged with planning enterprise changes Channels Direct Call Centers Internet Business Partners Business Units Finance Administration Sales & Marketing Product Management Consulting Services New Business Development Data DB DB DW CRM App App DB DB DW CRM App App DB DBCRM App App DB DB DBCRM App App DB DB DB DW CRM App App Systems Core Systems ERP Core Systems Core Systems Core Systems Core Systems CRM CRM ERP Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 18. © 2012 IBM Corporation18 Information Management Consequences of poorly planned enterprise changes Loss of customer satisfaction Higher costs Delays due to extra testing cycles Time consuming process of rolling back changes; Inability to identify the source of the problem; Manual process of modifying test scripts No methods for fully testing databases workloads Inability to deliver required functionality; Missed services level agreements; Inability to process transactions Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 19. © 2012 IBM Corporation19 Information Management Mitigate challenges with realistic testing Coordinate analysts, developers, DBAs and testers Align teams on project plans Track the impact of data lifecycle events Assess impact with reports • Collaborate in context using production workloads • Which requirements are in the development plan? • Are we including data lifecycle events? • What databases changes are we making? • Does the test team have realistic test cases built from production workloads? • Transparency across teams • Analysis of data related changes to understand problems Use actual production workloads for testing NOT simulated user scenarios Develop a broad approach to testing covering the full data lifecycle Include database workload record and replay as part of existing testing procedures Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 20. © 2012 IBM Corporation20 Information Management IBM InfoSphere Optim Query Capture and Replay Fully assess change impact before production deployment • Limit the use of laborious database test script creation and load simulators by leveraging actual production workloads for testing • Deploy a single database testing solution across heterogeneous systems • Accelerate project delivery by identifying and correcting potential problems sooner • Develop more streamlined, accurate database tests • Compliment existing database tools to pinpoint database problems faster to improve performance • Solve problems more rapidly with deep diagnostics and validation reports • Limit the use of laborious database test script creation and load simulators by leveraging actual production workloads for testing • Deploy a single database testing solution across heterogeneous systems • Accelerate project delivery by identifying and correcting potential problems sooner • Develop more streamlined, accurate database tests • Compliment existing database tools to pinpoint database problems faster to improve performance • Solve problems more rapidly with deep diagnostics and validation reports Improve customer satisfaction Anticipate and correct potential problems sooner Reduce cost of change Establish consistent database testing processes Meet SLAs Ensure well tuned, high performing workloads before deployment Discover& Define Optimize & Archive Consolidate & RetireDevelop & Test Information Governance Core Disciplines Lifecycle Management
  • 21. © 2012 IBM Corporation21 Information Management Requirements for managing data across its lifecycle Capture & replay production workloads Define policies Report & retrieve archived data Enable compliance with retention & e-discovery Move only the needed information Integrate into single data source Create & refresh test data Manage data growth Classify & define data and relationships Develop & test database structures/ code Enhance performance Discover where data resides Develop &Develop & TestTest Discover &Discover & DefineDefine Optimize &Optimize & ArchiveArchive Consolidate &Consolidate & RetireRetire Information Governance Core Disciplines Lifecycle Management
  • 22. © 2012 IBM Corporation22 Information Management Organizations have been increasingly challenged with successfully managing data growth Increasing Costs Poor Application Performance Manage Risk & Compliance Business users & customers wait for application response; DBA’s spend majority of time fixing performance issues The “keep everything” strategy can impact disaster recovery and data retention & disposal compliance Buying more storage is not a “cheap” fix when you add the operational burden Develop & Test Discover& Define Consolidate & RetireOptimize & Archive Information Governance Core Disciplines Lifecycle Management
  • 23. © 2012 IBM Corporation23 Information Management Organizations have been increasingly challenged with successfully managing data growth Increasing Costs Poor Application Performance Manage Risk & Compliance (a) Merv Adrian, IT Market Strategies, “Data Growth Challenges Demand Proactive Data Management”, November 2009 (b) IDC, “Worldwide Archival Storage Solutions 2011–2015 Forecast: Archiving Needs Thrive in an Information-Thirsty World”, October 2011 (c) Simple-Talk, “Managing Data Growth in SQL Server ”, January 2010 (d) IBM Client Case Study: Toshiba TEC Europe; archiving reduced batch process time by 75% (e) IDC Quick Poll Survey 2011, “Data Management for IT Optimization and Compliance”, November 2011 3-10xCost of managing storage over the cost to procurea $1.1 billionAmount organizations will have spent in 2011 on storageb 80%The time DBA’s spend weekly on disk capacity issuesc 250 hoursThe amount of time needed to run “daily” batch processesd 50%of firms retain structured data for 7+ yearse 57%of firms use Back-up for data retention needse Develop & Test Discover& Define Consolidate & RetireOptimize & Archive Information Governance Core Disciplines Lifecycle Management
  • 24. © 2012 IBM Corporation24 Information Management Archive historical data for data growth management CurrentCurrent Production HistoricalHistorical ArchiveArchive RetrieveRetrieveRestored DataRestored Data Universal Access to Application DataUniversal Access to Application Data Data Archives Historical DataHistorical Data Reference DataReference Data Can selectively restore archived data records ODBC / JDBC XML Report WriterApplication Data Archiving is an intelligent process for moving inactive or infrequently accessed data that still has value, while providing the ability to search and retrieve the data Develop & Test Discover& Define Consolidate & RetireOptimize & Archive Information Governance Core Disciplines Lifecycle Management
  • 25. © 2012 IBM Corporation25 Information Management Requirements for managing data across its lifecycle Capture & replay production workloads Define policies Report & retrieve archived data Enable compliance with retention & e-discovery Move only the needed information Integrate into single data source Create & refresh test data Manage data growth Classify & define data and relationships Develop & test database structures/ code Enhance performance Discover where data resides Develop &Develop & TestTest Discover &Discover & DefineDefine Optimize &Optimize & ArchiveArchive Consolidate &Consolidate & RetireRetire Information Governance Core Disciplines Lifecycle Management
  • 26. © 2012 IBM Corporation26 Information Management When it’s time to retire or consolidate applications Develop & Test Optimize & Archive Discover& Define Consolidate & Retire Information Governance Core Disciplines Lifecycle Management In almost ALL cases, access to legacy data MUST be retained while the application and database are eliminated Application portfolio has redundant systems acquired via mergers and acquisitions Line of business divested; application is no longer needed Legacy technologies not compatible with current IT direction • Old database and/or application versions no longer supported by manufacturer Required technical skills or application knowledge no longer available Budget pressures – do more with less
  • 27. © 2012 IBM Corporation27 Information Management Retire redundant and legacy applications Archived Data after ConsolidationInfrastructure before Retirement User Archive Engine Archive Data User User DatabaseApplication DataUser DatabaseApplication DataUser DatabaseApplication DataUser Preserve application data in its business context • Capture all related data, including transaction details, reference data & associated metadata • Capture any related reference data that may reside in other application databases Retire out-of-date packaged applications as well as legacy custom applications • Leverage out-of-box support of packaged applications to quickly identify & extract the complete business object Shut down legacy system without a replacement • Provide fast and easy retrieval of data for research and reporting, as well as audits and e-discovery requests Develop & Test Optimize & Archive Discover& Define Consolidate & Retire Information Governance Core Disciplines Lifecycle Management
  • 28. © 2012 IBM Corporation28 Information Management Effectively archive and manage data growth with InfoSphere Optim • Discover & identify data record types to archive across heterogeneous environments • Intelligently archive data to improve application performance and support data retention • Capture & store historical data in its original business context • Define & maintain data retention policies consistently across the enterprise • Ensure long-term, application-independent access of archived data via multiple access methods • Support for custom & packaged ERP applications in heterogeneous environments • Leverage comprehensive solution for application consolidation and retirement across your enterprise • Discover & identify data record types to archive across heterogeneous environments • Intelligently archive data to improve application performance and support data retention • Capture & store historical data in its original business context • Define & maintain data retention policies consistently across the enterprise • Ensure long-term, application-independent access of archived data via multiple access methods • Support for custom & packaged ERP applications in heterogeneous environments • Leverage comprehensive solution for application consolidation and retirement across your enterprise Reduce Costs Reduce hardware, software, storage & maintenance costs of enterprise applications Improve Performance Improve application performance & streamline back-ups and upgrades Minimize Risk Support data retention regulations & safely retire legacy/redundant applications
  • 29. © 2012 IBM Corporation29 Information Management What is the market saying? IBM’s Optim product line led the database archiving and ILM segment in 2010 with a 52.3% share, and showed nearly 18% growth in 2010. Today, IBM continues to lead the industry with the most comprehensive data archiving solution and the largest installed base … IBM’s customers spoke highly of the Optim solution’s reliability and strong performance. Organizations can realize benefits in the form of improved operational and capital cost savings, improved IT and end user efficiency, as well as higher levels of data protection and application performance [with InfoSphere Optim]. Source: IDC – Worldwide Database Development & Management Tools 2010 Vendor & Segment Analysis, December 2011 Source: Forrester Research – “Your Enterprise Data Archiving Strategy”, N. Yuhanna, February 2011 Source: Forrester Research – Total Economic Impact Study for InfoSphere Optim, October 2009 “ “ “
  • 30. © 2012 IBM Corporation30 Information Management Gartner key findings Source: Gartner, Inc., “Market Trends: World, Database Archiving Market Continues Rapid Growth, 2011”, S. Childs & A Dayley, September 2011 Vendor Market Share by 2010 Revenue Vendor Market Share by Total Number of Customers We believe the market for database archiving and application retirement is vibrant and dynamic, and will see continued solid growth over the next five years. Organizations are looking to database archiving vendors that offer packaged and custom application support in order to control storage growth, improve application performance, and support compliance, audit and e-discovery activities. “ “
  • 31. © 2012 IBM Corporation31 Information Management IBM provides the expertise to manage the data lifecycle improving application efficiency The top challenge for 43% of CFOs is improving governance, controls, and risk management CFO Survey: Current state & future direction, IBM Business Consulting Services Reduce the cost of data storage, software and hardware Improve application efficiency and performance Reduce risk and support compliance with retention requirements Speed time to market and improve quality
  • 32. © 2012 IBM Corporation32 Information Management Solution components: • IBM InfoSphere Optim Data Growth Solution • IBM InfoSphere Optim Data Masking Solution • IBM InfoSphere Optim Test Data Management Solution “We knew that Optim would provide the capabilities we needed to automate the process of retrieving archived records. So far, we have shortened the retrieval process from several days to a few hours.” —Xavier Mascaró, Senior DBA, Allianz Seguros Allianz Seguros Reduces record retrieval time from days to hours Allianz Seguros Case Study The need: The Allianz Seguros often had at least three development and testing mainframe environments in use at the same time. The quality of these environments degraded quickly because they were used for multiple tests. Every few months, the development team had to use an in-house “subsetting” program to copy test data from its large application production environment, which comprised about 700 GB with tables that contained over 200 million rows. The solution: IBM InfoSphere Optim’s subsetting capabilities help create realistic test environments that preserve data integrity, even when combined with data masking techniques. Allianz also uses InfoSphere Optim to provide comprehensive archiving capabilities for managing data growth and faster, more efficient methods for archiving, accessing and retrieving archived records. The benefits: • Reduced resources needed for application development and testing by creating smaller, more robust testing environments • Applied proven data masking techniques to protect privacy and support compliance with LOPD regulations • Reduced time to retrieve archived insurance records from days to hours
  • 33. © 2012 IBM Corporation33 Information Management Arek Oy Deploys a pension earnings and accrual system in 30 months The benefits: • Improved development and testing efficiencies, enabling Arek Oy to promote faster deployment of new pension application functionality and enhancements • Protected confidential data to strengthen public confidence and support TyEL compliance requirements. The need: Pension laws (TyEL) in Finland changed radically in 2007. In response, Arek Oy had to develop and deliver a tested and reliable Pension Earnings and Accrual System within 30 months. Arek Oy had to protect confidential employee salary and pension information in multiple non-production (development and testing) environments. Failure to satisfy requirements would result in loss of customer good will and future business opportunities. The solution: Using IBM InfoSphere Optim subsetting capabilities rather than cloning large production databases made it possible for Arek Oy staff to create robust, realistic test databases that supported faster iterative testing cycles. In addition, InfoSphere Optim offered proven capabilities for performing complex data masking routines, while preserving the integrity of the pension data for development and testing purposes. “We see Optim as an integral part of our development solution set. Optim’s data masking capabilities help ensure that we can protect privacy in our development and testing environments.” — Katri Savolainen, Project Manager, Arek Oy Solution components: • IBM InfoSphere Optim Data Masking Solution • IBM InfoSphere Optim Test Data Management Solution Arek Oy Case Study
  • 34. © 2012 IBM Corporation34 Information Management Toshiba TEC Europe Reduces database size by 30% resulting in 75% increase in application availability Toshiba TEC Europe implemented InfoSphere Optim to archive historical transactions. This reduced batch processing time for 19,000 jobs from 250 hours to only 65 hours. As a result, Toshiba TEC was able to satisfy business unit requirements and support continued business growth. Solution components: • IBM InfoSphere Optim Data Growth Solution for Oracle E- Business Suite Toshiba TEC Europe Case Study The benefits: • Managed continued data growth by archiving historical transactions • Deployed Oracle E-Business Suite across all business units • Reduced database size by 30% • Increased application availability by 75% by archiving historical transactions • Improved service levels and operations by ensuring access to current and historical transactions. The need: Proactively manage application data growth to support business expansion and deployment of Oracle E-Business Suite across business units. Increase application availability by reducing the time to complete 19,000 daily batch processing jobs exceeding 250 hours. Integrate and consolidate data and processes with the other Toshiba European entities to improve service levels and operational efficiencies. The solution: IBM InfoSphere Optim provides comprehensive database archiving capabilities to address data growth issues and integrates with Oracle E-Business Suite to improve overall business processes.
  • 35. © 2012 IBM Corporation35 Information Management Virginia Community College System (VCCS) Retains access to student data and deploys policy-driven archiving Solution components: • IBM InfoSphere Optim Data Growth Solution for PeopleSoft Enterprise Campus Solutions IMP14312-USEN-00 “We anticipate that the initial payback from implementing Optim will be that we have successfully stopped that vicious cycle of running out and purchasing high-performance hardware, which can cost hundreds of thousands of dollars.” —Andy Clark, Technical Lead for HR/CS, Virginia Community College System Source: VCCS case study approved for external use. The benefits: • Effectively manages data growth to improve service levels • Offers flexibility to archive 10 or more years of inactive student data • Enables staff to selectively restore student records as needed • Lowered infrastructure costs by eliminating frequent expensive server upgrades The need: The Virginia Community College System wanted an out-of-the box archiving solution for PeopleSoft Enterprise Campus Solutions that would help manage data growth without expensive server upgrades, support compliance requirements, and reduce the time spent on performance tuning and related issues. The solution: Using IBM® InfoSphere™ Optim™ software, VCCS can archive complete historical student records in batches for students who have graduated or been inactive for at least 10 years; access archived data for reporting and analysis; process requests for transcripts against archived student data without having to restore the data to the production environment; and selectively restore a complete record for a single student on demand.
  • 36. © 2012 IBM Corporation36 Information Management Learn more • Product Family Webpage • Solution Sheet: InfoSphere Optim Solutions for Data Lifecycle Management • Quick ROI: Self Service BVA for Data Growth • Case Studies: InfoSphere Optim Contact your IBM representative for more information on InfoSphere Optim data lifecycle management solutions!
  • 38. © 2012 IBM Corporation38 Information Management IBM InfoSphere Optim Solutions ProductionProduction Dev/TestDev/Test ArchiveArchive Discover Understand Classify  Reduce hardware, software, storage & maintenance costs for enterprise applications  Improve application performance & streamline back-ups and upgrades  Support data retention regulations & safely retire legacy/redundant applications Archive  Reduce cost, reduce risk & speed application delivery by maintaining right-sized test environments  Ensure compliance and privacy by masking  Improve customer satisfaction, reduce the cost of change and meet SLAs by using production workloads for testing Test  Accelerate data management projects and reduce risk by understanding complex data relationships within & across systems Discover • Capture • Archive • Retire DATA • Replay SQL SQL ApplicationApplicationApplicationApplication DATA • Subset • Mask • Compare • Refresh Mask  Ensure compliance by masking on demand
  • 39. © 2012 IBM Corporation39 Information Management IBM InfoSphere Optim supports the heterogeneous enterprise Single, scalable, heterogeneous information lifecycle management solution provides a central point to deploy policies to extract, archive, subset, and protect application data records from creation to deletion Manage Data GrowthData MaskingTest Data Management Application RetirementDiscover Partner-delivered SolutionsPartner-delivered Solutions
  • 40. © 2012 IBM Corporation40 Information Management InfoSphere Optim Packages Wo InfoSphere Optim Data Privacy InfoSphere Optim Test Data Management (Core + SAP) InfoSphere Optim Archive EnterpriseEdition WorkgroupEdition InfoSphere Optim Enterprise Edition for Oracle Applications InfoSphere Optim Query Capture & Replay
  • 41. © 2012 IBM Corporation41 Information Management IBM InfoSphere Discovery Requirements Benefits • Automation of manual activities accelerates time to value • Business insight into data relationships reduces project risk • Provides consistency across information agenda projects • Define business objects for archival and test data applications • Discover data transformation rules and heterogeneous relationships • Identify hidden sensitive data for privacy Accelerate project deployment by automating discovery of your distributed data landscape Discovery
  • 42. © 2012 IBM Corporation42 Information Management IBM InfoSphere Optim Test Data Management Solution Requirements Benefits • Deploy new functionality more quickly and with improved quality • Easily refresh & maintain test environments • Protect sensitive information from misuse & fraud with data masking • Accelerate delivery of test data through refresh • Create referentially intact, “right-sized” test databases • Automate test result comparisons to identify hidden errors • Protect confidential data used in test, training & development • Shorten iterative testing cycles and accelerate time to market Create “right-size” production-like environments for application testing Test Data Management 100 GB100 GB 25 GB 50 GB50 GB 25 GB 2TB2TB Development Unit Test Training Integration Test -Subset -Mask Production or Production Clone InfoSphere Optim TDM supports data on distributed platforms (LUW) and z/OS. Out-of-the-box subset support for packaged applications ERP/CRM solutions as well as : OtherOther -Compare -Refresh
  • 43. © 2012 IBM Corporation43 Information Management IBM InfoSphere Optim Data Masking Solution Personal identifiable information is masked with realistic but fictional data for testing & development purposes. JASON MICHAELS ROBERT SMITH Requirements Benefits • Protect sensitive information from misuse and fraud • Prevent data breaches and associated fines • Achieve better data governance • Protect confidential data used in test, training & development systems • Implement proven data masking techniques • Support compliance with privacy regulations • Solution supports custom & packaged ERP applications De-identify sensitive information with realistic but fictional data for testing & development purposes Data Masking
  • 44. © 2012 IBM Corporation44 Information Management IBM InfoSphere Optim Data Growth Solution • Reduce hardware, storage and maintenance costs • Streamline application upgrades and improve application performance • Safely retire legacy & redundant applications while retaining the data • Archive, manage and retain application & warehouse data according to business policies • Minimize downtime during application upgrades • Consolidate application portfolio and retire legacy applications Manage data growth and improve performance by intelligently archiving historical data CurrentCurrent HistoricalHistorical ArchiveArchive RetrieveRetrieve Universal Access to Application DataUniversal Access to Application Data Historical DataHistorical Data Reference DataReference Data XML Report Writer Application ODBC / JDBC InfoSphere Optim Data Growth supports data on distributed platforms (LUW) and z/OS. Out-of-the-box archiving support for packaged applications and data warehouses include: Data Growth Requirements Benefits IBM NetezzaIBM Netezza®® TeradataTeradata®® OtherOther
  • 45. © 2012 IBM Corporation45 Information Management IBM InfoSphere Optim Solution for Application Retirement Infrastructure before RetirementInfrastructure before Retirement ` User DatabaseApplication Data ` User DatabaseApplication Data ` User DatabaseApplication Data Archived Data after ConsolidationArchived Data after Consolidation ` User Archive DataArchive Engine ` User ` User Requirements Benefits • Ensure compliance and access to valuable data • Safely retire legacy & redundant applications while retaining the data • Reduce hardware, storage and maintenance costs • Archive, manage and retain application data according to data retention policies • Provide application independent access to archived data • Consolidate application portfolio and retire legacy applications Manage and provide access to application data as part of an application retirement project Application Retirement
  • 46. © 2012 IBM Corporation46 Information Management InfoSphere Optim System Analyzer for SAP® Applications Automatically identify SAP system changes & understand their impact to minimize risk OSA for SAP SAP Certified Optim System Analyzer supports both distributed (LUW) and z/OS platforms. Requirements Benefits • Reduce time, cost, complexity & risk for SAP problem solving • Provide sophisticated real- time data to help solve any SAP lifecycle initiative • Eliminate manual effort of retrieving, comparing & reporting on SAP data • Provide automation to help solve on-going SAP lifecycle activities • Pre-built templates to support major SAP lifecycle events and maintenance • Detailed impact analysis, precise diagnostics & advanced reporting
  • 47. © 2012 IBM Corporation47 Information Management InfoSphere Optim Business Process Analyzer for SAP applications Establish traceability between your SAP business processes and data structures SAP Certified BPA for SAP Requirements Benefits • Improve collaboration between business user and technical user • Ensure better visibility of the impact of changes to the SAP business process • Mitigate risk early in the SAP application lifecycle • Automatically capture business process from your SAP data structures • Visualize how changes will impact the business process • Manage business processes in single- and multi-instance SAP landscapes
  • 48. © 2012 IBM Corporation48 Information Management InfoSphere Optim Test Data Management Solution for SAP® Applications Create manageable real-world SAP data scenarios to improve the quality of development, testing & training TDM for SAP SAP Certified Optim TDM for SAP supports both distributed (LUW) and z/OS platforms. Requirements Benefits • Improve operational efficiencies by shortening iterative testing cycles • Extract data with no client copy downtime or performance degradation • Reduce the time, cost & risk across SAP lifecycle events • Create targeted, “right-sized” subsets for test, development and training environments • Leverage pre-built business objects and user-defined criteria for extracting data • Invoked within SAP GUI featuring easy point-and-click environment
  • 49. © 2012 IBM Corporation49 Information Management InfoSphere Optim Application Repository Analyzer Analyze application metadata to identify relationships & customizations within the Oracle family of packaged applications ARA Requirements Benefits • Quickly identify application customizations to speed data lifecycle projects • Reduce time and improve quality of application updates • Easily export the complete business object for archive, subsetting & masking projects • Analyze application metadata to identify data models, relationships & customizations • Compare data model structures across application versions & releases • Integrate with InfoSphere Optim solutions
  • 50. © 2012 IBM Corporation50 Information Management IBM InfoSphere Optim Query Capture and Replay Requirements Benefits Capture production workloads and replay them in testing environments • Minimize unexpected production problems • Shorten testing cycles • Develop more realistic database testing scenarios • Identify database problems sooner with validation reports and performance tuning • Use actual production workloads for testing rather than fabricated scenarios • Extend quality testing efforts to include the data layer Test Database Source DatabaseApplication InfoSphere Optim Query Capture and Replay Record Play Record and replay SQL
  • 51. © 2012 IBM Corporation51 Information Management IBM InfoSphere Optim Top market leader in database archiving & information lifecycle management “IBM’s Optim product line led the database archiving and ILM segment in 2010 with a 52.3% share, and showed nearly 18% growth in 2010.” Source: IDC - Worldwide Database Development & Management Tools 20010 Vendor & Segment Analysis, December 2011
  • 52. © 2012 IBM Corporation52 Information Management Client Value Engagement (CVE) Helping our customers achieve their business and technical objectives Identify Technical &Identify Technical & Business ChallengesBusiness Challenges Identify Technical &Identify Technical & Business ChallengesBusiness Challenges Determine CurrentDetermine Current (As-Is) Costs(As-Is) Costs Determine CurrentDetermine Current (As-Is) Costs(As-Is) Costs Determine FutureDetermine Future (To-Be) Costs(To-Be) Costs Determine FutureDetermine Future (To-Be) Costs(To-Be) Costs Technical SolutionTechnical Solution BlueprintBlueprint Technical SolutionTechnical Solution BlueprintBlueprint CVECVE Final ResultsFinal Results CVECVE Final ResultsFinal Results Define & Identify technical & business Problems / Challenges Identify Future Process & Costs with the Recommended Solution (To-Be) CVE Engagement Summary & Final Analysis Identify Current Process and Costs (As-Is) Measure the Difference Between As-Is & To-Be InfoSphere Optim CVE Offerings: Data Growth, Application Retirement, Test Data Management & Data Masking
  • 53. © 2012 IBM Corporation53 Information Management InfoSphere Optim Services Offerings On Demand Consulting Services Offering Expert Review Services Offering • InfoSphere Optim Install Package InfoSphere Optim Implementation & Configuration (custom) • Bronze • Silver • Gold Analyze & Design Monitor InfoSphere Optim Health Check InfoSphere Optim Upgrade Education Assessment Start up Configure Deploy & Operate Pre-sales Strategic Planning • InfoSphere Optim Data Growth Offering • InfoSphere Optim Decommissioning Foundation Services InfoSphere Discovery JumpStart

Editor's Notes

  1. This is the sales enablement presentation for Information Lifecycle Management Solutions which covers InfoSphere Optim.
  2. Challenges managing the lifecycle of application data What’s at Stake Leveraging an Information Governance Approach Optimizing the data lifecycle Discover & Define Develop & Test Optimize & Archive Consolidate & Retire IBM InfoSphere Solutions for Data Lifecycle Management
  3. This is the information supply chain slide, that many of you maybe familiar with from the prior sessions. InfoSphere Optim is represented under “Lifecycle” in the red box. Additional notes explaining the slide: There are typically hundreds or even thousands of different systems throughout an organization. Information can come in from many places (transaction systems, operational systems, document repositories, external information sources), and in many formats (data, content, streaming). Wherever it comes from, there are often meaningful relationships between various sources of data. We manage all this information in our systems, integrate to build warehouses and master the data to get single views and analyze it to make business decisions. This is a supply chain of information, flowing throughout the organization. Unlike a traditional supply chain, an information supply chain is a many-to-many relationship. With information, the same data about a person can come from many places – he may be a customer, an employee and a partner -- and that information can end up in many reports and applications. Different systems may define the information differently as well. This makes integrating information, ensuring its quality and interpreting it correctly crucial to using the information to make better decisions. Information must be turned into a trusted asset, and governed to maintain the quality over its lifecycle. The underlying systems must be cost effective, easy to maintain and perform well for the workloads they need to handle, even as information continues to grow at astronomical rates. These are the needs that have driven IBM’s strategy and investments in this area.
  4. InfoSphere is your trusted platform for managing trusted information that is comprehensive, integrated and intelligent. InfoSphere Optim is part of the InfoSphere Platform under Information Lifecycle Management.
  5. To stay competitive, organizations look to improve business processes and better manage their information. But there are road blocks along the way. So, what are some of those challenges when managing this lifecycle? New application functionality to meet business needs is not deployed on schedule e.g. Organizations are often challenged by application releases. I like to say that software development and airline industry have a lot in common – they are always on time, until they are late. One reason for these delays is the creation & management of test and development environments. Often this is done by simply cloning production data to create test data. But depending upon the size of production, this method can impede progress. How long does it take to create them? Refresh them? Are developers waiting around? In addition challenges include: No understanding of relationships between data objects delays projects Greater data volumes take longer to clone, test, validate and deploy which equates to longer test cycles Inability to replicate production conditions in test Disclosure of confidential data kept in test/development environments And if you’re using production data to create the test/dev environments, how are you keeping track of sensitive data? Are you in compliance with industry regulations? Does the developer upgrading the HR/payroll system *really* need to see everyone’s salary information? Or can that person use realistic – but fictional data - to complete the work? Application defects or database errors are discovered after deployment Are you to easily validate test data to ensure application errors are caught in the test/dev/QA process? Or is much of the test & development time spent sifting through and fixing the data vs. the application? Costs to resolve defects in production can be 10 – 100 times greater than those caught in the development environment Inability to meet SLAs for responsiveness and availability Increased operational and infrastructure costs impact IT budget: As data volumes increase year after year – compounded by the cloning needed for test & development - Do you have enough disk storage – or enough db licenses - to create the needed non-production environments? How does this impact your IT staff resources? How does this impact system uptime for your business users? Cloning databases requires more storage hardware Larger databases impact staff productivity and leads to additional license costs Load simulators and complex database test scripts require highly skilled staff As this quote from Forrester indicates, many organizations today are keeping a lot of infrequently used data in production databases. And this can have a negative impact on both performance & IT costs, as we’ll see when we discuss more on the challenges of managing growing amounts of data.
  6. Most large enterprises have petabytes of data stored in various data repositories across the organization — and this is likely to grow to exponentially in the coming years. As a result, enterprises store more data every year in production systems. With these increasing data volumes come increasing costs as well as increasing challenges in securing and managing online data to deliver high performance and availability. Forrester’s November 2010 Global Database Management Systems Online Survey evidenced this trend: Respondents indicated that the top five challenges they face are delivering improved performance, delivering higher availability, dealing with high data volume growth, increasing data management costs, and database upgrades.
  7. Some organizations take a “reactive” approach to managing the lifecycle of their data, taking action after the “headaches” begin impacts application efficiency. From a recent IBM survey, here are some key areas that our clients identified as catalysts to their search for a better data lifecycle management approach . High capital expenditures: In a “reactive mode”, often the solution was to add more high-performance hardware to ensure there was enough storage and to improve the declining performance of their existing infrastructure. Decreased productivity: Poor application performance impacted the business users’ ability to perform daily tasks. Batch processes were creeping into working hours. IT staff was frantically trying to tune databases or add more storage in response. And this lead to… Missed service level agreements: Which can impact revenue and customer satisfaction if databases and applications are not responding as they should Ad hoc performance management: Without a clear strategy in place to manage and optimize application performance, inefficient ad hoc fixes were leveraged, draining IT resources As organizations strive to stay ahead of the competition, a more “proactive” approach to data lifecycle management is needed to ensure the data is accessible and trusted.
  8. Let’s take a closer look at the requirements needed to manage the data lifecycle across the information supply chain. There are four main areas organizations should focus on for streamlining management of the data lifecycle and making applications more efficient: Discover & Define: Understanding where data resides, what domains of information exist, how its related across the enterprise and define the policies and standards for management of it. Develop & Test : Creating the database structures and re-useable database code to enhance productivity & team collaboration, efficiently creating the test & development environments (and protecting sensitive data within), and leveraging actual production workloads through database workload capture and replay. Optimize & Archive -Ensuring optimal application performance, archiving historical data to manage data growth, and ensuring business users have effective access to the data – both production and archived. Consolidate & Retire: Rationalizing the application portfolio, consolidating and decommissioning applications that are redundant or no longer align with current IT technology – but maintain access to the data per data retention rules, long after the application has been retired. DETAIL The core products for each stack are as follows: Discover & Define : InfoSphere Discovery, InfoSphere Business Glossary, InfoSphere Data Architect Develop & Test : InfoSphere Optim Development Studio, InfoSphere Optim pureQuery, InfoSphere Optim Test Data Management Solutions, InfoSphere Optim Privacy (Masking) Solutions Optimize, Archive & Access : InfoSphere Optim Performance Manager, InfoSphere Optim PureQuery, InfoSphere Optim Data Growth Solution Consolidate & Retire : InfoSphere Optim Application Consolidation (including ERP systems)
  9. In order to define a governance strategy and a process to achieve your organization’s goal, you first have to understand what you have. Without this, you cannot create an effective plan that will support your organization. This process begins with understanding the web of information represented in your enterprise applications and databases. You must understand: - where the data exists and what data elements there are - what complex relationships exists within and across sources - where the historical and reference data is for archiving - what test data is needed to satisfy the test cases - where is sensitive data located Many organizations rely on documentation (which is often out-dated) or on system/application experts for this information. Sometimes, this information is built into application logic and is not apparent to anyone the hidden relationships that might be enforced behind the scenes. It’s all about time, cost and risk. Trying to manually understand this information (or using the ‘spot check’ approach) can you lead you down the wrong path resulting in many lost hours in the future including potentially delays for project deployment. More information for speaker: The solutions necessary for the process by which we locate and understand the data relationships: Locate and inventory the databases across the enterprise Again, you can’t govern data if you don’t know where it resides. So ensure your solution can help you discover and document the data entities and the databases that reside in the enterprise. Define business objects* across heterogeneous databases & applications Understand how data is related across the enterprise to better deploy new functionality and ensure that the complete business object is captured when archiving data. Define enterprise-standard data models For example, set up in your data model to estimate database growth capacity to determine when to archive historical data Understand transformation rules to discover data relationships For example if you ever were to retire an application, you need to understand the underlying business logic to ensure you capture the needed related data to ensure your archived files make sense (See example of this in slide 15) Understand relationships required for identifying sensitive data – simple, embedded or compound. How is sensitive data related to other areas across the enterprise? Ensure it’s protected everywhere, consistently. Define and document the privacy & masking rules and propagate to ensure sensitive data will be protected How is that data going to be used? Who should have access to it and why? And as you mask sensitive data in one table, how do you ensure all related data elements are masked with the same information, keeping the referential integrity of the test data? Leverage unified scheme builder to create prototypes before deployment When you think about managing data across it’s lifecycle, at some point, you may need to retire applications and consolidate the data. By pre-testing the data that needs to be consolidated, you can ensure developers can update and/or deploy applications or new functionality with confidence.
  10. What is a “complete business object?” Why is it important to capture a complete business object? Business objects represent the fundamental building blocks of your application data records. From a business perspective, for example, a business object could be a payment, invoice, paycheck or customer record. From a database perspective, a business object represents a group of related rows from related tables across one or more applications, together with its related “metadata” (information about the structure of the database and about the data itself). Data lifecycle mgmt solutions that capture and process the complete business record thus create a valid snapshot of your business activity at the time the transaction took place – an “historical reference snapshot.” For example, when you archive a complete business object, you create a standalone repository of transaction history. If you are ever asked to provide proof of your business activity (for example, if you receive an audit or e-discovery request), your archive represents a “single version of the truth.” You can simply query the archive to locate information or generate reports. Another example in which a complete business object is important is, when you create test data, this allows organizations to identify complete business objects that can be used to create right sized referentially preserved test environments. Federated object support means the ability to capture a complete business object from multiple related applications, databases and platforms. For example, Optim can extract a "customer" record from Siebel, together with related detail on purchased items from a legacy DB2 inventory management system. Federated data capture ensures that your data management operations accurately reflect a complete, end-to-end business process. Only Optim provides federated object support.
  11. Let’s take a closer look at the requirements needed to manage the data lifecycle across the information supply chain. There are four main areas organizations should focus on for streamlining management of the data lifecycle: Discover & Define: Understanding where data resides, what domains of information exist, how its related across the enterprise and define the policies and standards for management of it. Develop & Test : Creating the database structures and re-useable database code to enhance productivity & team collaboration, efficiently creating the test & development environments (and protecting sensitive data within), and leveraging actual production workloads through database workload capture and replay. Optimize & Archive: Ensuring optimal application performance, archiving historical data to manage data growth, and ensuring business users have effective access to the data – both production and archived. Consolidate & Retire: Rationalizing the application portfolio, consolidating and decommissioning applications that are redundant or no longer align with current IT technology – but maintain access to the data per data retention rules, long after the application has been retired. DETAIL The core products for each stack are as follows: Discover & Define : InfoSphere Discovery, InfoSphere Business Glossary, InfoSphere Data Architect Develop & Test : InfoSphere Optim Development Studio, InfoSphere Optim pureQuery, InfoSphere Optim Test Data Management Solutions, InfoSphere Optim Privacy (Masking) Solutions Optimize, Archive & Access : InfoSphere Optim Performance Manager, InfoSphere Optim PureQuery, InfoSphere Optim Data Growth Solution Consolidate & Retire : InfoSphere Optim Application Consolidation (including ERP systems)
  12. Organizations continue to be challenged with building and delivering quality applications. The cost go spiraling up as defects are caught late in the cycle where they are expensive to correct. They are challenged with increasing risks associated with protecting data and complying with regulations. Time to market of these applications is critical to success of the businesses yet long testing cycles and resources often delay delivery of the software on time …. inadequate test environments, lack of realistic test data are contributing reasons for this challenge.
  13. The challenges are real…. $300 Billion is the annual costs of software related downtime. An FAA server used for application development & testing was breached, exposing the personally identifiable information of 45,000+ employees. 62% of companies use actual customer data to test applications exposing sensitive information to testers and developers 30-50% time testing teams spend on setting up test environments instead of testing. .
  14. Let’s look at some specific challenges related to the impact of inefficient test practices and what our customers are saying… -creating realistic test data for their testing efforts. -Lack of insight into the data environment so developers and testers don’t understand how to work with data -SLA missed due to lack of development and DBA communication -Simply cloning entire production creates duplicate copies of large test databases -Data masking requirements are not addressed Quote References 1st Quote: http://www-01.ibm.com/software/success/cssdb.nsf/CS/JHAL-7ZLTW7?OpenDocument&Site=dmmain&cty=en_us 2ndQuote: http://www-01.ibm.com/software/success/cssdb.nsf/CS/LWIS-7E2S6V?OpenDocument&Site=dmmain&cty=en_us 3rd Quote: http://www-01.ibm.com/software/success/cssdb.nsf/CS/LWIS-7F66X2?OpenDocument&Site=default&cty=en_us
  15. For generating the test data, it’s critical to productivity to create “right sized” subsets for all your testing needs, allowing testers and developers to easily extract, refresh and create properly sized data sets. After running tests, relationally compare results sets on the new data set and the actual production data to see the exact differences – and only the differences. This can help resolve application defects faster. Part of effective test data management is the ability to protect the sensitive data within these non-production environments. Ensure sensitive test data is masked while maintaining the referential integrity of the data, while ensuring this data transformation is appropriate to the context of the application. That is, the results of data transformation have to make sense to the person reviewing the test results. For example, if an address is needed, you would like to use a street address that actually exists as opposed to using something meaningless like XXXXXX as a street name.
  16. With InfoSphere Optim Test Data Management Solution, organizations can make test data management a best practice, helping them to reduce costs, lower risk, and expedite delivery in three key ways: Automate the creation of realistic, “right-sized” test data to reduce the size of test environments Mask sensitive information for compliance and protection Refresh test data, thereby speeding up testing and application delivery. Key differentiators include: Understand what test data is needed for test cases Create “right-sized” test data by subsetting Ensure masked data is contextually appropriate to the data it replaced, so as not to impede testing Easily refresh & maintain test environments by developers and testers Automate test result comparisons to identify hidden errors Support for custom & packaged ERP applications in heterogeneous environments
  17. How will a database configuration change affect the enterprise? What about an application change? With today’s complex enterprises, its not easy to anticipate if a change will disrupt business operations. Often organizations resort to finger pointing. Change windows are very tight and most shops squeeze in as many changes as they can during these short blocks of time. This makes it difficult to isolate a particular change as the problem source.
  18. Business Pains Inability to deliver required functionality to customers Missed services level agreements Loss of customer satisfaction Inability to process transactions resulting in lost revenue Cost of adding additional HW or SW to relieve immediate pain Cost of issue remediation IT Pains Time consuming, labor intensive process of rolling back changes Inability to identify and correct the source of the problem Tedious manual process of modifying test scripts Disruptions Business applications becomes unavailable Business application response time degrades Costs Business Lost – Unable to process business transactions Opportunity Lost – Unable to deliver competitive functionality to market IT Budget Lost – Extra costs needed to roll back changes and start over Revenue Drain – as employees wait for the system Hardware Costs – new hardware needed to solve capacity issues
  19. With more realistic testing scenarios, organizations can: Manage life-cycle events such as changes in hardware, workloads, databases or applications efficiently without production impact Develop accurate, streamlined tests to speed product and service delivery Skip laborious test script creation and load emulators Identify and quickly correct potential problems from enterprise changes Help ensure optimal SQL performance even as enterprise changes are deployed Complement existing regression, functional and performance tests with deeper analysis of the data layer Meet service-level agreements (SLAs) for application and database responsiveness and availability
  20. Reduce the cost of lifecycle changes (upgrades, migrations, consolidations, retirements, new application deployment or growth) Limit laborious database test script creation by leveraging actual production workloads for testing Limit load simulators with the capability to speed up and slow down the replay Deploy a single, repeatable process for database testing transparently across heterogeneous systems with minimal performance overhead Lower risk of lifecycle changes (upgrades, migrations, consolidations, retirements, new application deployment or growth) Develop more streamlined, accurate tests by leveraging actual production workloads for testing Identify and correct potential problems sooner with validation reports Accelerate project delivery with deep diagnostics of potential database problems Meet SLAs for availability, reliability and performance Ensure a well tuned and high performing workload before production deployment with performance tuning Extend quality testing efforts to include tailored database testing Integrate with existing database tools to get a complete view of production workloads Using actual production workloads for testing gives you insight into the best way to tune the database. This leads to better SQL performance. Using realistic testing give you real results. This is better than: • Guessing • Estimates • Rules of thumb When you know how actual production SQL is going to behave, you can tune the database better and be able to find and resolve problems sooner! We all know at the end of the day, better testing means: • Better user experience • Improved employee productivity • Allows for company growth with existing IT resources • Lower total cost of ownership
  21. Let’s take a closer look at the requirements needed to manage the data lifecycle across the information supply chain. There are four main areas organizations should focus on for streamlining management of the data lifecycle: Discover & Define: Understanding where data resides, what domains of information exist, how its related across the enterprise and define the policies and standards for management of it. Develop & Test: Creating the database structures and re-useable database code to enhance productivity & team collaboration, efficiently creating the test & development environments (and protecting sensitive data within), and leveraging actual production workloads through database workload capture and replay. Optimize & Archive : Ensuring optimal application performance, archiving historical data to manage data growth, and ensuring business users have effective access to the data – both production and archived. Consolidate & Retire: Rationalizing the application portfolio, consolidating and decommissioning applications that are redundant or no longer align with current IT technology – but maintain access to the data per data retention rules, long after the application has been retired. DETAIL The core products for each stack are as follows: Discover & Define : InfoSphere Discovery, InfoSphere Business Glossary, InfoSphere Data Architect Develop & Test : InfoSphere Optim Development Studio, InfoSphere Optim pureQuery, InfoSphere Optim Test Data Management Solutions, InfoSphere Optim Privacy (Masking) Solutions Optimize, Archive & Access : InfoSphere Optim Performance Manager, InfoSphere Optim PureQuery, InfoSphere Optim Data Growth Solution Consolidate & Retire : InfoSphere Optim Application Consolidation (including ERP systems)
  22. Organizations today have been increasingly challenged with successfully managing data growth. They have large volumes of data stored in various data repositories across the organization — and this is likely to grow to exponentially in the coming years. As a result of this rampant growth, these companies store more and more data every year in production systems. If your client has enterprise applications such HR, Finance, Customer Support systems – and they likely have many – then they’re dealing with the effects of increasing data volumes. Today’s organizations are met with 3 primary challenges: Increasing costs : As the volumes of data increase, the “natural” response it to simply buy more storage for the enterprise application. After all, “storage is cheap”, right? However, while the acquisition of storage hardware is cheap, the operational costs associated with it are often underestimated. Said differently, for every $1 spent on storage, organization spend $4 on the operational elements of managing that stored data. Poor Application Performance : Over time, as data volumes increase, application performance will be an issue. And what often occurs is that your clients “self-diagnose” this performance issue with simply the hardware performance – “let’s buy faster systems”; or they may task their DBA’s to tune and re-tune the database. However, without properly managing the data volumes, companies will see the impact in the performance of their system over time. This is particularly a problem when application performance impacts employee productivity. Risks associated with data retention and compliance : So, the “keep everything” in production systems can also create a risk associated with accessing this data for the long-term. So, how can companies safely store data that is no longer needed in production systems, but that must be retained per data retention policies and for compliance purposes. How can organizations store this data so that it’s audit-ready, easily accessible, and available for any e-discovery requests? .
  23. And the challenges are real: The cost of *managing* storage can be 3-10 times the cost of procuring it And IDC estimated that last year, organizations spent about $1.1 billion on storage costs Ask the DBA how much time they’re spending each week on hardware capacity-related performance issues – it can be up to 80% One InfoSphere Optim client had about 19,000 batch processes that took 250 hours to run – that’s more than 10 days. Archiving helped them reduce that time by over 75% Talk to your clients about how long they’re keeping data in their production systems. How long are they *supposed* to keep it at all? At least 50% of companies are keeping data for 7 or more years. And 57% of companies are leveraging their back-up copies for “data retention”. That’s a lot of data to sift through if you have an e-discovery request.
  24. Let’s first define what we mean by archiving. It should be an intelligent process for MOVING inactive/infrequently accessed data that still has value (e.g. for data retention needs, etc.). [Recall: Forrester’s statistic of 85% inactive data!] PLUS, the archive process should provide the ability to search and retrieve the data in a manner that functional users need to consume the data. This is a typical example of a Production environment prior to archiving. Initially, both Active and Inactive data is stored in the Production environment, taking up most of the space on the Production Server. Safely move the inactive or historical data to an archive, capturing the complete business object for application independent access. This data can then be stored in a variety of environments. The data can then also be easily retrieved to an application environment when additional business processing is required. You then have universal access to this data through multiple methods, including Report Writers such as Cognos and Crystal Reports, XML, ODBC/JDBC, application-based access (Oracle, Siebel, etc.)
  25. Let’s take a closer look at the requirements needed to manage the data lifecycle across the information supply chain. There are four main areas organizations should focus on for streamlining management of the data lifecycle: Discover & Define: Understanding where data resides, what domains of information exist, how its related across the enterprise and define the policies and standards for management of it. Develop & Test: Creating the database structures and re-useable database code to enhance productivity & team collaboration, efficiently creating the test & development environments (and protecting sensitive data within), and leveraging actual production workloads through database workload capture and replay. Optimize & Archive: Ensuring optimal application performance, archiving historical data to manage data growth, and ensuring business users have effective access to the data – both production and archived. Consolidate & Retire : Rationalizing the application portfolio, consolidating and decommissioning applications that are redundant or no longer align with current IT technology – but maintain access to the data per data retention rules, long after the application has been retired. DETAIL The core products for each stack are as follows: Discover & Define : InfoSphere Discovery, InfoSphere Business Glossary, InfoSphere Data Architect Develop & Test : InfoSphere Optim Development Studio, InfoSphere Optim pureQuery, InfoSphere Optim Test Data Management Solutions, InfoSphere Optim Privacy (Masking) Solutions Optimize, Archive & Access : InfoSphere Optim Performance Manager, InfoSphere Optim PureQuery, InfoSphere Optim Data Growth Solution Consolidate & Retire : InfoSphere Optim Application Consolidation (including ERP systems)
  26. We’ve come full-circle now, talking about the requirements for your data & applications, how to deploy & optimize those applications, and how to access the data effectively. Now we come to the retirement phase – or to put it another way, application rationalization. Looking around the data center, there is likely 1 or 2 – or more – systems that are merely “kept alive” because there might be important data stored within – but how will you access it? Have you really looked at what’s in the data there? How old is that application? Is the system still supported? Does anyone know what it does? Is it redundant to more current systems? How much is this older application costing – licenses, power consumption, extended support agreements, data center footprint, etc? If the solution is to consolidate/retire the application, the next big concern is what to do with the data . We don’t want to move it all into the consolidated application, that will cause performance issues, may not all be appropriate in the new context and could grow our data to be too large, but for business, governance and regulatory reasons, we must keep it. So, the best scenario is to provide access to the data without relying on the cost of database software or servers and doesn’t rely on the application to access it. All of that said, in most cases, we still need to get to the data. Analyst Quote: Organizations facing application retirement projects should look…to provide a way to get data that must be retained into a format that can be accessed independently of the retired application. Source: Carolyn Dicenzo, Gartner “Database-Archiving Products Are Gaining Market Traction” , October 2008
  27. So, the best scenario is to provide access to the data without relying on the cost of database software or servers and doesn’t rely on the application to access it – that is, application independent access of this data. Examples of application retirement benefits: For example, once data from similar business applications is consolidated and redundant applications are retired, a skilled DBA can redirect productive time toward implementing an ERP package, rather than maintaining a patchwork of databases that support outdated legacy applications. Another benefit: When you rationalize your infrastructure, you also reduce its complexity and therefore reduce business risk. For example, by consolidating a dozen homegrown general ledger applications into a packaged ERP solution, you can provide business-critical support and reduce the risk of missing key processing deadlines, such as a month-end close.
  28. So, how can IBM help clients with these challenges? Through effective data growth management. InfoSphere Optim Data Growth Solution can help clients REDUCE COSTS, IMPROVE APPLICATION PERFORMANCE AND MINIMIZE THE RISKS associated with managing application data over its lifetime. Reduce Costs : By archiving infrequently used data from production environments, that data can be stored on less expensive, tier 2 storage, and can be compressed to save even more storage space. Improve Performance : With less data, application performance improves, searches and batch processes run faster, back-up processes run more efficiently. If your client is considering an application upgrade, archiving can streamline the associated data conversion process – less data to convert to the upgraded version, the less time the application is offline. Minimize Risks : Intelligently archiving data out of production systems allows for data retention compliance, but also supports a better long-term solution for storing the application data, providing application independent access to that data. InfoSphere Optim Data Growth solution is the proven, marketing leading solution that: With InfoSphere Discovery, discovers & identifies data record types to archive across heterogeneous environments Intelligently archives data, capturing & storing historical data in its original business context Defines & maintains data retention policies consistently across the enterprise Ensures long-term, application-independent access of archived data via multiple access methods, including third party reporting tools Data Find, a web based search engine Supports for custom & packaged ERP applications in heterogeneous environments
  29. These are some recent quotes from industry analysts on InfoSphere Optim e.g. “ IBM’s Optim product line led the database archiving and ILM segment in 2010 with a 52.3% share, and showed nearly 18% growth in 2010.” Forrester said: Today, IBM continues to lead the industry with the most comprehensive data archiving solution and the largest installed base … IBM’s customers spoke highly of the Optim solution’s reliability and strong performance. Organizations can realize benefits in the form of improved operational and capital cost savings, improved IT and end user efficiency, as well as higher levels of data protection and application performance [with InfoSphere Optim].
  30. “IBM enjoys the largest market share of all vendors profiled in this research…” “[InfoSphere] Optim supports both mainframe and open-system environments, and many of IBM’s customers are using it in heterogeneous environments.” “Customers cite a ‘small company feel’ when asked to describe their interaction with [InfoSphere] Optim sales and support
  31. Governing the data lifecycle with IBM® InfoSphere software improves application efficiency by better managing data growth, managing test data and enabling efficient application upgrades, consolidation and retirement: Reduce the cost of data storage, software and hardware Improve application efficiency and performance Reduce risk and support compliance with retention requirements Speed time to market and improve quality
  32. Why IBM? IBM InfoSphere Optim provides proven test data management, data masking, and database archiving capabilities that enable organizations to improve application reliability, mitigate risk and control costs. About Allianz The Allianz Group provides its more than 60 million clients worldwide with a comprehensive range of services in the areas of property and casualty insurance, life and health insurance, asset management and banking. Its subsidiary, Allianz Seguros, ranks second in the market, with over 2,400 million Euros in premiums. The company offers a comprehensive range of insurance products and services, including individual and group life, health, home, casualty, auto, boating and more to meet the needs of nearly three million clients. Allianz Seguros has contributed to the “3 + One” business model, a successful strategic program initiated by the Allianz Group to achieve sustainable and profitable growth. Primary objectives of the “3 + One” program are to fortify the capital base, improve operations, reduce complexity and increase sustainable competitiveness and value. An increase in the number of insurance premiums, a reduction in the claims ratio and continuous management improvement has contributed to continued business growth. Allianz Seguros relies on several mission-critical mainframe insurance applications, developed in-house, to manage operations in all areas of its business activities. Insurance agents, claims representatives and accounting staff in all branch offices rely on these applications to manage information for policy management, claims processing and premium billing, among other activities. Delivering application enhancements presents challenges Allianz Seguros has its own internal application development and quality assurance teams that develop and enhance application functionality. The ability to deliver new insurance products and services is important to promote continued business growth, which presented several challenges. “Our primary challenge was to improve the efficiency of the development and quality assurance processes by reducing the size of the development and testing environments,” said Ramon Lasurt, Director of Development at Allianz Seguros. “Next, we wanted a way to ensure accuracy by preserving the integrity of the test data, and finally, we needed to mask client information in the development and testing environments to protect privacy.” There were often at least three development and testing mainframe environments in use at the same time. The quality of these environments degraded quickly because they were used for multiple tests. Every few months, the development team would use its in-house “subsetting” program to copy test data from its large application production environment, comprising about 700 GB with tables that contained over 200 million rows. “We would refresh these development and testing environments as needed,” said Xavier Mascaró, Senior DBA at Allianz Seguros. “However, the refresh process, using data cloned from large production databases, was very complex and time consuming, and the results often affected the integrity of the data. We estimated that reducing the testing environments to only 10 percent of the production environment would provide significant time and cost savings.” Because Allianz needed to protect confidential client information to comply with the Spanish Law of Protection of Personal Data (LOPD), privacy protection remained a high priority. In fact, recent revisions to the LOPD held individual staff members responsible for protecting client records. To address this need, the DBA team had to write special programs to mask client names, tax ID numbers (Número de Identificatión Fiscal or NIF) and national identifiers (Documento Nacional de Identidad or DNI) and then move data into the application development and testing environments. Preserving the integrity of the test data presented another challenge because the data structures that supported the insurance applications included dozens of complex relationships. Although the in-house subsetting program offered some of the needed functionality, to ensure valid test results, the development team needed a test data management solution that would accurately preserve the referential integrity of the data for even the most complex data relationships. InfoSphere Optim improves test data management After researching to find a solution, senior management at Allianz Seguros decided to evaluate IBM InfoSphere Optim. Members of the evaluation team included the Senior DBA, as well as the Chief of Technology and Production, the Director of Development and the Director of Systems, who both report to the Director General. “After attending a demonstration, the members of our evaluation team agreed that InfoSphere Optim provided the capabilities we needed to improve application development and testing processes and protect privacy,” said Xavier Mascaró. Immediately after purchasing InfoSphere Optim, the development team focused on defining the relationships and criteria for subsetting, based on their complex relational database environments. In addition to using relationships defined to the database, InfoSphere Optim offered flexibility for defining and managing the complex relationships defined within the application logic. Next, the DBA team implemented InfoSphere Optim in its integration testing environment. In-house archiving presents challenges A few years after the positive experience of implementing InfoSphere Optim for test data management, Allianz Seguros turned its attention to InfoSphere Optim’s database archiving capabilities. Applications, such as Vida (life insurance) and Contabilidad de Agentes (agent accounts), continued to collect historical records, and this information was never deleted. In Spain, there is no law that states that insurance records must be retained for a specific number of years. However, in the insurance business, a policy can be in effect for a lifetime. It was important to retain access to these historical records. Although the IT department had been using an in-house archiving program for years to manage data growth, their methods for accessing and retrieving archived information were time consuming. For example, if an agent requested specific archived insurance claims records, it was necessary to recover one or several files from a backup tape and send a printed copy to the requestor. This process could take between one and four days, which had a negative impact on policy management, claims processing and other insurance service activities. InfoSphere Optim ensures access to historical records InfoSphere Optim’s proven database archiving capabilities offered Allianz Seguros several advantages over its in-house archiving program. First, InfoSphere Optim archives application transaction records in complete business context, in effect creating historical reference snapshots of the business. Archives can be saved to a variety of storage media for easy retrieval. In addition, using InfoSphere Optim delivers more capabilities and eliminates the need to maintain the in-house archiving program.
  33. Challenges Improve development and testing strategies to deploy a new Pension Earnings and Accrual System within 30 months. Protect confidential employee salary and pension information in non-production (development, testing and training) environments to satisfy data privacy and TyEL compliance requirements. Why IBM? IBM InfoSphere Optim provides proven test data management and data privacy capabilities that support the Pension Earnings and Accrual System architecture and satisfy automated testing requirements. Solution IBM InfoSphere Optim Test Data Management Solution IBM InfoSphere Optim Data Masking Solution Benefits Improved development and testing efficiencies, enabling Arek Oy to promote faster deployment of new pension application functionality and enhancements. Protecting confidential data to strengthen public confidence and support TyEL compliance requirements. Headquartered in Finland, Arek Oy, Ltd. was established and is owned by the Finnish Centre for Pensions (ETK) and the country’s authorized pension insurance providers. Arek Oy manages the development of information systems and provides system services to the pension insurance community. Arek provides services to ETK and other pension providers, including Etera, Pension-Fennia, Ilmarinen, the Social Insurance Institution, the Central Church Fund, the Local Government Pensions Institution, the Seamen’s Pension Fund, Pensions-Alandia, Silta, Tapiola, the State Treasury, Varma and Veritas. All employment in Finland is covered by a statutory and compulsory earnings-related pension scheme that is funded by employer and employee contributions. Supporting a large-scale information management project TyEL dictates that it is the employer’s responsibility to arrange for pension insurance and to provide the insurance company with relevant information about employees, including identification, personal information, employment history and salary data. The pension insurance company, in turn, registers the data for employees and self-employed individuals, administers the funds and investments, awards and pays the pensions. All data must be handled in the strictest confidence. Among its various information technology and application development projects, Arek Oy maintains the Pension Earnings and Accrual System that manages all the data that supports earnings-related pensions. This new system is considered Finland’s single largest information management project. Services connected to the pension application are available directly from ETK. All employment records are also available electronically both via pension provider Internet pages and via the pension portal, Tyoelake.fi, which is maintained by ETK. Business challenges and demanding deadlines Arek Oy’s primary business challenge was to manage one of the largest Java 2 Platform, Enterprise Edition (J2EE) custom development efforts in Finland. Specifically, Arek Oy had to develop and deliver a thoroughly tested and reliable Pension Earnings and Accrual System, within 30 months. Earnings related pensions are crucial for each citizen’s well being and financial security. For an implementation project a failure to meet deadlines would result in an implementation sanction and, at a minimum, countless customer complaints in many cases. For Arek Oy the impact would be reclamations and serious loss of both customer good will and future business opportunities. Losses could range in the millions of euros. Tasked with the deployment of the new enhancements and functionality for the Pension Earnings and Accrual System within release deadlines, Arek Oy had to improve the efficiency of the application development and testing processes and procedures to support that business initiative. Managing test data poses technical challenges The primary technical challenge was ensuring the applicability of chosen technology for the selected solution architecture and processing needs. Arek Oy had to complete development efforts within a relatively short timeframe and had to ensure the expected application quality. This meant investing in solutions and methodologies that would enable the controlled delivery of 60,000 man-days within the given timeframe. Test data management and protecting privacy The Arek Oy development team needed a test data management solution with capabilities for creating realistic test data to satisfy specific application testing criteria. Processes for creating these testing environments had to be flexible and repeatable to ensure consistency and accuracy for system development projects. In addition, because of the nature and sensitivity of the personal pension information, Arek Oy needed a solution that would allow for de-identifying the personal data, such as names, addresses, national identifiers, salary and pension amounts, used in the development and testing environments. Next, the selected test data management solution had to support the proposed pension application architecture and satisfy automated testing requirements, as well as provide capabilities to enable developing the Service Oriented Architecture (SOA) and business applications concurrently. Because of the tight development deadlines, capabilities for coordinating the delivery of several concurrent projects were critical. That is, in addition to supporting concurrent development and testing processes, the selected solution had to ensure repeatability and transferability of tests and test data. Adapting a successful solution Since it was founded in 2004, Arek Oy had neither a previous solution nor the IT infrastructure to support the planned development activities. However, ETK had previously built an industry-wide Distributed Test Data Management solution for its own test data requirements. The existing ETK solution, called “Testimaha” was based on the IBM InfoSphere Optim Test Data Management Solution for the open systems environment. To speed the deployment of an effective solution, Arek Oy enlisted the expertise of Mainsoft Corporation. As an advanced IBM business partner, Mainsoft is a leading provider of cross-platform services and support. “The further utilization of the ETK concept was a natural solution for Arek Oy because we had to synchronize our test data with the data in the ETK data storage to produce a correct starting point for testing,” said Katri Savolainen, Project Manager at Arek Oy. “Since ETK was utilizing the mainframe and workstation versions of Optim, we knew that many of the test planners and testers would be familiar with Optim’s capabilities. Therefore, we decided to build our own test data management concept based on the one ETK was utilizing.” Arek Oy decided to implement a version of the ETK system using InfoSphere Optim and worked with professionals from Mainsoft to complete the implementation and ensure success. InfoSphere Optim integrated with the existing ETK solution, adapted easily to user-defined working principles, and was easy for the DBA to manage and support. Using InfoSphere Optim’s subsetting capabilities, rather than cloning large production databases, made it possible to create robust, realistic test databases that supported faster iterative testing cycles. In addition, InfoSphere Optim offered proven capabilities for performing complex data masking routines, while preserving the integrity of the pension data for development and testing purposes. Meeting these requirements would ensure accuracy and build confidence in the Pension Earnings and Accrual System, while protecting privacy in the development and testing environments. “We are currently in our second iteration of implementing Optim and we are very pleased with the quality of service and support provided by Mainsoft. At first, we completed and rolled out a mainframe version, in which we used a Java component to call Optim (mainframe). This was in production use within five months,” said Katri Savolainen, at Arek Oy. “In December 2006, we switched our production database server to AIX. On this occasion, we re-factored the previous solution into a UNIX script, which utilizes Optim.” Technical and business results Arek Oy has a Distributed Data Management Solution for Test Data, called TAHS, which is used throughout the application development cycle. TAHS supports all phases of application testing, including integration testing, system testing, acceptance testing and customer testing. TAHS is also used in conjunction with automated regression testing tools, which enable developers to prepare automated test scripts. “In June of 2007, we did a maintenance release, and using Optim, we were able to secure the availability of test data for development, even when the production database was undergoing a lengthy conversion,” said Katri Savolainen at Arek Oy. “Optim is now fully implemented and is used extensively by the in-house development team that built the integration to the ETK system and our DBA. In addition, there are a number of developers, test planners and testers who are using Optim through the TAHS application.” “We were able to give our development projects realistic, but masked test data, which satisfied requirements for providing appropriate test data and protecting privacy. We were also able to empower all system test and acceptance test planners to create and maintain their own test data,” said Katri Savolainen, at Arek Oy. “The test planners were able to concentrate on searching for the correct set of data and the best way of utilizing it instead of worrying about the technical transition of the data from our production environment to one of the several test environments.”
  34. Overview Toshiba TEC Europe is a leading subsidiary of the Toshiba Group, a world technology leader that manufactures a wide range of electronic and high-technology products for personal and institutional use. Business need: Proactively manage application data growth to support business expansion and deployment of Oracle® E-Business Suite across business units . Increase application availability by reducing the time to complete 19,000 daily batch processing jobs exceeding 250 hours . Integrate and consolidate data and processes with the other Toshiba European entities to improve service levels and operational efficiencies. Solution: IBM InfoSphere Optim Data Growth Solution for Oracle® E-Business Suite Benefits: Managed continued data growth and deployed Oracle E-Business Suite across business units by archiving to reduce database size by 30 percent. Increased application availability by archiving historical transactions to shorten time to complete 19,000 daily batch processes by 75 percent. Improved service levels and operations by implementing InfoSphere Optim to provide access to current and historical transactions.
  35. The need: The Virginia Community College System wanted an out-of-the box archiving solution for PeopleSoft Enterprise Campus Solutions that would help manage data growth without expensive server upgrades, support compliance requirements, and reduce the time spent on performance tuning and related issues. The solution: Using IBM® InfoSphere™ Optim™ software, VCCS can archive complete historical student records in batches for students who have graduated or been inactive for at least 10 years; access archived data for reporting and analysis; process requests for transcripts against archived student data without having to restore the data to the production environment; and selectively restore a complete record for a single student on demand. The benefits: Effectively manages data growth to improve service levels Offers flexibility to archive 10 or more years of inactive student data Enables staff to selectively restore student records as needed Lowered infrastructure costs by eliminating frequent expensive server upgrades
  36. You want to point customers to the InfoSphere Optim ibm.com page, solution sheet, But let your clients get their own statistics. There is a self-service business value assessment that a client – alone or with your guidance – can leverage to do a quick assessment of how InfoSphere Optim can help their business. We encourage you to leverage this link as you start to talk to your clients about business benefits.
  37. IBM InfoSphere Optim Solutions allows you to manage data through its lifecycle in heterogeneous environments. You may have a lot of data scattered around the organization – how do you find it? How do you know how it relates to other enterprise data? IBM InfoSphere Optim provides a solution to Discover the data and the relationships as information comes into the enterprise. You need to develop applications and functionality that can best maintain your data – and you need to effectively test those applications. We provide a solution for DBAs, testers and developers to effectively create and manage right size test data while protecting sensitive test data in development and test environments. The day-to-day challenges of managing the lifecycle of your data are intensified by the growth of data volumes. IBM InfoSphere Optim provides intelligent archiving techniques so that infrequently accessed data does not impede application performance, while still providing access to that data .IBM InfoSphere Optim provides a Data Growth solution that helps reduce hardware, storage and maintenance costs. Over time, the applications managing your data will need to be upgraded, consolidated and eventually retired – but not the data. Many organizations today are over burdened with redundant or legacy applications – e.g. as organizations are merged/acquired, so are their IT systems.. By leveraging InfoSphere Optim’s Application Retirement solution and archiving best practices you can ensure access to business-critical data for long-term data retention, long after an application’s life-expectancy.
  38. Enterprises are generally made up of many technologies. IBM InfoSphere Optim solutions span support for these technologies as well. We start with storage platforms including on-line, near-line and off-line. Depending on where the data is in its lifecycle, the data may be stored on these different platforms and be related across them. As we move up the chart, most organizations have systems that span multiple operating systems whether being z/Series, i/Series, Linux, Unix or Windows and InfoSphere Optim solutions spans support across the different operating systems to manage data across them. Few organizations have just 1 database management system as well. They will generally have data that runs on the same database platform on multiple operating systems, but also different DBMS's as well and it is critical that an data lifecycle solution supports heterogeneous environments. In a similar light, most organizations have adopted a combination of ERP and CRM packaged systems like SAP, PeopleSoft, Siebel and others along with the creation of custom applications. These system do not stand alone either, they are integrated together, share information and need to be managed in a consistent manner. Lastly, you can see the different capabilities that we see as critical to managing data across the enterprise and therefore the solutions available today from InfoSphere Optim. These items include the capability to Discover, Test Data Management, Data Masking, Manage Data Growth, and Application Retirement. With this, you can see and understand how IBM InfoSphere Optim is a single, scalable, heterogeneous information lifecycle management solution.
  39. Workgroup Edition Targets mid to low end market Less functionality Restricted to less than 6 terabytes of data and single server Trade up available to Enterprise Edition Enterprise Edition Targets mid to high end of the market Unrestricted use
  40. In order to be able to govern information effectively, you have to understand where that information exists and how its related to the organization. Data discovery is the process of analyzing data values and data patterns to identify the relationships that link disparate data elements into logical units of information, or “business objects” (such as customer, patient or invoice). A business object represents a group of related attributes (columns and tables) of data from one or more applications, databases or data sources. Discovery is also used to identify the transformation rules that have been applied to a source system to populate a target such as a data warehouse or operational data store. Once accurately defined, these business objects and transformation rules provide the essential input into information-centric projects like data integration, MDM and archiving. IBM InfoSphere Discovery™ provides market-leading capabilities to automate the identification and definition of data relationships across the complex, heterogeneous environments prevalent in IT today. Covering every kind of data relationship, from simple to complex, InfoSphere Discovery provides a 360° view of data assets. InfoSphere Discovery analyzes the data values and patterns from one or more sources, to capture these hidden correlations and bring them clearly into view. InfoSphere Discovery applies heuristics and sophisticated algorithms to perform a full range of data analysis techniques: single-source and cross-source data overlap and relationship analysis, advanced matching key discovery, transformation logic discovery, and more. It accommodates the widest range of enterprise data sources: relational databases, hierarchical databases, and any structured data source represented in text file format. InfoSphere Discovery’s automated capabilities accurately identify relationships and define business objects, speeding deployment of information- centric projects by as much as ten times.
  41. Creating realistic application development and testing environments is critical to delivering the right solutions for the business. However, cloning large production databases for development and testing purposes extends cycle times, increases the amount of data propagated across the organization, and significantly raises costs and governance control issues. The Optim Test Data Management Solution offers proven technology to optimize and automate processes that create and manage data in non-production (testing, development and training) environments. Development and testing teams can create realistic, “right-sized” test databases, made up of one or more business objects, for targeted test scenarios. The Optim Test Data Management Solution also allows teams to easily compare the data from “before” and “after” testing with speed and accuracy. Optim’s capabilities for creating and managing test data enable organizations to save valuable processing time, ensure consistency and reduce costs throughout the application lifecycle.
  42. InfoSphere Optim Data Masking Solution protects an organization’s data in non-production environments by de-identifying (or masking) sensitive/personal identifiable date. The InfoSphere Optim solution doesn’t keep the data from being stolen, but rather render the data unusable and of no value if stolen. This protects the business both financially and from loss of information and provides IT with a simple-to-use solution that supports a common way of protecting data leveraged in non-production (test, development) environments, or by third-party contractors. The InfoSphere Optim Data Masking Solution comes with a multitude of built in masking functions, as well as the ability to define your own transformations. There is no longer a reason to needlessly expose your sensitive data in your test environments ever again.
  43. The InfoSphere Optim Data Growth Solution solves the data growth problem at the source - by managing your enterprise application data. IBM Optim enables you to archive historical transaction records, controlling data growth and improving application performance. Historical data is archived securely and cost-effectively, and can be easily accessed for analysis or audit/e-discovery requests. And with less data to sift through, you speed reporting and complete mission-critical business processes on time, every time. Having a defined policy for managing the retention requirements for historical data is a requirement for enterprise governance frameworks to ensure compliance with regulatory mandates. As a recognized best practice, archiving segregates inactive application data from current activity and safely moves it to a secure archive. Streamlined databases reclaim capacity and help improve application performance and availability. With InfoSphere Optim, you can establish distinct service levels for each class of application data – for example, current data, reporting data and historical data – and consistently achieve performance targets. Policy-driven archive processes allow you to specify the business rules for archiving. For example, you may choose to archive all closed orders that are two years old or more. InfoSphere Optim identifies all transactions that meet these criteria and moves them into an accessible archive. InfoSphere Optim manages application data at the business object level. Business objects are comprised of a group of related columns and tables from one or more application databases, along with their associated metadata. By managing data at the business object level, InfoSphere Optim preserves both the relational integrity of the data and its original business context. Each archived record represents a historical reference snapshot of business activity, regardless of its originating application.
  44. The InfoSphere Optim solution for Application Retirement enables you to archive historical data securely and cost-effectively, and in a way that the data can be easily accessed for analysis or audit/e-discovery requests, long after the original application has been retired. InfoSphere Optim manages application data at the business object level. Business objects are comprised of a group of related columns and tables from one or more application databases, along with their associated metadata. By managing data at the business object level, Optim preserves both the relational integrity of the data and its original business context. Each archived record represents a historical reference snapshot of business activity, regardless of its originating application.
  45. SUMMARY: InfoSphere Optim System Analyzer for SAP Applications is a powerful web-based tool that automatically identifies SAP system changes for key application lifecycle events, and understands how those systems are impacted. Provides critical impact analysis information before applying changes into production Provides ability to compare metadata between multiple systems, applications, and data dictionary Presents results automatically in generated reports Provides further drill-down Recommends testing executables & identifies testing gaps End result: Reduce time, cost, complexity & risk for SAP application & system changes.
  46. And while System Analyzer looks at the impact of change from the data structure and customer code level, InfoSphere Optim Business Process Analyzer for SAP applications looks at these changes at the SAP business process level. InfoSphere Optim Business Process Analyzer is a component of InfoSphere Optim System Analyzer that automatically captures business process from your SAP data structures and helps SAP business analysts visualize how changes will impact the business process. The picture depicted here is a great example – InfoSphere Optim Business Process Analyzer is analyzing the impact of change across this SAP module, leveraging the business process – or “flow chart”. The processes impacted are in red, but if you look at the inset, you can see that the proposed changes only impact a small portion of the module. Now testers know where to focus. And this business process view of change impact provides greater collaboration between the business analyst and technical manager on an SAP project team.
  47. SUMMARY: InfoSphere Optim Test Data Management Solution for SAP Applications offers proven technology to optimize and automate the process to create and manage data in non-production (testing, development and training) environments, with no performance impact to production systems. Development and testing teams can create realistic, “right-sized” test environments, made up of one or more business objects, for targeted test scenarios. This SAP-Certified solution is invoked within SAP, providing the user with a familiar interface and easy point-and-click environment. This solution includes pre-built business objects for key SAP modules, that can be modified with user-defined criteria to extract the needed data. These extracts can then be saved as Variants (ABAP program routines) to be leveraged as a repeatable process, speeding the testing process. Optim’s capabilities for creating and managing test data enable organizations to save valuable processing time, ensure consistency and reduce costs throughout the application lifecycle.
  48. SUMMARY: InfoSphere Optim Application Repository Analyzer explores and analyzes your application repository information to identify the complex customizations of data model within your Oracle® applications, including Oracle E-Business Suite, PeopleSoft Enterprise and Siebel CRM. This helps you reduce time and improve the quality of lifecycle events, such as archiving, sub-setting and masking projects. InfoSphere Optim can understand your custom implemented modules to capture and identify the parent-child table relationships, thus minimizing manual efforts that would be needed, and improving accuracy. InfoSphere Optim can also identify data model differences across application versions and releases by comparison, to assess and anticipate the impact to customizations in upgrade and enhancement projects
  49. IBM® InfoSphere® Optim™ Query Capture and Replay enables organizations to fully assess the impact of life-cycle changes in the testing environment before production deployment. InfoSphere Optim Query Capture and Replay complements existing functional, regression and performance tests by giving IT teams deeper insight into the database layer. The result is a significantly streamlined and more realistic testing process. Compared to legacy testing techniques, enterprise changes can now be tested more rigorously, and replays can be tailored to meet a variety of test objectives, such as capability planning, performance and function testing, and more. This approach ultimately helps shorten testing cycles and save precious resources. In addition to capturing and replaying the production workload, InfoSphere Optim Query Capture and Replay provides reports that allow IT teams to accurately analyze the impact of changes made. Comprehensive before-and-after reporting includes both high-level summaries and detailed drill-downs. As a result, IT teams gain deep insight into potential problem areas, enabling them to resolve issues before production development begins. Available reports include: • Summary Comparison—Provides a high-level look into the differences between the capture and replay, comparing average execution time, SQL exceptions, rows retrieved and SQL failures • Workload Aggregate Match—Aggregates statistics to allow quick comparisons of selected workloads • Workload Exceptions—Shows which SQL generated exceptions during replay • Workload Match—Provides a side-by-side comparison of each SQL statement and a statistical comparison between the workloads Unique product differentiators: Robust capture of full production workloads including DMBS details – Not just SQL Minimal production impact across heterogeneous environments large and small Transparent deployment Adjustable replays for testing, capacity planning and problem diagnosis Validation reports
  50. Optim market share: This is IDC recent analysis of the Database Archiving and Information Life-cycle Management segment. IBM's Optim product line led the database archiving and ILM segment in 2010 with a 52.3% share, and showed nearly 18% growth in 2010. IDC suggests a market growth between 2009 to 2010 of almost 26%, the highest of all the categories in the database development & management market. Note: Database archiving and ILM include subsetting, masking and test data generation in this category.
  51. What is a CVE? Programmatic approach : CVE provides clients a defined, structured and easy to follow process designed to help all key stakeholders evaluate IBM Information Management solutions Economic Benefits: CVE determines the benefits in financial terms decision makers can understand such as 5-Yr Summary, ROI, TCO and many other metrics to evaluate the proposed solution Technical Solution Blueprint: CVE provides a tailored architectural blueprint comparing your current state process/architecture against the recommended/proposed future state CVE Final Results: CVE seeks to uncover all of the cost savings that can be realized as a result of an investment in the IBM recommended software and a solution blueprint Why do a CVE Analysis? Lightweight Process: CVE is a flexible process that is not time or resource intensive Clear Objectives: CVE is run programmatically defining clear objectives, priorities, timelines Speedy Time-to-Completion: CVE elapsed time is a few short weeks (Introduction to Final Presentation) Professional Deliverables: CVE final results are built by trained and qualified experts Personalized Process: CVE is planned around giving each client a individual experience A CVE is a value-added IBM Information Management program that provides clients the expertise, structure and programmatic process to help all key stakeholders understand & evaluate IBM Information Management solutions. Within the CVE we provide the business/technical experts & business case specialist who will facilitate the program. The CVE process uncovers the business/technical challenges, quantifies the economic benefits and constructs a detailed technical solution blueprint that compares the current state against the recommended IBM solution. The result of a CVE is a detailed business case that summarizes both technical & economic justifications, solution differentiators and architecture. 3 Steps for a Successful Info Lifecycle Management CVE: Step 1: We work with your key management staff to understand the technical & business challenges in managing the lifecycle of your database data Step 2: We conduct in-depth interviews and data collection to determine the cost of operations for the in-scope systems and applications selected by you Step 3: We develop a technical solution blueprint that describes how the IBM solution fits in your environment Analysis Validation: Our CVE Lead will validate all findings to ensure its accuracy, completeness before finalizing Final Presentation & Results: In financial and technical terms, we present & deliver to your organization a final CVE business case that provides the cost savings, strategy and solution blueprint. Sample ILM CVE Offerings: Data Growth: Calculate the value and benefits related to managing excessive data growth Test Data Management: Calculate the value & benefits related to more effective mgt of your test data in relation to your development projects & processes Application Retirement: Calculate the value & benefits of removing app from your portfolio Data Privacy: Risk reduction & TCO of privatizing nonproduction in your test data processes
  52. Information Management Software Services brings the following to each and every engagement: Deep product and industry expertise - the “heart surgeons” of a project Certified professionals Enablers, driving clients to be self-sufficient WW track record of project success Access to the Information Management Software Development Labs Skills, experience and standard practices; critical for early adopters of our technology Strategic partners