IMS and Information Governance - IMS UG May 2013 Dallas

614 views
536 views

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
614
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
23
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

IMS and Information Governance - IMS UG May 2013 Dallas

  1. 1. © 2013 IBM Corporation®IMSIMS – Data Governance OverviewDennis EichelbergerIT Specialist – IMS Advances Technical Skillsdeichel@us.ibm.com
  2. 2. © 2013 IBM Corporation2IMSIMS and Data GovernanceWhy discuss data governance?What is data governance?How does IMS implement data governance?What are the today’s challenges?
  3. 3. © 2013 IBM CorporationIMSWhat happens when you’re NOT in control of your business data…3Heathcare – “Dozens of women weretold wrongly that their smear test hadrevealed a separate infection after ahospital error, an independent inquiryhas found….Confusion arose because the hospitaldecided to use a code number tosignify “no infections”, not realizing thatit was already in use at the healthauthority where it meant “multipleinfections”….Retail – “Hackers have stolen 4.2million credit and debit card detailsfrom a US supermarket chain byswiping the data during paymentauthorization transmissions in stores..”Banking – “A major US bank has lostcomputer data tapes containing personalinformation on up to 1.2 million federalemployees, including some members of theU.S. Senate….The lost data includes Social Securitynumbers and account information thatcould make customers of a federalgovernment charge card programvulnerable to identity theft….”Banking – “A rogue trader accused ofthe world’s biggest banking fraud wason the run last night after fakeaccounts with losses of £3.7 billionwere uncovered….The trader used his inside knowledgeof the bank’s control procedures tohack into its computers and erase alltraces of his alleged fraud. Mr Leesonsaid ”Rogue trading is probably a dailyoccurrence within the financialmarkets. What shocked me was thesize. I never believed it would get tothis degree of loss.”Public Sector – “Two computer discsholding the personal details of allfamilies in the UK with a child under 16have gone missing….The Child Benefit data on then includesname, address, date of birth, NationalInsurance number and, where relevant,bank details of 25million people…”WASHINGTON – “The FINRA announcedtoday it has censured and fined a financialservices company $370,000 for makinghundreds of late disclosure to FINRA’sCentral Registration Depository (CRD) ofinformation about its brokers, includingcustomer complaints, regulatory actionsand criminal disclosures. Investors,regulators and others rely heavily on theaccuracy and completeness of theinformation in the CRD public reportingsystem – and, in turn, the integrity of thatsystem depends on timely and accuratereporting by firms.”
  4. 4. © 2013 IBM CorporationIMS…. Resulting in a broad range of potentially life threatening consequences4Heathcare – Dozens of women weretold wrongly that their smear test hadrevealed a separate infection after ahospital error, an independent inquiryhas found….Confusion arose because the hospitaldecided to use a code number tosignify “no infections”, not realizing thatit was already in use at the healthauthority where it meant “multipleinfections”….Retail – Hackers have stolen 4.2 millioncredit and debit card details from a USsupermarket chain by swiping the dataduring payment authorizationtransmissions in stores..Banking – A major US bank has lostcomputer data tapes containing personalinformation on up to 1.2 million federelemployees, including some members of theU.S. Senate….The lost data includes Social Securitynumbers and account information thatcould make customers of a federalgovernment charge card programvulnerable to identity theft….”Banking – Rogue trader accused of theworld’s biggest banking fraud was onthe run last night after fake accountswith losses of £3.7 billion wereuncovered….The trader used his inside knowledgeof the bank’s control procedures tohack into its computers and erase alltraces of his alleged fraud. Mr Leesonsaid ”Rogue trading is probably a dailyoccurrence within the financialmarkets. What shocked me was thesize. I never believed it would get tothis degree of loss.”Public Sector – Two computer discsholding the personal details of allfamilies in the UK with a child under 16have gone missing….The Child Benefit data on then includesname, address, date of birth, NationalInsurance number and, where relevant,bank details of 25million people…”WASHINGTON – The FINRA announcedtoday it has censured and fined a financialservices company $370,000 for makinghundreds of late disclosure to FINRA’sCentral Registration Depository (CRD) ofinformation about its brokers, includingcustomer complaints, regulatory actionsand criminal disclosures. “Investors,regulators and others rely heavily on theaccuracy and completeness of theinformation in the CRD public reportingsystem – and, in turn, the integrity of thatsystem depends on timely and accuratereporting by firms.”ssssssIncorrect classification..Life threateningconsequencesIneffective Security..Brand damageFinancial loss Physical Data Loss..Identity TheftLate Disclosures..Inaccurate dataHeavy Fines,Legal implicationsPhysical unprotected DataLoss..Fraud on a massive scalePoor InternalControls..Bankruptcy,Financial ruin,penaltiesNeed to manage the information
  5. 5. © 2012 IBM CorporationIMSWhat is Data Governance and Information Governance?• Data:– Structured– Unstructured– Metadata– Video, Audio, Multi-Media– Print, Email, and Archived– Software Code– Patents, IP– Protocols, Message Streams• Information:– Data which has been processed andtransformed in order to provide insightand answers to business questionsEffective management of data quality needs initiatives which:• span the whole organisation – not just within the silos• get to the root of the problem – not just the symptoms• allocate clear, measurable responsibilitiesThis isData GovernanceEffective use of business information needs a framework which:• manages the underlying data assets effectively through Data Governance• matches the supply of information with its demand from the business• underpins the business requirements with a solid Information architectureInformation Governance: ‘The specification of decisionrights and an accountability framework to encouragedesirable behavior in the valuation, creation, storage,use, archival, and deletion of information. It includesthe processes, roles, standards, and metrics thatensure the effective and efficient use of information inenabling an organization to achieve its goals.’ ~Gartner Inc.This isInformation GovernanceAs a business,we need to use these terms consistently
  6. 6. © 2013 IBM CorporationIMSData Governance Creates Order out of Data Chaos  Orchestrate people, process andtechnology toward a common goal Promotes collaboration Derive maximum value from informationData Governance is the exercise of decision rights toData Governance is the exercise of decision rights tooptimize, secure and leverage data as an enterprise asset.optimize, secure and leverage data as an enterprise asset.Governing the creation, management and usage ofGoverning the creation, management and usage ofenterprise data is not an option any longer. It is:enterprise data is not an option any longer. It is:Expected by your customers Demanded by the executives Enforced by regulators/auditors Leverage information as an enterpriseasset to drive opportunities Safeguards information Ensure highest quality Manage it throughout lifecycle
  7. 7. © 2012 IBM CorporationIMSIBM Information Governance approachA good Information Governance program supports compliance initiatives,A good Information Governance program supports compliance initiatives,reduces cost and minimizes risk to enable sustainable profitable growthreduces cost and minimizes risk to enable sustainable profitable growthValidated by the Information Governance CouncilTop global companies, business partners and industry expertshttp://www.infogovcommunity.com/Applied with a Unified ProcessRequirements driven alignedwith business goals to solvebusiness problemsAccelerates deployment withCouncil built Maturity ModelA framework (disciplines, levels) asstarting point and for prioritizingactions
  8. 8. © 2013 IBM Corporation8IMSData QualityManagementInformationLife-CycleManagementInformationSecurityand PrivacyCore DisciplinesData Risk Management &ComplianceBusiness Outcomes / ReportingValue CreationDataArchitectureClassification&MetadataAuditInformationLogging & ReportingSupporting DisciplinesOrganizational Structures & AwarenessEnablersPolicy Data StewardshipRequiresSupportsBusiness Intelligence & Advanced AnalyticsEnhancesRequiresSupportsInformation Governance Domains
  9. 9. © 2012 IBM CorporationIMSIBM Information Governance approach
  10. 10. © 2013 IBM Corporation10IMSData QualityManagementInformationLife-CycleManagementInformationSecurityand PrivacyCore DisciplinesData Risk Management &ComplianceBusiness Outcomes / ReportingValue CreationDataArchitectureClassification&MetadataAuditInformationLogging & ReportingSupporting DisciplinesOrganizational Structures & AwarenessEnablersPolicy Data StewardshipRequiresSupportsBusiness Intelligence & Advanced AnalyticsEnhancesRequiresSupportsInformation Governance Domains
  11. 11. © 2013 IBM Corporation11IMSIMS – All You Need in One A z/OS middleware that inherits all the strength of zEnterprise A Messaging & Transaction Manager– Based on a messaging and queuing paradigm– High-volume, rapid response transaction management for application programsaccessing IMS and/or DB2 databases, MQ queues– “Universal” Application Connectivity for SOA integration– Integrated with Business Rules & Business Events A Database Manager– Central point of control and access for the IMS databases– A hierarchical database model• Used by companies needing high transaction rates• Provides database recoverability– Now provide a “Universal” Database Connectivity based on JDBC / DRDA• Lot of new features in that space! Stay tuned A Batch Manager– Standalone z/OS batch support– Batch processing regions centrally managed by the IMS control region• Managing the batch-oriented programs — providing checkpoint/restart services
  12. 12. © 2013 IBM Corporation12IMSDynamics of an Information Ecosystem … with IMS in perspectiveReducethe Costof DataTrust &ProtectInformationMachineDataApplicationDataTransactionSocialMediaContentAnalyticApplicationsMobile/CloudApplicationsTHE INFORMATION SUPPLY CHAINManage Integrate& GovernNew InsightsFromBig DataAnalyzeEnterpriseApplicationsGovernQuality Security&PrivacyLifecycle Standards
  13. 13. © 2013 IBM Corporation13IMSz/OS Database Manager Positioning Hierarchical– Operational Data– Very High performance– Real time mission critical work– Time sensitive response oriented– Complex data structures with many levels Relational– Tabular data– Temporal data– Warehousing– Complex and/or ad hoc queries– Decision support13CUSTOMERBILLCOMMANDARTICLEPRODUCTCUSTOMERCUSTOMERBILLBILL COMMANDCOMMANDPRODUCTPRODUCTARTICLEDB2 for z/OSEnhanced for business analyticsIMS on z/OSBuilt for performance and recovery
  14. 14. © 2013 IBM Corporation14IMSIMS 12 on zEC12 providessuperlative Security, Compliance,Performance, Efficiency, andIndustrial-Strength Transaction andDatabase managementRevolutionize your IMS with zEC12! Most secure system with 99.999% reliability Optimized data serving with largest cache in theindustry Leadership in performance with z/OS using the 5.5GHz 6 way processor chip Ability to process terabytes of data quickly Millions of transactions per day with sub secondresponse times Faster problem determination with IBM zAware forimproved availability Java exploitation of Transactional Execution forincreased parallelism and scalability A 31% improvement to PL/I-based CPU intensiveapplications based on NEW Enterprise PL/I for z/OSand Updated C/C++ compilers Increased Performance through Flash Express oflarge pages via z/OS 1.13Additional gains include: XML hardware acceleration; streamlineand secure valuable SOA applications withIBM WebSphere DataPower Centrally monitored, controlled andautomated operations across heterogeneousenvironments with IBM Tivoli OmegamonIMS 12 on zEC12 shows up to30% improvementin transaction rate
  15. 15. © 2013 IBM Corporation15IMSIMS DB in PerspectiveNative Quality of Services  High Capacity HALDB & DEDBHigh Availability IMS Data SharingPerformance without CPU extracost1/2 the MIPS and 1/2 the DASD ofrelationalApplication Development  Multi-language AD support COBOL, PLI, C, … JAVAXML Support Decomposed or IntactJava SQL support (JDBC) IMS JavaAccess from CICS and IMSapplications, from BatchIMS since early daysOpen Access and DataIntegrationDRDA Universal Driver with IMS11+ Open DatabaseData Management  Metadata Management IMS 12+ CatalogAdvanced Space ManagementCapabilitiesDFSMS familyHealth Check Pointer validation & repairEnterprise Data Governance  Compression and Encryption InfoSphere Guardium ToolsAudit for every access InfoSphere Guardium ToolsData Masking InfoSphere OPTIM FamilyCreation of Test databases InfoSphere OPTIM FamilyData Growth Management InfoSphere OPTIM FamilyOperational Business Analytics& ReportingCOGNOS & SPSSInformation Integration & DataSynchronization Fast integration in Web 2.0applicationsIMS Open databaseData Federation InfoSphere Classic FederationReplication to IMS – TowardsActive / Active solutionInfoSphere IMS ReplicationReplication to Relational InfoSphere Classic ReplicationServer & Classic CDCPublication of DB Changes InfoSphere Classic Data EventPublisher
  16. 16. © 2013 IBM Corporation16IMSIMS Explorer for Development – View ExamplesMuch easiertounderstandthe databasestructureSQL &resultsetsz/OSPerspective
  17. 17. © 2013 IBM Corporation17IMSIMS Explorer for Development – View ExamplesMultiple Logically related databasesManufacturing – assembly parts arrival time to assembly line
  18. 18. © 2013 IBM Corporation18IMS18 Easily refresh & maintain right sized non-productionenvironments, while reducing storage costs Improve application quality and deploy newfunctionality more quickly Speed understanding and project time throughrelationship discovery within and across data sources Understand sensitive data to protect and secure itInfoSphere Optim solutionsManaging data throughout its lifecycle in heterogeneousenvironmentsProductionProductionTrainingTrainingDevelopmentDevelopmentTestTestArchiveArchive•Subset•Mask•Compare•Refresh Reduce hardware, storage & maintenance costs Streamline application upgrades and improveapplication performanceData Growth ManagementTest Data ManagementData Masking Protect sensitive information from misuse & fraud Prevent data breaches and associated finesRetireRetireDiscoverUnderstandClassifyDiscover Safely retire legacy & redundant applications whileretaining the data Ensure application-independent access to archivedataApplication Retirement
  19. 19. © 2013 IBM Corporation19IMS19Managing Test Data in Non-Production – OPTIM Test Data Management Create right-sized test environments, providing support across multipleapplications, databases and operating systems Deploy new functionality quicker and with improved quality & customer satisfaction Compare results during successive test runs to pinpoint defects and errors On z/OS: Support for DB2, IMS, VSAM100 GBDevelopment100 GBTest 100 GB100 GBTraining100 GB100 GBQAProduction orProduction Clone Subset1 TBhttp://www-01.ibm.com/software/data/data-management/optim/core/test-data-management-solution-zos
  20. 20. © 2013 IBM Corporation20IMS20Data Masking and Protection - OPTIM Data Masking Option Reduce risk of exposure during data theft– Fines and lawsuits– Avoid the negative publicity– Customer loss– Loss of intellectual propertyPersonal identifiable information (PII) is maskedwith realistic but fictional data for testing & development purposes.http://www-01.ibm.com/software/data/data-management/optim/core/data-privacy-solution-zos/ De-identify for privacy protection Deploy multiple masking algorithms Provide consistency across environmentsand iterations No value to hackers Enable off-shore testing On z/OS: Support for DB2, IMS DB, VSAM
  21. 21. © 2013 IBM Corporation21IMSInfoSphere Optim Data Masking Solution / OptionExample 2Example 1PersNbr FstNEvtOwn LstNEvtOwn27645 Elliot Flynn27645 Elliot FlynnEvent TableEvent TablePersNbr FstNEvtOwn LstNEvtOwn10002 Pablo Picasso10002 Pablo PicassoEvent TableEvent TablePersonal Info TablePersonal Info TablePersNbr FirstName LastName08054 Alice Bennett19101 Carl Davis27645 Elliot FlynnPersonal Info TablePersonal Info TablePersNbr FirstName LastName10000 Jeanne Renoir10001 Claude Monet10002 Pablo PicassoData masking techniques include:String literal valuesCharacter substringsRandom or sequential numbersArithmetic expressionsConcatenated expressionsDate agingLookup valuesGeneric maskReferential integrity is maintained with keypropagationPatient InformationPatient InformationPatient No. SSNNameAddressCity State Zip112233 123-45-6789Amanda Winters40 Bayberry DriveElgin IL 60123123456 333-22-4444Erica Schafer12 Murray CourtAustin TX 78704Data is masked with contextually correctdata to preserve integrity of test dataSatisfy Privacy regulations Reduce risk of data breaches Maintain value of test data
  22. 22. © 2013 IBM Corporation22IMSManaging Data Growth in Production – OPTIM Data Growth Segregate historical data to secure archive Align performance to service level targets Reclaim under utilized capacity On z/OS: Support for DB2, IMS DB, VSAMCurrentProductionHistoricalSelective RetrievalArchived Data/MetadataReportingDataHistoricalDataReferenceDataSelective ArchiveUniversal Access to Application DataApplication Application XML ODBC / JDBC
  23. 23. © 2013 IBM Corporation23IMSInfoSphere Optim Application Retirement Preserve application data in its business context Retire out-of-date packaged applications as well as legacy customapplications Shut down legacy system without a replacementInfrastructure before RetirementInfrastructure before Retirement Archived Data after ConsolidationArchived Data after Consolidation`User Archive DataArchive Engine`User`User`User DatabaseApplication Data`User DatabaseApplication Data`User DatabaseApplication Data
  24. 24. © 2013 IBM Corporation24IMSSecure & Protect High Value Databases - Guardium Data Encryptionhttp://www-01.ibm.com/software/data/guardium/ Provides: z/OS integrated software support for data encryption Operating System S/W API Interface to Cryptographic Hardware− CEX2/3C/4C hardware feature Enhanced Key Management for key creation and distribution− Public and private keys− Secure and clear keys− Master keys Created keys are stored/accessed in the Cryptographic Key DataSet (CKDS) with unique key label− CKDS itself is secured via Security Access Facility
  25. 25. © 2013 IBM Corporation25IMSSecure & Protect High Value Databases - GuardiumData Encryption Non-invasive architecture Clear and Secure Keys Hardware enabled = Minimalperformance impact Supports DES, TDES & AES algorithms Supports 56, 128 & 256 bit encryption Installed at the segment level No DBMS or application changeshttp://www-01.ibm.com/software/data/guardium/Clear textCryptotext
  26. 26. © 2013 IBM Corporation26IMSSecure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Non-invasive architecture Heterogeneous, cross-DBMS solution Does not rely on native DBMS logs Minimal performance impact No DBMS or application changes Activity logs cannot be erased by attackers orrogue DBAs Automated compliance reporting, sign-offs &escalations (SOX, PCI, NIST, etc.) Granular, real-time policies & auditing Single point of monitoring across DBMSsDB2& DB2/zhttp://www-01.ibm.com/software/data/guardium/IMS VSAM
  27. 27. Copyrite IBM 2013IMSSecure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Architecture
  28. 28. Copyrite IBM 2013IMSHere is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed databaseAUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Contextcolumn the calls in sequence made to the database are seen.Secure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Sample report
  29. 29. Copyrite IBM 2013IMSSecure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Sample reportHere is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed databaseAUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Contextcolumn the calls in sequence made to the database are seen.
  30. 30. © 2013 IBM Corporation30IMSOperational Business Analytics on IMS DataCognos Reporting with IMS 12 Benefits:– Ad hoc reporting access– Report on data reflecting the most current state of the business– React faster to trusted data– Market-leading BI solution for IMS customers Roadmap for customers– Cognos 10.2 & IMS V11: IMS 11 JDBC driver is NOT certified with Cognos 10.2.• Even if Open database architecture is available.– Cognos 10.2 & IMS V12 : IMS 12 JDBC driver with Catalog is certified with Cognos 10.2.• New functions that allow to get enhanced predicats exploited by Cognos• IMS catalog for z/OS centralized metadata managementCognosBIReportAuthoringPublishedReportsCognosFrameworkManagerIMSConsumerAuthorJDBCDataStoreDataModelIMS UniversalJDBC
  31. 31. © 2013 IBM Corporation31IMSOperational Business Analytics on IMS DataCognos Reporting with IMS 12Avail withCognos 10.2
  32. 32. © 2013 IBM Corporation32IMSTechnologies are in Place for Mainframe Apps ExtensibilityWASMQCICS & IMSConnectorsData WarehouseXMLAsset MgmtSOABusiness ProcessesComplianceService MgmtCustomersare here todayTechnology is inplace to gohere nextHybrid ComputingWorkload OptimizationAnalyticsASM & Cobol& PLI & CJavaApplication Investment ProtectionzLinux & zBXBusiness Rules
  33. 33. © 2013 IBM Corporation33IMSBig Data and IMS Databases IMS integration with the BigInsights application connectors– Merge trusted OLTP data with the Big Data platform Integrate IMS with the Big Data Machine Data Accelerator (MDA)– Correlate log records from off-platform application servers with IMS log recordsTraditional ApproachStructured, analytical, logicalNew ApproachCreative, holistic thought, intuitionData WarehouseTraditionalSourcesStructuredRepeatableLinearHadoopStreamsNewSourcesUnstructuredExploratoryIterativeWeb LogsSocial DataText Data: emailsSensor data: imagesRFIDEnterpriseIntegrationIMSIMSOperationalOperationalDataDataTransaction DataInternal App DataMainframe DataOLTP System Data
  34. 34. © 2013 IBM Corporation34IMSAn enterprise information hub on a single integrated platformTransactionProcessingSystems (OLTP)Predictive AnalyticsBest in AnalyticsIndustry recognized leader inBusiness Analytics and DataWarehousing solutionsBest in FlexibilityBest in OLTP &Transactional SolutionsIndustry recognized leader formission critical transactionalsystemsBusiness AnalyticszEnterpriseRecognized leader in workloadmanagement with provensecurity, availabilityand recoverabilityDB2 AnalyticsAccelerator for z/OSPoweredby NetezzaRecognized leader in cost-effective high speed deepanalyticsData MartData Mart ConsolidationUnprecedented mixed workload flexibility and virtualization providingthe most options for cost effective consolidationDataMartDataMartDataMartAbility to start with your mostcritical business issues, quicklyrealize business value, andevolve without re-architectingBest in Batch workloadEfficient executionenvironment close to the datawith first class I/O TechnologyBatch workload
  35. 35. Copyrite IBM 2013IMSWhy Assess Information Maturity Information Maturity is typically assessed to provide a snapshot in time of anorganisations ability to manage information, as defined by the maturity model This is most often used to benchmark and compare maturity• Across time within an organisation• Between organisations With the aim to improve the ability to manage information over time• Reduce the time needed to access information• Reduce stored information complexity• Lower costs through an optimized infrastructure• Gain insight through analysis & discovery• Leverage information for business transformation• Gain control over master data• Manage risk and compliance via a single version of truth This implies that an information maturity assessment should be an ongoing activity
  36. 36. IMSA Model for Information Maturity: An Evolution for our ClientsBusinessValueofInformationInformation ManagementMaturity• Data: All relevant internal and externalinformation seamless and shared. Additionalsources easily added• Integration: Virtualized Information Services• Applications: Dynamic Application Assembly• Infrastructure: Dynamically, re-configurable;Sense & Respond• Flexible, adaptive business environments across enterprise andextraprise• Enablement of strategic business innovation• Optimization of Business performance and operations• Strategic insight• Data: Seamless & shared; Information separated fromprocess; Full integration of structured and unstructured• Integration: Information Available as a Service• Applications: Process Integration via Services; in line busapps• Infrastructure: Resilient SOA; Technology Neutral• Role-based, work environments commonplace• Fully embedded capabilities within workflow, processes &systems• Information-enabled Process innovation• Enhanced Business Process & Operations Management• Foresight, predictive analytics• Data: Standards based, structured & some unstructured• Integration: Integration of silos; Virtualization of Information• Applications: Services-based• Infrastructure: Component/Emerging SOA, PlatformSpecific• Introduction of contextual, role-based, workenvironments• Enhanced levels of automation• Enhancement of existing processes and applications• Integrated business performance management• Single version of truth• Insight thru analytics, real-time• Data: Structured content; organized• Integration: Some integration; silos still remain• Applications: Component-based applications• Infrastructure: Layered Architecture, PlatformSpecific• Basic search, query, reporting and analytics• Some automation• Disparate work environments• Limited enterprise visibility• Multiple versions of the truth• Data: Structured content, static• Integration: Disjointed, Siloed, non-integrated solutions• Applications: Stand alone modules; application-dependent• Infrastructure: Monolithic, Platform Specific• Basic reporting & spreadsheet- based• Manual, ad hoc dependence• Information overload• No version of truth• Hindsight basedInformation as aCompetitiveDifferentiatorInformation toEnable InnovationInformation as aStrategic AssetInformation toManage theBusinessData to Run theBusiness12345
  37. 37. © 2012 IBM CorporationIMS37Data Governance Workshop Key StepsConduct Interviews ofKey IT/Business Leadersand DG CouncilAssess DataGovernance Maturity andTarget CapabilitiesDevelop a Roadmap forDelivering CapabilitiesDevelopRecommendationsNext StepsIdentify Gap to FutureState (18 months)© 2011 IBM Corporation16Information Maturity Assessment – Gap Summary1. Organizational Structures andAwareness2. Data Stewardship3. Policy4. Value Creation5. Data Risk Management &Compliance6. Information Security & Privacy7. Data Architecture8. Data Quality Management9. Classification & Metadata10. Information Life-CycleManagementOptimizingLevel 511. Audit Information Logging &ReportingMaturity Category QuantitativelyManagedDefinedManagedInitialLevel 4Level 3Level 2Level 1Scope of ServicesAssess current stateDetermine future state(in 12- 18 months)Identify required capabilitiesand initiativesCapability Gap© 2011 IBM Corporation222011 2012 …..Implementation Roadmap1. OrganizationalStructures and Awareness2. Data Stewardship3. Policy4. Value Creation5. Data Risk Management& Compliance6.Information Security &PrivacyJulyOSA1: Communication and DG OwnershipOSA2: DGC undertaking critical projectsOSA3: Establish COE & Execution committeeOSA4: Data Stewards across Biz/IT areasDS1: Stewards clearly identified/definedDS2: Pilot program across departmentsDS3: Data Stewards AccountabilityPOL1: Policy prioritizationPOL2: Flushing Policy DetailsPOL3: Policies communication, enforcement and complianceVC1: Develop DG ScorecardVC2: Selective LOB projects using DGVC3: Selective cross-LOB projects using DGAssessment for baseline and establish Data CentricSecurity Reference ArchitectureVulnerability AssessmentData Discovery (structured)Activity Monitoring of current privileged user access to systemsVerify that Level 4 has started by comparing governance success with assessment findings for people and process.Adjust Privileged Access Rights‘Sensitive’ Data PolicyDocument Controls in placemappedto requirements for datasecurity and compliance* Risk Assessment for current ControlsData Centric Security ArchitectureAutomated Activity MonitoringEstablish and Mandate De-Identification program for non-production systems(Test, QA, Dev)Align perimeter & Identity controls with Activity MonitoringBaseline Vulnerability AssessmentPre Assessment InternalSurvey© 2011 IBM Corporation14Pre-Workshop Survey Results - Executive Summary© 2011 IBM Corporation21Next Steps1. Communication of Workshop assessment results2. Validate Data Governance Plan and Objectives Alignment of current business and IT initiatives with IBM workshop assessment Prioritize Data Governance initiatives and integrate with planned project sequence;for short term and long term3. Create Discover Roadmap; with prioritized initiatives4. Implement a Data Governance Project Management Office Obtain Executive sponsorship Define structure, responsibilities, and identify core team Define quality metrics and reporting5. Conduct Detailed workshop / Execution of prioritized initiatives. E.g. DataQuality, Classification and Metadata Management Adopt Metadata Driven Data Governance in IT Acquire Metadata Management, Analysis, and Quality Tools Analyze current data quality Implement Process Improvement for Data Quality6. Define the metrics to identify how the business realizes returns on investmentin the collection, production, and use of data.7. Identify areas where additional consulting would accelerate timeline.© 2011 IBM Corporation8Key Observations and Opportunities Efficiency Data IntegrityPolicy, Standards, Data definition1 Monitored Data Quality (early Risk ID) Quantified RiskMetrics – Data Quality, Business Impact4 Data and analytics optimization for thebusiness Higher ROI / faster paybackValue Creation process ( Enterprise and LOB)3OpportunityObservationREF Cost avoidance Risk mitigationUnstructured content7 Risk mitigation & complianceInternal data access and sensitive datalocation/control6 Ownership Efficiency Data QualityOrganizational Effectiveness and DataAccountability (Stewardship)5 Communication SME availability Level of influenceOrganizational Awareness and EnterpriseSolutions2
  38. 38. © 2013 IBM CorporationInformation Governance – Company’s Assessment resultsDataQualityManagement/DiscoveryInformationLife-CycleManagementInformationSecurityand PrivacyCore DisciplinesData Risk Management &ComplianceBusiness Outcomes / ReportingValue CreationDataArchitectureClassification &MetadataAudit InformationLogging & ReportingSupporting DisciplinesOrganizational Structures & AwarenessEnablersPolicy Data StewardshipRequiresSupportsEnhance
  39. 39. © 2013 IBM CorporationInformation Governance Maturity Assessment current and target mappingAssessed current statePlanned future statePrioritized*Domains withRecommended Action PlanPrioritization has been made byWorkshop attendees based on :• Highest gaps between current andto-be positions• Evaluation of acceptance capability /Feasability by the Organization
  40. 40. © 2013 IBM Corporation40IMSIMS and Data GovernancePalisades, New York: May 14-15, 2013Chicago: May 21-22, 2013São Paulo, BR: June 3, 2013Costa Mesa, CA: June 4-5, 2013Boeblingen, DE : June 5, 2013Taipei, Taiwan: June 2013United Kingdom: June 2013Charlotte, NC: June 25-26 2013 (tentative)
  41. 41. © 2013 IBM Corporation41IMSIMS and Data GovernanceData governanceRegulation complianceAvoiding media embarrassmentCompetitive edgeIMS Enterprise Suite V2.2 ExplorerInfosphere Optim for Test Data ManagementInfosphere Optim for data and application retirementInfosphere Guardium for data protectionEncryption of IMS dataS-Tap monitorsData Maturity assessment workshopInformation Governance Wildfire Workshops
  42. 42. Data Governance for System z Workshop(DGSYSZ)Palisades, NY May 14-15, 2013Chicago, IL May 21-22, 2013Costa Mesa, CA June 4-5, 2013This workshop like all Wildfire Workshops is offered at no-fee to qualified customers.IBM Advanced Technical SkillsWildfire WorkshopWith the complexity of today’s information ecosystems, organizations must improve the level of trustusers have in information, ensure consistency of data, and establish safeguards over information.When information is trusted, business can optimize outcomes.Join us for one and a half days at the IBM Data Governance for System z Workshop. Meet withexperts to understand business and IT implications of Data Governance, Real Time Analytics, andOperational Data Warehousing, and learn how the IBM System z platform can help you meet,simplify, and reduce the cost of meeting your data governance requirements.Workshop Topics:• Drivers of Information Governance• Data & Information Governance, What are They?• Enablers of Enterprise Data Governance Strategy− Policy− Data Stewardship− Organizational Structure & Awareness• Pillars of Data Governance− Data Quality Management− Information Life Cycle− Security, Privacy, & Compliance− Master Data Management• Enterprise Data Governance on System z• Data Architecture on System z
  43. 43. © 2012 IBM CorporationIMS5/22/13 IBM Smart Analytics System 9600
  44. 44. Copyright IBM Corp. 2008 1Section Title in Header© 2013 IBM Corporation®IMSIMS – Data Governance OverviewDennis EichelbergerIT Specialist – IMS Advances Technical Skillsdeichel@us.ibm.comThis is the "cover slide" for use at the start of a major section of the class. For example, this would be first slidefor DBRC, Operations, or another major topic.
  45. 45. Copyright IBM Corp. 2008 2Section Title in Header© 2013 IBM Corporation2IMSIMS and Data GovernanceWhy discuss data governance?What is data governance?How does IMS implement data governance?What are the today’s challenges?Messaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memoryor in a shared structure), and then invokes its scheduler to start the business application program in amessage processing region.The message processing region retrieves the transaction from the IMS message queue and processes it,reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queuesensuring proper management of the transaction scope.The IMS application itself decides to send a response message back, to start another IMS transactionasynchronously or to access synchronously a set of services.IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type ofconfiguration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities withrepositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity,and allows parallel access between transactional workload and batch workload based on efficient lockingmechanisms provided by resources managers.Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batchapplication or utilities. This environment is not covered in this document even if it’s still used a lot by somecustomers who didn’t need over years parallel processing between online and batch.
  46. 46. Copyright IBM Corp. 2008 Section Title in Header3© 2013 IBM Corporation3IMSWhat happens when you’re NOT in control of your business data…3Heathcare – “Dozens of women weretold wrongly that their smear test hadrevealed a separate infection after ahospital error, an independent inquiryhas found….Confusion arose because the hospitaldecided to use a code number tosignify “no infections”, not realizing thatit was already in use at the healthauthority where it meant “multipleinfections”….Retail – “Hackers have stolen 4.2million credit and debit card detailsfrom a US supermarket chain byswiping the data during paymentauthorization transmissions in stores..”Banking – “A major US bank has lostcomputer data tapes containing personalinformation on up to 1.2 million federalemployees, including some members of theU.S. Senate….The lost data includes Social Securitynumbers and account information thatcould make customers of a federalgovernment charge card programvulnerable to identity theft….”Banking – “A rogue trader accused ofthe world’s biggest banking fraud wason the run last night after fakeaccounts with losses of £3.7 billionwere uncovered….The trader used his inside knowledgeof the bank’s control procedures tohack into its computers and erase alltraces of his alleged fraud. Mr Leesonsaid ”Rogue trading is probably a dailyoccurrence within the financialmarkets. What shocked me was thesize. I never believed it would get tothis degree of loss.”Public Sector – “Two computer discsholding the personal details of allfamilies in the UK with a child under 16have gone missing….The Child Benefit data on then includesname, address, date of birth, NationalInsurance number and, where relevant,bank details of 25million people…”WASHINGTON – “The FINRA announcedtoday it has censured and fined a financialservices company $370,000 for makinghundreds of late disclosure to FINRA’sCentral Registration Depository (CRD) ofinformation about its brokers, includingcustomer complaints, regulatory actionsand criminal disclosures. Investors,regulators and others rely heavily on theaccuracy and completeness of theinformation in the CRD public reportingsystem – and, in turn, the integrity of thatsystem depends on timely and accuratereporting by firms.”So what I did was select a handful of incidents takenfrom the press - all public domain - that show whathappens when you’re not in control of your data.They range from incorrect classification of data,security breaches to false information and latedisclosures. Data governance applies to everyindustry irrespective of size and geography.The healthcare item for example - a batch of womenwere incorrectly told that their cervical smear testhad multiple infections - not the kind of letter youwant to receive in the post. The error was that theHospital lab had used a code to signify that aselection of smears had no infections, sent thedetails to the governing authority who were usingthe same code but to mean multiple infections. Sopotentially there could have been numerouspatients starting some quite nasty treatmentsunnecessarily.The one below concerns a bank in Europe where atrader had used his inside knowledge of the backoffice systems to try to remove any traces of hisalleged fraud. Nick Leeson states that roguetrading is probably a daily occurren but he couldn’tbelieve it had got such a large figure before beingnoticed.Up the top a US supermarket had 4.2 million creditcard details stolen during the card swiping process.And below, two discs containing unencrypted detailsof families entitled to child benefit (any family with achild under 16) were sent in the post to anotherdepartment but never arrived. I was one of the 25million people affected and I received a letter statingthat our bank details and daughter’s informationwere on one of the discs that went missing.
  47. 47. Copyright IBM Corp. 2008 Section Title in Header4© 2013 IBM Corporation4IMS…. Resulting in a broad range of potentially life threatening consequences4Heathcare – Dozens of women weretold wrongly that their smear test hadrevealed a separate infection after ahospital error, an independent inquiryhas found….Confusion arose because the hospitaldecided to use a code number tosignify “no infections”, not realizing thatit was already in use at the healthauthority where it meant “multipleinfections”….Retail – Hackers have stolen 4.2 millioncredit and debit card details from a USsupermarket chain by swiping the dataduring payment authorizationtransmissions in stores..Banking – A major US bank has lostcomputer data tapes containing personalinformation on up to 1.2 million federelemployees, including some members of theU.S. Senate….The lost data includes Social Securitynumbers and account information thatcould make customers of a federalgovernment charge card programvulnerable to identity theft….”Banking – Rogue trader accused of theworld’s biggest banking fraud was onthe run last night after fake accountswith losses of £3.7 billion wereuncovered….The trader used his inside knowledgeof the bank’s control procedures tohack into its computers and erase alltraces of his alleged fraud. Mr Leesonsaid ”Rogue trading is probably a dailyoccurrence within the financialmarkets. What shocked me was thesize. I never believed it would get tothis degree of loss.”Public Sector – Two computer discsholding the personal details of allfamilies in the UK with a child under 16have gone missing….The Child Benefit data on then includesname, address, date of birth, NationalInsurance number and, where relevant,bank details of 25million people…”WASHINGTON – The FINRA announcedtoday it has censured and fined a financialservices company $370,000 for makinghundreds of late disclosure to FINRA’sCentral Registration Depository (CRD) ofinformation about its brokers, includingcustomer complaints, regulatory actionsand criminal disclosures. “Investors,regulators and others rely heavily on theaccuracy and completeness of theinformation in the CRD public reportingsystem – and, in turn, the integrity of thatsystem depends on timely and accuratereporting by firms.”ssssssIncorrect classification..Life threateningconsequencesIneffective Security..Brand damageFinancial loss Physical Data Loss..Identity TheftLate Disclosures..Inaccurate dataHeavy Fines,Legal implicationsPhysical unprotected DataLoss..Fraud on a massive scalePoor InternalControls..Bankruptcy,Financial ruin,penaltiesNeed to manage the informationSo moving on, the next slide shows that there arewide ranging consequencesIncorrect classification leading to life threateningconsequencesPoor internal controls leading financial ruin andbankruptcyIneffective security can result in brand damage andfinancial lossPhysical loss of data possibly resulting in potentialfraud and identity theft andLate disclosures leading to fines, legal implicationsand resignations.So don’t slip up as it can cost you and yourorganization lots of money and longer termproblems.
  48. 48. Copyright IBM Corp. 2008 Section Title in Header5© 2012 IBM CorporationIMSWhat is Data Governance and Information Governance?• Data:– Structured– Unstructured– Metadata– Video, Audio, Multi-Media– Print, Email, and Archived– Software Code– Patents, IP– Protocols, Message Streams• Information:– Data which has been processed andtransformed in order to provide insightand answers to business questionsEffective management of data quality needs initiatives which:• span the whole organisation – not just within the silos• get to the root of the problem – not just the symptoms• allocate clear, measurable responsibilitiesThis isData GovernanceEffective use of business information needs a framework which:• manages the underlying data assets effectively through Data Governance• matches the supply of information with its demand from the business• underpins the business requirements with a solid Information architectureInformation Governance: ‘The specification of decisionrights and an accountability framework to encouragedesirable behavior in the valuation, creation, storage,use, archival, and deletion of information. It includesthe processes, roles, standards, and metrics thatensure the effective and efficient use of information inenabling an organization to achieve its goals.’ ~Gartner Inc.This isInformation GovernanceAs a business,we need to use these terms consistently
  49. 49. Copyright IBM Corp. 2008 Section Title in Header© 2013 IBM CorporationIMSData Governance Creates Order out of Data Chaos  Orchestrate people, process andtechnology toward a common goal Promotes collaboration Derive maximum value from informationData Governance is the exercise of decision rights toData Governance is the exercise of decision rights tooptimize, secure and leverage data as an enterprise asset.optimize, secure and leverage data as an enterprise asset.Governing the creation, management and usage ofGoverning the creation, management and usage ofenterprise data is not an option any longer. It is:enterprise data is not an option any longer. It is:Expected by your customers Demanded by the executives Enforced by regulators/auditors Leverage information as an enterpriseasset to drive opportunities Safeguards information Ensure highest quality Manage it throughout lifecycle
  50. 50. Copyright IBM Corp. 2008 Section Title in Header7© 2012 IBM CorporationIMSIBM Information Governance approachA good Information Governance program supports compliance initiatives,A good Information Governance program supports compliance initiatives,reduces cost and minimizes risk to enable sustainable profitable growthreduces cost and minimizes risk to enable sustainable profitable growthValidated by the Information Governance CouncilTop global companies, business partners and industry expertshttp://www.infogovcommunity.com/Applied with a Unified ProcessRequirements driven alignedwith business goals to solvebusiness problemsAccelerates deployment withCouncil built Maturity ModelA framework (disciplines, levels) asstarting point and for prioritizingactions
  51. 51. 8The IBM Information Governance Council Maturity Model measures informationgovernance competencies of organizations based on the 11 crucial domains ofinformation governance maturity, such as organizational awareness and risklifecycle management.Illustrated here are those domains which have been grouped based upon theirprimary relationships. These groupings are:• Outcomes• Enablers• Core Disciplines• Supporting DisciplinesFor example, consider that quality and security/privacy requirements for dataneed to be assessed and managed throughout the information lifecycle.Executive-level endorsement and sponsorship is an enabler for stewardship ofinformation that requires standardization across processes and functionalboundaries. Consistency in practice can be enabled through stewardship whenthere are enterprise-level policies and standards in place for informationgovernance disciplines.These domains or categories of the maturity model are further refined intomultiple sub-domains or sub-categories for assessing maturity.The maturity model is described on the next slide.----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------SOURCE:“The IBM Information Governance Council Maturity Model: Building aroadmap for effective information governance” white paper, dated October2007, http://www-01.ibm.com/software/sw-library/en_US/detail/Z992137B74662E40.html
  52. 52. Copyright IBM Corp. 2008 Section Title in Header9© 2012 IBM CorporationIMSIBM Information Governance approach
  53. 53. 10The IBM Information Governance Council Maturity Model measures informationgovernance competencies of organizations based on the 11 crucial domains ofinformation governance maturity, such as organizational awareness and risklifecycle management.Illustrated here are those domains which have been grouped based upon theirprimary relationships. These groupings are:• Outcomes• Enablers• Core Disciplines• Supporting DisciplinesFor example, consider that quality and security/privacy requirements for dataneed to be assessed and managed throughout the information lifecycle.Executive-level endorsement and sponsorship is an enabler for stewardship ofinformation that requires standardization across processes and functionalboundaries. Consistency in practice can be enabled through stewardship whenthere are enterprise-level policies and standards in place for informationgovernance disciplines.These domains or categories of the maturity model are further refined intomultiple sub-domains or sub-categories for assessing maturity.The maturity model is described on the next slide.----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------SOURCE:“The IBM Information Governance Council Maturity Model: Building aroadmap for effective information governance” white paper, dated October2007, http://www-01.ibm.com/software/sw-library/en_US/detail/Z992137B74662E40.html
  54. 54. Copyright IBM Corp. 2008 11Section Title in Header© 2013 IBM Corporation11IMSIMS – All You Need in One A z/OS middleware that inherits all the strength of zEnterprise A Messaging & Transaction Manager– Based on a messaging and queuing paradigm– High-volume, rapid response transaction management for application programsaccessing IMS and/or DB2 databases, MQ queues– “Universal” Application Connectivity for SOA integration– Integrated with Business Rules & Business Events A Database Manager– Central point of control and access for the IMS databases– A hierarchical database model• Used by companies needing high transaction rates• Provides database recoverability– Now provide a “Universal” Database Connectivity based on JDBC / DRDA• Lot of new features in that space! Stay tuned A Batch Manager– Standalone z/OS batch support– Batch processing regions centrally managed by the IMS control region• Managing the batch-oriented programs — providing checkpoint/restart servicesMessaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memoryor in a shared structure), and then invokes its scheduler to start the business application program in amessage processing region.The message processing region retrieves the transaction from the IMS message queue and processes it,reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queuesensuring proper management of the transaction scope.The IMS application itself decides to send a response message back, to start another IMS transactionasynchronously or to access synchronously a set of services.IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type ofconfiguration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities withrepositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity,and allows parallel access between transactional workload and batch workload based on efficient lockingmechanisms provided by resources managers.Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batchapplication or utilities. This environment is not covered in this document even if it’s still used a lot by somecustomers who didn’t need over years parallel processing between online and batch.
  55. 55. Copyright IBM Corp. 2008 12Section Title in Header12© 2013 IBM Corporation12IMSDynamics of an Information Ecosystem … with IMS in perspectiveReducethe Costof DataTrust &ProtectInformationMachineDataApplicationDataTransactionSocialMediaContentAnalyticApplicationsMobile/CloudApplicationsTHE INFORMATION SUPPLY CHAINManage Integrate& GovernNew InsightsFromBig DataAnalyzeEnterpriseApplicationsGovernQuality Security&PrivacyLifecycle Standards
  56. 56. Copyright IBM Corp. 2008 13Section Title in Header© 2013 IBM Corporation13IMSz/OS Database Manager Positioning Hierarchical– Operational Data– Very High performance– Real time mission critical work– Time sensitive response oriented– Complex data structures with many levels Relational– Tabular data– Temporal data– Warehousing– Complex and/or ad hoc queries– Decision support13CUSTOMERBILLCOMMANDARTICLEPRODUCTCUSTOMERCUSTOMERBILLBILL COMMANDCOMMANDPRODUCTPRODUCTARTICLEDB2 for z/OSEnhanced for business analyticsIMS on z/OSBuilt for performance and recoveryPage 7 Database Manager positioning is based on two types of fundamentally different data -- Operational and Informational.Operational data is more application oriented, constantly updated, and must support daily operations.Informational data is subject oriented, non-volatile, and supports decision making.Hierarchical and relational were developed with inherently different characteristics. Hierarchical is more efficient in data access and storage with strict rules foraccess. Relational makes data access easier when not needing to be defined in advance. Thus each plays a different, critical role - best at what each isdesigned for.Hierarchical is best for mission-critical work requiring the utmost in performance. Relational is best for decision support where application productivity isrequired. Both embrace hierarchical XML data for Business to Business data interchange. Both continue to be enhanced to address different applicationrequirements, creating more overlap in their capabilities. IBM continues to invest in both with complementary solutions.How IMS fits into your strategy-------------------------------------move operational data to top bulletutmost perf as numberreal time mission critical workall deal with mission critical info - in case of ims its real fast, subsecond response time for operations...relational:warehousing (put data in warehouse)complex queries (against your warehouse)decision support (to help with your decision support)one is for data warehousing and one is for operationalXML:xml is popular for messaginxml as data is struggling to make promise a realitytwo parts of xml: exchange of messages (metadata is encapsulated in a message)xml as data repository structurexml:
  57. 57. Copyright IBM Corp. 2008 14Section Title in Header© 2013 IBM Corporation14IMSIMS 12 on zEC12 providessuperlative Security, Compliance,Performance, Efficiency, andIndustrial-Strength Transaction andDatabase managementRevolutionize your IMS with zEC12! Most secure system with 99.999% reliability Optimized data serving with largest cache in theindustry Leadership in performance with z/OS using the 5.5GHz 6 way processor chip Ability to process terabytes of data quickly Millions of transactions per day with sub secondresponse times Faster problem determination with IBM zAware forimproved availability Java exploitation of Transactional Execution forincreased parallelism and scalability A 31% improvement to PL/I-based CPU intensiveapplications based on NEW Enterprise PL/I for z/OSand Updated C/C++ compilers Increased Performance through Flash Express oflarge pages via z/OS 1.13Additional gains include: XML hardware acceleration; streamlineand secure valuable SOA applications withIBM WebSphere DataPower Centrally monitored, controlled andautomated operations across heterogeneousenvironments with IBM Tivoli OmegamonIMS 12 on zEC12 shows up to30% improvementin transaction rateUpper left box we consider our “IMS zEC12 Mission Statement”Box on right includes the key zEC12-specific value statements that also apply to IMS, as well as the threekey features that IMS can/will exploit:Java exploitation of Transactional Execution for increased parallelism and scalability – for IMS Javaapplications on zEC12, clients will see both enhanced performance as well as reduced TCO throughspecialty engine offload capabilitiesA 31% improvement to PL/I-based CPU intensive applications based on NEW Enterprise PL/I forz/OS and Updated C/C++ compilers – For IMS PL/I, C/C++ applications on zEC12, a recompile will lead toincreased performance.Increased Performance through Flash Express and pageable large pages via z/OS 1.13 exploitation– IMS exploits the Flash Express/Pageable Large Pages feature of z/OS 1.13 to improve performance andspeed data access.Lower left box shows additional gains that our clients can see by including specific hardware appliances orsoftware.And lastly, our current performance improvement metric for IMS 12 on the zEC12. Awesome. 12 on 12leads to great synergy.
  58. 58. Copyright IBM Corp. 2008 15Section Title in Header© 2013 IBM Corporation15IMSIMS DB in PerspectiveNative Quality of Services  High Capacity HALDB & DEDBHigh Availability IMS Data SharingPerformance without CPU extracost1/2 the MIPS and 1/2 the DASD ofrelationalApplication Development  Multi-language AD support COBOL, PLI, C, … JAVAXML Support Decomposed or IntactJava SQL support (JDBC) IMS JavaAccess from CICS and IMSapplications, from BatchIMS since early daysOpen Access and DataIntegrationDRDA Universal Driver with IMS11+ Open DatabaseData Management  Metadata Management IMS 12+ CatalogAdvanced Space ManagementCapabilitiesDFSMS familyHealth Check Pointer validation & repairBackup and Recovery AdvancedSolutionsIMS ToolsReorganization for betterperformanceIMS ToolsEnterprise Data Governance  Compression and Encryption InfoSphere Guardium ToolsAudit for every access InfoSphere Guardium ToolsData Masking InfoSphere OPTIM FamilyCreation of Test databases InfoSphere OPTIM FamilyData Growth Management InfoSphere OPTIM FamilyOperational Business Analytics& ReportingCOGNOS & SPSSInformation Integration & DataSynchronization Fast integration in Web 2.0applicationsIMS Open databaseData Federation InfoSphere Classic FederationReplication to IMS – TowardsActive / Active solutionInfoSphere IMS ReplicationReplication to Relational InfoSphere Classic ReplicationServer & Classic CDCPublication of DB Changes InfoSphere Classic Data EventPublisher
  59. 59. Copyright IBM Corp. 2008 16Section Title in Header© 2013 IBM Corporation16IMSIMS Explorer for Development – View ExamplesMuch easiertounderstandthe databasestructureSQL &resultsetsz/OSPerspective
  60. 60. Copyright IBM Corp. 2008 17Section Title in Header© 2013 IBM Corporation17IMSIMS Explorer for Development – View ExamplesMultiple Logically related databasesManufacturing – assembly parts arrival time to assembly line
  61. 61. Copyright IBM Corp. 2008 18Section Title in Header18© 2013 IBM Corporation18IMS18 Easily refresh & maintain right sized non-productionenvironments, while reducing storage costs Improve application quality and deploy newfunctionality more quickly Speed understanding and project time throughrelationship discovery within and across data sources Understand sensitive data to protect and secure itInfoSphere Optim solutionsManaging data throughout its lifecycle in heterogeneousenvironmentsProductionProductionTrainingTrainingDevelopmentDevelopmentTestTestArchiveArchive•Subset•Mask•Compare•Refresh Reduce hardware, storage & maintenance costs Streamline application upgrades and improveapplication performanceData Growth ManagementTest Data ManagementData Masking Protect sensitive information from misuse & fraud Prevent data breaches and associated finesRetireRetireDiscoverUnderstandClassifyDiscover Safely retire legacy & redundant applications whileretaining the data Ensure application-independent access to archivedataApplication RetirementIBM InfoSphere Optim Solutions allows you to manage data through its lifecycle in heterogeneousenvironments.You may have a lot of data scattered around the organization – how do you find it? How do you know how itrelates to other enterprise data? IBM InfoSphere Optim provides a solution to Discover the data and therelationships as information comes into the enterprise.You need to develop applications and functionality that can best maintain your data – and you need toeffectively test those applications. We provide a solution for DBAs, testers and developers to effectivelycreate and manage right size test data while protecting sensitive test data in development and testenvironments.The day-to-day challenges of managing the lifecycle of your data are intensified by the growth of datavolumes. IBM InfoSphere Optim provides intelligent archiving techniques so that infrequently accesseddata does not impede application performance, while still providing access to that data .IBM InfoSphereOptim provides a Data Growth solution that helps reduce hardware, storage and maintenance costs.Over time, the applications managing your data will need to be upgraded, consolidated and eventuallyretired – but not the data. Many organizations today are over burdened with redundant or legacyapplications – e.g. as organizations are merged/acquired, so are their IT systems.. By leveragingInfoSphere Optim’s Application Retirement solution and archiving best practices you can ensure access tobusiness-critical data for long-term data retention, long after an application’s life-expectancy.
  62. 62. Copyright IBM Corp. 2008 19Section Title in Header19© 2013 IBM Corporation19IMS19Managing Test Data in Non-Production – OPTIM Test Data Management Create right-sized test environments, providing support across multipleapplications, databases and operating systems Deploy new functionality quicker and with improved quality & customer satisfaction Compare results during successive test runs to pinpoint defects and errors On z/OS: Support for DB2, IMS, VSAM100 GBDevelopment100 GBTest 100 GB100 GBTraining100 GB100 GBQAProduction orProduction Clone Subset1 TBhttp://www-01.ibm.com/software/data/data-management/optim/core/test-data-management-solution-zosOnce piece of Solution Delivery is managing Test Data. This means the creation of non-productionenvironments including Test, Development and Training environments with the least amount of data in thefastest amount of time. As you can see in this example, our production database is made up of over 2Terabytes of data. If we replicate this all over the place, we will have to pay for software and hardware tosupport it as well as just imagine the amount of duplication effort that is required in time and difficulty.Effective Test Data Management will allow me to build just what is needed, quickly, easily and significantlymore cost effectively.We begin with a production system or clone of productionOptim extracts the desired data records, based on user specifications, and safely copies them to acompressed file.It loads the file into the target Development, Test or QA environment.After running tests, and for DB2 only, it can compare the results against the baseline data to validate resultsand identify any errors. They can refresh the database simply by re-inserting the extract file, therebyensuring consistency.
  63. 63. Copyright IBM Corp. 2008 20Section Title in Header20© 2013 IBM Corporation20IMS20Data Masking and Protection - OPTIM Data Masking Option Reduce risk of exposure during data theft– Fines and lawsuits– Avoid the negative publicity– Customer loss– Loss of intellectual propertyPersonal identifiable information (PII) is maskedwith realistic but fictional data for testing & development purposes.http://www-01.ibm.com/software/data/data-management/optim/core/data-privacy-solution-zos/ De-identify for privacy protection Deploy multiple masking algorithms Provide consistency across environmentsand iterations No value to hackers Enable off-shore testing On z/OS: Support for DB2, IMS DB, VSAMData Privacy protects an organizations data both in production and non-production environments with encryption andmasking technology. Remember, we don’t keep the data from being stolen, but instead render the data unusable and ofno value if stolen. This protects the business both financially and from loss of information and provides IT with a simpleto use solution that supports a common way of protecting data across the enterprise.Here is an example of the context aware data masking that IBM Optim Data Privacy can perform. The real credit cardnumber is transformed to another seemingly valid credit card number that conforms to all the rule for a valid Visa creditcard numbers (e.g.. it has 16 digits, starts with 4 etc.). Similarly the first and last name of the actual person “Eugene V.Wheatley” is transformed to a fictional yet valid name of “Sanford P. Briggs”.IBM Optim Data Privacy provides the ability to easily perform this type of de-identification to your sensitive dataautomatically. Most importantly, it can do this data transformation in a way that is appropriate to the context of theapplication. That is, the results of data transformation have to make sense to the person reviewing the test results. Forexample, fields containing alphabetic characters should be substituted with other alphabetic characters, in the appropriatepattern. Also, the transformed data must be within the range of permissible values. For example, if your diagnostic codesare four digits long, but only range from 0001 to 1000, a masked value of 2000 would not be appropriate. Also if anaddress is needed, you would like to use a street address that actually exists as opposed to using somethingmeaningless like XXXXXX as a street name.This is called “context aware” data transformation or data masking and is a core capability of IBM Optim Data Privacy.http://www-01.ibm.com/software/data/data-management/optim/core/data-privacy-solution-zos/IBM Optim Data Privacy comes with a multitude of these built in masking functions, as well as the ability to of coursedefinite your own transformations. There is no longer a reason to needlessly expose your sensitive data in your testenvironments ever again.
  64. 64. Copyright IBM Corp. 2008 Section Title in Header© 2013 IBM Corporation21IMSInfoSphere Optim Data Masking Solution / OptionExample 2Example 1PersNbr FstNEvtOwn LstNEvtOwn27645 Elliot Flynn27645 Elliot FlynnEvent TableEvent TablePersNbr FstNEvtOwn LstNEvtOwn10002 Pablo Picasso10002 Pablo PicassoEvent TableEvent TablePersonal Info TablePersonal Info TablePersNbr FirstName LastName08054 Alice Bennett19101 Carl Davis27645 Elliot FlynnPersonal Info TablePersonal Info TablePersNbr FirstName LastName10000 Jeanne Renoir10001 Claude Monet10002 Pablo PicassoData masking techniques include:String literal valuesCharacter substringsRandom or sequential numbersArithmetic expressionsConcatenated expressionsDate agingLookup valuesGeneric maskReferential integrity is maintained with keypropagationPatient InformationPatient InformationPatient No. SSNNameAddressCity State Zip112233 123-45-6789Amanda Winters40 Bayberry DriveElgin IL 60123123456 333-22-4444Erica Schafer12 Murray CourtAustin TX 78704Data is masked with contextually correctdata to preserve integrity of test dataSatisfy Privacy regulations Reduce risk of data breaches Maintain value of test data
  65. 65. Copyright IBM Corp. 2008 22Section Title in Header© 2013 IBM Corporation22IMSManaging Data Growth in Production – OPTIM Data Growth Segregate historical data to secure archive Align performance to service level targets Reclaim under utilized capacity On z/OS: Support for DB2, IMS DB, VSAMCurrentProductionHistoricalSelective RetrievalArchived Data/MetadataReportingDataHistoricalDataReferenceDataSelective ArchiveUniversal Access to Application DataApplication Application XML ODBC / JDBCHere is how Optim works: This is a typical example of a Production environment prior to archiving. BothActive and Inactive data is stored in the Production environment, taking up most of the space on theProduction Server.<Click>Optim safely moves the inactive or historical data to an archive, which can be stored in a variety ofenvironments.<Click>Optim provides universal access to this data through multiple methods, including Report Writers such asCognos and Crystal Reports, XML, ODBC/JDBC, application based access (Oracle, Siebel, etc.) or evenMS Excel.<Click>Finally, data can be easily retrieved to an application environment when additional business processing isrequired. As you can see, Optim’s broadcapabilities for enterprise data management help give CIOs a comprehensive solution for dealing with datagrowth.
  66. 66. Copyright IBM Corp. 2008 Section Title in Header© 2013 IBM Corporation23IMSInfoSphere Optim Application Retirement Preserve application data in its business context Retire out-of-date packaged applications as well as legacy customapplications Shut down legacy system without a replacementInfrastructure before RetirementInfrastructure before Retirement Archived Data after ConsolidationArchived Data after Consolidation`User Archive DataArchive Engine`User`User`User DatabaseApplication Data`User DatabaseApplication Data`User DatabaseApplication Data
  67. 67. 24Copyright IBM Corp. 2008Section Title in Header© 2013 IBM Corporation24IMSSecure & Protect High Value Databases - Guardium Data Encryptionhttp://www-01.ibm.com/software/data/guardium/ Provides: z/OS integrated software support for data encryption Operating System S/W API Interface to Cryptographic Hardware− CEX2/3C/4C hardware feature Enhanced Key Management for key creation and distribution− Public and private keys− Secure and clear keys− Master keys Created keys are stored/accessed in the Cryptographic Key DataSet (CKDS) with unique key label− CKDS itself is secured via Security Access FacilityZSERIES EncryptionUsing the CEX hardware accelerator provides minimal impact on performance.Both Clear Keys and Secures are supported.A clear key is not encrypted and can be found in a storage dump. This is not truly acceptable for securitypurposes. A Secure key is encrypted by the system master key while at rest AND will not be shown instorage or a dump of the system.DES (Data Encryption Standard), TDES (triple or Telecommunications Data Encryption Standard) and AES(Advanced Encryption Standard) are all supported encryption algorithms. The DES supports 56 bit only andis not considered strong by todays standards. TDES and AES support 128 bit and are consideredacceptable. AES also supports 192 and 256 bit encryption. The 256 bit is considered to be strategic.An encryption exit is installed in the IMS Database definition at the segment level and implemented duringan unload/reload of the database.There are not changes to the application programs accessing the data24
  68. 68. 25Copyright IBM Corp. 2008Section Title in Header© 2013 IBM Corporation25IMSSecure & Protect High Value Databases - GuardiumData Encryption Non-invasive architecture Clear and Secure Keys Hardware enabled = Minimalperformance impact Supports DES, TDES & AES algorithms Supports 56, 128 & 256 bit encryption Installed at the segment level No DBMS or application changeshttp://www-01.ibm.com/software/data/guardium/Clear textCryptotextZSERIES EncryptionUsing the CEX hardware accelerator provides minimal impact on performance.Both Clear Keys and Secures are supported.A clear key is not encrypted and can be found in a storage dump. This is not truly acceptable for securitypurposes. A Secure key is encrypted by the system master key while at rest AND will not be shown instorage or a dump of the system.DES (Data Encryption Standard), TDES (triple or Telecommunications Data Encryption Standard) and AES(Advanced Encryption Standard) are all supported encryption algorithms. The DES supports 56 bit only andis not considered strong by todays standards. TDES and AES support 128 bit and are consideredacceptable. AES also supports 192 and 256 bit encryption. The 256 bit is considered to be strategic.An encryption exit is installed in the IMS Database definition at the segment level and implemented duringan unload/reload of the database.There are not changes to the application programs accessing the data25
  69. 69. 26Copyright IBM Corp. 2008Section Title in Header© 2013 IBM Corporation26IMSSecure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Non-invasive architecture Heterogeneous, cross-DBMS solution Does not rely on native DBMS logs Minimal performance impact No DBMS or application changes Activity logs cannot be erased by attackers orrogue DBAs Automated compliance reporting, sign-offs &escalations (SOX, PCI, NIST, etc.) Granular, real-time policies & auditing Single point of monitoring across DBMSsDB2& DB2/zhttp://www-01.ibm.com/software/data/guardium/IMS VSAMLet’s talk about our solution!Heterogeneous support for Databases and ApplicationsS-TAP Agentslightweight cross platform supportNO changes to Databases or ApplicationsAlso monitor direct access to databases by privileged users (such as SSH console access), whichcan’t be detected by solutions that only monitor at the switch level.Collectors handle the heavy lifting (continuous analysis, reporting and storage of audit data)reduces the impact on the database serverOur solution does not rely on log or native audit dataDBAs can (sometimes have to!) turn this offLogging greatly impacts performance on the Database Server as you increase granularity!Real-time alerting – not after the factMonitor ALL Access26
  70. 70. 27Copyrite IBM 201306/04/13IMSSecure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Architecture
  71. 71. 28Copyrite IBM 201306/04/13IMSHere is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed databaseAUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Contextcolumn the calls in sequence made to the database are seen.Secure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Sample report
  72. 72. 29Copyrite IBM 201306/04/13IMSSecure & Protect High Value Databases - GuardiumReal-Time Database Monitoring Sample reportHere is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed databaseAUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Contextcolumn the calls in sequence made to the database are seen.
  73. 73. 30Copyright IBM Corp. 2008Section Title in Header30© 2013 IBM Corporation30IMSOperational Business Analytics on IMS DataCognos Reporting with IMS 12 Benefits:– Ad hoc reporting access– Report on data reflecting the most current state of the business– React faster to trusted data– Market-leading BI solution for IMS customers Roadmap for customers– Cognos 10.2 & IMS V11: IMS 11 JDBC driver is NOT certified with Cognos 10.2.• Even if Open database architecture is available.– Cognos 10.2 & IMS V12 : IMS 12 JDBC driver with Catalog is certified with Cognos 10.2.• New functions that allow to get enhanced predicats exploited by Cognos• IMS catalog for z/OS centralized metadata managementCognosBIReportAuthoringPublishedReportsCognosFrameworkManagerIMSConsumerAuthorJDBCDataStoreDataModelIMS UniversalJDBC
  74. 74. 31Copyright IBM Corp. 2008Section Title in Header31© 2013 IBM Corporation31IMSOperational Business Analytics on IMS DataCognos Reporting with IMS 12Avail withCognos 10.2
  75. 75. Copyright IBM Corp. 2008 32Section Title in Header32© 2013 IBM Corporation32IMSTechnologies are in Place for Mainframe Apps ExtensibilityWASMQCICS & IMSConnectorsData WarehouseXMLAsset MgmtSOABusiness ProcessesComplianceService MgmtCustomersare here todayTechnology is inplace to gohere nextHybrid ComputingWorkload OptimizationAnalyticsASM & Cobol& PLI & CJavaApplication Investment ProtectionzLinux & zBXBusiness Rules
  76. 76. Copyright IBM Corp. 2008 33Section Title in Header© 2013 IBM Corporation33IMSBig Data and IMS Databases IMS integration with the BigInsights application connectors– Merge trusted OLTP data with the Big Data platform Integrate IMS with the Big Data Machine Data Accelerator (MDA)– Correlate log records from off-platform application servers with IMS log recordsTraditional ApproachStructured, analytical, logicalNew ApproachCreative, holistic thought, intuitionData WarehouseTraditionalSourcesStructuredRepeatableLinearHadoopStreamsNewSourcesUnstructuredExploratoryIterativeWeb LogsSocial DataText Data: emailsSensor data: imagesRFIDEnterpriseIntegrationIMSIMSOperationalOperationalDataDataTransaction DataInternal App DataMainframe DataOLTP System Data
  77. 77. Copyright IBM Corp. 2008 Section Title in Header© 2013 IBM Corporation34IMSAn enterprise information hub on a single integrated platformTransactionProcessingSystems (OLTP)Predictive AnalyticsBest in AnalyticsIndustry recognized leader inBusiness Analytics and DataWarehousing solutionsBest in FlexibilityBest in OLTP &Transactional SolutionsIndustry recognized leader formission critical transactionalsystemsBusiness AnalyticszEnterpriseRecognized leader in workloadmanagement with provensecurity, availabilityand recoverabilityDB2 AnalyticsAccelerator for z/OSPoweredby NetezzaRecognized leader in cost-effective high speed deepanalyticsData MartData Mart ConsolidationUnprecedented mixed workload flexibility and virtualization providingthe most options for cost effective consolidationDataMartDataMartDataMartAbility to start with your mostcritical business issues, quicklyrealize business value, andevolve without re-architectingBest in Batch workloadEfficient executionenvironment close to the datawith first class I/O TechnologyBatch workload
  78. 78. Copyright IBM Corp. 2008 Section Title in Header3535Copyrite IBM 2013IMSWhy Assess Information Maturity Information Maturity is typically assessed to provide a snapshot in time of anorganisations ability to manage information, as defined by the maturity model This is most often used to benchmark and compare maturity• Across time within an organisation• Between organisations With the aim to improve the ability to manage information over time• Reduce the time needed to access information• Reduce stored information complexity• Lower costs through an optimized infrastructure• Gain insight through analysis & discovery• Leverage information for business transformation• Gain control over master data• Manage risk and compliance via a single version of truth This implies that an information maturity assessment should be an ongoing activity
  79. 79. Copyright IBM Corp. 2008 Section Title in Header3636IMSA Model for Information Maturity: An Evolution for our ClientsBusinessValueofInformationInformation ManagementMaturity• Data: All relevant internal and externalinformation seamless and shared. Additionalsources easily added• Integration: Virtualized Information Services• Applications: Dynamic Application Assembly• Infrastructure: Dynamically, re-configurable;Sense & Respond• Flexible, adaptive business environments across enterprise andextraprise• Enablement of strategic business innovation• Optimization of Business performance and operations• Strategic insight• Data: Seamless & shared; Information separated fromprocess; Full integration of structured and unstructured• Integration: Information Available as a Service• Applications: Process Integration via Services; in line busapps• Infrastructure: Resilient SOA; Technology Neutral• Role-based, work environments commonplace• Fully embedded capabilities within workflow, processes &systems• Information-enabled Process innovation• Enhanced Business Process & Operations Management• Foresight, predictive analytics• Data: Standards based, structured & some unstructured• Integration: Integration of silos; Virtualization of Information• Applications: Services-based• Infrastructure: Component/Emerging SOA, PlatformSpecific• Introduction of contextual, role-based, workenvironments• Enhanced levels of automation• Enhancement of existing processes and applications• Integrated business performance management• Single version of truth• Insight thru analytics, real-time• Data: Structured content; organized• Integration: Some integration; silos still remain• Applications: Component-based applications• Infrastructure: Layered Architecture, PlatformSpecific• Basic search, query, reporting and analytics• Some automation• Disparate work environments• Limited enterprise visibility• Multiple versions of the truth• Data: Structured content, static• Integration: Disjointed, Siloed, non-integrated solutions• Applications: Stand alone modules; application-dependent• Infrastructure: Monolithic, Platform Specific• Basic reporting & spreadsheet- based• Manual, ad hoc dependence• Information overload• No version of truth• Hindsight basedInformation as aCompetitiveDifferentiatorInformation toEnable InnovationInformation as aStrategic AssetInformation toManage theBusinessData to Run theBusiness12345
  80. 80. Copyright IBM Corp. 2008 Section Title in Header© 2012 IBM CorporationIMS37Data Governance Workshop Key StepsConduct Interviews ofKey IT/Business Leadersand DG CouncilAssess DataGovernance Maturity andTarget CapabilitiesDevelop a Roadmap forDelivering CapabilitiesDevelopRecommendationsNext StepsIdentify Gap to FutureState (18 months)© 2011 IBM Corporation16Information Maturity Assessment – Gap Summary1. Organizational Structures andAwareness2. Data Stewardship3. Policy4. Value Creation5. Data Risk Management &Compliance6. Information Security & Privacy7. Data Architecture8. Data Quality Management9. Classification & Metadata10. Information Life-CycleManagementOptimizingLevel 511. Audit Information Logging &ReportingMaturity Category QuantitativelyManagedDefinedManagedInitialLevel 4Level 3Level 2Level 1Scope of ServicesAssess current stateDetermine future state(in 12- 18 months)Identify required capabilitiesand initiativesCapability Gap© 2011 IBMCorporation222011 2012 …..Implementation Roadmap1. OrganizationalStructures and Awareness2. Data Stewardship3. Policy4. Value Creation5. Data Risk Management& Compliance6.Information Security &PrivacyJulyOSA1: Communication andDG OwnershipOSA2: DGC undertaking critical projectsOSA3: EstablishCOE & ExecutioncommitteeOSA4:Data Stewards across Biz/ITareasDS1: Stewards clearly identified/definedDS2: Pilot program across departmentsDS3: Data Stewards AccountabilityPOL1:Policy prioritizationPOL2: FlushingPolicy DetailsPOL3: Policies communication, enforcement andcomplianceVC1: DevelopDG ScorecardVC2: Selective LOB projects using DGVC3: Selective cross-LOB projects using DGAssessment for baseline and establish DataCentricSecurity Reference ArchitectureVulnerability AssessmentData Discovery (structured)Activity Monitoringof currentprivileged user access to systemsVerify thatLevel 4 has startedby comparinggovernancesuccess withassessment findings for people andprocess.AdjustPrivilegedAccess Rights‘Sensitive’ Data PolicyDocument Controls in place mapped torequirements for datasecurity and compliance* Risk Assessment for current ControlsData Centric Security ArchitectureAutomated Activity MonitoringEstablish and Mandate De-Identification program for non-production systems(Test, QA, Dev)Align perimeter & Identity controls withActivity MonitoringBaselineVulnerability AssessmentPre Assessment InternalSurvey© 2011 IBM Corporation14Pre-Workshop Survey Results - Executive Summary© 2011 IBMCorporation21Next Steps1. Communication of Workshop assessment results2. Validate Data Governance Plan and Objectives Alignment of current business and IT initiatives with IBM workshop assessment Prioritize Data Governance initiatives and integrate with planned project sequence;for short term and long term3. Create Discover Roadmap; with prioritized initiatives4. Implement a Data Governance Project Management Office Obtain Executive sponsorship Define structure, responsibilities, and identify core team Define quality metrics and reporting5. Conduct Detailed workshop / Execution of prioritized initiatives. E.g. DataQuality, Classification and Metadata Management Adopt Metadata Driven Data Governance in IT Acquire Metadata Management, Analysis, and Quality Tools Analyze current data quality Implement Process Improvement for Data Quality6. Define the metrics to identify how the business realizes returns on investmentin the collection, production, and use of data.7. Identify areas where additional consulting would accelerate timeline.©2011 IBMCorporation8Key Observations and OpportunitiesEfficiencyData IntegrityPolicy, Standards, Data definition1Monitored Data Quality (early Risk ID)Quantified RiskMetrics – Data Quality, Business Impact4Data and analytics optimization for thebusinessHigher ROI / faster paybackValue Creation process ( Enterprise and LOB)3OpportunityObservationREFCost avoidanceRisk mitigationUnstructured content7Risk mitigation & complianceInternal data access and sensitive datalocation/control6OwnershipEfficiencyData QualityOrganizational Effectiveness and DataAccountability (Stewardship)5CommunicationSME availabilityLevel of influenceOrganizational Awareness and EnterpriseSolutions2
  81. 81. Copyright IBM Corp. 2008 Section Title in Header© 2013 IBM CorporationInformation Governance – Company’s Assessment resultsDataQualityManagement/DiscoveryInformationLife-CycleManagementInformationSecurityand PrivacyCore DisciplinesData Risk Management &ComplianceBusiness Outcomes / ReportingValue CreationDataArchitectureClassification &MetadataAudit InformationLogging & ReportingSupporting DisciplinesOrganizational Structures & AwarenessEnablersPolicy Data StewardshipRequiresSupportsEnhance38
  82. 82. Copyright IBM Corp. 2008 Section Title in Header© 2013 IBM CorporationInformation Governance Maturity Assessment current and target mappingAssessed current statePlanned future statePrioritized*Domains withRecommended Action PlanPrioritization has been made byWorkshop attendees based on :• Highest gaps between current andto-be positions• Evaluation of acceptance capability /Feasability by the Organization
  83. 83. Copyright IBM Corp. 2008 40Section Title in Header© 2013 IBM Corporation40IMSIMS and Data GovernancePalisades, New York: May 14-15, 2013Chicago: May 21-22, 2013São Paulo, BR: June 3, 2013Costa Mesa, CA: June 4-5, 2013Boeblingen, DE : June 5, 2013Taipei, Taiwan: June 2013United Kingdom: June 2013Charlotte, NC: June 25-26 2013 (tentative)Messaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memoryor in a shared structure), and then invokes its scheduler to start the business application program in amessage processing region.The message processing region retrieves the transaction from the IMS message queue and processes it,reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queuesensuring proper management of the transaction scope.The IMS application itself decides to send a response message back, to start another IMS transactionasynchronously or to access synchronously a set of services.IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type ofconfiguration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities withrepositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity,and allows parallel access between transactional workload and batch workload based on efficient lockingmechanisms provided by resources managers.Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batchapplication or utilities. This environment is not covered in this document even if it’s still used a lot by somecustomers who didn’t need over years parallel processing between online and batch.
  84. 84. Copyright IBM Corp. 2008 41Section Title in Header© 2013 IBM Corporation41IMSIMS and Data GovernanceData governanceRegulation complianceAvoiding media embarrassmentCompetitive edgeIMS Enterprise Suite V2.2 ExplorerInfosphere Optim for Test Data ManagementInfosphere Optim for data and application retirementInfosphere Guardium for data protectionEncryption of IMS dataS-Tap monitorsData Maturity assessment workshopInformation Governance Wildfire WorkshopsMessaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memoryor in a shared structure), and then invokes its scheduler to start the business application program in amessage processing region.The message processing region retrieves the transaction from the IMS message queue and processes it,reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queuesensuring proper management of the transaction scope.The IMS application itself decides to send a response message back, to start another IMS transactionasynchronously or to access synchronously a set of services.IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type ofconfiguration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities withrepositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity,and allows parallel access between transactional workload and batch workload based on efficient lockingmechanisms provided by resources managers.Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batchapplication or utilities. This environment is not covered in this document even if it’s still used a lot by somecustomers who didn’t need over years parallel processing between online and batch.
  85. 85. Data Governance for System z Workshop(DGSYSZ)Palisades, NY May 14-15, 2013Chicago, IL May 21-22, 2013Costa Mesa, CA June 4-5, 2013This workshop like all Wildfire Workshops is offered at no-fee to qualified customers.IBM Advanced Technical SkillsWildfire WorkshopWith the complexity of today’s information ecosystems, organizations must improve the level of trustusers have in information, ensure consistency of data, and establish safeguards over information.When information is trusted, business can optimize outcomes.Join us for one and a half days at the IBM Data Governance for System z Workshop. Meet withexperts to understand business and IT implications of Data Governance, Real Time Analytics, andOperational Data Warehousing, and learn how the IBM System z platform can help you meet,simplify, and reduce the cost of meeting your data governance requirements.Workshop Topics:• Drivers of Information Governance• Data & Information Governance, What are They?• Enablers of Enterprise Data Governance Strategy− Policy− Data Stewardship− Organizational Structure & Awareness• Pillars of Data Governance− Data Quality Management− Information Life Cycle− Security, Privacy, & Compliance− Master Data Management• Enterprise Data Governance on System z• Data Architecture on System z• Role of DB2 for z/OS & IMS in Data Governance• Operational Analytics & Real Time Analytics on System z• Data Governance AssessmentAudience:Attendance of this workshop is recommended for Chief Technology Officers, Architects, DataStewards, IT Management, Owners of Business Analytics, Data Warehouse Owners, Line ofBusiness Application Owners, and DBA Management, and Test & Development Management.Enrollment:To enroll please work with your IBM sales representative and enroll together by visiting thefollowing website:https://www.ibm.com/servers/eserver/zseries/education/topgun/enrollment/esfldedu.nsf/0/0D284179789982B5852578B8004C07B4?EditDocumentFor more information on enrollment or for other Wildfire administration questions, contact JudyVadnais-Keute at judyv@us.ibm.com , and for more information on this Data Governance forSystem z Workshop, please contact Peter Kohler at pkohler@us.ibm.com

×