The document discusses common pitfalls in data integration projects. It describes projects that rush into development without proper planning, creating overly customized data models. This can lead to projects taking much longer than planned and models that are never fully deployed. The document also discusses projects that build solutions without ensuring they meet business needs, resulting in solutions not being adopted. It advocates planning projects better by understanding requirements, using industry data models when possible, and ensuring solutions address real business problems. The document also stresses the importance of data quality, profiling data sources, and performing impact analysis in iterative development cycles.
Netapp Evento Virtual Business Breakfast 20110616Bruno Banha
Apresentação efectuada pela Netapp no evento Virtual Business Breakfast, realizado no dia 16 de Junho de 2011, no Porto.
O evento realizado em conjunto com a VMware e a NextiraOne Portugal.
This document discusses maximizing returns from a data warehouse. It covers the need for real-time data integration to power business intelligence and enable timely, trusted decisions. It outlines challenges with traditional batch-based approaches and how Oracle's data integration solutions address these through products that enable real-time data capture and delivery, bulk data movement, and data quality profiling to build an enterprise data warehouse.
Matt Kimball, AMD Server Solutions Marketing presentation on "Architecting Cloud Solutions" at Dell World.
What makes the difference between a successful cloud implementation and one that doesn’t live up to expectations? Explore how the underlying architecture can translate into performance gains and cost savings.
Leveraging System z to Turn Information Into Insightdkang
This document discusses IBM's DB2 10 for z/OS, IMS 11, and System z momentum. Some key points:
- DB2 10 for z/OS has seen the fastest sales upgrade in 20 years with incredible demand and every beta client moving to production, including JP Morgan Chase.
- IMS 11 is running 3.6 billion transactions daily, 15 times more than a year ago, and IMS Tools saw its largest sales year ever.
- System z is seeing momentum from database consolidation projects, adding DB2 warehouses, and application patterns that save costs by keeping applications close to operational data sources.
- The document discusses how IBM offers business analytics and data warehousing solutions on System
Cloud Clf 2011 12 Big Things To Know Idc Analysts 2011Job Voorhoeve
The document discusses trends in technology presented by analysts from IDC. It covers:
1. How the "third platform" of cloud, mobile, social, analytics and more is transforming the IT industry and enabling millions of new applications and services.
2. How converged IT solutions are reaching limits and shared infrastructure will drive the next data center transformation.
3. How "bring your own license" approaches allow more deployment choices and ease of using public clouds for licensed applications.
4. How IT is reorienting around delivering services instead of applications, with vendors building digital supply chains and more companies sourcing complex cloud services.
e-Com:Activate content in your business processVincent Kwon
The document discusses activating content in business processes. It provides an agenda that includes topics on content collection, connecting content to decisions, and using content to enhance business processes. It notes the large and growing amounts of unstructured content organizations face. The document then discusses how the IBM Content Collector can help manage content as part of business processes by classifying, collecting, and activating content to trigger business processes.
The document summarizes a presentation on evolving a new analytical platform. It discusses defining the platform to include tools for the whole research cycle beyond just business intelligence (BI), with SQL Server 2008 R2 as an example of defining the platform. It also discusses what is working with existing platforms and what is still missing, including the need for more scalable data storage and processing.
BI Forum 2010 - High Performance BI: The Future of BIOKsystem
The document discusses high performance business intelligence and Sybase. It summarizes that data volumes and decision points are expanding due to more devices and data. Sybase delivers analytics solutions like Analytics Server to help organizations manage, analyze and mobilize information for better decisions. The document also discusses how Sybase IQ helped Telstra optimize its mobile network during the 2000 Olympics by providing near real-time analytics on over 200 cells.
Netapp Evento Virtual Business Breakfast 20110616Bruno Banha
Apresentação efectuada pela Netapp no evento Virtual Business Breakfast, realizado no dia 16 de Junho de 2011, no Porto.
O evento realizado em conjunto com a VMware e a NextiraOne Portugal.
This document discusses maximizing returns from a data warehouse. It covers the need for real-time data integration to power business intelligence and enable timely, trusted decisions. It outlines challenges with traditional batch-based approaches and how Oracle's data integration solutions address these through products that enable real-time data capture and delivery, bulk data movement, and data quality profiling to build an enterprise data warehouse.
Matt Kimball, AMD Server Solutions Marketing presentation on "Architecting Cloud Solutions" at Dell World.
What makes the difference between a successful cloud implementation and one that doesn’t live up to expectations? Explore how the underlying architecture can translate into performance gains and cost savings.
Leveraging System z to Turn Information Into Insightdkang
This document discusses IBM's DB2 10 for z/OS, IMS 11, and System z momentum. Some key points:
- DB2 10 for z/OS has seen the fastest sales upgrade in 20 years with incredible demand and every beta client moving to production, including JP Morgan Chase.
- IMS 11 is running 3.6 billion transactions daily, 15 times more than a year ago, and IMS Tools saw its largest sales year ever.
- System z is seeing momentum from database consolidation projects, adding DB2 warehouses, and application patterns that save costs by keeping applications close to operational data sources.
- The document discusses how IBM offers business analytics and data warehousing solutions on System
Cloud Clf 2011 12 Big Things To Know Idc Analysts 2011Job Voorhoeve
The document discusses trends in technology presented by analysts from IDC. It covers:
1. How the "third platform" of cloud, mobile, social, analytics and more is transforming the IT industry and enabling millions of new applications and services.
2. How converged IT solutions are reaching limits and shared infrastructure will drive the next data center transformation.
3. How "bring your own license" approaches allow more deployment choices and ease of using public clouds for licensed applications.
4. How IT is reorienting around delivering services instead of applications, with vendors building digital supply chains and more companies sourcing complex cloud services.
e-Com:Activate content in your business processVincent Kwon
The document discusses activating content in business processes. It provides an agenda that includes topics on content collection, connecting content to decisions, and using content to enhance business processes. It notes the large and growing amounts of unstructured content organizations face. The document then discusses how the IBM Content Collector can help manage content as part of business processes by classifying, collecting, and activating content to trigger business processes.
The document summarizes a presentation on evolving a new analytical platform. It discusses defining the platform to include tools for the whole research cycle beyond just business intelligence (BI), with SQL Server 2008 R2 as an example of defining the platform. It also discusses what is working with existing platforms and what is still missing, including the need for more scalable data storage and processing.
BI Forum 2010 - High Performance BI: The Future of BIOKsystem
The document discusses high performance business intelligence and Sybase. It summarizes that data volumes and decision points are expanding due to more devices and data. Sybase delivers analytics solutions like Analytics Server to help organizations manage, analyze and mobilize information for better decisions. The document also discusses how Sybase IQ helped Telstra optimize its mobile network during the 2000 Olympics by providing near real-time analytics on over 200 cells.
Sanjay Mirchandani’s KeyNote – EMC Forum India – Mumbai November 17, 2011EMC Forum India
The document discusses EMC's vision and strategy around cloud computing. It outlines EMC's agenda to discuss their cloud vision, how cloud meets big data, and EMC IT's own transformation journey to the cloud and IT-as-a-service strategy. It highlights challenges like budget constraints, increasing data volumes, and security threats that cloud computing can help address. The document advocates for a hybrid cloud approach combining private and public clouds and outlines phases in the journey to establishing a private cloud.
Datacenter transformation - Dion van der ArendHPDutchWorld
(1) Datacenters are facing increasing demands that many current facilities cannot meet, requiring transformation through consolidation, virtualization, and improved energy efficiency and availability.
(2) Datacenter designs are evolving from small, isolated IT islands to larger, standardized facilities with improved reliability, energy conservation, and reduced costs. Next-generation designs feature modular pods that can be deployed rapidly and offer high power densities up to 20kW/m2.
(3) As datacenter economics have changed, managing costs such as power and cooling have become priorities, driving the need for more energy-efficient computing and facility solutions.
The document discusses the need for a single data and events platform to handle high volumes of data and events. It describes GemStone Systems, which provides a distributed main-memory data management platform using a data fabric/grid. The platform allows applications to process and distribute large amounts of data and events at high speeds and scales linearly. It provides an example of using the platform for electronic trade order management to normalize, validate, aggregate and distribute trading data and events in real-time across clustered applications.
Big Data i CSC's optik, CSC RepresentativeIBM Danmark
This document provides an overview of CSC's Netezza case study for a client in Zurich. It discusses how the client was struggling with performance issues on their DB2 database. CSC conducted a proof of concept that showed Netezza and Teradata providing significant performance improvements over DB2. Netezza was ultimately chosen due to cost and compatibility factors. The implementation of Netezza reduced the client's month-end processing time from 9 days to 3 days and improved query performance dramatically. Future plans include migrating more systems to Netezza and taking advantage of upcoming Netezza upgrades.
This document provides an overview of IBM's cloud computing efforts including establishing cloud computing centers worldwide, developing academic initiatives to train students on cloud technologies, and using cloud platforms to enable software development and virtual classrooms. It describes IBM's vision of converged web-centric clouds and enterprise data centers to drive adoption of cloud computing for businesses. Key points covered include IBM's leadership in dynamic enterprise data centers, the characteristics and benefits of cloud computing, and examples of IBM's cloud computing platforms and centers.
This document discusses data management challenges for Oracle Transportation Management (OTM) users. It outlines that mapping user data to OTM business objects and tables can be difficult due to complexity. Formatting data for OTM can also be a pain point. The document recommends getting experience from peers, using standard OTM functionality, working with partners knowledgeable in OTM and user systems, and leveraging commercial accelerators and data maintenance tools to help mitigate these issues.
The document discusses innovations in SAP BusinessObjects 4.0, including:
1) It is lightning fast with in-memory and Sybase IQ technologies, which can make reporting processes run 350 times faster.
2) It provides a trusted 360-degree view of information, including unstructured data and real-time insights.
3) The suite is easier to use with a unified experience across products, improved authoring tools, and one administration platform.
4) Access to information is available whenever and wherever users need it, through self-service mobile and embedded analytics.
Real-time Data Distribution: When Tomorrow is Too LateInside Analysis
The document outlines an upcoming webinar series from The Bloor Group called #briefr. Each month will focus on a different technology topic related to enterprise software. The September webinar will focus on data integration, October on databases, November on cloud computing, December on innovators in technology, and January on architecture. The webinars aim to provide in-depth analysis of innovative technologies, give vendors a chance to explain their products, and allow audiences to ask questions. The document also includes information about speakers from Sybase, an SAP company, who will present on Sybase Replication Server in an upcoming webinar.
The document is an introduction to IBM's managed services for datacenter operations in Romania. It discusses the facilities and infrastructure of IBM's Romanian datacenter, including 1000 sqm of server space, separate spaces for telecom equipment, electrical distribution, UPS systems, batteries, generators, fire suppression, and storage. It details the datacenter's reliable power supply including on-site power generation, UPS configuration, and backup generators. It also summarizes the datacenter's cooling and humidity control systems.
Designing A Data Warehouse With Sql 2008thomduclos
The document provides an overview of designing a data warehouse with SQL Server 2008. It discusses the Kimball architecture and methodology, including focusing on business requirements, using conformed dimensions, and building dimensional models and BI applications in an iterative process. The document compares different architectural approaches and emphasizes that the dimensional model and Kimball methodology provide the best balance of ease of use, time to market, and ability to scale over time.
This document introduces SQL-H, which enables SQL analytics on Hadoop. It provides a primer on HCatalog and Aster, defines SQL-H, and provides examples of SQL-H usage. SQL-H allows direct access to HCatalog tables from within AsterDB, providing full SQL support and integration with BI tools on data stored in Hadoop. It performs reads from HCatalog in a distributed, native manner without using MapReduce.
The document discusses big data and analytics. It explains that big data refers to extremely large datasets that are difficult to manage with traditional tools due to their size. It also discusses how distributed computing helps address bottlenecks in analyzing big data by allowing inexpensive addition of multiple machines to a computing network. The document also provides an overview of how Splunk can help create a single customer view by ingesting and analyzing structured and unstructured data from various sources in real-time.
The document discusses a presentation on Dynamic IT for Microsoft. The presentation includes sections on Dynamic IT for Microsoft by Fujitsu Siemens Computers, Microsoft Dynamic IT for optimizing infrastructure by Microsoft Italia, and a disaster recovery solution for Kuwait Petroleum Italia's service-oriented architecture by Atos Origin. The presentation discusses how Dynamic Data Centers can adapt to business needs through automated and shared resource allocation, virtualization, and integration. Dynamic IT for Microsoft is aimed at companies using Microsoft and SAP solutions, those with distributed datacenters requiring high availability, service providers relying on Microsoft and SAP, and companies consolidating or expanding their datacenters.
SolNet - Ministry of Health: Cancer Registry SolutionVincent Kwon
The document discusses the redevelopment of the New Zealand Cancer Registry (NZCR) system. The NZCR collects and stores cancer diagnosis information. The previous system had issues with usability, data quality, and performance. The redeveloped system addresses these challenges through a composite web services application with a rich client interface. It also implements features for data validation, security, and optimized loading of historical data. The customer has been satisfied with the redeveloped system which streamlines data collection and quality assurance processes.
The document discusses trends in big data and data management. It notes that data volume, velocity, variety, and value are increasing dramatically. This rapid growth is challenging IT to manage and analyze more complex data relationships in real time and at large scale. The document also discusses how new consumption models like cloud computing and storage virtualization can help reduce costs and better manage the explosion of data replication. It introduces Hitachi's accelerated flash storage and new HUS VM entry-level enterprise storage system to address these big data challenges.
The document discusses the components and architecture of data warehouses and data marts. It describes how a data warehouse collects data from multiple operational systems and makes it available for analysis. Data marts contain subsets of data tailored for specific business functions or departments. The document outlines different types of data warehouse architectures including virtual, coarse-grained, central, distributed, and data marts-only. It also discusses challenges like integrating dirty data from multiple sources and prerequisites for a successful data warehouse implementation.
CDP - Global Outlook for Business IntelligenceVincent Kwon
The presentation discusses trends in business intelligence, including standardization, mashups, data quality, governance, pre-built analytics, dashboarding, collaboration, competency centers, and leveraging analytics. It provides examples of how these trends are impacting analysts and being adopted by local New Zealand and international companies.
Informatica PowerCenter provides integrated components including a repository to store metadata, a client with tools for designing mappings and workflows, and a server to extract, transform and load data between sources and targets based on the metadata in the repository and workflows created in the client. The designer tool allows importing of sources and targets, creating mappings through connecting them with transformations, and testing mappings.
Sanjay Mirchandani’s KeyNote – EMC Forum India – Mumbai November 17, 2011EMC Forum India
The document discusses EMC's vision and strategy around cloud computing. It outlines EMC's agenda to discuss their cloud vision, how cloud meets big data, and EMC IT's own transformation journey to the cloud and IT-as-a-service strategy. It highlights challenges like budget constraints, increasing data volumes, and security threats that cloud computing can help address. The document advocates for a hybrid cloud approach combining private and public clouds and outlines phases in the journey to establishing a private cloud.
Datacenter transformation - Dion van der ArendHPDutchWorld
(1) Datacenters are facing increasing demands that many current facilities cannot meet, requiring transformation through consolidation, virtualization, and improved energy efficiency and availability.
(2) Datacenter designs are evolving from small, isolated IT islands to larger, standardized facilities with improved reliability, energy conservation, and reduced costs. Next-generation designs feature modular pods that can be deployed rapidly and offer high power densities up to 20kW/m2.
(3) As datacenter economics have changed, managing costs such as power and cooling have become priorities, driving the need for more energy-efficient computing and facility solutions.
The document discusses the need for a single data and events platform to handle high volumes of data and events. It describes GemStone Systems, which provides a distributed main-memory data management platform using a data fabric/grid. The platform allows applications to process and distribute large amounts of data and events at high speeds and scales linearly. It provides an example of using the platform for electronic trade order management to normalize, validate, aggregate and distribute trading data and events in real-time across clustered applications.
Big Data i CSC's optik, CSC RepresentativeIBM Danmark
This document provides an overview of CSC's Netezza case study for a client in Zurich. It discusses how the client was struggling with performance issues on their DB2 database. CSC conducted a proof of concept that showed Netezza and Teradata providing significant performance improvements over DB2. Netezza was ultimately chosen due to cost and compatibility factors. The implementation of Netezza reduced the client's month-end processing time from 9 days to 3 days and improved query performance dramatically. Future plans include migrating more systems to Netezza and taking advantage of upcoming Netezza upgrades.
This document provides an overview of IBM's cloud computing efforts including establishing cloud computing centers worldwide, developing academic initiatives to train students on cloud technologies, and using cloud platforms to enable software development and virtual classrooms. It describes IBM's vision of converged web-centric clouds and enterprise data centers to drive adoption of cloud computing for businesses. Key points covered include IBM's leadership in dynamic enterprise data centers, the characteristics and benefits of cloud computing, and examples of IBM's cloud computing platforms and centers.
This document discusses data management challenges for Oracle Transportation Management (OTM) users. It outlines that mapping user data to OTM business objects and tables can be difficult due to complexity. Formatting data for OTM can also be a pain point. The document recommends getting experience from peers, using standard OTM functionality, working with partners knowledgeable in OTM and user systems, and leveraging commercial accelerators and data maintenance tools to help mitigate these issues.
The document discusses innovations in SAP BusinessObjects 4.0, including:
1) It is lightning fast with in-memory and Sybase IQ technologies, which can make reporting processes run 350 times faster.
2) It provides a trusted 360-degree view of information, including unstructured data and real-time insights.
3) The suite is easier to use with a unified experience across products, improved authoring tools, and one administration platform.
4) Access to information is available whenever and wherever users need it, through self-service mobile and embedded analytics.
Real-time Data Distribution: When Tomorrow is Too LateInside Analysis
The document outlines an upcoming webinar series from The Bloor Group called #briefr. Each month will focus on a different technology topic related to enterprise software. The September webinar will focus on data integration, October on databases, November on cloud computing, December on innovators in technology, and January on architecture. The webinars aim to provide in-depth analysis of innovative technologies, give vendors a chance to explain their products, and allow audiences to ask questions. The document also includes information about speakers from Sybase, an SAP company, who will present on Sybase Replication Server in an upcoming webinar.
The document is an introduction to IBM's managed services for datacenter operations in Romania. It discusses the facilities and infrastructure of IBM's Romanian datacenter, including 1000 sqm of server space, separate spaces for telecom equipment, electrical distribution, UPS systems, batteries, generators, fire suppression, and storage. It details the datacenter's reliable power supply including on-site power generation, UPS configuration, and backup generators. It also summarizes the datacenter's cooling and humidity control systems.
Designing A Data Warehouse With Sql 2008thomduclos
The document provides an overview of designing a data warehouse with SQL Server 2008. It discusses the Kimball architecture and methodology, including focusing on business requirements, using conformed dimensions, and building dimensional models and BI applications in an iterative process. The document compares different architectural approaches and emphasizes that the dimensional model and Kimball methodology provide the best balance of ease of use, time to market, and ability to scale over time.
This document introduces SQL-H, which enables SQL analytics on Hadoop. It provides a primer on HCatalog and Aster, defines SQL-H, and provides examples of SQL-H usage. SQL-H allows direct access to HCatalog tables from within AsterDB, providing full SQL support and integration with BI tools on data stored in Hadoop. It performs reads from HCatalog in a distributed, native manner without using MapReduce.
The document discusses big data and analytics. It explains that big data refers to extremely large datasets that are difficult to manage with traditional tools due to their size. It also discusses how distributed computing helps address bottlenecks in analyzing big data by allowing inexpensive addition of multiple machines to a computing network. The document also provides an overview of how Splunk can help create a single customer view by ingesting and analyzing structured and unstructured data from various sources in real-time.
The document discusses a presentation on Dynamic IT for Microsoft. The presentation includes sections on Dynamic IT for Microsoft by Fujitsu Siemens Computers, Microsoft Dynamic IT for optimizing infrastructure by Microsoft Italia, and a disaster recovery solution for Kuwait Petroleum Italia's service-oriented architecture by Atos Origin. The presentation discusses how Dynamic Data Centers can adapt to business needs through automated and shared resource allocation, virtualization, and integration. Dynamic IT for Microsoft is aimed at companies using Microsoft and SAP solutions, those with distributed datacenters requiring high availability, service providers relying on Microsoft and SAP, and companies consolidating or expanding their datacenters.
SolNet - Ministry of Health: Cancer Registry SolutionVincent Kwon
The document discusses the redevelopment of the New Zealand Cancer Registry (NZCR) system. The NZCR collects and stores cancer diagnosis information. The previous system had issues with usability, data quality, and performance. The redeveloped system addresses these challenges through a composite web services application with a rich client interface. It also implements features for data validation, security, and optimized loading of historical data. The customer has been satisfied with the redeveloped system which streamlines data collection and quality assurance processes.
The document discusses trends in big data and data management. It notes that data volume, velocity, variety, and value are increasing dramatically. This rapid growth is challenging IT to manage and analyze more complex data relationships in real time and at large scale. The document also discusses how new consumption models like cloud computing and storage virtualization can help reduce costs and better manage the explosion of data replication. It introduces Hitachi's accelerated flash storage and new HUS VM entry-level enterprise storage system to address these big data challenges.
The document discusses the components and architecture of data warehouses and data marts. It describes how a data warehouse collects data from multiple operational systems and makes it available for analysis. Data marts contain subsets of data tailored for specific business functions or departments. The document outlines different types of data warehouse architectures including virtual, coarse-grained, central, distributed, and data marts-only. It also discusses challenges like integrating dirty data from multiple sources and prerequisites for a successful data warehouse implementation.
CDP - Global Outlook for Business IntelligenceVincent Kwon
The presentation discusses trends in business intelligence, including standardization, mashups, data quality, governance, pre-built analytics, dashboarding, collaboration, competency centers, and leveraging analytics. It provides examples of how these trends are impacting analysts and being adopted by local New Zealand and international companies.
Informatica PowerCenter provides integrated components including a repository to store metadata, a client with tools for designing mappings and workflows, and a server to extract, transform and load data between sources and targets based on the metadata in the repository and workflows created in the client. The designer tool allows importing of sources and targets, creating mappings through connecting them with transformations, and testing mappings.
Macleans - NZ Business taking on the world with a world class IT infrastructu...Vincent Kwon
The document summarizes a presentation given by Adam Zame and Gerhard Richards of Maclean Computing. It describes a project where Maclean Computing virtualized the IT infrastructure of a textile company with locations in the US, New Zealand, and Australia to address risks in the company's aging systems and lack of disaster recovery capabilities. After the new virtualized infrastructure was implemented, a flood damaged the physical servers, but Maclean was able to restore all virtual machines from backups within 5 hours, avoiding significant downtime and costs. The client is now considering further improvements such as physical hosting and additional redundancy.
The document provides an overview of IBM's Big Data platform vision. The platform addresses big data use cases involving high volume, velocity and variety of data. It integrates with existing data warehouse and master data management systems. The platform handles different data types and formats, provides real-time and batch analytics, and has tools to make it easy for developers and users to work with. It is designed with enterprise-grade security, scalability and failure tolerance. The platform allows organizations to analyze big data from various sources to gain insights.
Kiwibank: From Startup to Enterprise in 7 yearsVincent Kwon
Kiwibank was established in 2002 with a low startup budget and leveraged existing infrastructure from NZ Post. It initially used IBM p-Series and x-Series systems running Microsoft applications and SQL databases. Over 7 years it grew strongly through acquisitions and developing its brand, becoming successful with many customers and products. It continues upgrading its infrastructure and adopting new technologies to support further growth as an enterprise.
Yahoo uses Apache Hadoop at a massive scale to power many of its products and services. Hadoop clusters at Yahoo contain tens of thousands of servers and store over 170 petabytes of data. Hadoop is used for data analytics, content optimization, machine learning, advertising products, and more. One example is Yahoo's homepage, where Hadoop enables the personalization of content for each user, increasing engagement on the site.
This document discusses the history of Sphere, a small software company that was acquired by NEC. It proposes converging communications and business applications by integrating tasks, information, decisions and actions across CRM, ERP, SCM and HR systems through a service-oriented communications platform. This would shift focus from traditional telecommunications infrastructure to facilitating user interactions and workflows across the enterprise.
The document discusses DataStreams Corp, a provider of data integration and quality management solutions. It describes DataStreams' product offerings which include ETL, data migration, real-time and deferred data integration, metadata management, and data quality solutions. The document also provides details about DataStreams' customers, market share, and reputation as the leading data integration and quality management solution provider in Korea.
Checkpoint - A Practical Demonstration of Endpoint SecurityVincent Kwon
The document discusses Check Point Endpoint Security. It addresses issues with traditional endpoint security like complexity for administrators and annoyance for users. It then outlines the security features included in Check Point TotalSecurity like personal firewalls, antivirus, VPN clients, and full disk encryption. The rest of the document demonstrates through diagrams how TotalSecurity protects devices when users access the office, move sensitive data, use USB keys, and access networks from home. It also shows the centralized management and reporting capabilities for administrators.
Parallels Server Beta is virtualization software that allows SMBs to optimize their IT infrastructure and realize business goals. It offers an easy to use interface with features like integrated updating, multi-user access to VMs, dynamic memory allocation, extensive virtual device support, and an intuitive GUI to help SMBs increase productivity while reducing pressure on employees and IT resources.
IBM Dynamic Infrastructure - A Telecom Case studyVincent Kwon
The document discusses Telecom New Zealand's implementation of a dynamic infrastructure. Garry Johnston, head of technical services at Telecom New Zealand, presents on transforming the company's assets into higher value services through a service-oriented and service-managed approach. This allows Telecom to rapidly and dynamically deliver business and IT services by providing visibility, control, and automation across all infrastructure assets.
Move your desktop to the cloud for $1 day Desktone
This webinar will explore the reasons for changing traditional desktop computing strategies, why cloud-hosted virtual desktops are a compelling solution for many businesses, and how to leverage cloud-hosted desktops for Windows 7 migrations, mobile and departmental workers, and disaster recovery scenarios.
The document provides information about what a data warehouse is and why it is important. A data warehouse is a relational database designed for querying and analysis that contains historical data from transaction systems and other sources. It allows organizations to access, analyze, and report on integrated information to support business processes and decisions.
A data warehouse is a relational database designed for query and analysis that contains historical data from transaction systems and other sources. It integrates data from various sources like ERP, weblogs, and legacy systems. The data in a data warehouse is nonvolatile, time-variant, and can be organized by subject area. A data warehouse provides a single, consistent view of data from across the organization to support reporting, analysis, and business decisions.
Steve Sams (VP IBM Global Site & Facilities Services) presentation at Gartner Data Center Conference (Dec 2011). Learn more about IBM Smarter Data Center Services: ibm.co/smarterdc
Cisco - Collaboration Enabled Business TransformationVincent Kwon
This document discusses how collaboration technologies can enable business transformation. It describes a sales transformation solution that uses collaboration tools to help sales specialists find the right people, interact virtually without compromising, and measure business results. The solution combines technology, process, and culture changes. It has led to increases in external interactions and satisfaction ratings, as well as reductions in time spent and expenses. Companies can save full-time employees and improve work-life balance using these virtual collaboration solutions.
The Briefing Room with Colin White and Composite Software
Live Webcast Feb. 26, 2013
The modern business analyst needs data from all over the place: yes, the data warehouse, but also the Web, big data, production systems, as well as via partners and vendors. In fact, the typical analyst spends more than 50% of the time chasing data, which slows delivery of analytic insights and limits the time available for thorough analysis. Some practitioners refer to this conundrum as "the data problem."
Check out the slides from this episode of The Briefing Room to hear veteran Analyst Colin White of BI Research as he explains why analytical sandboxes and data hubs can be an analyst's best friend. He'll be briefed by Bob Eve of Composite Software who will discuss his company's mature data virtualization platform, which includes a number of capabilities that help organizations leverage agile analytics. He will discuss why time-to-insight is fast becoming the battle cry of analysis-driven organizations.
Visit: http://www.insideanalysis.com
Similar to InfoSphere: Leading from the Front - Accelerating Data Integration through Metadata (20)
The document discusses trends driving changes in education systems towards a "Smarter Nation". Five key trends are identified: 1) technology immersion, 2) personalized learning, 3) knowledge/skills focus, 4) global integration, and 5) economic alignment. These trends form an "Educational Continuum" and have implications for integrating education providers and economic development initiatives to benefit the nation.
Paul Croft discusses four layers of cloud computing offerings from infrastructure-as-a-service to software-as-a-service. He outlines deployment options for cloud computing from private to public models. Croft asks when a cloud is not actually a cloud and says cloud is an opportunity beyond just consumption and delivery of services.
The document discusses barriers to public cloud adoption and options for using cloud computing. It finds that the primary barriers are concerns about data security and privacy. While private clouds are currently preferred over public clouds, those more open to public cloud see it as less of an issue and view application availability and management as important. The document recommends starting with a test/development public cloud to reduce costs and increase speed and flexibility compared to traditional testing environments.
Security solutions for a smarter planetVincent Kwon
This document summarizes IBM's security strategy and solutions for enabling a smarter planet. It discusses how security must be built into new technologies from the start to enable innovation while managing risks. IBM's approach focuses on foundational security controls, compliance, and helping customers securely adopt new models like cloud computing and virtualization.
The unprecedented state of web insecurityVincent Kwon
The document summarizes security trends from IBM's X-Force research and development team. It discusses the increasing sophistication of cyber attacks, vulnerabilities in web browsers and document readers, the rise of exploit kits and malware creation tools, and challenges in keeping pace with evolving threats through rapid patching and detection techniques.
Capitalising on Complexity - Ross PearceVincent Kwon
A new IBM CEO study on capitalizing on complexity was conducted with over 1,500 CEOs. The results have implications for CIOs in how they can better support CEOs. The study found CEOs focus on creativity, customers, and operational dexterity to manage complexity. CIOs can help enable these priorities through embracing new technologies, simplifying processes, and providing business intelligence and analytics to support smarter decisions. CIOs also need to help reinvent customer relationships by using data and collaboration technologies to better understand customer needs.
VMWare Sponsor Presentation: Accelerating the journey to cloudVincent Kwon
Join VMware to find out how businesses of all sizes can benefit from taking a less tactical consolidation of non-critical systems by making a strategic investment for all applications through virtualisation and virtualisation management. This presentation covers what business can do now to pave the path to Cloud computing, leveraging the efficient pooling of on-demand, self managed virtual infrastructure, consumed as a service.
Turn data into intelligence: Uncover insights. Take actionVincent Kwon
Spreadsheets alone aren’t the answer to your reporting, analysis and planning problems. If you want to compete against big enterprises with big budgets, you need to use proper tools that extend the value of what you have and deliver a more insightful and accurate view of the business. With the right analytics solution, you can do just that. Cognos Express provides a complete reporting, analysis and planning solution for midsized organisations. This session will demonstrate how this solution can integrate into existing infrastructure and provide the dashboards, reports, forecasts and budgets your organisations needs. All of this with minimal implementation and at an affordable cost.
Keynote intelligence, innovation & best practiceVincent Kwon
1. The document discusses how organizations can drive growth and profitability through intelligence, innovation, and best practices. It provides examples of world-leading organizations that achieve high returns on shareholder funds and above-average growth through intellectual property, unique culture, and following best practices.
2. It argues that in today's business environment, organizations must plan from an "outside-in" perspective by understanding influential external factors like the world, national, economic, and industry environments. The intelligent organization also sources over a third of its business information externally to support this outside-in planning approach.
3. Innovation and productivity are imperative for success. It shows countries with the highest standards of living invest heavily in research and development and
The document discusses how virtualization can help organizations achieve maximum value through improved power efficiency, reliability, and more integrated systems. It notes that digital data is growing exponentially and many companies will need to modify their data centers to handle this growth. Virtualization can help organizations reduce costs, improve service delivery, and better manage risk by consolidating servers, storage, and networking infrastructure. When combined with integrated service management tools, virtualization provides improved visibility, control, and automation of IT resources.
The document summarizes findings from a global CFO study on the evolving role of finance. It finds that over 70% of CFOs see themselves in an advisory role, and around 60% believe major changes are needed in finance organizations to keep up with industry changes. It also highlights the benefits of achieving both finance efficiency through standards and providing business insight, finding the highest rewards come from excelling in both areas.
Drive business performance with information analyticsVincent Kwon
The document discusses IBM Cognos Express, an integrated reporting, analysis, and planning solution designed for mid-sized companies. It highlights that mid-sized companies are important drivers of economic growth but often lack tools and resources for effective business intelligence. IBM Cognos Express is presented as an affordable and easy-to-use solution that mid-sized companies can use to gain insights from their data, align resources based on analysis, and capitalize on opportunities.
The document discusses managing and mitigating risk in businesses. It outlines an evolving risk landscape with new technologies, data growth, and regulatory compliance challenges. Different types of risks are described, from frequent low impact issues to infrequent high impact disasters. Key success factors for managing risk include lowering costs, ensuring compliance, protecting data and applications, and securing the data center. IBM is positioned as being able to help businesses fuel innovation, secure data, meet compliance, and secure their data centers from threats to ensure productivity and reputation.
The document discusses how companies are leveraging cloud computing solutions. It provides examples of how IBM has helped various organizations adopt cloud technologies to improve collaboration, access to resources, and IT efficiencies. Key benefits mentioned include reduced costs, improved flexibility, scalability and security. The document also outlines factors to consider when evaluating cloud solutions and ROI areas to investigate.
The document discusses cloud computing as a new IT delivery and consumption model inspired by consumer internet services. It is driven by virtualization, automation, and standardization which enable economies of scale, flexible pricing, and self-service. Adoption of cloud computing will be shaped by analyzing workload characteristics and risks to determine the best delivery models of public, private or hybrid cloud.
Acclerating jounrey to cloud computingVincent Kwon
VMware provides solutions across four key areas - IT Operations, Applications, Business Continuity, and Desktop - to help customers improve IT quality of service. The solutions leverage VMware products and services to deliver measurable business value such as reduced costs, increased efficiency, and improved service levels. VMware's approach involves an evolutionary journey from IT Production to IT as a Service.
Wellington Business Keynote - Paul CallaghanVincent Kwon
The document discusses New Zealand's economy and culture. It notes that New Zealand's per capita GDP is lower than several other developed nations such as Australia, Ireland, and the USA. Closing the income gap with Australia would require an additional $30 billion in annual income for New Zealand. The document also examines New Zealand's exports, greenhouse gas emissions, tourism, and diaspora living abroad.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
InfoSphere: Leading from the Front - Accelerating Data Integration through Metadata
1. Leading from the Front
Accelerating Data Integration through Metadata
Scott Abbott
Certified IT Architect, InfoSphere Software
IBM Insight Forum 09 Make change work for you
®
2. Context
C t t
IBM Insight Forum 09
IBM Insight Forum 09
2 Make change work for you
®
®
3. Are you
e
constantly
disappointed
by your Data
Integration
I t ti
projects?
IBM Insight Forum 09 Make change work for you
®
4. Often it’s
because we
rush in
without
thinking
what we are
doing
d i
IBM Insight Forum 09 Make change work for you
®
5. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
LEGACY
SOURCES 1
2 3 DATA INTEGRATION
DATAMARTS
REFERENCE DATA “if we build it
they will come”
MASTER DATA
“The custom
data model”
“of course our “we’ll work it out
data is good” in the testing”
IBM Insight Forum 09 Make change work for you
®
6. The I f S h
Th InfoSphere Software Evolution
S ft E l ti
DataMirror Change Data
Ch D t
Capture
LAS Global Name
Enrichment
DWL
Unicorn Operational Master Data
Management
Ascential Metadata Management
SRD Transformation, Cleansing,
Trigo Profiling and metadata integration
Entity Resolution and
Product Information Analysis
Management
IBM Insight Forum 09 Make change work for you
®
8. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
LEGACY
SOURCES 1
2 3 DATA INTEGRATION
DATAMARTS
REFERENCE DATA
MASTER DATA
METADATA
IBM Insight Forum 09 Make change work for you
®
9. Pitfall
Pitf ll #1
“The C t
“Th Custom Model”
M d l”
IBM Insight Forum 09
IBM Insight Forum 09
9 Make change work for you
®
®
10. DI Pitfall #1
WAREHOUSE
1
“The custom
data model
model”
NZ Customer Experience
“who k
“ h knows our industry
i d • Project duration 24-36 mths
better than us” • Model never fully deployed
• Complex ETL feeds
destabilized ti
d t bili d entire BI system
t
“it will only take a couple of • Users bypass to get required
months” information
IBM Insight Forum 09 Make change work for you
®
11. DI Pitfall #1
Accelerator
80:20 rule (20% customization)
Months not years
Fully attributed data models across
six industries
Complete b i
C l t business t
templates f
l t for
industry KPIs
Key
Ke accelerators for migration &
integration projects
Act
A t as acceleration t
l ti templates within
l t ithi
Information Server & Cognos 8 BI
IBM Insight Forum 09 Make change work for you
®
12. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
industry
models
LEGACY
SOURCES 1
2 3 DATA INTEGRATION
DATAMARTS
REFERENCE DATA
MASTER DATA
Target
state
METADATA
IBM Insight Forum 09 Make change work for you
®
13. Pitfall
Pitf ll #2
if we build it
b ild
they will come..
y
IBM Insight Forum 09
IBM Insight Forum 09
13 Make change work for you
®
®
14. 14
DI Pitfall #2
REPORTS
OLAP
4
“if we build it
they will come”
“it is what the business NZ Customer Experience
asked for” • Multiple examples of BI
solutions not meeting initial
business drivers
“the users will understand •UUsers perceive new BI
i
initiatives as burdens rather
the new system” than assets
IBM Insight Forum 09 Make change work for you
®
15. 15
Missing the Point
Corporate Chi
C t Chinese Whi
Whispers
Identify High Value Monthly Report on
Customers to support Customers Revenue
Call Centre & Web breakdown
Personalization
Business Subject Matter Architects Data Developers DBAs
Users Experts Analysts
IBM Insight Forum 09 Make change work for you
®
16. 16
Bridging the Gap
relating the new to the old
l ti th t th ld
“item”
“component” ? “part”
?
IBM Insight Forum 09 Make change work for you
®
32. Understanding Your D t
U d t di Y Data
InfoSphere
Business Glossary
Captures Business Taxonomies
Captures and defines shared searchable business
glossary
Assigns stewardship to key business terms
Links business terms to technical assets
IBM Insight Forum 09 Make change work for you
®
33. InfoSphere Business Glossary
Web-based authoring, managing and
sharing of business metadata
Aligns the efforts of IT with the goals
Subject Matter Business
of the business Experts Users
Provides business context to
InfoSphere Business Gl
I f S h B i Glossary
information technology assets
Establishes responsibility and Create and manage business
vocabulary and relationships, while
accountability
y linking to physical sources
Database = DB2 GL Account
Number
Schema =
NAACCT The ten digit
account number.
Table =
Sometimes
DLYTRANS
referred to as
Technical Business
Column =
C l the
th account ID.
t ID
ACCT_NO This value is of
the form L-
data type =
FIIIIVVVV. Business View
char(11)
IBM Insight Forum 09 Make change work for you
®
34. Business Glossary Anywhere ANY
User
Real-time access to business glossary from any desktop application
Features From Any
From any desktop application, click on a term & Application..
view its business definition in a pop-up window .
without any loss of context or focus
Intelligent matching returns best candidates in a
I t lli t t hi t b t did t i
single search
Search engine for terms and categories
Access steward contact information directly
Security enforced via the Information Server
common security layer
Benefits
Increased trust and acceptance of information by
delivering definitions in context
Expanded adoption of enterprise glossary outside of
Information Platform technologies
Pop the
Improved information availability with multiple access
mechanisms for electronically stored information (ESI) Definition!
35. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
LEGACY
SOURCES 1 Correct
2 3 DATA INTEGRATION
DATAMARTS
Understood
REFERENCE DATA
Data
Steward
MASTER DATA
Terms
Target
state
METADATA
IBM Insight Forum 09 Make change work for you
®
36. Pitfall
Pitf ll #3
data
d t quality
lit
IBM Insight Forum 09
IBM Insight Forum 09
36 Make change work for you
®
®
37. DI Pitfall #3
LEGACY
SOURCES
2
“of course our
data is good”
NZ Customer Experience
“the b i
“ h business owner says the h • ETL Proof of Concept
• Client assured data quality sufficient so
information we need is in there” excluded data cleansing from scope
• At end of 2wk pilot, project halted due to
unsolvable data quality issues
q y
“the schema’s show they • Many 15-20 year old systems still in
operation in NZ market
have the same keys”
IBM Insight Forum 09 Make change work for you
®
60. InfoSphere Information Analyzer
Data-centric analysis of application,
Subject Matter Data
database and file-based sources Experts Analysts
InfoSphere Information Analyzer
Secure, detailed profiling of fields,
across fields, and across sources
Analyse source data structures, and
monitor adherence to integration and
quality rules
lit l
Creation of metadata from profiling
results
Results instantly promotable across
IBM InfoSphere Information Server
Physical View
IBM Insight Forum 09 Make change work for you
®
61. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
LEGACY
SOURCES 1
2 3 DATA INTEGRATION
DATAMARTS Correct
REFERENCE DATA
Understood
Data
Steward
MASTER DATA
Terms
Target
ETL Source state
Hints State
METADATA
IBM Insight Forum 09 Make change work for you
®
62. Pitfall
Pitf ll #4
Iterative
It ti
Development
p
IBM Insight Forum 09
IBM Insight Forum 09
62 Make change work for you
®
®
63. DI Pitfall #4
3 DATA INTEGRATION
“we’ll work it out
in the testing”
NZ Customer Experience
• ETL development >75% total project $$
• Projects t ki
P j t taking 2-3x l
2 3 longer th planned
than l d
• Some clients taking 70+% of dev.time doing impact analysis
• Impact analysis methods very basic
• Largely iterative development method
• Unreliable forecast completion dates
• Low levels of trust by business in IT ability to achieve BI
outcomes
• Substantial cost overruns
• Expensive BI maintenance costs
IBM Insight Forum 09 Make change work for you
®
64. Where does the
How d I Find Out …
H do Fi d O t data for this
report come
Data Analyst
from?
…where this data comes
from?
… when the job had been
running last time?
… the details for these
assets?
IBM Insight Forum 09 Make change work for you
®
65. Pitfall
Pitf ll #4
Development
D l t
(Impact Analysis)
( p y )
IBM Insight Forum 09
IBM Insight Forum 09
65 Make change work for you
®
®
87. What is the InfoSphere Metadata Workbench?
Web-based exploration of
Information Assets generated and
g
used by Information Server
applications
Out of the box reporting on data
p g Data
Developers
Integration
I t ti
movement, data lineage, Managers
business meaning, impact of InfoSphere Metadata Workbench®
changes and dependencies Provides IT professionals with a tool for
Tracing the data lineage of exploring and understanding the assets
generated and used by the Information
Business Intelligence Reports to Server suite.
provide basis for compliance with
legislation such as S
Sarbanes-
Oxley and Basel II
88. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
LEGACY
SOURCES 1
2 3 DATA INTEGRATION
DATAMARTS Correct
REFERENCE DATA
Understood
Data
Steward
MASTER DATA
Impact Terms
Analysis
Target
ETL Source state
Hints State
METADATA
IBM Insight Forum 09 Make change work for you
®
89. Pitfall
Pitf ll #4
Development
D l t
(Iterative cycles)
( y )
IBM Insight Forum 09
IBM Insight Forum 09
89 Make change work for you
®
®
90. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
LEGACY
SOURCES 1
2 3 DATA INTEGRATION
DATAMARTS Correct
Requirements
REFERENCE DATA
Understood
ETL Code Data
Generation Steward
MASTER DATA
Impact Terms
Analysis
Target
ETL Source state
Hints State
METADATA
IBM Insight Forum 09 Make change work for you
®
91. InfoSphere FastTrack
To reduce costs of integration projects through automation
Business analysts and IT
collaborate in context to
create project specification
Leverages source analysis,
analysis
target models, and
metadata to facilitate Specification
mapping process
Auto-generation of
data transformation
j
jobs and reports
p
Auto-generates
DataStage jobs
Flexible Reporting
92. Typical Data Integration Project REPORTS
OLAP
WAREHOUSE
4
LEGACY
SOURCES 1
2 3 DATA INTEGRATION
DATAMARTS Correct
Requirements
REFERENCE DATA
Understood
ETL Code Data
Generation Steward
MASTER DATA
Impact Terms
Analysis
Target
ETL Source state
Hints State
METADATA
IBM Insight Forum 09 Make change work for you
®
93. 93
Information Server
Optimizing A li ti D
O ti i i Application Development
l t
IBM Insight Forum 09 Make change work for you
®
94. 94
IBM InfoSphere Information Server
Delivering information you can trust
Information S
I f ti Server
InfoSphere Information Services Director
InfoSphere Information Analyzer
InfoSphere Business Glossary InfoSphere Federation Server
InfoSphere QualityStage InfoSphere DataStage
InfoSphere Data Architect InfoSphere Replication Server / EVP
InfoSphere FastTrack InfoSphere Change Data Capture
InfoSphere Metadata Server
InfoSphere Metadata Workbench
IBM Insight Forum 09 Make change work for you
®
95. 95
Bringing It All Together
g g g
Business Subject Matter Architects Data Developers DBAs
Users Experts Analysts
Information Server – Common Framework
Simplify Integration Increase trust and
confidence in information
Facilitate h
F ilit t change Increase compliance to
I li t
Design Operational management & reuse standards
IBM Insight Forum 09 Make change work for you
®
96. Leading from the Front
Greater Preparation will yield dramatically lower
project costs/times
Typical Work Effort for Migration Activities
15-30% of total project budget will be spent on Migration Activities
15-30% of total p j
15 30% g p g
project budget will be spent on Migration Activities
Discover Prepare Deliver
30% 40% 30%
Understanding Cleaning, Standardising Conversion, Loading,
Source Data Harmonizing, Management Interfaces, Connectivity
This effort is the most unpredictable. The work can vary
50% Business
greatly depending on condition of data, however it is 25% Business
Coding transformations and loads.
75% Business
Largely manual effort on small
always the largest piece of work in the data initiative.
Traditionally this effort is plagued with
problems related to data quality and it
Largely manual effort on 100% of data. This can mean
percentage of data. Some manual can easily be pulled by necessity into the
dozens of persons cleaning source systems manually t
d f l i t ll to
coding can review all data . 50% IT
correct and augment data and manually aligning records 75% IT
Cleaning, Standardising and Harmonising
25% IT to MRD. Some manual coding can reduce the manual
area causing timing and budget
problems.
effort.
IBM Insight Forum 09 Make change work for you
®
97. 97
Thank
Th k you
Questions?
IBM Insight Forum 09 Make change work for you
®