The blurring of the line between decision support systems and operational systems because of real-time warehousing, the use of Enterprise Information Integration (EII), and closed- loop business processes
Download at http://DavidHubbard.net/powerpoint - This Introduction to Business Intelligence gives an overview of how Business Intelligence fits into business strategy in general. It does not go into the specific technologies of Business Intelligence. It is meant to be used to explain Business Intelligence to those not already familiar with Business Intelligence.
Business Intelligence made easy! This is the first part of a two-part presentation I prepared for one of our customers to help them understand what Business Intelligence is and what can it do...
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace, from digital transformation to marketing, customer centricity, population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
Download at http://DavidHubbard.net/powerpoint - This Introduction to Business Intelligence gives an overview of how Business Intelligence fits into business strategy in general. It does not go into the specific technologies of Business Intelligence. It is meant to be used to explain Business Intelligence to those not already familiar with Business Intelligence.
Business Intelligence made easy! This is the first part of a two-part presentation I prepared for one of our customers to help them understand what Business Intelligence is and what can it do...
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace, from digital transformation to marketing, customer centricity, population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
The Institution's Innovation Council (Ministry of HRD initiative) and the Institution of Electronics and Telecommunication Engineers (IETE) invited me to grace "World Telecommunication & Information Society Day" on 18 May 2020.
The enterprise marketer's playbook: Building an integrated data strategy.
An integrated data strategy can help any business see customer journeys more clearly ― and then give customers more relevant ads and experiences that get results. So why doesn't everyone have such a strategy? We look at what sets the marketing leaders apart.
Let marketing data be your guide
If you've ever felt too swamped by data to find the customer insights you need, you're not alone. But there's a new and better approach to gaining deeper audience insights: building an integrated data strategy.
Read this report to learn how:
86% of senior executives agree that eliminating organizational silos is critical to expanding the use of data and analytics in decision-making.
75% of marketers agree that lack of education and training on data and analytics is the biggest barrier to more business decisions being made based on data insights.
Leading marketers are 59% more likely to use digital analytics to optimize the user experience in real time.
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
Linking Data Governance to Business GoalsPrecisely
The importance of data to businesses has increased exponentially over recent years as companies seek benefits such as gains in efficiency, the ability to respond to growing privacy regulations scale quickly and increased and increase customer loyalty.
Despite being a vital part of any Data Transformation, Data Governance has sometimes been misrepresented as a restrictive and controlling process leaving governance leaders having to continually make the case for business buy-in.
In this on-demand webinar we will explore the concept of business-first Data Governance, an approach that promotes adoption by the organisation, lays the foundation for data integrity and consistently delivers business value in the long term.
BI is the “Gathering of data from multiple sources to present it in a way that allows executives to make better business decisions”. I will describe in more detail exactly what BI is, what encompasses the Microsoft BI stack, why it is so popular, and why a BI career pays so much. I will review specific examples from previous projects of mine that show the benefits of BI and its huge return-on-investment. I'll go into detail on the components of a BI solution, and I will discuss key concepts for successfully implementing BI in your organization.
Presentation about digital transformation driven by big data, and how to navigate from data to insight to action. Presentation given by Hamzah Amin, a Senior Data Scientist & Analytics Consultant at Jordan Business Systems. He is also a Master's student in Data Science at Princess Sumaya University for Technology.
By Hamzah Amin at the JOSA Data Science Meetup on 14/9/2019.
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
Digital Transformation is a top priority for many organizations, and a successful digital journey requires a strong data foundation. Creating this digital transformation requires a number of core data management capabilities such as MDM, With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
You Need a Data Catalog. Do You Know Why?Precisely
Data catalog has become a more popular discussion topic within data management and data governance circles. “What is it?” and “Do I need one?” are two common questions; along with “How does a catalog relate to and support the data governance program?”
The data catalog plays a key role in the governance process; How well information can be managed, aligned to business objectives and monetized depends in great part to what you know about your data.
In this webinar you will learn about:
- The role of the data catalog
- What kinds of information should be in your data catalog
- Those catalog items that can be harvested systemically versus those that require stewardship involvement
- The role of the catalog in your data quality program
We hope you’ll join this on-demand webinar and learn how a data catalog should be part of your governance and data quality program!
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Metadata is hotter than ever, according to a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
This introduction to data governance presentation covers the inter-related DM foundational disciplines (Data Integration / DWH, Business Intelligence and Data Governance). Some of the pitfalls and success factors for data governance.
• IM Foundational Disciplines
• Cross-functional Workflow Exchange
• Key Objectives of the Data Governance Framework
• Components of a Data Governance Framework
• Key Roles in Data Governance
• Data Governance Committee (DGC)
• 4 Data Governance Policy Areas
• 3 Challenges to Implementing Data Governance
• Data Governance Success Factors
White Paper - The Business Case For Business IntelligenceDavid Walker
This white paper looks at the business case that should lie behind the decision to build a data warehouse and provide a business intelligence solution.
There are three primary drivers for making the investment in a business intelligence solution
1. Measurement and management of the business process
2. Analysis of why things change in the business in order to react better in the future
3. Providing information for stakeholders
As a consequence of the investment there will also be a number of secondary benefits that will help to justify the investment and these are also discussed. Finally there are a number of ‘anti-drivers’ – reasons for not embarking on a business intelligence programme.
Presentation on Data Mesh: The paradigm shift is a new type of eco-system architecture, which is a shift left towards a modern distributed architecture in which it allows domain-specific data and views “data-as-a-product,” enabling each domain to handle its own data pipelines.
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data PipelinesDATAVERSITY
With the aid of any number of data management and processing tools, data flows through multiple on-prem and cloud storage locations before it’s delivered to business users. As a result, IT teams — including IT Ops, DataOps, and DevOps — are often overwhelmed by the complexity of creating a reliable data pipeline that includes the automation and observability they require.
The answer to this widespread problem is a centralized data pipeline orchestration solution.
Join Stonebranch’s Scott Davis, Global Vice President and Ravi Murugesan, Sr. Solution Engineer to learn how DataOps teams orchestrate their end-to-end data pipelines with a platform approach to managing automation.
Key Learnings:
- Discover how to orchestrate data pipelines across a hybrid IT environment (on-prem and cloud)
- Find out how DataOps teams are empowered with event-based triggers for real-time data flow
- See examples of reports, dashboards, and proactive alerts designed to help you reliably keep data flowing through your business — with the observability you require
- Discover how to replace clunky legacy approaches to streaming data in a multi-cloud environment
- See what’s possible with the Stonebranch Universal Automation Center (UAC)
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
Introduction to Modern Data Virtualization 2021 (APAC)Denodo
Watch full webinar here: https://bit.ly/2XXyc3R
“Through 2022, 60% of all organisations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch on-demand this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
Data Virtualization, a Strategic IT Investment to Build Modern Enterprise Dat...Denodo
This content was presented during the Smart Data Summit Dubai 2015 in the UAE on May 25, 2015, by Jesus Barrasa, Senior Solutions Architect at Denodo Technologies.
In the era of Big Data, IoT, Cloud and Social Media, Information Architects are forced to rethink how to tackle data management and integration in the enterprise. Traditional approaches based on data replication and rigid information models lack the flexibility to deal with this new hybrid reality. New data sources and an increasing variety of consuming applications, like mobile apps and SaaS, add more complexity to the problem of delivering the right data, in the right format, and at the right time to the business. Data Virtualization emerges in this new scenario as the key enabler of agile, maintainable and future-proof data architectures.
The Institution's Innovation Council (Ministry of HRD initiative) and the Institution of Electronics and Telecommunication Engineers (IETE) invited me to grace "World Telecommunication & Information Society Day" on 18 May 2020.
The enterprise marketer's playbook: Building an integrated data strategy.
An integrated data strategy can help any business see customer journeys more clearly ― and then give customers more relevant ads and experiences that get results. So why doesn't everyone have such a strategy? We look at what sets the marketing leaders apart.
Let marketing data be your guide
If you've ever felt too swamped by data to find the customer insights you need, you're not alone. But there's a new and better approach to gaining deeper audience insights: building an integrated data strategy.
Read this report to learn how:
86% of senior executives agree that eliminating organizational silos is critical to expanding the use of data and analytics in decision-making.
75% of marketers agree that lack of education and training on data and analytics is the biggest barrier to more business decisions being made based on data insights.
Leading marketers are 59% more likely to use digital analytics to optimize the user experience in real time.
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
Linking Data Governance to Business GoalsPrecisely
The importance of data to businesses has increased exponentially over recent years as companies seek benefits such as gains in efficiency, the ability to respond to growing privacy regulations scale quickly and increased and increase customer loyalty.
Despite being a vital part of any Data Transformation, Data Governance has sometimes been misrepresented as a restrictive and controlling process leaving governance leaders having to continually make the case for business buy-in.
In this on-demand webinar we will explore the concept of business-first Data Governance, an approach that promotes adoption by the organisation, lays the foundation for data integrity and consistently delivers business value in the long term.
BI is the “Gathering of data from multiple sources to present it in a way that allows executives to make better business decisions”. I will describe in more detail exactly what BI is, what encompasses the Microsoft BI stack, why it is so popular, and why a BI career pays so much. I will review specific examples from previous projects of mine that show the benefits of BI and its huge return-on-investment. I'll go into detail on the components of a BI solution, and I will discuss key concepts for successfully implementing BI in your organization.
Presentation about digital transformation driven by big data, and how to navigate from data to insight to action. Presentation given by Hamzah Amin, a Senior Data Scientist & Analytics Consultant at Jordan Business Systems. He is also a Master's student in Data Science at Princess Sumaya University for Technology.
By Hamzah Amin at the JOSA Data Science Meetup on 14/9/2019.
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
Digital Transformation is a top priority for many organizations, and a successful digital journey requires a strong data foundation. Creating this digital transformation requires a number of core data management capabilities such as MDM, With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
You Need a Data Catalog. Do You Know Why?Precisely
Data catalog has become a more popular discussion topic within data management and data governance circles. “What is it?” and “Do I need one?” are two common questions; along with “How does a catalog relate to and support the data governance program?”
The data catalog plays a key role in the governance process; How well information can be managed, aligned to business objectives and monetized depends in great part to what you know about your data.
In this webinar you will learn about:
- The role of the data catalog
- What kinds of information should be in your data catalog
- Those catalog items that can be harvested systemically versus those that require stewardship involvement
- The role of the catalog in your data quality program
We hope you’ll join this on-demand webinar and learn how a data catalog should be part of your governance and data quality program!
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Metadata is hotter than ever, according to a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
This introduction to data governance presentation covers the inter-related DM foundational disciplines (Data Integration / DWH, Business Intelligence and Data Governance). Some of the pitfalls and success factors for data governance.
• IM Foundational Disciplines
• Cross-functional Workflow Exchange
• Key Objectives of the Data Governance Framework
• Components of a Data Governance Framework
• Key Roles in Data Governance
• Data Governance Committee (DGC)
• 4 Data Governance Policy Areas
• 3 Challenges to Implementing Data Governance
• Data Governance Success Factors
White Paper - The Business Case For Business IntelligenceDavid Walker
This white paper looks at the business case that should lie behind the decision to build a data warehouse and provide a business intelligence solution.
There are three primary drivers for making the investment in a business intelligence solution
1. Measurement and management of the business process
2. Analysis of why things change in the business in order to react better in the future
3. Providing information for stakeholders
As a consequence of the investment there will also be a number of secondary benefits that will help to justify the investment and these are also discussed. Finally there are a number of ‘anti-drivers’ – reasons for not embarking on a business intelligence programme.
Presentation on Data Mesh: The paradigm shift is a new type of eco-system architecture, which is a shift left towards a modern distributed architecture in which it allows domain-specific data and views “data-as-a-product,” enabling each domain to handle its own data pipelines.
Putting the Ops in DataOps: Orchestrate the Flow of Data Across Data PipelinesDATAVERSITY
With the aid of any number of data management and processing tools, data flows through multiple on-prem and cloud storage locations before it’s delivered to business users. As a result, IT teams — including IT Ops, DataOps, and DevOps — are often overwhelmed by the complexity of creating a reliable data pipeline that includes the automation and observability they require.
The answer to this widespread problem is a centralized data pipeline orchestration solution.
Join Stonebranch’s Scott Davis, Global Vice President and Ravi Murugesan, Sr. Solution Engineer to learn how DataOps teams orchestrate their end-to-end data pipelines with a platform approach to managing automation.
Key Learnings:
- Discover how to orchestrate data pipelines across a hybrid IT environment (on-prem and cloud)
- Find out how DataOps teams are empowered with event-based triggers for real-time data flow
- See examples of reports, dashboards, and proactive alerts designed to help you reliably keep data flowing through your business — with the observability you require
- Discover how to replace clunky legacy approaches to streaming data in a multi-cloud environment
- See what’s possible with the Stonebranch Universal Automation Center (UAC)
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
Introduction to Modern Data Virtualization 2021 (APAC)Denodo
Watch full webinar here: https://bit.ly/2XXyc3R
“Through 2022, 60% of all organisations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch on-demand this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
Data Virtualization, a Strategic IT Investment to Build Modern Enterprise Dat...Denodo
This content was presented during the Smart Data Summit Dubai 2015 in the UAE on May 25, 2015, by Jesus Barrasa, Senior Solutions Architect at Denodo Technologies.
In the era of Big Data, IoT, Cloud and Social Media, Information Architects are forced to rethink how to tackle data management and integration in the enterprise. Traditional approaches based on data replication and rigid information models lack the flexibility to deal with this new hybrid reality. New data sources and an increasing variety of consuming applications, like mobile apps and SaaS, add more complexity to the problem of delivering the right data, in the right format, and at the right time to the business. Data Virtualization emerges in this new scenario as the key enabler of agile, maintainable and future-proof data architectures.
Data and Application Modernization in the Age of the Cloudredmondpulver
Data modernization is key to unlocking the full potential of your IT investments, both on premises and in the cloud. Enterprises and organizations of all sizes rely on their data to power advanced analytics, machine learning, and artificial intelligence.
Yet the path to modernizing legacy data systems for the cloud is full of pitfalls that cost time, money, and resources. These issues include high hardware and staffing costs, difficulty moving data and analytical processes to cloud environments, and inadequate support for real-time use cases. These issues delay delivery timelines and increase costs, impacting the return on investment for new, cutting-edge applications.
Watch this webinar in which James Kobielus, TDWI senior research director for data management, explores how enterprises are modernizing their mainframe data and application infrastructures in the cloud to sustain innovation and drive efficiencies. Kobielus will engage John de Saint Phalle, senior product manager at Precisely, in a discussion that addresses the following key questions:
When should enterprises consider migrating and replicating all their data assets to modern public clouds vs. retaining some on-premises in hybrid deployments?How should enterprises modernize their legacy data and application infrastructures to unlock innovation and value in the age of cloud computing?What are the key investments that enterprises should make to modernize their data pipelines to deliver better AI/ML applications in the cloud?What is the optimal data engineering workflow for building, testing, and operationalizing high-quality modern AI/ML applications in the cloud?What value does real-time replication play in migrating data and applications to modern cloud data architectures?What challenges do enterprises face in ensuring and maintaining the integrity, fitness, and quality of the data that they migrate to modern clouds?What tools and methodologies should enterprise application developers use to refactor and transform legacy data applications that have migrated to modern clouds
A Logical Architecture is Always a Flexible Architecture (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3joZa0a
The current data landscape is fragmented, not just in location but also in terms of processing paradigms: data lakes, IoT architectures, NoSQL, and graph data stores, SaaS applications, etc. are found coexisting with relational databases to fuel the needs of modern analytics, ML, and AI. The physical consolidation of enterprise data into a central repository, although possible, is both expensive and time-consuming. A logical data warehouse is a modern data architecture that allows organizations to leverage all of their data irrespective of where the data is stored, what format it is stored in, and what technologies or protocols are used to store and access the data.
Watch this session to understand:
- What is a logical data warehouse and how to architect one
- The benefits of logical data warehouse – speed with agility
- Customer use case depicting logical architecture implementation
AMD Putting Server Virtualization to WorkJames Price
E
nterprises have been using virtualization technology on mainframes and RISC-based systems
for years to enable better utilization of hardware resources. As x86 servers have become a mainstay
in the enterprise, more companies are exploring virtualization with these servers to enable more productive, flexible, and scalable datacenters while reducing costs and boosting data availability. Computing technologies from AMD are providing the foundation for today’s—and tomorrow’s—enterprise virtualization solutions.
KASHTECH AND DENODO: ROI and Economic Value of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3sumuL5
Join KashTech and Denodo to discover how Data Virtualization can help accelerate your time-to-value from data while reducing the costs at the same time.
Gartner has predicted that organizations using Data Virtualization will spend 40% less on data integration than those using traditional technologies. Denodo customers have experienced time-to-deliver improvements of up to 90% within their data provisioning processes and cost savings of 50% or more. As Rod Tidwell (Cuba Gooding Jr.) said in the movie 'Jerry Maguire', "Show me the money!"
Register to attend and learn how Data Virtualization can:
- Accelerate the delivery of data to users
- Drive digital transformation initiatives
- Reduce project costs and timelines
- Quickly deliver value to your organization
Data Virtualization: The Agile Delivery PlatformDenodo
Watch full webinar here: https://goo.gl/2wNBhg
To grow or compete in today's fast paced business environment, you need a robust, agile and cost effective data-driven decision strategy.
However, many companies are struggling with the growing complexity of data integration projects as they try to manage the increasing volumes and types of data from traditional enterprise sources as well as new sources such as big data, machine data, social media or cloud sources.
Data virtualization is the technology to simplify and reduce the costs of your data integration projects.
Watch this webinar in which we explore:
• How data virtualization lets you provide the business with the information it needs to make better decisions faster.
• How you can connect and combine all your data in real-time, without compromising on scalability, security or governance.
Modernize your Infrastructure and Mobilize Your DataPrecisely
Modernizing your infrastructure can get complicated really fast. The keys to success involve breaking down data silos and moving data to the cloud in real time. But building data pipelines to mobilize your data in the cloud can be time consuming. You need solutions that decrease bandwidth, ensure data consistency, and enable data migration and replication in real-time; solutions that help you build data pipelines in hours, not days.
Watch this on-demand webinar to learn about the trends and pitfalls related to modernizing your infrastructure to cloud, how the pace of on-prem data growth demands accelerating data streaming to analytics platforms, and why mobilizing your data for the cloud improves business outcomes.
Fast Data Strategy Houston Roadshow PresentationDenodo
Fast Data Strategy Houston Roadshow focused on the next industrial revolution on the horizon, driven by the application of big data, IoT and Cloud technologies.
• Denodo’s innovative customer, Anadarko, elaborated on how data virtualization serves as the key component in their prescriptive and predictive analytics initiatives, driven by multi-structured data ranging from customer data to equipment data.
• Denodo’s session, Unleashing the Power of Data, described the complexity of the modern data ecosystem and how to overcome challenges and successfully harness insights.
• Our Partner Noah Consulting, an expert analytics solutions provider in the energy industry, explained how your peers are innovating using new business models and reducing cost in areas such as Asset Management and Operations by leveraging Data Virtualization and Prescriptive and Predictive Analytics.
For more information on upcoming roadshows near you, follow this link: https://goo.gl/WBDHiE
If you visit Revlon, Inc.’s data center in Oxford, North Carolina, don’t blink, or you’ll miss the infrastructure. Just two racks house roughly 3.6PB and 800 virtual servers that process an average of 14,000 transactions per second (TPS) from systems around the world, with 99.9999% uptime. When people walk into our data center, they ask, “That’s it?” The answer is yes, and it runs everything. Find out more about Revlon and NetApp here: http://www.netapp.com/us/campaigns/builton/?REF_SOURCE=smctwitter-initiative-builton
Big Data LDN 2018: CONNECTING SILOS IN REAL-TIME WITH DATA VIRTUALIZATIONMatt Stubbs
Date: 14th November 2018
Location: Keynote Theatre
Time: 13:50 - 14:20
Speaker: Becky Smith
Organisation: Denodo
About: How many users inside and outside of your organization access your organization’s data? Dozens? Hundreds is probably more like it, each with their own structure and content requirements as well as different access rights. As a result, many organizations have witnessed the formation of “data delivery mills,” in various shapes and sizes. How does one create order and reliability in this world of chaotic data streams? Quite easily, if it’s done with data virtualization.
According to Gartner, "through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration.” Data virtualization enables organizations to gain data insights from multiple, distributed data sources without the time-consuming processes of data extraction and loading. This allows for faster insights and fact-based decisions, which help business realize value sooner.
Join us to find out more about:
• What data virtualization actually means and how it differs from traditional data integration approaches.
• How you can connect and combine all your data in real-time, without compromising on scalability, security or governance.
• The benefits of data virtualization and its most important use cases.
¿Cómo las manufacturas están evolucionando hacia la Industria 4.0 con la virt...Denodo
Watch full webinar here: https://bit.ly/3cbpipB
Uno de los sectores en los que la transformación digital está teniendo un efecto más disruptivo es el de la fabricación. Líderes del sector manufacturero están apostando por el Big Data, la computación en la nube, la inteligencia artificial y el Internet de las Cosas (IoT) entre otras tecnologías, además de contemplar la llegada de la 5G, con el fin de:
- Automatizar los procesos de manera eficiente, para permitir una mayor producción en menor tiempo
- Crear valor añadido en los productos manufacturados
- Conectar la planta industrial con el punto de venta
- Impulsar el análisis en tiempo real de datos provenientes de diferentes cadenas de producción
Sin embargo, para alcanzar estos objetivos y llevar a cabo esta revolución tecnológica, también conocida como industria 4.0, las manufacturas tienen que enfrentarse a una serie de desafíos no negligentes. El sector industrial es el que genera más datos en el mundo, y en la era digital, la velocidad, la diversidad y el volumen exponencial de los datos pueden superar las arquitecturas de TI tradicionales. Además, la mayoría de los fabricantes se enfrentan a silos de datos, lo que hace que su tratamiento sea lento y costoso. Necesitan entonces una plataforma de TI fiable que permita integrar, centralizar y analizar datos de distintas fuentes y diferentes formatos de manera ágil y segura para poner la información al servicio del negocio.
Los expertos de Enki y Denodo te proponen este seminario online para descubrir qué es la virtualización de datos, y por qué líderes del sector apuestan por esta tecnología innovadora para optimizar su estrategia de TI y conseguir un ROI significativo gracias a un acceso más rápido, simple y unificado a los datos industriales.
Similar to Advanced Topics In Business Intelligence (20)
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
The Art of the Pitch: WordPress Relationships and Sales
Advanced Topics In Business Intelligence
1. ADVANCED TOPICS IN
BUSINESS INTELLIGENCE
The blurring of the line between decision support systems and operational systems because of real-
time warehousing, the use of Enterprise Information Integration (EII), and closed- loop business
processes
3. Topic Understanding
Decision Support Systems Operational Systems
(DSS)
Term used in data warehousing
Class of information systems to refer to a system that is
(including but not limited to
computerized systems) that used to process the day-to-day
support business and transactions of an organization.
organizational decision-making
activities. Categorized by These systems are designed
Types: so processing of day-to-day
Communication-driven transactions is performed
Data-driven efficiently and the integrity of
Document-driven the transactional data is
Knowledge-driven
preserved.
4. Topic Understanding
Real-Time Evolution in organization use
Warehousing
Updated
every time an
operational
system
performs a
transaction
(e.g. an order
or a delivery
or a booking.)
5. Topic Understanding
Enterprise Information Integration Closed- loop business processes
(Ell)
Refers to software systems that can Encompass of enterprise-wide
take data from a variety of internal processes.
and external sources and in different
formats and treat them as a single
data source. Data access technologies:
ADO.NET
JDBC
ODBC
OLE DB
XQuery
Service Data Objects (SDO) for Java,
C++ and .Net clients and any type of
data source
6. Business value
1970s The most significant trend is the
Original mission statement of creation of tools that provide visibility
empowering the real-time
enterprise — the R in SAP/R3
— of both underlying processes and
stands for real-time
surface issues — to enable decision
makers at all levels of the enterprise
1990s Real-time order and fulfillment to ―close the loop‖ and reduce the
system resulted in 97%+ customer time it takes to make and act upon
satisfaction rate and helped to decisions.
propel Dell to the number one slot
in the personal computer industry.
Demands to implement real-time
solutions:
2000s Average four-day fill rate increased Increased access to information
from 96.5% to 98.5%, netting $20 Better ways to distribute information to
million in savings from reduced the systems and individuals who can
safety stock and a $10 million process it
savings in excess transport Improved techniques to gain insight from
7. Business value
Benefits Drawbacks
Increased productivity due to fewer manual checks for accuracy. Only senior-level managerial attention will induce cultural
change
Reduction in the time and effort required to produce reports thanks to
data consolidation. According to TDWI Research, the average data
warehousing project costs $1.1 million and takes 10 months
Enhanced ability to comply with regulatory requirements and greater
to deliver, while a data mart project costs $544,000 and
– and more confident – audit readiness.
takes six months to deliver.1
Enhanced access to highly consistent information, as well as to
Most BI solutions are used by less than 20 percent of
unstructured data.
employees (if that) and provide only departmental views of
Enhanced ability to transform data into usable and actionable data.
information.
Business are not agile enough to deal with real-time
Reduced cost and effort required for virtually every IT project. information
Reduced IT costs associated with data maintenance. Burden the production system by polling it continually.
ALTERNATIVE: Centralized data warehouse
Elimination of custom programming to build data extraction and as a repository and distribution engine for
manipulation. online transaction processing data.
Incremental revenue from the ability to cross-sell and up-sell related
products and services. 1From In Search of a Single Version of Truth: Strategies for Consolidating Analytic Silos by Wayne Eckerson,
TDWI Best Practices Report, 2004 (www.tdwi.org/research/reportseries). Technically, the numbers are for
Improved customer service and reduced time required to serve each consolidating data warehouses, but the common approach for consolidation was starting from scratch.
8. Case Study
The Path Less Taken.
Integration of firm's resource and capability to implement enterprise
CRM: A case study of a retail bank in Korea.
10. Case Study: The Path Less Taken
Company Wesco International
FORTUNE 500 COMPANY with $5.3 BILLION in revenue
in 2006
Electrical and industrial product distributor
Pittsburgh-base
More than 6,000 employees
370 full-service branches across the U.S. and Canada
Eight high-tech distribution centers
More than 100,000 customers worldwide
11. Case Study: The Path Less Taken
Wesco Business
Its
strategy has centered on putting inventory; expertise
and services where its customers need them.
Customers cross most industries and run the gamut from
Boeing to Dow Chemical to PepsiCo
370 branches fed by eight distribution centers
Distribution center managers are given a high degree of
autonomy, including the ability to determine inventory; set
prices and negotiate contracts
12. Case Study: The Path Less Taken
Wesco Situation
Did not have real-time access to inventory at the branch level, and could
not as a result easily shift supplies from one location to another to meet
demand.
Did not have immediate access to sales information from the field; this
data was consolidated at headquarters via nightly uploads to an Informix
database.
Management could not quickly drill down to important customer-level
information, such as which customers had recorded a dramatic drop in
purchases and were perhaps getting their supplies from a competitor.
The Informix system, installed in 1993, couldn‗t be tweaked much further. It
was overloaded and underpowered.
A key sales analysis report required 80 hours of processing time
13. Case Study: The Path Less
Taken
In 2000 began evaluating an In the end decided that
Enterprise Resource Planning Wesco didn't need a new
system to move closer to real- ERP system
time visibility
Looked at systems from SAP and Decided to replace its
Oracle Informix data warehouse with
Cost close to $110 million. an Oracle data warehouse
To achieve the integration would Construction began 1999
have had to scrap WesNet, its
distributed point-of-sale system NCR account representative
(based on a 20-year-old NCR proposed use of an NCR
system called ITEM). Teradata system
WesNet was completely paid for, Company agreed to a
incorporated a high degree of
customization, and could still be benchmarking exercise
expanded.
14. Case Study: The Path Less Taken
INFORMIX LEGACY SYSTEM
Benchmark
Time 80 ORACLE WAREHOUSE
required to BENCHMARKS
process key
Hours
reports by 1999
system. 28
2000
12 13
3 4
1.25 0.58 0.25 2006
Month End Invoice Detail Sales & Suppliers
Sales Analysis Loading Summary
15. Case Study: The Path Less
Taken
Wesco decided to continue Although Oracle and Teradata are built on
implementing Oracle for some relational database management system
(RDBMS) technology, in which data is
functions organized around related tables (rows and
Transactional data such as columns) of data, the design of
pricing, electronic data Teradata RDBMS has always revolved
around fast analysis and retrieval of data.
interchange (EDI) and the
Incorporates a technology known as
company's e-commerce massively parallel processing, in which
environment. database lookups are broken into smaller
sub-tasks that are assigned to different
The Oracle system also feeds processors on a multi-processor server
information back into Teradata. Oracle grew up around online transaction
processing (OLTP) applications, in which the
Teradata would now serve as the most important thing is to record transactions
storage hub for sales such as purchases and payments quickly
and reliably
analysis, accounts receivable Can also be tuned and configured to support
and payable, supplier summaries more analytical applications such as data
warehousing.
and customer master records.
16. Case Study: The Path Less
Taken
Teradata implications to Wesco Constructed a number of applications more often
associated with ERP suites
$5 million more expensive than Oracle
Spent about $10 million on the Teradata
Initially it ran parallel to existent Informix system implementation, including the WebFocus
In 2002 bought new model and reassigned initial piece, and additional applications written for the
to application development Oracle databases
Choose a tool for presenting information and $10 million one-time margin improvement through
conducting business intelligence queries the use of the system.
Initially Cognos $8 million one-time gain through inventory
Eventually Webfocus suite reduction and better distribution of inventory
among branches
Closer to real-time access to data from field
operations, and a way of drilling down into the $4 million savings in the first 24 months through
data. better management of its discount prices
Tweaked WesNet at the branch level to push $1 million savings
inventory updates to head office several times a Gained an indefinite extension on its WesNet
day system
17. Case Study: The Path Less Taken
Links to and Implications for BI projects
―The strategy we took isn't right for every organization, but
it's something they should consider‖
"Companies have invested a lot of money in developing
applications that run their business really well. Why give
that up for the cookie-cutter approach of an ERP system‖
John Conte
Chief Information Officer
Wesco International
18. A case study of a retail bank in Korea
Integration of firm's resource and capability to implement enterprise CRM
19. Case Study: Retail Bank in Korea
Introduction
Find-Equity Bank (a pseudonym) one of the big players in Korea
Intense competition in the retail bank industry
Transform from being product- or service-centered into customer-centered
As a customer-centered IT-driven strategy, Customer Relationship Management
(CRM) implemented enterprise-wide
In 2003
Concerns
Decrease of the interest profit rate on deposit and loan
Infringing on the banking business by other industries
Dichotomized customer management processes caused by the merger and acquisition
with Seoul Bank in 2002 were yielding customer dissatisfaction, consequently resulting in
customer defections
Enterprise-wide CRM was deemed to be a mission-critical business strategy to ensure
distinguish itself from its competitors, win over new customers, and maintain the loyalty of its
20. Case Study: Retail Bank in
Korea
Customer
Relationship
Management
(CRM)
Implementatio
n made up of
two different They found that they have been
phases (not missing another critical factor: the
intended from people
the outset) CRM is inherently a business strategy
driven by not technology but people
21. Case Study: Retail Bank in Korea
Critical problems
Technological
Difficult to synchronize data
acquired from various channels
Required plenty of time to do it
because every channel has
operated by its own
subsystems
The integrity and consistency of
customer information were rarely
guaranteed
Partial and separated analytical
functions supported by each
subsystem have caused redundant
targeting, resulting in
ineffectiveness of marketing
campaigns
22. Case Study: Retail Bank in Korea
Critical problems
Strategic
The systems separated by channels
forced to grade its customers by not
their profits but their deposited
amount, and manage them according
to each product and channel
The responsibilities of CRM planning
and execution activities had been left
to each branch
Impose excessive workloads on the
employees
Redundant and frequent marketing
efforts to the same customers
Increased marketing costs
Diminished response rate
The clerks feel that the CRM
was not effective
23. Case Study: Retail Bank in
Korea
Redesigned
integrative data
model
The six subject
areas, that
each includes
15 to 24
detailed
entities, are not
physical but
logical
divisions such
that they are
connected with
each other
systematically
24. Case Study: Retail Bank in Korea
Newly designed
analysis framework
Every analysis
activity would be
aligned according to
each customer life
cycle in
banking, spanned
from
selection/contraction
to
expiration/terminatio
n, and each
analytical initiative is
guided by systematic
procedures
consisting of
customer
understanding, strate
gy planning and
building, execution, a
nd result analysis
25. Case Study: Retail Bank in
Korea
Operation Capability monotonous message to
Event-based response all customers who
system and sales force brought about an
automation (SFA) were the identical event
key drivers Expected not only to
Provided a function of real- support making
time perception of decisions related to
customer needs in terms of customers efficiently, but
customer events, enabling also to reduce the
the so-called immediate operational cost through
responsive system the automation of
Solved the problem preparing the responses
of, regardless of the to customers' ordinary
customer demands
contexts, forwarding a
26. Case Study: Retail Bank in Korea
Event-based response system Sales force automation (SFA)
Before only gathered naive events (e.g., Considered as a tool for leveraging the event-
customer's birthday) daily by a batch processing based marketing strategy
at the end of the daily tasks, and delivered the
prepared massages to the customers the next day Efforts began to customize Siebel's solution to
integrate it with the event-based marketing
28 events had significant influences on profits, capability
many of them had been prepared with no
strategic response schemes or inappropriate Was designed to provide high-degree customer
responsive activities at that time knowledge and insights for the effective and
efficient sales activity
Now when system perceives an important event
from a customer, it first derives the most Provided learning opportunities for the internal
appropriate response strategy for the customer resources and capabilities by feeding the voices
and the event automatically, and delivers the of customers such as complaints, praises, and
derived response strategy to the customer suggestions collected through various channels
through every channel, department, or branch back to the internal resources and capabilities
consistently
28. Case Study: Retail Bank in Korea
Implications Establish a series of CRM
Development of proper
education and training
employees compensation programs
schemes Providing systematic
Improving its incentive and education and training
reward system program
Make a more customer- Best in profitability per
oriented organizational customer in Korea
structure Awarded by Euromoney as
Reorganizing roles and the best private bank for four
responsibilities related to consecutive years from 2005
CRM jobs to 2008
29. Case Study: Retail Bank in
Korea
Links to and Implications for BI projects
CRM is a continuous learning process rather than an
information technology or analytical method2, it should evolve
permanently to respond to quickly and continuously changing
customer needs
CRM would hardly be implemented successfully when it is
considered as a technology, and even its successful
implementation does not necessarily mean the success of
the strategy
People play the role of interface between a firm's internal
2
service quality and its external service quality, which is vital for
A. Osarenkhoe and A. Bennani, An exploratory study of implementation of customer relationship management strategy, Business Process Management Journal 13 (1) (2007), pp.
139–164.managing customer relationship
31. Looking ahead
Wesco International Retail Bank in Korea
Putting Secure present level
inventory, expertise of competency
and services where
its customers need
them
Always have clear the primary aim goal
32. Looking ahead
Wesco International Retail Bank in Korea
Decided did not Phase I
need an Enterprise unsatisfactory:
Resource Planning Integration of
Functional
(ERP) system Resources &
Capabilities
Initial assessment is a key for efficient success
Diagnosis of CRM
The Outcome could not satisfied even spending lots of money and time
Need to evaluate each package carefully on its own merit
33. Looking ahead
Wesco International Retail Bank in Korea
Teradata Data Event-based
Warehouse reduced response system
the report process and sales force
time 80% in the last automation (SFA)
seven years Real time perception
of customer needs
Technology will keep uninterrupted grow
Adoption is not trivial and requires a different organization, human integration and process adaptation
34. Patrón de prueba de pantalla panorámica (16:9)
Prueba de la
relación de
aspecto
(Debe parecer
circular)
4x3
16x9