The document summarizes a Gartner report on the Magic Quadrant for Data Quality Tools. It discusses trends in the market such as growth, mergers and acquisitions, and the evolution of tools to offer more full-function suites. It evaluates vendors based on capabilities like profiling, parsing, standardization, and matching. The market is expected to grow significantly due to the importance of data quality initiatives.
The document discusses the rise of analytic platforms as a disruption to the traditional data warehouse market. Analytic platforms integrate data management and analytics generation to provide a complete solution for business analysis. They offer superior price/performance and faster time to value compared to traditional offerings. The adoption of analytic platforms is growing strongly across various industries worldwide as their promises of improved analytics are being realized by users. However, selecting the right analytic platform is essential, as real-world tests can show that some vendors struggle to deliver as needed.
Future of Horizontal Services by Harrick Vin, VP & Chief Scientist, TCS. The two functions of enterprise IT -- run the business (RTB) and change the business (CTB) -- are undergoing significant changes because of automation. In this presentation, we talked about what is fueling this change, and some of the challenges in realizing automation benefits in enterprises.
The Business Value of Commvault Software IDC 2016 US40773815Charles Uprichard
This document summarizes the results of a survey conducted by IDC of 722 Commvault customers. The survey found that Commvault customers reported significant benefits in three categories: simplification, risk reduction, and productivity gains. Commvault customers simplified their IT environments through reduced costs, staffing needs, and complexity. They reduced risks through lower downtime, improved data protection and recovery capabilities. Commvault also provided productivity gains by improving IT staff efficiency and organizational productivity overall. Key findings included 42-52% lower annual costs, 55% less downtime, 66-217% greater data coverage, and 63-73% lower risks of compliance failures or data loss.
This document provides an excerpt from an IDC MarketScape report on smart multifunction peripherals (MFPs) in the US market in 2013. It discusses key factors for success with smart MFPs, including a complete product and services portfolio. It also outlines IDC's vendor assessment methodology and positions major vendors as leaders, major players, contenders or participants based on their strategies and capabilities. The excerpt highlights Lexmark as a leader, noting its broad MFP lineup, software acquisition strategy and emphasis on managed print services.
This document outlines best practices for implementing a Master Data Management (MDM) solution to improve data quality. MDM can help by providing a single source of trusted customer, product, and partner data across systems. When implementing MDM, organizations should follow best practices like establishing formal data governance policies, using architecture consistent with existing IT systems, demonstrating a clear business case and ROI, taking a phased approach while making MDM a long-term program, and ensuring active vendor support. Following these practices can help organizations realize the benefits of MDM like increased revenue, cost savings, and regulatory compliance.
The document discusses the journey organizations take to establish trusted data through effective data management. It outlines key barriers such as a disconnect between business and IT needs as well as a lack of data ownership and governance. The document promotes establishing repeatable data processes through a single data management solution that provides data quality, integration and master data management capabilities. This helps improve business user productivity, reduce costs and risks, and support data-driven decisions.
ASUG 10_27_2016 Entegris PLM-MDM Business Process Optimization 3keefe008
This document discusses Entegris' project to optimize their PLM-MDM business processes with LeverX. The project goals are to reduce time to market, enable a sustainable and extendable model, and increase data quality. The project will occur in phases from 2016-2017, starting with quick wins to optimize material master extensions and changes (Release 1.0, 1.5). Future phases will integrate CAD, optimize specifications/EHS, and enable future PLM design enhancements. LeverX tools like BMAX and IPS will be implemented to automate workflows and rules-based processes. The expected benefits include connecting functions, preparing for acquisitions, eliminating data re-entry, and improving the bottom line.
The document discusses the rise of analytic platforms as a disruption to the traditional data warehouse market. Analytic platforms integrate data management and analytics generation to provide a complete solution for business analysis. They offer superior price/performance and faster time to value compared to traditional offerings. The adoption of analytic platforms is growing strongly across various industries worldwide as their promises of improved analytics are being realized by users. However, selecting the right analytic platform is essential, as real-world tests can show that some vendors struggle to deliver as needed.
Future of Horizontal Services by Harrick Vin, VP & Chief Scientist, TCS. The two functions of enterprise IT -- run the business (RTB) and change the business (CTB) -- are undergoing significant changes because of automation. In this presentation, we talked about what is fueling this change, and some of the challenges in realizing automation benefits in enterprises.
The Business Value of Commvault Software IDC 2016 US40773815Charles Uprichard
This document summarizes the results of a survey conducted by IDC of 722 Commvault customers. The survey found that Commvault customers reported significant benefits in three categories: simplification, risk reduction, and productivity gains. Commvault customers simplified their IT environments through reduced costs, staffing needs, and complexity. They reduced risks through lower downtime, improved data protection and recovery capabilities. Commvault also provided productivity gains by improving IT staff efficiency and organizational productivity overall. Key findings included 42-52% lower annual costs, 55% less downtime, 66-217% greater data coverage, and 63-73% lower risks of compliance failures or data loss.
This document provides an excerpt from an IDC MarketScape report on smart multifunction peripherals (MFPs) in the US market in 2013. It discusses key factors for success with smart MFPs, including a complete product and services portfolio. It also outlines IDC's vendor assessment methodology and positions major vendors as leaders, major players, contenders or participants based on their strategies and capabilities. The excerpt highlights Lexmark as a leader, noting its broad MFP lineup, software acquisition strategy and emphasis on managed print services.
This document outlines best practices for implementing a Master Data Management (MDM) solution to improve data quality. MDM can help by providing a single source of trusted customer, product, and partner data across systems. When implementing MDM, organizations should follow best practices like establishing formal data governance policies, using architecture consistent with existing IT systems, demonstrating a clear business case and ROI, taking a phased approach while making MDM a long-term program, and ensuring active vendor support. Following these practices can help organizations realize the benefits of MDM like increased revenue, cost savings, and regulatory compliance.
The document discusses the journey organizations take to establish trusted data through effective data management. It outlines key barriers such as a disconnect between business and IT needs as well as a lack of data ownership and governance. The document promotes establishing repeatable data processes through a single data management solution that provides data quality, integration and master data management capabilities. This helps improve business user productivity, reduce costs and risks, and support data-driven decisions.
ASUG 10_27_2016 Entegris PLM-MDM Business Process Optimization 3keefe008
This document discusses Entegris' project to optimize their PLM-MDM business processes with LeverX. The project goals are to reduce time to market, enable a sustainable and extendable model, and increase data quality. The project will occur in phases from 2016-2017, starting with quick wins to optimize material master extensions and changes (Release 1.0, 1.5). Future phases will integrate CAD, optimize specifications/EHS, and enable future PLM design enhancements. LeverX tools like BMAX and IPS will be implemented to automate workflows and rules-based processes. The expected benefits include connecting functions, preparing for acquisitions, eliminating data re-entry, and improving the bottom line.
The document discusses the journey organizations take to establish trusted data through effective data management. It outlines common barriers to coordinating data initiatives and how the gap between IT and business needs can be closed. A maturity model is presented showing how organizations evolve their data practices from being IT-driven to enabling personalized customer experiences. The key is establishing repeatable processes through a single data management platform that provides data quality, integration and master data management capabilities.
Master Data Management: Extracting Value from Your Most Important Intangible ...FindWhitePapers
This SAP Insight explores the importance of master data and the barriers to achieving sound master data, describes the ideal master data management solution, and explains the value and benefits of effective management of master data.
Driving Value Through Data Analytics: The Path from Raw Data to Informational...Cognizant
As organizations gather and process colossal amounts of data, analytics is essential for operational and strategic excellence. We offer a guide to the phases of the data analytics journey, from descriptive to diagnostic to predictive to prescriptive, covering intentions, tools and people considerations.
EAI - Master Data Management - MDM - Use CaseSherif Rasmy
KGX is a broker-dealer that was fined by the SEC for regulatory reporting failures and frequent trade errors caused by using multiple inconsistent versions of master data across its systems. To address this, KGX implemented a Master Data Management (MDM) solution to create a single source of security master data. The MDM solution standardized, cleansed, matched and consolidated security data from multiple sources into a consistent version stored in the MDM database. Changes are then synchronized to operational systems in real-time or batch to provide accurate security master data across KGX's systems. Governance policies and processes were also established to manage the quality and usage of security master data going forward.
Master Data Management (MDM) for Mid-MarketVivek Mishra
This document discusses master data management (MDM) solutions for mid-sized businesses. It begins by introducing MDM and its benefits, such as a single source of truth and improved data quality. It then outlines some common challenges for implementing MDM in mid-sized companies, such as high costs, maintaining multiple data domains, and the need for organizational change management. The document provides advice on overcoming these challenges by selecting flexible and affordable MDM platforms. It also describes key properties of effective MDM, like ongoing data governance and providing a 360-degree view of customers. Finally, it introduces Compunnel Digital's partnership with Profisee to offer modernized MDM solutions tailored for mid-sized organizations.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
This document discusses the importance of business intelligence (BI) and provides guidance on how to successfully implement a BI program. It advocates for a "back to basics" approach that ensures prerequisites are in place, understands goals and timelines, and starts small before scaling up. Effective BI requires quality data from multiple sources, strong data governance, and developing an analytical culture. The document outlines key components of a basic BI system including a data repository, data use rules, and analytical tools. It emphasizes understanding an organization's current analytical maturity and business needs to guide tool selection and maximize the chances of BI success.
This document discusses how companies can use information technology for strategic advantage. It covers various competitive strategies like cost leadership, differentiation, innovation, and growth. It also discusses how IT can support these strategies through activities like reducing costs, creating new products/services, promoting business process innovation, locking in customers/suppliers, and raising barriers to entry. Examples are provided of how companies have leveraged IT for strategic benefits like improving efficiency, creating new business opportunities, and maintaining valuable relationships.
Chap02 Competing with Information TechnologyAqib Syed
James A. O'Brien, and George Marakas. Management Information Systems with MISource 2007, 8th ed. Boston, MA: McGraw-Hill, Inc., 2007. ISBN: 13 9780073323091
Cognitive Integration: How Canonical Models and Controlled Vocabulary Enable ...Cognizant
For pharmaceuticals companies dealing with multiple partners' systems, employing a canonical model for data communications facilitates point-to-point integration, and applying a controlled vocabulary (CV) in such models alleviates semantical ambiguity and facilitates cognitive and systems integration. We demonstrate how this works with a pharma business scenario involving Contract Research Organizations (CROs).
North American Utility Sparks Up its Complaint Handling SystemCognizant
Electric utility's new complaint handling system reduces resolution times, increases staff productivity, boosts customer satisfaction and improves regulatory compliance.
This document discusses using soft computing techniques to evaluate coordination in supply chains. It proposes a fuzzy analytic hierarchy process (FAHP) model to measure the extent of coordination (EC) based on four coordination mechanisms: contracts, information sharing, information technology, and joint decision making. The EC is calculated as a weighted sum of the crisp scores of the four mechanisms, where the weights are determined from pairwise comparisons in the AHP. The model provides a way to quantitatively assess coordination across a supply chain and analyze different coordination scenarios. It is demonstrated through a numerical example involving order quantities and performance measures at different supply chain levels.
IT has become increasingly complex and difficult to manage
and scale. Costs are growing at unprecedented rates, as
enterprises continue to add technologies in response to an
explosion in the quantity and new types of data. This explosion in data and the patchwork of methods used to store, retrieve
and provide usable views has resulted in inefficiencies in data center architecture, duplication of data base licenses, increased costs for data storage and management, and inefficient use of data center technicians. Database replication alone has precipitated new challenges and resulted in the lack of consolidation
of numerous software middleware licenses, further increasing costs.
This document provides an excerpt from an IDC report on the business intelligence tools market in 2007. Some key points:
1) The BI tools market grew 12.1% to $7.05 billion in 2007, with advanced analytics growing faster than query, reporting, and analysis.
2) Several large acquisitions impacted the largest suppliers in 2007, and IDC expects more startups focused on BI niches in the next few years.
3) Business decision makers showed growing awareness of BI benefits, signaling a shift in purchase power from IT to business users.
Manufacturing companies have traditionally
had an on-again-off-again relationship with
technology. However, the paradigm shift driven
by global manufacturing and distribution,
combined with rapid digital innovation, is
changing this equation. Deloitte’s 2016 MHI
survey reveals that 83% of manufacturing
organizations believe investing in key digital
technologies such as IoT, robotics, Big
Data, cloud computing, etc. will be key to
competitive advantage in the near future.1
Streebo is an innovation company that builds technology products and solutions. It has a product suite that includes spend data visualizations, a dispute resolution portal, a consent management system, Open Payments Analytics, and Physi-Engage. Streebo has over 100 users from 40 pharma/biotech companies using its cloud-based SaaS solutions. It also provides custom reports, notifications, and data integration/enrichment capabilities. Streebo has experience providing solutions to major customers across various industries including retail, financial services, and life sciences. It offers services such as data warehousing, analytics, master data management, and enterprise mobility.
Understanding the Information Architecture, Data Management, and Analysis Cha...Cognizant
As the Internet of Things (IoT) becomes increasingly prevalent, organizations must build the enterprise information architecture required to gather, manage, and analyze vast troves of rich real-time data. We offer an IoT framework, use cases, and a maturity model that helps enable you to choose an adoption approach.
A technical Introduction to Big Data AnalyticsPethuru Raj PhD
This presentation gives the details about the sources for big data, the value of big data, what to do with big data, the platforms, the infrastructures and the architectures for big data analytics
Presented during the Open Source Conference 2012, organized by Accenture and Redhat on December 14th 2012. This presentation discusses an open source Big Data case study.
By Jonathan Bender, Consultant, Accenture Technology Labs
Becoming an analytics-driven organization helps companies reduce costs, increase
revenues and improve competitiveness, and this is why business intelligence and
analytics continue to be a top priority for CIOs. Many business decisions, however,
are still not based on analytics, and CIOs are looking for ways to reduce time to value
for deploying business intelligence solutions so that they can expand the use of
analytics to a larger audience of users.
Companies are also interested in leveraging the value of information in so-called big
data systems that handle data ranging from high-volume event data to social media
textual data. This information is largely untapped by existing business intelligence
systems, but organizations are beginning to recognize the value of extending the
business intelligence and data warehousing environment to integrate, manage, govern
and analyze this information.
The document discusses the journey organizations take to establish trusted data through effective data management. It outlines common barriers to coordinating data initiatives and how the gap between IT and business needs can be closed. A maturity model is presented showing how organizations evolve their data practices from being IT-driven to enabling personalized customer experiences. The key is establishing repeatable processes through a single data management platform that provides data quality, integration and master data management capabilities.
Master Data Management: Extracting Value from Your Most Important Intangible ...FindWhitePapers
This SAP Insight explores the importance of master data and the barriers to achieving sound master data, describes the ideal master data management solution, and explains the value and benefits of effective management of master data.
Driving Value Through Data Analytics: The Path from Raw Data to Informational...Cognizant
As organizations gather and process colossal amounts of data, analytics is essential for operational and strategic excellence. We offer a guide to the phases of the data analytics journey, from descriptive to diagnostic to predictive to prescriptive, covering intentions, tools and people considerations.
EAI - Master Data Management - MDM - Use CaseSherif Rasmy
KGX is a broker-dealer that was fined by the SEC for regulatory reporting failures and frequent trade errors caused by using multiple inconsistent versions of master data across its systems. To address this, KGX implemented a Master Data Management (MDM) solution to create a single source of security master data. The MDM solution standardized, cleansed, matched and consolidated security data from multiple sources into a consistent version stored in the MDM database. Changes are then synchronized to operational systems in real-time or batch to provide accurate security master data across KGX's systems. Governance policies and processes were also established to manage the quality and usage of security master data going forward.
Master Data Management (MDM) for Mid-MarketVivek Mishra
This document discusses master data management (MDM) solutions for mid-sized businesses. It begins by introducing MDM and its benefits, such as a single source of truth and improved data quality. It then outlines some common challenges for implementing MDM in mid-sized companies, such as high costs, maintaining multiple data domains, and the need for organizational change management. The document provides advice on overcoming these challenges by selecting flexible and affordable MDM platforms. It also describes key properties of effective MDM, like ongoing data governance and providing a 360-degree view of customers. Finally, it introduces Compunnel Digital's partnership with Profisee to offer modernized MDM solutions tailored for mid-sized organizations.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
This document discusses the importance of business intelligence (BI) and provides guidance on how to successfully implement a BI program. It advocates for a "back to basics" approach that ensures prerequisites are in place, understands goals and timelines, and starts small before scaling up. Effective BI requires quality data from multiple sources, strong data governance, and developing an analytical culture. The document outlines key components of a basic BI system including a data repository, data use rules, and analytical tools. It emphasizes understanding an organization's current analytical maturity and business needs to guide tool selection and maximize the chances of BI success.
This document discusses how companies can use information technology for strategic advantage. It covers various competitive strategies like cost leadership, differentiation, innovation, and growth. It also discusses how IT can support these strategies through activities like reducing costs, creating new products/services, promoting business process innovation, locking in customers/suppliers, and raising barriers to entry. Examples are provided of how companies have leveraged IT for strategic benefits like improving efficiency, creating new business opportunities, and maintaining valuable relationships.
Chap02 Competing with Information TechnologyAqib Syed
James A. O'Brien, and George Marakas. Management Information Systems with MISource 2007, 8th ed. Boston, MA: McGraw-Hill, Inc., 2007. ISBN: 13 9780073323091
Cognitive Integration: How Canonical Models and Controlled Vocabulary Enable ...Cognizant
For pharmaceuticals companies dealing with multiple partners' systems, employing a canonical model for data communications facilitates point-to-point integration, and applying a controlled vocabulary (CV) in such models alleviates semantical ambiguity and facilitates cognitive and systems integration. We demonstrate how this works with a pharma business scenario involving Contract Research Organizations (CROs).
North American Utility Sparks Up its Complaint Handling SystemCognizant
Electric utility's new complaint handling system reduces resolution times, increases staff productivity, boosts customer satisfaction and improves regulatory compliance.
This document discusses using soft computing techniques to evaluate coordination in supply chains. It proposes a fuzzy analytic hierarchy process (FAHP) model to measure the extent of coordination (EC) based on four coordination mechanisms: contracts, information sharing, information technology, and joint decision making. The EC is calculated as a weighted sum of the crisp scores of the four mechanisms, where the weights are determined from pairwise comparisons in the AHP. The model provides a way to quantitatively assess coordination across a supply chain and analyze different coordination scenarios. It is demonstrated through a numerical example involving order quantities and performance measures at different supply chain levels.
IT has become increasingly complex and difficult to manage
and scale. Costs are growing at unprecedented rates, as
enterprises continue to add technologies in response to an
explosion in the quantity and new types of data. This explosion in data and the patchwork of methods used to store, retrieve
and provide usable views has resulted in inefficiencies in data center architecture, duplication of data base licenses, increased costs for data storage and management, and inefficient use of data center technicians. Database replication alone has precipitated new challenges and resulted in the lack of consolidation
of numerous software middleware licenses, further increasing costs.
This document provides an excerpt from an IDC report on the business intelligence tools market in 2007. Some key points:
1) The BI tools market grew 12.1% to $7.05 billion in 2007, with advanced analytics growing faster than query, reporting, and analysis.
2) Several large acquisitions impacted the largest suppliers in 2007, and IDC expects more startups focused on BI niches in the next few years.
3) Business decision makers showed growing awareness of BI benefits, signaling a shift in purchase power from IT to business users.
Manufacturing companies have traditionally
had an on-again-off-again relationship with
technology. However, the paradigm shift driven
by global manufacturing and distribution,
combined with rapid digital innovation, is
changing this equation. Deloitte’s 2016 MHI
survey reveals that 83% of manufacturing
organizations believe investing in key digital
technologies such as IoT, robotics, Big
Data, cloud computing, etc. will be key to
competitive advantage in the near future.1
Streebo is an innovation company that builds technology products and solutions. It has a product suite that includes spend data visualizations, a dispute resolution portal, a consent management system, Open Payments Analytics, and Physi-Engage. Streebo has over 100 users from 40 pharma/biotech companies using its cloud-based SaaS solutions. It also provides custom reports, notifications, and data integration/enrichment capabilities. Streebo has experience providing solutions to major customers across various industries including retail, financial services, and life sciences. It offers services such as data warehousing, analytics, master data management, and enterprise mobility.
Understanding the Information Architecture, Data Management, and Analysis Cha...Cognizant
As the Internet of Things (IoT) becomes increasingly prevalent, organizations must build the enterprise information architecture required to gather, manage, and analyze vast troves of rich real-time data. We offer an IoT framework, use cases, and a maturity model that helps enable you to choose an adoption approach.
A technical Introduction to Big Data AnalyticsPethuru Raj PhD
This presentation gives the details about the sources for big data, the value of big data, what to do with big data, the platforms, the infrastructures and the architectures for big data analytics
Presented during the Open Source Conference 2012, organized by Accenture and Redhat on December 14th 2012. This presentation discusses an open source Big Data case study.
By Jonathan Bender, Consultant, Accenture Technology Labs
Becoming an analytics-driven organization helps companies reduce costs, increase
revenues and improve competitiveness, and this is why business intelligence and
analytics continue to be a top priority for CIOs. Many business decisions, however,
are still not based on analytics, and CIOs are looking for ways to reduce time to value
for deploying business intelligence solutions so that they can expand the use of
analytics to a larger audience of users.
Companies are also interested in leveraging the value of information in so-called big
data systems that handle data ranging from high-volume event data to social media
textual data. This information is largely untapped by existing business intelligence
systems, but organizations are beginning to recognize the value of extending the
business intelligence and data warehousing environment to integrate, manage, govern
and analyze this information.
The document discusses how emerging technologies are creating new sources of data and how analyzing this data can provide businesses a competitive advantage. It identifies key trends like cloud computing, social media, mobile devices, and big data that are fueling data growth. To leverage this "nexus of forces", companies need strategies to innovate using new types of information and analytics. This includes assessing business needs, understanding new possibilities, and adopting technologies like analytics, databases, and Hadoop to access diverse data sources and gain insights.
IBM Smarter Business 2012 - PureSystems - PureDataIBM Sverige
1) IBM's PureSystems are expert integrated systems that simplify IT challenges around big data by capturing built-in expertise and deeply integrating hardware and software.
2) PureSystems deliver greater simplicity, speed, and lower cost across the entire IT lifecycle from design to deployment to management through pre-integration and automation.
3) The PureData System delivers optimized data platforms and services for transactions, analytics, and operational analytics workloads through scale-out clusters of DB2, Netezza, and other technologies.
What is Big Data?
Big Data Laws
Why Big Data?
Industries using Big Data
Current process/SW in SCM
Challenges in SCM industry
How Big data can solve the problems?
Migration to Big data for an SCM industry
This document provides a summary of big data analytics and how it can derive meaning from large volumes of structured and unstructured data. It discusses how new analysis tools and abundant processing power through technologies like Hadoop can unlock insights from massive data sets. Examples are given of how big data analytics can help various industries like healthcare, banking, manufacturing, and utilities to optimize processes, predict outcomes, and detect patterns. The integration of structured and unstructured data from various sources into analytical models is also described.
Big data analytics enables organizations to derive meaningful insights from large volumes of structured and unstructured data. New tools can analyze petabytes of data across various formats and identify patterns and trends. This helps optimize processes, reduce risks, and uncover new opportunities. Examples include detecting healthcare treatment patterns that improve outcomes, preventing bank fraud, and predicting consumer demand to inform utility planning. While big data is still emerging, it has potential to enhance business intelligence and integrate diverse internal and external data sources for more powerful analytics.
Karya Technologies provides enterprise services including IT strategy and software applications to improve operational efficiency. They offer solutions for data management, integration platforms, cloud services, and consulting. Their expertise is bolstered by strategic alliances with technology companies. Karya engages clients through comprehensive and cost-effective solutions tailored to their needs. Their enterprise solutions portfolio focuses on data management, ERP/CRM platforms, and cloud services for small and medium enterprises.
This document discusses how big data can provide competitive advantages and describes Google's cloud services for managing big data. It notes that big data is growing faster than companies' ability to leverage it and that scaling traditional business intelligence for big data can be challenging. It then provides examples of how Google's cloud services like BigQuery, Cloud Storage, and Cloud SQL can help store, analyze, and share large datasets interactively and at scale.
A Service Oriented Analytics Framework For Multi-Level Marketing BusinessBrandi Gonzales
This document proposes a service-oriented analytics framework for multi-level marketing businesses. It discusses developing a statistical service engine solution using R to automate analytical processes and improve enterprise knowledge generation and reusability. The solution would involve:
1) A statistical job portal for users to submit predefined or ad-hoc analysis requests via an XML message format.
2) An enterprise service bus to route job requests to GNU-R engines running statistical scripts on distributed servers. Data could be retrieved from databases or file repositories.
3) The GNU-R engines would execute the scripts, retrieve and analyze data, save results to files, and return outcomes to users for decision making. Asynchronous messaging and portability were prior
Datacraft Asia Strategy For Services 2010Joshua Hong
The document provides an overview of the IT market in Asia from 2009-2010 and analyzes Datacraft Asia's growth strategy. It discusses key trends like virtualization and managed services. It analyzes Datacraft's strengths, weaknesses, opportunities and threats. Recommendations include focusing on virtualization, managed services and multisourcing to address challenges like transitioning to a strategic IT partner and adopting new technologies. The strategy involves increasing revenues from existing clients through upselling services and developing offerings in growth areas.
A System Approach For Defining Data Center Value Proposition.pdfVernette Whiteside
This document discusses defining the value proposition of a data center using a systems approach. It introduces a method to measure a data center's value using a set of metrics that capture the behavior and outcomes of a data center as a system. These metrics would provide measures for variables like performance, investments, operations, and services. Analyzing these metrics would provide a system model to define stakeholder value. Current methods for evaluating IT investments often focus only on financial metrics and lack consideration of external factors, behaviors, and qualitative impacts. A balanced, mixed approach is needed to fully capture a data center's true value proposition.
This document provides a summary of 19 vendor briefings from the 2016 Strata Conference in NYC. It includes 3-sentence summaries of presentations by Alation, AllSight, Alpine Data, Basho Technologies, Cambridge Semantics, Continuum Analytics, Dataiku, Dell EMC, GigaSpaces, Logtrust, MapR Technologies, Rocana, and SAP. Each summary highlights the vendor's solution, how it addresses key challenges identified in DEJ research, and a relevant quote from the presentation.
Metadata may not be the best word to use to try to get senior management excited about reference data projects. But metadata management is a vital part of enterprise data management (EDM), and as EDM projects are now maturing, metadata is swiftly moving up the agenda. Tine Thoresen explores the best strategies for implementing metadata tools and systems.
Enterprise Data Management | Getting Meta All The TimeMichael Findling
Metadata may not be the best word to use to try to get senior management excited about reference data projects. But metadata management is a vital part of enterprise data management (EDM), and as EDM projects are now maturing, metadata is swiftly moving up the agenda. Tine Thoresen explores the best strategies for implementing metadata tools and systems.
Big data comes from a variety of sources and in different formats. It is characterized by its volume, velocity, and variety. Organizations are using big data to gain business insights through analytics. This allows them to increase revenue, reduce costs, optimize processes, and manage risks. Examples of big data uses include marketing campaign analysis, customer segmentation, and fraud detection. Companies must overcome technological and organizational challenges to successfully leverage big data.
Looking at what is driving Big Data. Market projections to 2017 plus what is are customer and infrastructure priorities. What drove BD in 2013 and what were barriers. Introduction to Business Analytics, Types, Building Analytics approach and ten steps to build your analytics platform within your company plus key takeaways.
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Denodo
Watch full webinar here: https://bit.ly/2O2r3NP
In the last several decades, BI has evolved from large, monolithic implementations controlled by IT to orchestrated sets of smaller, more agile capabilities that include visual-based data discovery and governance. These new capabilities provide more democratic analytics accessibility that is increasingly being controlled by business users. However, given the rapid advancements in emerging technologies such as cloud and big data systems and the fast changing business requirements, creating a future-proof data management strategy is an incredibly complex task.
Catch this on demand session to understand:
- BI program modernization challenges
- What is data virtualization and why is its adoption growing so quickly?
- How data virtualization works and how it compares to alternative approaches to data integration
- How modern data virtualization can significantly increase agility while reducing costs
Similar to Gartner Positions Data Flux In The Leaders Quadrant Of The Magic Quadrant For Data Quality Tools, 2008 (20)
Idc Worldwide Business Intelligence Tools 2008 Vendor SharesCezar Cursaru
The document provides an excerpt from an IDC report on the worldwide business intelligence tools market in 2008.
It discusses key trends in the market including continued spending by organizations of all sizes, stronger growth in advanced analytics vs query/reporting, and the evolving definition of BI software. It provides revenue and market share data for 2008 and growth rates for 2006-2008 for major vendors like SAP, SAS, IBM, Oracle and Microsoft. Specialty vendors still occupy 30% of the market.
This document summarizes a business case for strategy management solutions. It finds that strategy management solutions are a priority investment that can help companies effectively define objectives, associated indicators, and the strategies required to meet objectives. It lists benefits for both business and IT processes. The document then provides context by distinguishing strategy management solutions from other types of solutions like investment portfolio management, data integration, risk management, business intelligence, and predictive analysis.
Business Performance Solutions Clash Of The Titans The Market Remains Vibrant...Cezar Cursaru
This document provides an executive summary of a Forrester Research report on the business performance solutions (BPS) software market. It finds that the BPS market has seen significant growth and vendor consolidation in recent years. Forrester expects BPS software revenues to grow 12.7% through 2012 to $3.2 billion, despite temporary slowing due to the recession. The market is dominated by six large vendors, but also includes smaller BI, ERP, and pure-play BPS vendors. The report provides an overview of the BPS software category and functional elements.
This document summarizes key points about credit card issuer fraud management from a report by Mercator Advisory Group. It finds that while direct credit card fraud losses for issuers have remained relatively stable at around $1 billion annually, total costs of fraud related to credit cards in the US may exceed $16 billion once indirect costs are accounted for. Purposeful data breaches that steal payment card information present ongoing challenges as they fuel a black market for stolen card data. While enterprise fraud management solutions aim to address fraud across multiple products, organizational barriers remain for issuers in implementing more holistic fraud prevention strategies.
SAS provides business intelligence software and services to help customers make better, faster decisions. Their solutions include the SAS Enterprise Intelligence Platform for data integration, management, and analysis, as well as industry and line-of-business specific intelligence solutions. SAS aims to give customers the power to access relevant information throughout the enterprise and improve sustainable performance. Independent analysts recognize SAS as a leader in the business intelligence market.
Datamonitor Decision Matrix Selecting A Business Intelligence Vendor (Competi...Cezar Cursaru
SAS is the clear market leader as it leads technology assessment, dominates user sentiment and exerts considerable market impact. SAS offers a great portfolio of both basic and advanced functionality, backed up by a dependable support capability. Its stable financial footing, superb vision and lead in advanced analytics all imply that SAS is well placed to continue as the Business Intelligence market leader.
The Forrester Wave Enterprise Business Intelligence Platforms, Q3 2008Cezar Cursaru
SAS was among the select companies that Forrester invited to participate in its 2008 Forrester Wave report, The Forrester Wave: Enterprise Business Intelligence Platforms, Q3 2008. In this evaluation, SAS was cited as a leader in Enterprise Business Intelligence Platforms.
- The worldwide data warehouse platform software market grew 14.6% in 2007 to reach $6.7 billion.
- The largest vendors in 2007 were Oracle, IBM, and Microsoft, who combined held over 65% of the market.
- The market is split between data warehouse generation tools and data warehouse management systems, with management systems making up the majority at around 77% of revenues.
Gartner Positions SAS In The Leaders Quadrant Of The Magic Quadrant For Custo...Cezar Cursaru
SAS and SPSS remain the leading vendors in the customer data-mining application market. ThinkAnalytics emerges as a visionary vendor focused on embedding predictive analytics into operational deployments. The market definition describes customer data mining as the application of descriptive and predictive analytics to support marketing, sales or service functions through packaged applications or on-demand services. The evaluation criteria assess vendors on their ability to execute, including the functionality of their products, and completeness of vision, such as their market understanding, marketing strategy, and innovation.
Butler Group Technology Audit Sas For Customer Experience Analytics, Septembe...Cezar Cursaru
SAS® for Customer Experience Analytics combines dynamic real-time on-line data collection from speed-trap, with SAS\' customer intelligence and analytics capabilities. The combination enables organisations that operate in a multi-channel world to understand precisely what customers are doing on-line, and apply that insight across all other customer touch points. SAS for Customer Experience Analytics does away with the need for tagging, network packet sniffers, and Web log analysers, therefore removing some of the expensive pain points of Web analytics.
1. Credit card issuer fraud losses remain well-contained at around $1 billion annually despite rising volumes, though total card-related fraud costs may exceed $16 billion due to additional stakeholders like merchants and consumers.
2. Purposeful data breaches targeting payment card data are a major challenge, driving a thriving secondary market in stolen card information and products.
3. Enterprise fraud management solutions aiming to leverage data across multiple products may provide improved detection, but organizational barriers remain as issuers seek added value from multi-product implementations.
Yphise Software Product Assessment, Strategy Management, December 2008Cezar Cursaru
SAS® Strategic Performance Management demonstrates its commitment to integrating nonfinancial and intangible objectives into enterprise performance management. SAS SPM provides an easy-to-use and easy-to-customize strategy management model that helps manage any kind of objectives. SAS also provides predefined templates according to compliance issues, risks, vertical sectors or business departments; these accelerate strategy definition.
SAS ranks first in two categories of Chartis RiskTech 100 reportCezar Cursaru
SAS is a leading provider of risk management software and services. It has established leadership positions in credit risk management and operational risk management. SAS' core strengths lie in its powerful data management, analytics and business intelligence capabilities. It has won RiskTech 100 category awards for Core Technology and Operational Risk & GRC. SAS has a global presence and strong financials, though its implementation capacity has at times constrained its growth. It continues to enhance its solutions and expand into new risk categories and industries.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
leewayhertz.com-AI in predictive maintenance Use cases technologies benefits ...alexjohnson7307
Predictive maintenance is a proactive approach that anticipates equipment failures before they happen. At the forefront of this innovative strategy is Artificial Intelligence (AI), which brings unprecedented precision and efficiency. AI in predictive maintenance is transforming industries by reducing downtime, minimizing costs, and enhancing productivity.
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on automated letter generation for Bonterra Impact Management using Google Workspace or Microsoft 365.
Interested in deploying letter generation automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Deep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStr
Gartner Positions Data Flux In The Leaders Quadrant Of The Magic Quadrant For Data Quality Tools, 2008
1. Magic Quadrant for Data Quality Tools
Gartner RAS Core Research Note G00157464, Ted Friedman, Andreas Bitterer, 4 June 2008, R2756 06052009
Growth, innovation and volatility (via mergers and acquisitions)
continue to shape the market for data quality tools. Investment
on the part of buyers and vendors is increasing as organizations
recognize the value of these tools in master data management
and information governance initiatives.
WHAT YOU NEED TO KNOW
The market for data quality tools continues to enjoy significant growth, but experiences
ongoing volatility in the form of acquisitions (both direct acquisitions of stand-alone vendors in
this market, as well as the acquisition of larger vendors for which this market represents one
of many competitive fronts). Most vendors have evolved to full-function data quality tool suites
that address a broad range of data quality requirements. This is a clear indication of the
blending of data profiling, data-cleansing operations and domain-specific management.
Specialist vendors, with a focus on a single functional competence, provide narrow
functionality at a lower cost but are increasingly pressured to expand capabilities as more
consolidation occurs. A macro trend of convergence of the data quality tools market and the
related market for data integration tools continues, as organizations recognize that data
integration activities must provide more than simply data delivery – they must ensure the
quality of the data being delivered enhances the value of data integration investments.
When evaluating offerings in this market, organizations must consider the breadth of
functional capabilities (for example, data profiling, parsing, standardization, matching,
monitoring and enrichment) relative to their requirements. Other key criteria include the degree
of integration of these capabilities into a single architecture and product – specifically,
integration at the metadata level, for example, a single unified metadata repository or the
ability to apply findings from one toolset to create inference outcomes in another. Finally,
consider nontechnology characteristics, such as the availability of preferable deployment and
pricing models, and the size, viability and partnerships of the vendors.
MAGIC QUADRANT
Market Overview
Organizations of all sizes and in all industries are recognizing the importance of high-quality
data and the critical role of data quality in information governance and stewardship driven by
broader enterprise information management initiatives. As a result, their interest in the role of
tools and technology for data quality improvement continues to grow. Fueled by a market of
purpose-built, packaged tools for addressing various dimensions of the data quality discipline,
data quality functionality is readily available from a variety of providers, both large and small.
Data quality functionality is also being recognized as a fundamental component of offerings in
many related software markets, such as data integration tools, master data management
(MDM) solutions and business intelligence (BI) platforms.
3. 3
In time, vendors that do not act on these trends – and instead • Connectivity/adapters – Ability to interact with a range of
continue to focus solely on customer data and traditional different data structure types.
approaches to data quality improvement – will fall behind their • Subject-area-specific support – Standardization capabilities
competition and lose market share. for specific data subject areas.
• International support – Relevance for data quality operations
The market for data quality tools is modest in size (approximately
on a global basis.
$365 million in software revenue) but will grow at a compound
• Metadata management – Ability to capture, reconcile and
annual rate of 17% or more between 2006 and 2011, which is
interoperate metadata related to the data quality process.
stronger than the growth of many other software markets. Much of
the innovation continues to come from outside the United States. • Configuration environment – Capabilities for creation,
As a result, the veteran data quality tool vendors are being management and deployment of data quality rules.
challenged by entrants that have a more significant international • Operations and administration – Facilities for supporting,
focus. Many new entrants focus on domain-agnostic data quality managing and controlling data quality processes.
services (stand-alone or embedded in applications), based on a
• Service-enablement – Service-oriented characteristics and
centrally managed set of business rules. However, with the
support for SOA deployments.
increasing trend toward embedding data quality capabilities in
business applications, data integration tools and other software
The tools provided by vendors in this market are generally
offerings from larger vendors, these small competitors will face
consumed by technology users for internal deployment in their IT
significant challenges as they attempt to survive and grow. Also,
infrastructure, although hosted data quality solutions are continuing
acquisition activity in this and related markets continues to change
to evolve and grow in popularity.
the competitive landscape for data quality tools. For example,
SAP’s acquisition of Business Objects brings significant data
Inclusion and Exclusion Criteria
quality tools into the SAP portfolio for the first time, while
Informatica’s acquisition of Identity Systems continues the trend of For vendors to be included in the Magic Quadrant, they must meet
small data quality specialists being subsumed into the portfolios of the following criteria:
larger players in this market.
• Offer stand-alone (not only embedded in, or dependent on,
other products and services) packaged tools that are
Market Definition/Description positioned, marketed and sold specifically for data quality
The data quality tools market comprises vendors that offer stand- applications.
alone software products for addressing the core functional
• Deliver functionality that addresses, at a minimum, profiling,
requirements of the data quality discipline:
parsing, standardization, cleansing and matching. Vendors
• Profiling: Analysis of data to capture statistics (metadata) that offering only narrow functionality (for example, only address
provide insight into the quality of the data and aid in the cleansing and validation or only matching) are excluded
identification of data quality issues. because they do not provide complete data quality tool suites.
• Parsing and standardization: Decomposition of text fields into • Support this functionality for data in more than one language
component parts and formatting of values into consistent and specific to more than one country (in the case of address
layouts based on industry standards, local standards (for standardization).
example, postal authority standards for address data), user-
• Maintain an installed base of at least 50 production customers
defined business rules and knowledge bases of values and
for their data quality products.
patterns.
• Demonstrate, via customer references, use of the tools at an
• Generalized “cleansing”: Modification of data values to meet
enterprise (cross-departmental or multiapplication) level.
domain restrictions, integrity constraints or other business rules
that define sufficient data quality for the organization.
A vendor that does not meet the above criteria may be considered
• Matching: Identification, linking or merging related entries
for inclusion if it is a new entrant that is demonstrably different from
within or across sets of data.
established vendors and represents a future direction for data
• Monitoring: Deployment of controls to ensure ongoing quality tools.
conformance of data to business rules that define data quality
for the organization. There are many data quality tools vendors but most do not meet
the above criteria and are, therefore, not included in the Magic
• Enrichment: Enhancing the value of internally held data by
Quadrant. Many vendors provide products that address one very
appending related attributes from external sources
specific data quality problem, such as address cleansing and
(for example, consumer demographic attributes or
validation, but cannot support other types of applications, or lack
geographic descriptors).
the full breadth of functionality expected in today’s data quality
solutions. Others provide a range of functionality, but operate only
In addition, these products provide a range of related functional
in a single country or support only narrow, departmental
capabilities that are not unique to this market but are required for
implementations. Others may meet all the functional, deployment
executing many of the data quality core functions, or for specific
and geographic requirements but are at a very early stage in their
data quality applications:
4. 4
“life span” and, therefore, have few, if any, production customers. • InQuera, Tefen, Israel, www.inquera.com – specializes in
The following vendors may be considered by Gartner clients technology for standardization, matching and deduplication,
alongside those appearing in the Magic Quadrant when with a specific focus on product data.
deployment needs are aligned with their specific capabilities, or are • Intelligent Search Technology, White Plains, New York,
newer entrants beginning to gain visibility in the market but lacking www.intelligentsearch.com – develops products for profiling,
a significant customer base: matching, deduplication and U.S. address correction.
• Ixsight, Mumbai, India, www.ixsight.com – offers services for
• AddressDoctor, Maxdorf, Germany, www.addressdoctor.com data quality audits, along with products for standardization and
– specializes in international address standardization and deduplication.
validation, supporting 240 countries and territories.
• Melissa Data, Rancho Santa Margarita, California,
• AMB Dataminers, Chicago, Illinois, www.melissadata.com – supports standardization of names,
www.payasyougodataquality.com – provides profiling, addresses and phone numbers, and validation of addresses
standardization and cleansing functionality for deployment in and phone numbers (both via on-premises software and
Windows environments. hosted Web services).
• Anchor Software, Plano, Texas, • Omikron, Pforzheim, Germany, global.omikron.net – provides
www.anchorcomputersoftware.com – provides a range of data products for standardization and deduplication of customer
quality utilities supporting common customer list management name and address data.
operations such as file splitting, deduplication and suppression.
• QAS (a subsidiary of Experian), London, U.K., www.qas.com
• BackOffice Associates, Harwich, Massachusetts, – offers global name and address standardization, validation
www.boaweb.com – offers services and technology for and matching/deduplication functionality.
governance of master data within SAP applications.
• Silver Creek Systems, Louisville, Colorado,
• BCC Software (a division of Bowe Bell + Howell), Rochester, www.silvercreeksystems.com – provides parsing,
New York, www.bccsoftware.com – provides a range of data standardization and matching functionality, with a focus on
quality utilities supporting common customer list management product data applications.
operations such as address validation, change of address,
• Spad, Paris, France, eng.spadsoft.com – offers a suite of data
deduplication and suppression.
quality products for data profiling, monitoring and
• Business Data Quality, London, U.K., standardization.
www.businessdataquality.com – offers products focused on
• SQL Power, Toronto, Canada, www.sqlpower.ca – provides
data profiling (BDQ Analysis) and data quality monitoring
open-source tools supporting standardization, address
(BDQ Monitor).
validation and deduplication.
• Certica Solutions, Wakefield, Massachusetts,
• SRC, Orange, California, www.extendthereach.com – provides
www.certicasolutions.com – provides products focused on
data cleansing in the context of business intelligence
validating data against predefined data quality rules.
applications with a geographic orientation.
• Ciant, Richardson, Texas, www.ciant.com – provides parsing,
• Stalworth, San Mateo, California, www.stalworth.com – offers
standardization and matching functionality for customer data,
a platform for standardization and cleansing of customer data,
in support of sales and marketing analytics.
including international address validation.
• Datras, Munich, Germany, www.datras.de – focuses on the
• TIQ Solutions, Leipzig, Germany, www.tiq-solutions.de –
German-speaking markets, providing profiling, standardization
provides data profiling and data quality dashboards, with a
and monitoring capabilities.
focus on banking, insurance and distribution verticals.
• Datiris, Lakewood, Colorado, www.datiris.com – provides
• Utopia, Mundelein, Illinois, www.utopiainc.com – offers
various data profiling techniques for a range of data sources.
services and technology for data quality analysis and data
• DQ Global, Fareham, U.K., www.dqglobal.com – provides standardization, with a focus on product master data.
matching, deduplication and international address
• Veda Advantage, Sydney, Australia, www.vedaadvantage.com
standardization and validation functionality.
– provides software to cleanse and update customer
• FinScore, Renens, Switzerland, www.finscore.com – offers addresses, add phone numbers, merge databases into a single
technology for measuring data quality and presenting metrics customer view and append segmentation data.
in a dashboard form.
• WinPure, Reading, U.K., www.winpure.com – offers low-cost
• helpIT systems, Surrey, U.K., www.helpit.com – provides data cleansing, matching and data deduplication software on
data quality tools oriented toward customer matching, the Windows platform.
deduplication and suppression operations.
• Zoomix, Jerusalem, Israel, www.zoomix.com – delivers
• Infogix, Naperville, Illinois, www.infogix.com – provides technology for adaptive matching and standardization, with a
controls-based technology for auditing and validating the focus on product data.
integrity of data within and across systems.
• Infosolve Technologies, South Brunswick, New Jersey, Gartner will continue to monitor the status of these vendors for
www.infosolvetech.com – provides open-source tools (with possible inclusion in future updates of the Magic Quadrant for Data
required service contract) that support profiling, Quality Tools.
standardization, matching and deduplication operations.
5. 5
• Market Understanding: The degree to which the vendor leads
Dropped
the market in new directions (technology, product, services or
• Fuzzy Informatik – This vendor was acquired by Business
otherwise) and its ability to adapt to significant market changes
Objects in 2007 and no longer exists as an independent entity.
and disruptions. Given the dynamic nature of this market, this
item receives a high weighting.
Evaluation Criteria • Marketing Strategy: The degree to which the vendor’s
marketing approach aligns with and/or exploits emerging trends
Ability to Execute
and the overall direction of the market.
We evaluate vendors’ ability to execute in the data quality tools
• Sales Strategy: The alignment of the vendor’s sales model to
market by using the following criteria:
the way customers’ preferred buying approaches will evolve
• Product/Service: How well the vendor supports the range of
over time.
data quality functionality required by the market, the manner
• Offering (Product) Strategy: The degree to which the vendor’s
(architecture) in which this functionality is delivered and the
product road map reflects demand trends in the market and fills
overall usability of the tools. Product capabilities are critical to
current gaps or weaknesses. We also consider the strength of
the success of data quality tool deployments and, therefore,
the vendor’s strategy regarding delivery models of different
receive a high weighting.
types.
• Overall Viability: The magnitude of the vendor’s financial
• Business Model: The overall approach the vendor takes to
resources and the strength of its people and organizational
execute its strategy for the data quality market. With a
structure.
reasonably high degree of similarity across the vendors in this
• Sales Execution/Pricing: The effectiveness of the vendor’s
market, this item receives a low weighting.
pricing model and the effectiveness of its direct and indirect
• Vertical/Industry Strategy: The level of emphasis the vendor
sales channels.
places on vertical solutions, and the vendor’s depth of vertical
• Market Responsiveness and Track Record: The degree to
expertise. Given the broad cross-industry nature of the data
which the vendor has demonstrated the ability to respond
quality discipline, vertical strategies are less critical and,
successfully to market demand for data quality capabilities over
therefore, this item receives a low weighting.
an extended period.
• Innovation: The degree to which the vendor has demonstrated
• Marketing Execution: The overall effectiveness of the vendor’s
a willingness to make new investments to support the strategy
marketing efforts, and the degree of “mind share,” market
and enhance product capabilities, the level of investment in
share and account penetration the vendor has achieved as a
R&D directed toward development of the tools and the extent
result.
to which the vendor demonstrates creative energy. With rapidly
• Customer Experience: The quality of the vendor’s general evolving technology requirements – in the face of trends such
customer service, implementation service and technical as SOA – and increased competition in the market from large
support, and customers’ perception of overall value. vendors, this item receives a high weighting.
• Geographic Strategy: The global presence of the vendor and
Completeness of Vision the manner in which it is achieved (for example, direct local
We assess vendors’ completeness of vision for the data quality presence, resellers and distributors) in light of the desire of
tools market by using the following criteria: multinational enterprises to exploit common tools worldwide.
Table 1. Ability to Execute Evaluation Criteria Table 2. Completeness of Vision Evaluation Criteria
Weighting
Evaluation Criteria Weighting
Evaluation Criteria
high
Product/Service high
Market Understanding
standard
Overall Viability (Business Unit, Financial, standard
Marketing Strategy
Strategy, Organization)
standard
Sales Strategy
standard
Sales Execution/Pricing
standard
Offering (Product) Strategy
standard
Market Responsiveness and Track Record
low
Business Model
standard
Marketing Execution
low
Vertical/Industry Strategy
standard
Customer Experience
high
Innovation
no rating
Operations
standard
Geographic Strategy
Source: Gartner
Source: Gartner
6. 6
Leaders • Business Objects provides good breadth of functional data
quality capabilities, including data profiling (via Data Insight XI)
Leaders in the market demonstrate strength across a complete
and common data-cleansing operations (via Data Quality XI).
range of data quality functionality, including profiling, parsing,
The core data quality functionality in Data Quality XI enables the
standardization, matching, validation and enrichment. They exhibit
delivery of data quality services in an SOA context, and will be
a clear understanding and vision of where the market is headed,
used in the Business Objects Data Services product (which
including recognition of noncustomer data quality issues and the
combines data integration and data quality functionality).
delivery of enterprise-level data quality implementations. Leaders
Business Objects has made progress in the market in the past
have an established market presence, significant size and a
12 months, actively selling Data Quality XI alongside the Data
multinational presence (directly or as a result of a parent
Integrator extraction, transformation and loading (ETL) tool.
company).
• Business Objects’ strength remains very much in applications
of customer data quality, specifically in matching/linking,
Challengers
deduplication, and name and address standardization and
Challengers in the market provide strong product capabilities but
validation. The technology is proven for applications of this type
may not have the same breadth of offering as Leaders. For
and such implementations represent most of the installed base.
example, they may lack several functional capabilities of a
The acquisition of Fuzzy Informatik in 2007 has provided
complete data quality solution. Challengers have an established
stronger and additional name and addressing standardization
presence, credibility and viability, but may demonstrate strength
capabilities and content for Europe, the Middle East and Africa,
only in a specific domain (for example, only customer name and
with a specific emphasis on German-speaking and Eastern
address cleansing) and/or may not demonstrate a significant
European countries.
degree of thought leadership and innovation.
Cautions
Visionaries
• Very few customer references report using the technology in
Visionaries in the market demonstrate a strong understanding of
data domains beyond customer data (and similar “party”-
current and future market trends and directions, such as the
oriented subject areas such as supplier or employee). While this
importance of ongoing monitoring of data quality, engagement of
is because of historical optimization of the technology for
business subject matter experts and delivery of data quality
customer data, the delivery of Universal Data Cleanse (UDC) in
services. They exhibit capabilities aligned with these trends, but
2007 enabled broader use. However, UDC is still new and
may lack the market presence, brand recognition, customer base
production implementations remain scarce, and it also
and resources of larger vendors.
represents an additional cost to customers beyond the base
Data Quality XI functionality.
Niche Players
• Data profiling remains an area of relative weakness for Business
Niche Players often have limited breadth of functional capabilities Objects, with the Data Insight product continuing to see slow
and may lack strength in rapidly evolving functional areas such as market adoption and customer references reporting limited use
data profiling and international support. In addition, they may focus and mixed results. Specifically, reliability of IQ Insight and
solely on a specific market segment (such as midsize businesses), integration with Data Quality XI (in terms of ease of converting
limited geographic areas or a single domain (such as customer profiling results into rules for data cleansing and monitoring)
data), as opposed to positioning toward broader use. Niche represent an opportunity for improvement.
Players may have good functional breadth but may have an early-
• The acquisition of Business Objects by SAP brings both
stage presence in the market, with a small customer base and
opportunities and risks for the market presence of Business
limited resources. Niche Players that specialize in a particular
Objects’ data quality tools. This technology was not a major
geographic area or data domain may have very strong offerings for
factor in SAP’s acquisition strategy, and SAP’s long-term plans
their chosen focus area and deliver substantial value for their
and product road map for the tools, including potential
customers in that segment.
bundling, packaging and pricing with SAP products, are not yet
finalized. The vendor must decide how to address product
Vendor Strengths and Cautions
overlaps, such as the matching functionality from Business
Objects and the matching functionality in SAP MDM. SAP
Business Objects
should focus on using Business Objects’ data quality
Strengths
technology to enhance the value of the NetWeaver platform
• Business Objects, an SAP company, has a substantial BI
and SAP applications, as well as on growing a stand-alone
platform market presence and large base of data quality tools
data quality tools business. The vendor must continue to clarify
customers (the overwhelming majority of which are in North
for its customers its product plans and strategic direction for
America and were obtained through its 2006 acquisition of
data quality.
Firstlogic). This creates significant cross-sell opportunities for the
vendor to increase its data quality tools business. As a part of
SAP, the vendor’s growth prospects are further expanded via
access to the global SAP applications customer base, where
data quality challenges are prevalent. In addition, Business
Objects’ data quality tools will be complementary to SAP MDM,
which has been lacking rich data quality functionality.
7. 7
Datactics • The DataFlux platform includes profiling, matching, cleansing
and monitoring capabilities in a single platform, supported by a
Strengths
shared metadata repository. DataFlux has leveraged its parent
• Datactics is a small data quality vendor based in Belfast,
company SAS to expand its geographic presence and has
Northern Ireland, and operates primarily in Europe, but there
good traction in Europe.
are a number of value-added resellers (VARs) in the Americas
and Asia. Its software is used in a range of subject areas, not
Cautions
limited to typical name/address verification scenarios. Many
references report use of the software beyond cleansing of • With the convergence of the data integration and data quality
customer data. Its profiling capabilities are cited as particularly tools markets and the ongoing M&A activities in them, DataFlux
strong. needs to expand its portfolio and messaging beyond data
quality. Despite its efforts in MDM, DataFlux has not been
• The company’s flagship product, DataTrawler, is fully 64-bit and
recognized beyond its status as a data quality technology
Unicode-enabled, supports most European languages, runs on
provider, and its expansion into business process integration is
many platforms and supplies broad capabilities in profiling,
still in its infancy.
matching/merging, cleansing and monitoring. Data quality
scorecards can be constructed to monitor quality-related • Despite the vendor’s broad connectivity to commonly used data
metrics. Most of Datactics’ reference customers are small and sources and applications, some customers struggle with the
midsize businesses, mainly in the supply chain sector, as well adapter licensing and integration of the platform into other
as government agencies. environments, particularly SAP.
• Datactics has partnerships with consultancies and system • Although DataFlux provides locale support for 36 countries and
integrators (SIs) that have used the DataTrawler product in 18 languages, most customers report using the software in
some strategic data quality programs – it is quick to implement single-language English environments.
at a reasonable cost. Datactics has also built an alliance with
ETI, a data integration tools vendor, and other software
DataLever
companies that include DataTrawler services.
Strengths
• DataLever focuses on the core requirements of data quality,
Cautions
providing integrated data-profiling and data-cleansing
• Datactics recently underwent some management changes,
functionality in a single product. All operations can be readily
including the recruitment of a CEO, generating some
deployed in both batch and real-time modes. The vendor has
uncertainty about the vendor’s potential strategy changes.
focused on delivering the fundamental capabilities required in
However, the company successfully finished a funding round
virtually all data quality projects (such as parsing,
and is negotiating a second tranche.
standardization and cleansing) rather than attempting to expand
• With only five sales employees, limited marketing budgets and the scope of the data quality discipline or innovate in new
relatively low-profile partnerships, Datactics is “flying functional areas.
underneath the radar” for most organizations looking for a
• DataLever takes a domain-agnostic view of data quality issues,
provider of data quality tools. Datactics needs to build an even
enabling its technology to be applied in various data domains,
stronger OEM channel, with more visible independent software
including customer and product. While most of the installed
vendor (ISV) partners in the data management or BI markets.
base applies DataLever’s technology to customer data quality
• Although Datactics has signed up VARs in markets such as issues, customer references reflect a solid percentage of
Brazil, Hong Kong and Turkey, there is no traction in those implementations in other areas.
regions and all major sales or partnering opportunities remain
• Customers cite overall ease of use, relatively short
mostly in English-speaking countries. A stronger ISV partner is
implementation times and lower cost than alternative offerings
required to take Datactics to new shores.
as the main selling points of DataLever’s products. Increasingly,
strong performance in scenarios with large data volumes is
DataFlux helping DataLever to succeed in competitive situations. In
addition, the lower complexity of the product enables its use by
Strengths
business subject matter experts in addition to IT personnel.
• DataFlux has firmly established itself as a major brand in the
market. It continues on its solid growth path, has seen good
Cautions
traction as a multipurpose data quality platform beyond
customer data and even as an enterprisewide standard in large • As one of the smaller and privately held providers in the market,
accounts. The company has one of the highest ratios of DataLever supports a small customer base of approximately
reinvesting revenue in R&D. 150, with virtually no presence outside North America. Although
it has wisely chosen to focus solely on its home region of North
• To speed deployment, the vendor has successfully launched a
America early in its maturity, the vendor’s relative weakness in
set of “accelerators,” for example, Customer Data Analysis and
Commodity Coding, and is praised by its customers for the
ease of use of its tools, including for non-IT staff, and good
performance, particularly profiling and matching.
8. 8
international support (the technology is not yet Unicode- minority of the usage, the vendor’s customer references do
compatible) will hinder its adoption by multinational enterprises reflect use of the technology in product data quality and
and its growth in other regions. However, DataLever has begun financial data quality applications.
to address this issue via a partnership that supplies • From a product functionality perspective, DataMentors has
international address standardization and validation weaknesses in runtime platform support (Windows is the only
functionality, and the vendor’s product road map calls for deployment option, although DataFuse can interact with
delivery of Unicode support in 3Q08. applications and data sources on other platforms) and
• DataLever has very limited runtime platform support (Windows international capabilities because of lack of Unicode support. A
and Linux only), although support for other platforms is new partnership with a provider of international address
planned for future releases. The vendor’s lack of significant standardization and validation functionality represents a positive
partnerships with SIs and complementary software vendors will first step in making DataFuse suitable for use by global
limit its competitive strength – DataLever must begin to look organizations.
beyond its own intellectual property and capabilities to improve
its ability to execute.
Datanomic
• DataLever’s technology has traditionally had greatest adoption Strengths
by midsize businesses. However, the vendor is increasingly
• Datanomic continues to establish itself in the European data
attracting large enterprises but these customers tend to deploy
quality tools market. The vendor has just passed the 100
the technology within single projects or a limited set of projects
customers mark, most of which are in the U.K., with some in
rather than enterprisewide.
mainland Europe and a few in North America and Asia. As a
relatively new player, Datanomic has been able to build its
DataMentors dn:Director platform on modern technology, and with an
Strengths attractive user interface, without any major legacy baggage.
• DataMentors specializes in customer data quality applications, • The new Web services generation capability enables
providing matching, linking, standardization and cleansing dn:Director users to rapidly deploy data quality components,
operations via its DataFuse product, and data profiling such as matching or cleansing, into SOA environments; the
capabilities via ValiData. Its partnership with smartFocus new Siebel connector benefits from this new function.
enables the vendor to offer campaign management, analytics Datanomic has also enhanced its real-time capabilities, added
and mapping capabilities (branded as DataMentors PinPoint). new data quality processors into the product and continued to
The vendor’s roots are in database marketing, with the improve the presentation functionality in its data quality
management team having been involved in large-scale dashboards.
applications of this type for more than 20 years. • Datanomic has a strong focus on the financial industry, with a
• Customer references cite accuracy of matching, ease of use few clients in the telecommunications and public sectors.
and attractive pricing relative to that of some of the more Datanomic products are domain-agnostic and not specifically
prominent vendors in the market as key strengths and reasons targeted at customer data.
for their selection of DataMentors’ technology. Forthcoming
versions of DataFuse will introduce further advancements in Cautions
ease of use, parallel processing and data quality monitoring
• Datanomic has been unable to capitalize on the international
functionality. While all the installed base is using the technology
reach of its SI partners and has virtually no visibility outside its
in the customer data domain, some customer references
home market in the U.K.
indicate use in the product data domain as well.
• While the dn:Director product is built on an SOA, and its
• The vendor’s customer base reflects a higher percentage of
database connectivity is expanding to cover access to Oracle,
hosted (SaaS) implementations than is seen for any other
Microsoft, Sybase and others, native adapters for some major
vendor in this market. DataMentors estimates that half its
database management systems such as DB2 and Teradata are
customers are using its technology in a hosted manner and
not available. Hardly any references report using the product
this is reflected in the vendor’s customer references.
outside customer name and address cleansing.
• Although the vendor maintains an alliance with Oracle, it does
Cautions
not participate in a sales and marketing “ecosystem” with a
• With a small installed base (approximately 70 customers, all in number of data integration or BI platform companies, thereby
North America) and limited resources for marketing, missing out on OEM and channel sales opportunities.
DataMentors will be challenged to gain mind share in a market
increasingly populated by much larger providers. Its recent
certification for NetWeaver MDM will help, but DataMentors will
need to establish additional partnerships to expand its
presence and visibility.
• DataMentors’ focus on marketing applications and customer
data quality issues may place it at a competitive disadvantage
when prospects have broader requirements, including quality
issues in noncustomer data domains. However, although it is a
9. 9
Human Inference • Supporting its vision for “information on demand,” IBM’s
Strengths Information Analyzer (discovery, profiling and analysis) and
QualityStage (parsing, standardization and sophisticated
• Human Inference, based in Arnhem, the Netherlands, provides
matching) are repeatedly described as the enterprisewide data
data quality solutions to large customers mostly in the
quality standard and are being used in several departments in
European financial services, telecom and utilities industries,
customer organizations. IBM’s customers have started to use
where it has some long-standing relationships with
its data quality products in multiple data domains, beyond
approximately 250 clients. New investors GIMV and Iris Capital
customer data. IBM has also started to integrate its data quality
will enable the company to extend its reach into other
tools with the Cognos platform, which enables the creation of
geographic regions.
data quality dashboards.
• The HIquality components include technology for inspection
• The newly architected Information Server includes – apart from
and profiling, name and address cleansing, matching, merging
data quality tools – ETL tools, federation, replication and
and enrichment. One of the key differentiators for Human
metadata management. The standardization and matching
Inference is its maintenance of reference datasets, which are
functions are praised by reference customers for their accuracy,
available for select countries and serve as a knowledgebase for
performance and scalability.
names, addresses and other specific meanings from a variety
Cautions
of contexts. The vendor has partnered with T-Systems as the
hosting provider for a SaaS offering of its software and has • IBM’s overarching information-on-demand theme takes away
signed up its first customers. some of the focus on the data integration and data quality
products in the Information Server. Despite IBM’s continuous
• As one of the larger European data quality tools vendors,
efforts through data quality seminars worldwide, mind share in
Human Inference has good mind share in the Netherlands and
the market grows relatively slowly.
is increasingly active in other European countries, driven mostly
by successful marketing programs and themed events. • The adoption rate of the Information Server and customers’
Reference customers cite the quality of the Dutch and Belgian willingness to upgrade to the latest versions of the data quality
knowledgebases, address validation and geocoding as products continues to be somewhat slow. Few references have
particularly strong features. reported running the latest version of a data quality product in
production.
Cautions • Although smaller competitors have embarked on a SaaS model
for data quality, IBM has not, despite its extensive hosting
• Reference customers report a reluctance to migrate to the
capabilities, addressed this new market segment.
latest version of the product because of high complexity and
cost during the migration process, but version 6 of the
HIquality Suite is addressing ease-of-use and other Informatica
shortcomings. A relatively high ratio of customers also indicates Strengths
issues with access to skilled service personnel, software
• Informatica increased its market presence in the past 12
pricing and value for money.
months, adding a significant number of new customers for data
• Human Inference’s partner channel strategy is still at an early quality tools. Most of these additions were via cross-selling of
stage. The vendor needs OEM partnerships with data data quality tools to the existing PowerCenter installed base, a
integration, process integration and application vendors so it strategy that Informatica is executing very well. The installed
can extend its presence and compete more effectively with the base of its core data quality products (Informatica Data Quality
product offerings from large international infrastructure and Informatica Data Explorer) is estimated at approximately
companies that, because of their holistic approach, regularly 400 customers.
win deals over small providers.
• Informatica’s data quality tools portfolio includes strong data
• The vendor’s products are repeatedly described as complex to profiling functionality (Data Explorer) and domain-agnostic
install and configure, requiring additional service personnel from parsing, standardization and matching capabilities (Data
Human Inference. Despite the possibility of using Web services, Quality). The company’s recent acquisition of Identity Systems
integrating the data quality software into other applications is indicates the strategic importance it places on this market. The
described as difficult. acquisition will expand the matching and identity resolution
functionality of Informatica’s data quality offerings.
IBM • Customer references reflect a range of data quality application
Strengths types, with strong indications of multidomain adoption across
customer, product financial and other types of data. Ease of
• IBM continues to push for ubiquitous data quality functionality
use of the products and positive service and support
as a key component of its integration portfolio. The Information
experience are also cited by customer references as significant
Server acts as the host platform for the company’s data quality
strengths.
products, but IBM also uses the components in MDM solutions
with the MDM Server, as well as in data quality assessments
led by IBM Global Business Services. As one of the best-
known brands with worldwide consulting, service and support
functions, IBM is well equipped to sell its vision of data quality
to the largest organizations worldwide.
10. 10
Cautions reflect no use of the technology in other data domains, such as
product data or financial data.
• Having successfully completed the organizational integration of
its major 2006 acquisition to enter the data quality tools • Given its long history in the market, Innovative’s relatively small
market, Informatica now faces the challenge of integrating an installed base indicates limited growth in recent years. It has been
even larger entity now that it has completed its purchase of generally successful in retaining its long-standing customers, but
Identity Systems. Key to success will be integration at a will need to increase the pace of new customer acquisition to
technology level and clearly articulating to customers the remain competitive. A stronger emphasis on marketing,
appropriate use of Identity Systems’ matching technology establishing partnerships with SIs and complementary software
relative to that in the Data Quality product. vendors, and expanding product functionality toward multidomain
capabilities will have a positive impact.
• Acquisition activity in related markets continues to degrade the
value of Informatica’s indirect sales channels for its data quality • Innovative’s data profiling capabilities appear to have limited
products, while at the same time increasing competitive market adoption so far, although this is a relatively new and
pressure on pure-play vendors like Informatica. For example, immature offering in the vendor’s portfolio. Customer references
the recent acquisition of Cognos by IBM negates the data reflect less adoption of profiling capabilities than those of major
quality tools reseller agreement that Informatica established competitors in this market. In addition, while Innovative’s
with Cognos. However, the acquisition of Identity Systems technology can support multilingual data, the lack of full
adds existing reseller and OEM partnerships in the CRM and Unicode capabilities limits Innovative’s ability to compete on a
customer data integration hub (customer MDM) markets. global basis.
• Customer references reflect extremely limited use in Netrics
multilanguage, multicountry implementations, as well as Strengths
relatively low satisfaction with functionality for related • Netrics, a relatively new entrant to the data quality tools
operations such as address validation and geocoding. market, provides a range of capabilities with a specific focus on
Informatica must continue to improve its competence in these matching. The vendor uses a machine learning approach to
areas and says it is increasing its focus through more implementing matching and standardization, based on the
dedicated resources and a new leader of its address validation customer “teaching” the technology about the characteristics
and enrichment team. of matches by working through a sample set of data.
• Netrics’ technology is essentially an embeddable data quality
Innovative Systems and matching engine, enabling the deployment of data-quality-
Strengths related services inside any type of application. This is a
significant differentiation from most other vendors in the
• Innovative Systems has been in this market longer than any
market, and enables Netrics to focus primarily on an indirect
other vendor, with a history spanning nearly 35 years.
channel strategy with OEM and system integration partners.
Innovative’s i/Lytics platform provides proven capabilities based
The most recent release of the technology added a Web
on its deep experience in customer data-matching and
services application programming interface for applications to
cleansing applications. i/Lytics provides strong support for both
communicate with the engine, as well as the addition of
mainframe and distributed platforms, and enables data quality
“information theory scoring” capabilities that add to Netrics’
functionality to be exposed via service interfaces.
repertoire of matching algorithms.
• Innovative’s customer base (approximately 200 customers,
• Customer references claim better accuracy in highly complex
most of which are large enterprises) reflects the vendor’s
matching problems compared with more traditional matching
strong experience in the banking and insurance industries –
approaches, with a shorter time to implementation because
these verticals include about two-thirds of the vendor’s
comparatively less “programming” is needed. References also
customers. While most of these customers are in North
reflect the lack of domain bias in Netrics’ technology –
America, Innovative also supports customers in Europe and is
customers are working with various types of data, including
experiencing growth in Latin America (a region in which it has
customer, product and financial – and growing use of the
significant experience). Customer references report a very
technology in an off-premises hosted delivery model. In
favorable service and support experience, and success with
addition, references report a very positive experience with ease
enterprisewide deployments.
of use and performance of the technology.
• Complementary to its financial services experience, Innovative
continues to focus on its Fin-Scan compliance watchlist
Cautions
screening offerings, an area showing continued strong
demand. In addition, it is placing more emphasis on delivery of • Netrics’ strong emphasis on matching comes at the expense
i/Lytics functionality in a SaaS model, which is in line with an of other data quality operations, such as profiling and address
early-stage trend toward hosted and hybrid (combination of validation, in which it has limited capabilities compared with
on-premises and hosted) deployments in this market. most other vendors in this market. The lack of a user interface,
other than a Web-based console for administration of engine
operations, means the vendor does not provide “out of the
Cautions
box” functionality for exposing profiling results, matching results
• With a strong emphasis on customer data quality issues,
or runtime statistics – capabilities that are increasingly
Innovative will be challenged to win new business or expand its
important as organizations focus more strongly on ongoing
presence in existing accounts when multidomain data quality
information governance.
improvement initiatives are required. Customer references
11. 11
• Netrics’ product road map includes mostly technical the product data domain. In addition, the recent delivery of a
enhancements – additional functionality that will improve the data profiling product has enabled Pitney Bowes Software to
scalability or matching flexibility of the engine. However, the expand its functional capabilities; however, customer references
road map is limited in enhancements that would fill critical gaps reflect minimal uptake of this offering.
relative to larger competitors, such as robust data profiling • Lack of clarity about the product road map and migration paths
functionality, or support for richer parsing, standardization and from older Group 1 Software and Pitney Bowes data quality
validation rules (in particular for the customer data domain, a products to CDQ have created frustration on the part of
mainstay of demand in the data quality tools market). customers. References report an inconsistent experience with
• With a small installed base (approximately 100 customers) and customer service and support. With the creation of the Pitney
limited resources for marketing, Netrics will be challenged to Bowes Software business unit and a stronger focus on aligning
gain mind share in a market increasingly populated by much the various software assets of Pitney Bowes, the vendor is
larger providers. Customer references are generally midsize beginning to solidify the product road map and has the
organizations, although some of the applications in which opportunity to rationalize and strengthen its interactions with
Netrics’ tools are embedded (including applications delivered by customers.
some of its OEM partners) support very large numbers of users. • While Pitney Bowes Software offers a range of pricing models
and options, mainframe-based customers (which represent the
core of its customer base) continue to report challenges in
Pitney Bowes Software
negotiating the cost of upgrades and ongoing
Strengths
support/maintenance, and working through renegotiations of
• Pitney Bowes Software, which competes in the data quality enterprise licenses, including mainframe products.
tools market as a result of the acquisition of Group 1 Software
by Pitney Bowes, continues to focus on its traditional
Trillium Software
positioning of “customer data quality.” The vendor specializes in
global name and address standardization and validation, Strengths
matching-related capabilities (including linking and • Harte-Hanks Trillium Software provides a broad data quality tool
deduplication) and geocoding. This functionality is supported suite, including data profiling (TS Discovery), core data quality
on a range of platforms, including the mainframe. Although the components (TS Quality) and a data quality dashboard offering
vendor’s underlying technology can be considered domain- (TS Insight). Its data enrichment capabilities are focused on
agnostic, customer data quality applications are its sole focus, customer data (addresses, geocoding and watchlist
as is clear from the Customer Data Quality (CDQ) Platform compliance). Trillium is attempting to expand its positioning and
product naming. capabilities beyond core data quality capabilities toward what it
• Pitney Bowes Software retains a large installed base (more than calls “data intelligence,” with a product road map calling for
2,400 customers), making it one of the market-share leaders richer metadata discovery and management, semantic
for data quality tools. Customer references reflect a highly understanding and business user interaction.
North-American-centric installed base, although the vendor has • Trillium continues to enjoy strong brand recognition and
established a foothold in Asia/Pacific, where it now has several customer retention, and remains a market-share leader with a
hundred customers. large installed base of approximately 700 customers, most of
• With the significant financial resources of Pitney Bowes, the which are in North America. Customer references report a high
vendor continues to expand its capabilities through acquisitions level of satisfaction with performance and scalability of Trillium’s
– such as its 2007 addition of MapInfo, which brings further tools, and a very positive service and support experience. The
geospatial and mapping services to the CDQ product. The company has a high-profile partnership with Oracle – with
vendor continues to fund organic development of its core data Trillium’s data quality functionality an option in the Oracle Data
quality technology, with the latest version of CDQ adding, Integration Suite – that represents Trillium’s most significant
among other enhancements, an improved user interface for channel opportunity in recent times. A new reseller partnership
data stewardship activities, integration with MapInfo services with Teradata further expands the size and quality of Trillium’s
for location intelligence and improved visibility in matching indirect channels.
results. Its product road map includes additional • Trillium has disbanded its Diamond Data offering, which
enhancements, such as monitoring and integration of profiling provided the TS Quality functionality in a hosted model for
and cleansing functionality, which represent “must have” customers of SaaS application providers such as
capabilities in this market. salesforce.com. Trillium is redirecting the resources allocated to
that service into its own SaaS offering, TS On-Demand, and
Cautions increasing its focus on hosted deployments. However,
customer references do not yet reflect use of these capabilities.
• Pitney Bowes Software’s focus on customer data will place it at
an increasingly significant competitive disadvantage compared
with providers with multidomain-capable tools. Customer
references rarely report use of the technology in noncustomer
data domains, which is consistent with the vendor’s product
positioning. The announcement of a partnership with Silver
Creek Systems provides the potential for Pitney Bowes
Software customers to begin addressing data quality issues in
12. 12
Cautions Cautions
• Trillium’s functionality, marketing and product road map have • As many organizations start to view data quality as a domain-
historically been largely geared toward data quality issues in agnostic issue, Uniserv’s strong customer data focus will put it
customer data. A minority of customer references indicate that at a disadvantage compared with other providers that market
they are applying TS Quality in other data domains, although themselves with a broader data quality view toward, for
with the introduction of Universal Data Libraries (prebuilt example, product data or financial data.
functionality for common data attributes including units of • Uniserv is an established brand for matching, merging,
measure, currencies and package types) in the v.11 release, cleansing, and address and bank data verification technologies,
this is beginning to change. Trillium must continue to expand its but it does not serve increasingly popular areas such as data
capabilities and experience in this direction to remain quality dashboards and quality monitoring. However, the vendor
competitive with vendors delivering multidomain functionality. recently signed an OEM agreement to fill the data profiling gap
• To ensure long-term market leadership, the vendor will need to in Uniserv’s portfolio. Only an English version of the profiling tool
continue to generate significant growth in other regions in is available currently but German and French user interfaces are
response to competition from larger and more globally visible planned for later in 2008.
vendors. It is beginning to do this, with 28% of its customer • Uniserv’s strong concentration on its direct sales force, and its
base now outside North America. In addition, while it has lack of large international alliances with SIs and ISVs that use
relationships with a number of high-profile SIs, it must continue Uniserv technology as OEMs, put the vendor under increasing
to expand the depth of these partnerships to generate more pressure from larger international competitors. In addition, both
traction on a global basis. its partners SAP and Oracle have either acquired or embedded
• Harte-Hanks’ acquisition of U.K.-based address-cleansing data quality technology from Uniserv’s competitors.
specialist Global Address has created some redundancy of
functionality with TS Quality, specifically in the area of address
Vendors Added or Dropped
standardization for certain countries. Harte-Hanks faces the
We review and adjust our inclusion criteria for Magic Quadrants
challenge of rationalizing the product sets, but also has the
and MarketScopes as markets change. As a result of these
opportunity to upsell Global Address customers to TS Quality
adjustments, the mix of vendors in any Magic Quadrant or
as their needs expand beyond address-cleansing activities.
MarketScope may change over time. A vendor appearing in a
Magic Quadrant or MarketScope one year and not the next does
not necessarily indicate that we have changed our opinion of that
Uniserv
vendor. This may be a reflection of a change in the market and,
Strengths
therefore, changed evaluation criteria, or a change of focus by a
• Uniserv has been a provider of data quality solutions for more
vendor.
than 30 years. The vendor – which has its headquarters in
Pforzheim, Germany – focuses almost exclusively on customer
data, name and address verification, and geocoding. About
75% of Uniserv’s revenue and customers are in Germany and Acronym Key and Glossary Terms
France, but the vendor has also sold in other European BI business intelligence
countries and the U.S. CDQ Customer Data Quality
ETL extraction, transformation and loading
• Uniserv is one of only a few data quality vendors adopting a
ISV independent software vendor
SaaS delivery model. This amounts to only a small slice of the
MDM master data management
vendor’s revenue – and no reference customers have indicated
SaaS software as a service
this deployment model – but the SaaS portion has enjoyed
SI system integrator
growth of more than 40% in the past 24 months. Uniserv’s
SOA service-oriented architecture
installed base is growing faster internationally than in its
UDC Universal Data Cleanse
domestic market.
VAR value-added reseller
• Uniserv has more than 60 employees in technical roles, such
as product development, professional services and technical
support. A relatively low average software license fee per
implementation makes it easy for customers to deploy Uniserv
software.
13. 13
Evaluation Criteria Definitions
Ability to Execute
Product/Service: Core goods and services offered by the vendor that compete in/serve the defined market. This includes current
product/service capabilities, quality, feature sets and skills, whether offered natively or through OEM agreements/partnerships as
defined in the market definition and detailed in the subcriteria.
Overall Viability (Business Unit, Financial, Strategy, Organization): Viability includes an assessment of the overall organization’s
financial health, the financial and practical success of the business unit, and the likelihood of the individual business unit to continue
investing in the product, to continue offering the product and to advance the state of the art within the organization’s portfolio of products.
Sales Execution/Pricing: The vendor’s capabilities in all presales activities and the structure that supports them. This includes deal
management, pricing and negotiation, presales support and the overall effectiveness of the sales channel.
Market Responsiveness and Track Record: Ability to respond, change direction, be flexible and achieve competitive success as
opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the
vendor’s history of responsiveness.
Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization’s message to
influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification
with the product/brand and organization in the minds of buyers. This mind share can be driven by a combination of publicity,
promotional, thought leadership, word-of-mouth and sales activities.
Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products
evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary
tools, customer support programs (and the quality thereof), availability of user groups and service-level agreements.
Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational
structure including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and
efficiently on an ongoing basis.
Completeness of Vision
Market Understanding: Ability of the vendor to understand buyers’ wants and needs and to translate those into products and
services. Vendors that show the highest degree of vision listen and understand buyers’ wants and needs, and can shape or enhance
those with their added vision.
Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and
externalized through the Web site, advertising, customer programs and positioning statements.
Sales Strategy: The strategy for selling product that uses the appropriate network of direct and indirect sales, marketing, service and
communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.
Offering (Product) Strategy: The vendor’s approach to product development and delivery that emphasizes differentiation,
functionality, methodology and feature set as they map to current and future requirements.
Business Model: The soundness and logic of the vendor’s underlying business proposition.
Vertical/Industry Strategy: The vendor’s strategy to direct resources, skills and offerings to meet the specific needs of individual
market segments, including verticals.
Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation,
defensive or pre-emptive purposes.
Geographic Strategy: The vendor’s strategy to direct resources, skills and offerings to meet the specific needs of geographies
outside the “home” or native geography, either directly or through partners, channels and subsidiaries as appropriate for that
geography and market.