The document discusses product information management (PIM) for HP Printing and Personal Systems. It outlines the challenges of managing vast amounts of product data across departments and systems. It then describes how a PIM solution could address these challenges by providing a single source of truth for product information through capabilities like data integration, governance and a centralized repository. The paper also provides details on how HP could implement a PIM architecture using a transactional hub model to manage master product data.
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
How to identify the correct Master Data subject areas & tooling for your MDM...Christopher Bradley
1. What are the different Master Data Management (MDM) architectures?
2. How can you identify the correct Master Data subject areas & tooling for your MDM initiative?
3. A reference architecture for MDM.
4. Selection criteria for MDM tooling.
chris.bradley@dmadvisors.co.uk
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
How to identify the correct Master Data subject areas & tooling for your MDM...Christopher Bradley
1. What are the different Master Data Management (MDM) architectures?
2. How can you identify the correct Master Data subject areas & tooling for your MDM initiative?
3. A reference architecture for MDM.
4. Selection criteria for MDM tooling.
chris.bradley@dmadvisors.co.uk
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
In business, master data management is a method used to define and manage the critical data of an organization to provide, with data integration, a single point of reference.
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
Data Governance — Aligning Technical and Business ApproachesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, data governance consists of committee meetings and stewardship roles. To others, it focuses on technical data management and controls. Holistic data governance combines both of these aspects, and a robust data architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning data architecture & data governance for business and IT success.
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
It’s been three years since the General Data Protection Regulation shook up how organizations manage data security and privacy, ushering in a new focus on Data Governance. But what is the state of Data Governance today?
How has it evolved? What’s its role now? Building on prior research, erwin by Quest and ESG have partnered on a new study about what’s driving the practice of Data Governance, program maturity and current challenges. It also examines the connections to data operations and data protection, which is interesting given the fact that improving data security is now the No. 1 driver of Data Governance, according to this year’s survey respondents.
So please join us for this webinar to learn about the:
Other primary drivers for enterprise Data Governance programs
Most common bottlenecks to program maturity and sustainability
Advantages of aligning Data Governance with the other data disciplines
In a post-COVID world, data has the power to be even more transformative, and 84% of business and technology professionals say it represents the best opportunity to develop a competitive advantage during the next 12 to 24 months. Let’s make sure your organization has the intelligence it needs about both data and data systems to empower stakeholders in the front and back office to do what they need to do.
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall Enterprise Architecture for enhanced business value and success.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
The first step towards understanding data assets’ impact on your organization is understanding what those assets mean for each other. Metadata – literally, data about data – is a practice area required by good systems development, and yet is also perhaps the most mislabeled and misunderstood Data Management practice. Understanding metadata and its associated technologies as more than just straightforward technological tools can provide powerful insight into the efficiency of organizational practices and enable you to combine practices into sophisticated techniques supporting larger and more complex business initiatives. Program learning objectives include:
- Understanding how to leverage metadata practices in support of business strategy
- Discuss foundational metadata concepts
- Guiding principles for and lessons previously learned from metadata and its practical uses applied strategy
Metadata strategies include:
- Metadata is a gerund so don’t try to treat it as a noun
- Metadata is the language of Data Governance
- Treat glossaries/repositories as capabilities, not technology
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
In business, master data management is a method used to define and manage the critical data of an organization to provide, with data integration, a single point of reference.
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
Data Governance — Aligning Technical and Business ApproachesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, data governance consists of committee meetings and stewardship roles. To others, it focuses on technical data management and controls. Holistic data governance combines both of these aspects, and a robust data architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning data architecture & data governance for business and IT success.
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
It’s been three years since the General Data Protection Regulation shook up how organizations manage data security and privacy, ushering in a new focus on Data Governance. But what is the state of Data Governance today?
How has it evolved? What’s its role now? Building on prior research, erwin by Quest and ESG have partnered on a new study about what’s driving the practice of Data Governance, program maturity and current challenges. It also examines the connections to data operations and data protection, which is interesting given the fact that improving data security is now the No. 1 driver of Data Governance, according to this year’s survey respondents.
So please join us for this webinar to learn about the:
Other primary drivers for enterprise Data Governance programs
Most common bottlenecks to program maturity and sustainability
Advantages of aligning Data Governance with the other data disciplines
In a post-COVID world, data has the power to be even more transformative, and 84% of business and technology professionals say it represents the best opportunity to develop a competitive advantage during the next 12 to 24 months. Let’s make sure your organization has the intelligence it needs about both data and data systems to empower stakeholders in the front and back office to do what they need to do.
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall Enterprise Architecture for enhanced business value and success.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
The first step towards understanding data assets’ impact on your organization is understanding what those assets mean for each other. Metadata – literally, data about data – is a practice area required by good systems development, and yet is also perhaps the most mislabeled and misunderstood Data Management practice. Understanding metadata and its associated technologies as more than just straightforward technological tools can provide powerful insight into the efficiency of organizational practices and enable you to combine practices into sophisticated techniques supporting larger and more complex business initiatives. Program learning objectives include:
- Understanding how to leverage metadata practices in support of business strategy
- Discuss foundational metadata concepts
- Guiding principles for and lessons previously learned from metadata and its practical uses applied strategy
Metadata strategies include:
- Metadata is a gerund so don’t try to treat it as a noun
- Metadata is the language of Data Governance
- Treat glossaries/repositories as capabilities, not technology
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Master Data Management: Extracting Value from Your Most Important Intangible ...FindWhitePapers
This SAP Insight explores the importance of master data and the barriers to achieving sound master data, describes the ideal master data management solution, and explains the value and benefits of effective management of master data.
This presentation illustrates best practices in master data governance through a rich set of case studies. The presentation leverages seven years of in-depth experience in the field from the Competence Center Corporate Data Quality.
Online retailers see huge growth through smartphones, but consumer spending behavior on mobile devices could lead companies to miss out on some revenue growth. Learn the consequences in Adobe’s most recent study.
Big Data, Self-Service BI, entrepôts de données, Master Data Management, Qualité des données.
Il est de plus en plus difficile de maîtriser les données dans un système d'information.
Et pourtant une bonne gouvernance des données est cruciale pour maximiser la valeur ajoutée pour le métier.
Dans cette session, nous traiterons de la Data Gouvernance en entreprise et de son implémentation avec les outils Microsoft
· Industry certified Hadoop developer with 7+ years of experience in Software Industry and 6 years of hadoop development experience
· Has 3+ yrs experience as Technical Lead .
· Has experience in domains - Retail analytics,Hi-tech,Banking,Telecom and Insurance
· Working experience in HORTONWORKS,MAPR and CLOUDERA distributions
· Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
· Intermediate expertise in scala programming.
· Strong understanding and hands-on experience in distributed computing frameworks, particularly Apache Hadoop 2.0 (YARN; MR & HDFS) and associated technologies - Hive, Sqoop, , Avro, Flume, Oozie, Zookeeper, Hortonworks Ni-Fi etc.
· Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
· Proficiency in Python Scripting
Now companies are in the middle of a renovation that forces them to be analytics-driven to
continue being competitive. Data analysis provides a complete insight about their business. It
also gives noteworthy advantages over their competitors. Analytics-driven insights compel
businesses to take action on service innovation, enhance client experience, detect irregularities in
process and provide extra time for product or service marketing. To work on analytics driven
activities, companies require to gather, analyse and store information from all possible sources.
Companies should bring appropriate tools and workflows in practice to analyse data rapidly and
unceasingly. They should obtain insight from data analysis result and make changes in their
business process and practice on the basis of gained result. It would help to be more agile than
their previous process and function.
Choosing the Right Document Processing Solution for Healthcare OrganizationsProvectus
Looking to automate document processing in your healthcare organization? Learn from Provectus & AWS experts how to make data capture, conversion, and analytics more efficient. Process and manage documents faster and on a larger scale with AI & Machine Learning.
In this presentation, we offer management and engineering perspectives on document processing with AI, to help you explore available options. Whether you are looking for a ready-made solution or plan to build a custom solution of your own, this webinar will help you find the best fit for your healthcare use cases.
How to choose the right modern bi and analytics tool for your business_.pdfAnil
We highlight Top 5 Business Intelligence Tools as suggested by Gartner and ask critical questions that can help organizations make better and informed decisions.
This paper explores the Consumer Data Management, Consumer Data Management CDM area as the process and framework for collecting, managing, and analyzing consumer data from various sources in order to form a unified view of each client. Customer data management is the way companies keep track of their customer information and ensure proper and relevant data is obtained. Vrinda Bhateja "Consumer Data Management" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31555.pdf Paper Url :https://www.ijtsrd.com/management/operations-management/31555/consumer-data-management/vrinda-bhateja
Successfully supporting managerial decision-making is critically dep.pdfanushasarees
Successfully supporting managerial decision-making is critically dependent upon the availability
of integrated, high quality information organized and presented in a timely and easily understood
manner. Data warehouses have emerged to meet this need. They serve as an integrated repository
for internal and external data—intelligence critical to understanding and evaluating the business
within its environmental context. With the addition of models, analytic tools, and user interfaces,
they have the potential to provide actionable information resources—business intelligence that
supports effective problem and opportunity identification, critical decision-making, and strategy
formulation, implementation, and evaluation. Four themes frame our analysis: integration,
implementation, intelligence, and innovation.
1:four major categories of business environment factors is
INTEGRATION,IMPLEMENTATION,INTELLIGENCE AND INNOVATION.
Organizations use data warehousing to support strategic and mission-critical applications. Data
deposited into the data warehouse must be transformed into information and knowledge and
appropriately disseminated to decision-makers within the organization and to critical partners in
various capacities within the organizational value chain. Crucial problems that must be addressed
in this area are: the modes of dissemination of information to the end user; the development,
selection, and implementation of appropriate models, analytic tools, and data mining tools; the
privacy and security of data; system performance; and adequate levels of training and support.
The human–computer interface is of paramount importance in the data warehouse environment
and the primary determinant of success from the end-user perspective. In order to support
analysis and reporting tasks, the data warehouse must have high quality data and make these data
accessible through intuitive interface technologies. Data warehouse browsing tools provide star-
schema query-like access through a flexible menu-based interface, with pull-down menus
representing important dimensions. These types of tools are easy to use and support some ad-hoc
exploration, but are usually controlled through an administrative layer that determines the data
available to endusers. In developing a flexible interface, there is a tradeoff between the ability to
express ad-hoc queries and the ease-of-use that results from pre-defined constructs implemented
by data warehouse designers and administrators. Of course, SQL can provide an ad-hoc query
facility, but its use requires some care in the data warehouse environment where the combination
of very large tables and ill-formed user queries can produce some truly awful performance and
potentially erroneous results. Casual users may not have sufficient understanding of SQL or of
the database schema to effectively use such an interface. Typically, only trained power users
(e.g., DBAs, application developers) are permitted to write SQL queries on .
Make compliance fulfillment count doubleDirk Ortloff
This whitepaper gives an overview about the requirements and the approaches to
make your compliance initiative count double. Not only to fulfill compliance but to go
the next step bringing your documentation and knowledge handling to a stage where
future projects can learn from previous successes and mistakes. This will make your
R&D department ready for future challenges, faster markets and global
partnerships.
Whitepaper des Herstellers zum Thema Collect, Transform,Generate and Test
MetaSuite and HP Quality Center Enterprise, generating Test Data
from any data source from any platform, including mainframe
Kontakt: http://www.Minerva-SoftCare.de
Similar to Whitepaper on Master Data Management (20)
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
1. The University of Texas at Dallas
WhitePaper
Master Data Management
Product Information Management for HP Printing and Personal Systems
Atul Jena
Abhrajit Ghosh
Jagruti Dwibedi
By
2. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 1
Master Data Management
Product Information Management for HP PPS
University of Texas at Dallas
1. Executive Summary………………………………………………...2
2. Introduction…………………….……………...……………………3
3. Liabilities of bad data………………………….……………………4
4. PIM Capabilities……………………………………..……………...5
5. PIM Architecture..…………………………………………………..6
6. PIM Implementation at HP………………………………………….8
7. Data Governance……………………………………………………11
8. PIM Vendors……………….…….…………………………………12
9. Conclusion……………….…….…………………………………...13
References…………………….…….…………………………….…….14
3. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 2
Executive Summary
Working at Hewlett Packard, which is both a product as well as a service
based company, one thing was noticed over the years, is the amount of time
and money lost over poor-quality data. Clearly, an organization like HP
works on multiple departments, managing vast amount of data about its
customers, products, suppliers, location and more. With multiple
departments managing so much data, there are anomalies which results in
no single consolidated version of the truth about its business. It’s an
expensive problem.
Master Data Management is a framework that reasserts business processes
to present master data to the business users in a consistent and contextual
manner. Such presentation of accurate data will help business users in
making smarter and economical decisions. Broadly, two separate domain
specific streams emerged as a part of MDM: Customer Data Integration
(CDI) and Product Information Management (PIM).
This paper will discuss Product Information Management for HP printing
and personal systems. From stating the liabilities of bad data quality to
building a PIM architecture for product solutions, this paper will highlight
end-to-end solution that merges and centralizes product information across
the enterprise.
Disclaimer: This paper is a case study for HP PPS Global and is presented
as a view point of handling the data quality challenge. No internal product
information of the company has been used.
4. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 3
Introduction
The product data quality challenge in HP is formidable because of the
complexity of managing product information across numerous
departments, hundreds to thousands of suppliers, and thousand to
millions individual product items. Poor data quality leads to inefficient
internal processes and missed sales revenues. But as stated earlier,
cleansing product data alone isn’t the answer- retailers, distributors,
manufacturers need a comprehensive solution that provide much more.
"With numerous manual data entry processes across multiple
applications, product data errors are pervasive and result in purchase
order discrepancies, longer lead times and inefficient use of human
resources," said Andrew White, enterprise and supply chain
management research director at technology consultancy Gartner.
To meet this challenge one would need a system which combines
product information management with robust capabilities in data
integration and governance. As a single repository for all product data
for distribution in all sales channels, the PIM should provide a
cohesive, centralized platform for all channel commerce.
While everyone chases the customer insight part of the equation (the
360o
view of the customer) realizing the power and potential of product
information (the single view of products) should be the goal for HP to
be able to recommend and promote the exact products the customers
are likely to buy.
5. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 4
Liabilities of Bad Data
As a retailer HP needs to know:
All about its customer
Their profiles, histories, preferences, behaviors across all channels
(web, mobile, social, call centers, in-store, customer service ,
etc.)
All about its products
So a personal insight to things each customer is most likely to buy
can be mapped
If the management of product information is poorly done it may
become unsustainable in the market.
The problems faced with product information are
It’s incomplete: shoppers aren’t sure and click away
It’s Out-of-date: as it takes a lot of time to update each channel
It’s inconsistent: with different images or descriptions in different
channels
It’s boring: relying on generic data instead of on-brand descriptions,
images and video
Inconsistent database: For example, a mobile team has a different
database from the web team and the store team.
It takes ages to get to market: this causes ‘shelf lag’ that eats up sales
and margin.
6. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 5
PIM Capabilities
With a wide range of products in the ranging from printers to servers,
HP leads the market in delivering best experience through its products.
Clearly with the capabilities PIM provides the enterprise should benefit
the most from it.
A PIM solution will allow HP to do the following:
Locate and use appropriate data from heterogeneous sources.
Access structure product data, which consists such things as model
name, product number, technical description and features set.
Unstructured data are not easily modeled into a PIM repository like
warranty (PDF), videos about the product etc.
Cleanse data and related content.
Identify and create missing product information.
Connect and transmit data.
Unify and relate a single product instance to multiple types of
content. By collecting, validating, and approving the product
related content, the PIM provides one synthetic representation,
which is available on different purposes.
Enable cross media publishing of product catalogs.
Distribute disparate product information from a single source.
Enable multi-lingual catalog creation and deployment
Create personalized catalog views of the product information. Such
a view contains only the product information that the specific user
cares about.
7. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 6
PIM Architecture
There are four PIM solution architectures: External Reference, Registry,
Reconciliation Engine, and Transaction Hub.
External Reference (or Consolidation) is the low-end PIM solution
architecture; a reference database that points to all data but does not actually
contain any data. ; does not define, create, or manage a centralized platform
where Master Data is integrated to create a “single version of the truth.”
The Registry architecture consists of a registry of unique master entity
identifiers. An entity resolution service identifies the master entity records
and the data-source links that were used to maintain the attributes are
maintained by the Data Hub.
The Reconciliation Engine (or Coexistence) architecture is a step up from
the Registry architecture. It harmonizes product Master Data across
databases and acts as a central reference point. This architecture provides
synchronization between itself and legacy systems; retailers will often
implement it as an intermediate architecture (i.e., after they have outgrown
the Registry architecture.
The Transaction Hub architecture stores the up-to-date product Master
Data with its associated enriched attribute data. It also supports new and
legacy transactional and analytical applications, and includes a business
service and data integration layer. This architecture is well-suited to
companies that need to collect information, cleanse it, build it on the fly,
and serve it to other destinations. Hence, it is a perfect solution for HP PPS.
The following figure illustrates the general PIM architecture. The PIM hub
contains the MDM Data Storage, the Validation Engine, the Workflow
Engine, References, and the Metadata. This information is made available
through the Security and Access Layer. The latter ensures that you present
content only to persons who are entitled to have it, even as you allow
authorized persons to modify that content.
8. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 7
Figure: The PIM Solution Architecture
The Enterprise service Bus is used to make available the information
both upstream and downstream using mechanisms such as PubSub,
Web Services or Batch FTP- —that will allow HP to collect the
information or publish it to its consumers whether they are supply
chain, e-commerce, publishing, or stores.
9. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 8
PIM Implementation at HP
HP PPS has broadly divided its products into two categories- Consumer
(Pavilion, Envy, Omen, Deskjets, etc.) & Commercial (Probook,
EliteBook, Z Workstations, and Officejet etc.) units. Features are
modified to both the series of units over the time. New products are
introduced in each segment and out dated products are recalled. Spare
parts list are maintained for all these units. Consumers constantly look
for products online or seek tech support depending on the information
they see. Immaculate data presentation is an absolute need.
The MDM services should be robust enough to manage the Master data,
Data Quality, services like authorization, introduction of new products
and much more. A PIM allows to create lots of metadata, including
description of product categories, descriptions of the information that
needs to be collected, the rules about the information, and the
exceptions to those rules.
So, HP would need a model that would be robust enough in handling
problems of duplication, wrong information, authorization, etc.
The following Transactional Hub Model handles all of these.
Working Model of a Transaction Hub MDM Architecture
The MDM services component is composed of the following components
shown in the Figure:
Interface services: These services provide a consistent entry point to
invoke MDM services through a variety of technologies regardless of
how the service is called. In addition, the interface services have the
10. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 9
ability to accept multiple request message formats through support for
pluggable parsers
Lifecycle management services: Lifecycle management services
provide business and information services for all master data domains
such as customer, product, account or location to create, access and
manage master data held within the master data repository.
Data quality management services: The services in this group can be
divided into two groups:
Data validation and cleansing services provide capabilities to
specify and enforce data integrity rules.
Reconciliation services provide matching services which check
whether or not a new product is a duplicate to an existing product,
conflict resolution services and merge, collapse and split services which
are used by data stewards to reconcile duplicates.
Master data event management services: The master data event
management services provide the ability to create business rules to react
to certain changes on master data and to trigger notification events.
Hierarchy and relationship management services: Hierarchy
services create and maintain hierarchies.
Authoring services: Authoring services are used to define or extend the
definition of master data entities, hierarchies, relationships and
groupings.
Base services: The base services component provides services in the
following four groups:
Privacy and security services implement authorization on four
different levels:
o Service level: determines who is allowed to use the service
o Entity level: determines who is allowed to read/write a particular
entity
o Attribute level: determines who can read/write which attribute
o Record level: determines who can update which particular
records
Audit logging services have the ability to write a complete history
of all transactions and events which occurred for a complete trace
on what happened in the MDM system which can also be used for
problem determination or to comply with certain legal requirements.
The workflow services support collaborative authoring of master
data in processes like New Product Introduction and enable business
rules and delegation of tasks to external components.
Search services allow you to look up and retrieve master data
Master data repository: The master data repository has the following
parts:
The metadata: This part of the repository has all relevant metadata
stored such as a description of the data model for the master data.
The master data: This part of the repository is where the master data is
physically stored.
The history data: The history data is a complete history on all the master
data entity changes in the repository. This enables point-in-time queries
against the MDM data.
11. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 10
The reference data: Here lookup tables such as country codes,
measurement units for products, marital status, and the like are stored.
How it works
Master Data is scattered across applications when MDM is applied to it.
Thus in the Figure, (1), the master data (both HP consumer and commercial
units) from the source application system has to be extracted, cleansed,
standardized, de-duplicated, transformed and loaded into the MDM system
(2).These steps are performed in the Master Data Integration phase. For HP,
once the MDM system built with the Transaction Hub MDM pattern is
complete, all redundant copies of the master data in the source application
systems can be deleted as indicated by the white colour of the master data
parts of the persistence. Furthermore, the source applications are "MDM
enabled".
This means, whenever a transaction (3) is invoked on the source application
system which affects transactional data (for example, billing data of a
pavilion laptop) and master data, the master data portion of this transaction
invokes a master data service of the MDM system for processing. Only the
transactional part is processed locally.
Customers/Consumers access applications (UI) which consume master data
by (4) invoking the MDM services to retrieve master data in a read-only
way.
An MDMUI (5) on enterprise level is used to create and change master data.
An MDM UI can be part of an enterprise portal implementation, for
example. The key imperative is that all changes to master data by any source
system are only performed through services of the MDM system. This
guarantees the required level of master data consistency at all times and
enables customers reach out to the correct/desired products.
12. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 11
Data Governance
Data Governance can be defined as the mechanism by which we ensure
that the right corporate data is available to the right people at the right
time in the right format with the right context through the right
channels.
With the unprecedented growth in the amount of Data in the recent
years, the data needs to be controlled and understood in order to process
it effectively in a secure manner. Data governance isn’t a definable
solution; rather, it’s a journey toward transparency—offering a clearer
understanding of what information you have, how to manage it, and
how it can be used to advance the enterprise.
A product information governance project may appear to be a daunting
effort when one begins to structure the data rules.
The best practice is to develop a data roadmap to provide a clear and
precise understanding of the data and its use within HP. The road map
should detail how data is required and submitted for use within the
enterprise, account for the multiple uses of the data (purchasing,
engineering, marketing, and maintenance), plus the required data
elements and structure needed to accommodate each software system.
Benefits as a Result of Data Governance
There are many benefits of implementing an innovative data governance
and master data management system. Many of the basic benefits, both in
process and cost, are:
Reducing inventory through identification of duplicate items,
Facilitation of inventory sharing and internal purchasing programs,
Reduced employee time spent searching for items,
Common spare part usage strategies,
Reduced downtime in manufacturing equipment due to lack of
information availability,
Ability to manage inventory using a just in-time model.
Data Governance supports both indirect and direct cost savings.
Businesses can begin to embrace the definition of operational data as an
asset of the corporation, ensuring improved data accuracy and
confidence of the data users.
13. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 12
PIM Vendors
It has been said that data outlasts applications. This means that an
organization’s business data survives the changing application landscape.
Technology advancements drive periodic application reengineering, but the
business products, suppliers, assets and customers remain.
The dominant PIM solutions are IBM InfoSphere, Oracle Product Hub, and
SAP NetWeaver. All three vendors, IBM, Oracle, and SAP, have been
involved with MDM for the past 10 years. They have reached their positions
of dominance through multiple acquisitions.
All three vendors offer a full MDM ecosystem, including data integration,
data quality, databases, messaging, and sometimes hardware.
This Magic Quadrant by Gartner provides insight into the segment of the
constantly evolving packaged MDM system market that focuses on
managing product data to support supply chain management (SCM), CRM
and other customer-related strategies. It positions relevant technology
providers on the basis of their Completeness of Vision relative to the
market, and their Ability to Execute on that vision.
14. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 13
Conclusion
PIM is Master Data Management applied to the product space. PIM is
enabled through business process improvements, organizational
improvements, and the alignment of multiple information technologies.
In this paper it was shown how HP could lose business because of poor
data. Having a customer insight is not enough. As a retailer HP relies
heavily on its products. And in the product domain it is all about knowing
the afflictions, meeting the challenges and delivering on the promise of the
personalized customer experience.
Retail business is all about staying ahead of competition. A PIM integrated
system will provide just that to HP, staying ahead by giving a wonderful
customer experience. Like HP says “If you are going to do something,
make it matter”.
15. Naveen Jindal School of Management, Paper
Atul Je The University of Texas at Dallas 14
References
http://www.informatica.com/us/products/master-data-
management/product-information-management/#fbid=sBHsv9WFkAk
Product Information Management: Definition, Purpose, and Offering,
By Christophe Marcant, Senior Specialist in Sapient
http://h30507.www3.hp.com/t5/Journey-through-Enterprise-IT/Data-
Governance-It-is-the-data-stupid-govern-it/ba-
p/125983#.VIZrSTHF_d2
http://www8.hp.com/h20195/V2/GetDocument.aspx?docname=4AA4-
9093ENW&cc=us&lc=en
Product Information Management (PIM) Data Governance, by Jackie
Roberts, VP at DATAFORGE™
http://www.gartner.com/technology/reprints.do?id=1-
1QTLTLC&ct=140214&st=sb