The document is a newsletter article that discusses the requirements for an enterprise-class configuration management database (CMDB) software solution. It explains that an enterprise CMDB must be based on dimensional modeling rather than a traditional relational database. It also must support federation of multiple data sources, reconciliation of data inconsistencies, synchronization of changes, and dynamic modeling and visualization of configuration information. Few existing CMDB products meet all of these requirements for a true enterprise solution.
The opportunity of the business data lakeCapgemini
The Pivotal Business Data Lake is a new way to deliver information for the enterprise based around four simple principles:
- Store everything
- Encourage local
- Govern only the common
- Treat global as a local view
Principles that match the way business works today and now principles that can be delivered efficiently in technology using the Pivotal Business Data Lake and Capgemini's information governance and delivery methods.
The opportunity of the business data lakeCapgemini
The Pivotal Business Data Lake is a new way to deliver information for the enterprise based around four simple principles:
- Store everything
- Encourage local
- Govern only the common
- Treat global as a local view
Principles that match the way business works today and now principles that can be delivered efficiently in technology using the Pivotal Business Data Lake and Capgemini's information governance and delivery methods.
Solix Cloud – Managing Data Growth with Database Archiving and Application Re...LindaWatson19
Mission-critical ERP and CRM applications are the lifeblood of any business. This paper examines how Solix Cloud Database Archiving and Application Retirement Solutions enable enterprises to achieve their ILM goals while reducing complexity and offering superior performance.
Automate the collection, processing & management of your data using digital forms. Intercon’s Information Capture Solution bridges the gap in engagement between end users and e-forms and collapses the business process by automating the capture, management and processing of a company’s business data.
This paper will provide an overview of Intercon Associates and a description of their information capture solutions and services and how they can benefit your organization.
The Belgian Railways Journey to Enterprise Information ManagementStephane Haelterman
Agendat: Introduction, Definition, Motivation, Information Management, Framework, Dimensions, Inventory, Strategy, Corporate objectives, Value, Return-On-Investment, Aproach, Information Governance, Information Architecture, Unstructured data management, Information Security, Communication channels, Next Step.
There’s growing recognition in the analyst community that reference data is a form of master data that requires its own governance. Locations, currency codes, financial accounts, and organizational hierarchies are so widely used in an organization that mismatches can result in: reconciliation issues, poor quality analytics or even transactional failures.
While it’s easy to see how poor reference data management (RDM) can cause problems, many companies struggle with determining how to get started. Multiple questions arise: What’s the scope? How should one choose between RDM solutions? How do I compute ROI? To answer these questions and more, Orchestra Networks teamed up with Aaron Zornes, Chief Research Office of the MDM Institute and Godfather of MDM, for: Everything you ever wanted to know about Reference Data (but were afraid to ask).
In this hour long webcast featuring Aaron Zornes (MDM Institute) and Conrad Chuang (Orchestra Networks) you will learn the:
Characteristics of reference data,
Key features of a reference data management (RDM) solution,
Lessons learned RDM implementations,
and more
MDM Institute: Why is Reference data mission critical now?Orchestra Networks
Learn why market-leading enterprises are focusing on RDM in this exclusive webinar from MDM research analyst Aaron Zornes
More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing reference data management (RDM) in the next 18 months.
Why is RDM mission critical today?
How does RDM differ from (how is it similar to) MDM?
What are the top business drivers for RDM?
What are the “top 10” technical evaluation criteria?
Where are most organizations focusing their RDM efforts?
Aaron Zornes, Chief Research Officer of the MDM Institute, answers these questions and more when he reveals findings from the first ever RDM market study based on a 1Q2014 survey of 75+ global 5000 size enterprises.
Top 10 guidelines for deploying modern data architecture for the data driven ...LindaWatson19
Enterprises are facing a new revolution, powered by the rapid adoption of data analytics with modern technologies like machine learning and artificial intelligence (A).
Mike Ferguson, managing director of Intelligent Business Strategies, highlights his top ten worst practices in Master Data Management (MDM) in this Information Builders webinar slideshow.
Top 10 use cases for enterprise search to increase organization productivityEduard Daoud
intergator Enterprise Search is not only an intelligent search solution, it also supports you in developing all relevant data and delivering a prerequisite for a further added-value use.
intergator is a cross-system search engine, knowledge management and analytics platform, that helps to save valuable working time.Within seconds your employees and customers are able to find all business-relevant information from millions of documents, emails, web pages and other datasets. Integrate intergator into your existing system and leverage an enterprise-wide search that considers access rights
Solix Cloud – Managing Data Growth with Database Archiving and Application Re...LindaWatson19
Mission-critical ERP and CRM applications are the lifeblood of any business. This paper examines how Solix Cloud Database Archiving and Application Retirement Solutions enable enterprises to achieve their ILM goals while reducing complexity and offering superior performance.
Automate the collection, processing & management of your data using digital forms. Intercon’s Information Capture Solution bridges the gap in engagement between end users and e-forms and collapses the business process by automating the capture, management and processing of a company’s business data.
This paper will provide an overview of Intercon Associates and a description of their information capture solutions and services and how they can benefit your organization.
The Belgian Railways Journey to Enterprise Information ManagementStephane Haelterman
Agendat: Introduction, Definition, Motivation, Information Management, Framework, Dimensions, Inventory, Strategy, Corporate objectives, Value, Return-On-Investment, Aproach, Information Governance, Information Architecture, Unstructured data management, Information Security, Communication channels, Next Step.
There’s growing recognition in the analyst community that reference data is a form of master data that requires its own governance. Locations, currency codes, financial accounts, and organizational hierarchies are so widely used in an organization that mismatches can result in: reconciliation issues, poor quality analytics or even transactional failures.
While it’s easy to see how poor reference data management (RDM) can cause problems, many companies struggle with determining how to get started. Multiple questions arise: What’s the scope? How should one choose between RDM solutions? How do I compute ROI? To answer these questions and more, Orchestra Networks teamed up with Aaron Zornes, Chief Research Office of the MDM Institute and Godfather of MDM, for: Everything you ever wanted to know about Reference Data (but were afraid to ask).
In this hour long webcast featuring Aaron Zornes (MDM Institute) and Conrad Chuang (Orchestra Networks) you will learn the:
Characteristics of reference data,
Key features of a reference data management (RDM) solution,
Lessons learned RDM implementations,
and more
MDM Institute: Why is Reference data mission critical now?Orchestra Networks
Learn why market-leading enterprises are focusing on RDM in this exclusive webinar from MDM research analyst Aaron Zornes
More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing reference data management (RDM) in the next 18 months.
Why is RDM mission critical today?
How does RDM differ from (how is it similar to) MDM?
What are the top business drivers for RDM?
What are the “top 10” technical evaluation criteria?
Where are most organizations focusing their RDM efforts?
Aaron Zornes, Chief Research Officer of the MDM Institute, answers these questions and more when he reveals findings from the first ever RDM market study based on a 1Q2014 survey of 75+ global 5000 size enterprises.
Top 10 guidelines for deploying modern data architecture for the data driven ...LindaWatson19
Enterprises are facing a new revolution, powered by the rapid adoption of data analytics with modern technologies like machine learning and artificial intelligence (A).
Mike Ferguson, managing director of Intelligent Business Strategies, highlights his top ten worst practices in Master Data Management (MDM) in this Information Builders webinar slideshow.
Top 10 use cases for enterprise search to increase organization productivityEduard Daoud
intergator Enterprise Search is not only an intelligent search solution, it also supports you in developing all relevant data and delivering a prerequisite for a further added-value use.
intergator is a cross-system search engine, knowledge management and analytics platform, that helps to save valuable working time.Within seconds your employees and customers are able to find all business-relevant information from millions of documents, emails, web pages and other datasets. Integrate intergator into your existing system and leverage an enterprise-wide search that considers access rights
How 3 trends are shaping analytics and data management Abhishek Sood
Explore how 3 current trends are shaping modern data environments and learn about the impact of non-relational databases, big data, cloud data integration, self-service analytics, and more.
Digital revolution is disrupting businesses like never before! Ability to extract actionable insight from a large amount of disparate data has become the determining factor of competitive advantage! Everyday new business models are created around data and forcing the incumbents to reinvent themselves to be relevant. Consumer facing businesses felt this pressure early on but eventually every business need to be data driven. But what is the best strategy to address this digital disruption? Our experience says the core data infrastructure modernization is the logical starting point! In this session, we will share trends, strategies and our experience on rejuvenating data integration landscape to address digital disruptions.
Download White Paper : CMDB Implementations - A Tale of Two ExtremesServiceDesk Plus
One of the "quality problems" to have, as your business grows is the challenge of managing all your resources. As the number of your employees grows and your IT assets expand, it is difficult to ascertain exactly what and where all your assets are. It is important to get more visibility on what applications and services are running on each asset, how they interact, and the business impact if these resources are down, responding poorly or slowly, or jeopardized by security threats.
Download the white paper for free now !!!
http://www.manageengine.com/products/service-desk/cmdb-white-paper.html
Data warehousing has quickly evolved into a unique and popular busin.pdfapleather
Data warehousing has quickly evolved into a unique and popular business application class.
Early builders of data warehouses already consider their systems to be key components of their
IT strategy and architecture. Numerous examples can be cited of highly successful data
warehouses developed and deployed for businesses of all sizes and all types. Hardware and
software vendors have quickly developed products and services that specifically target the data
warehousing market. This paper will introduce key concepts surrounding the data warehousing
systems.
What is a data warehouse? A simple answer could be that a data warehouse is managed data
situated after and outside the operational systems. A complete definition requires discussion of
many key attributes of a data warehouse system. Later in Section 2, we will identify these key
attributes and discuss the definition they provide for a data warehouse. Section 3 briefly reviews
the activity against a data warehouse system. Initially in Section 1, however, we will take a brief
tour of the traditions of managing data after it passes through the operational systems and the
types of analysis generated from this historical data.
Evolution of an application class
This section reviews the historical management of the analysis data and the factors that have led
to the evolution of the data warehousing application class.
Traditional approaches to historical data
In reviewing the development of data warehousing, we need to begin with a review of what had
been done with the data before of evolution of data warehouses. Let us first look at how the kind
of data that ends up in today\'s data warehouses had been managed historically.
Throughout the history of systems development, the primary emphasis had been given to the
operational systems and the data they process. It is not practical to keep data in the operational
systems indefinitely; and only as an afterthought was a structure designed for archiving the data
that the operational system has processed. The fundamental requirements of the operational and
analysis systems are different: the operational systems need performance, whereas the analysis
systems need flexibility and broad scope. It has rarely been acceptable to have business analysis
interfere with and degrade performance of the operational systems.
Data from legacy systems
In the 1970s virtually all business system development was done on the IBM mainframe
computers using tools such as Cobol, CICS, IMS, DB2, etc. The 1980s brought in the new mini-
computer platforms such as AS/400 and VAX/VMS. The late eighties and early nineties made
UNIX a popular server platform with the introduction of client/server architecture.
Despite all the changes in the platforms, architectures, tools, and technologies, a remarkably
large number of business applications continue to run in the mainframe environment of the
1970s. By some estimates, more than 70 percent of business data for large corporations still
resi.
Discussion post· The proper implementation of a database is es.docxmadlynplamondon
Discussion post
· The proper implementation of a database is essential to the success of the data performance functions of an organization. Identify and evaluate at least three considerations that one must plan for when designing a database.
· Suggest at least two types of databases that would be useful for small businesses, two types for regional level organizations and two types for international companies. Include your rationale for each suggestion.
LP’s post states the following:Top of Form
Question:
The proper implementation of a database is essential to the success of the data performance functions of an organization. Identify and evaluate at least three considerations that one must plan for when designing a database.
Answer:
Planning is the most significant aspect of database design, and here is where most projects for database design will fail because the database does not meet requirements, does not meet expectations, or are just unmanageable. Here you need to be forward-thinking by planning for the future. What information needs to be stored or what things or entities do we need to store information about (Knauff, 2004)? What questions will we need to ask of the database (Knauff, 2004)?
A well-designed database promotes consistent data entry and retrieval and reduces the existence of duplication among the database tables. Relational database tables work together to ensure that the correct data is available when you need it.
The first consideration should be what is the database’s intended purpose. Understanding the purpose will help define the need. Some examples might be “to keep a list of customers,” “to manage inventory,” or “to grade students (Filemaker Staff, n.d.).” All stakeholders need to be involved in this process.
Second is Data integrity. Is the data accurate, consistent, and complete? What kind of categories does the data align with? Identifying these categories is critical to designing an efficient database because different types and amounts of data in each category will be stored. Some example categories might be sales that track “customers,” “products,” and “invoices,” or grades that track “students,” “classes,” and “assignments (Filemaker Staff, n.d.).” Once the categories have been defined the relations can be determined. A good exercise to help with this is to write these out in simple sentences:
“customers order products” and “invoices record customers’ orders.”
Now the organization of the data can begin. The categories above can be used as tables so common data can be grouped.
The third is security. Is the database secure? Will the current policy and rules be sufficient going forward? Who should have access? Who should have access to which tables (Nield, 2016)? Read-only access? Write access? Is this database critical to business operations (Nield, 2016)? What are the D&R plans?
Excessive security creates excessive red tape and obstructs agility, but insufficient security will invite catastrophe (Nield, 2016 ...
This paper presents an overview of best practices and techniques for enabling data discovery at an enterprise scale. The paper is based on real world experience implementing this type of solutions for Global 2000 companies.
IT Service Management (ITSM) Model for Business & IT AlignementRick Lemieux
Today’s multi-faceted business world demands that Information Technology provide its services in the context of a fully integrated corporate strategic model. This transformation becomes possible when IT evolves from its technological heritage into a Business Technical Organization, or an “internal service provider.” This paper describes how the itSM Solutions reference model integrates five widely used service management domains to create a powerful model to guide IT in its journey into the business leadership circle.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
2. Enterprise CMDB
Instructor Portal
Downloads
Newsletters
---------------------------------------------
DITY™ Newsletter
Vol. 3.4 • January 24, 2007
Enterprise CMDB
digg (discuss or comment) on this article. Show your support for DITY!
Subscribe Its Free!
RSS Stay up to date!
Print page
PDF Pass it around!
A CMDB is a nebulous thing, and can exist in the minds of the organization, 3x5 cards, or any
other medium. However, an enterprise CMDB is different. For large organizations only software
will do, but not traditional relational database software, what we need is a new breed of
software products...and they might be coming...
By Hank Marquis
I usually tell smaller and less mature IT organizations that they don't need to invest
in expensive and complicated CMDB software. I have used Configuration
Management principles of knowledge management to help different IT departments
using basic office tools like Excel.
A CMDB is a system for managing knowledge more than it is a product, and it is more
of an index than a database. This means as long as the amount of data is small, you can get the
benefits of a CMDB any number of ways—paper, Visio diagrams, Access databases, peoples heads,
etc. And this is exactly what you see in most organizations.
However an enterprise IT organization spans large geographies with hundreds or thousands of
http://www.itsmsolutions.com/newsletters/DITYvol3iss4.htm (1 of 5)1/21/2007 8:38:38 AM
3. Enterprise CMDB
users and locations, and the sheer number of configuration variations requires a specialized CMDB
solution. For the record, I don't think such a product is available yet. The requirements lay just
outside of today's technology and protocols.
But this is changing. First to change was the realization that a CMDB is not simply an IT Asset
Manager (ITAM) "on steroids," but rather that a true enterprise CMDB requires some unique
features. Chief among this is that a true enterprise CMDB solution has to dust of an old database
concept that is a bit out of style—dimensional modeling. An enterprise CMDB solution has to be
based on a dimensional database, a relational database simply can't do the job.
Following I explain what a dimensional database is, and what an enterprise-class CMDB software
solution requires: Federation, Reconciliation, Synchronization, and Modeling.
Many vendors are starting to talk this talk, but few understand what it really means, and fewer
still deliver products that walk the talk.
Relational vs. Dimensional
Data stored in a relational database is easy to lookup when you know in advance what you will
want to see. The common "row and column" approach of relational databases are ideal for On Line
Transaction Processing (OLTP).
But a CMDB does not store most of its data; it references data stored externally in other, perhaps
relational, databases. And a CMDB is used to provide contextual awareness over non-obviously
related bits of data. For example, a common inquiry posed to configuration management might be:
"How many users, in sales, use SAP during the last week of the month?"
This kind of query is not well suited to a centralized relational database with pre-built SQL
queries. There are just too many possible combinations of data. This type of query has to pull
information from many systems, and the data it needs is probably not all nicely lined up in rows
and columns ready to query. No, an enterprise CMDB has to be dimensional—a technology that
represents data as different dimensions or plains.
The dimensions of a CMDB often include locale (e.g., city, state, floor, etc.), work group like sales,
marketing and so on, IT service like SAP or Email, date ranges, and others. Instead of an Excel
spreadsheet with rows and columns, think about a Rubik's cube and you begin to get the idea. The
logic required is not new, its been around for years in the form of On Line Analytical Processing
(OLAP).
Don't get too excited—simply having OLAP does not give you a CMDB for a couple of very special
reasons. First, most CMDB data resides outside of the CMDB system. In order to pull this data
from many sources requires federation—a new CMDB buzzword you will begin to hear about more
and more.
By way of an example of federation and what it requires, lets consider an IT service for project
management. Composing this service are human resource information residing in SAP, project
management data in Microsoft Project Server, IT asset information in CA Unicenter, and
networking hardware resource data discovered and stored in CiscoWorks.
Federation
The first major requirement for dimensional modeling is federation, or referencing data from
several sources instead of replicating it. The CMDB is a meta-database, that is, it is a database
that references other databases. The issue that drove federation the first time around was data
http://www.itsmsolutions.com/newsletters/DITYvol3iss4.htm (2 of 5)1/21/2007 8:38:38 AM
4. Enterprise CMDB
validity. If you make a copy of something, then what is definitive? The original or the copy? And
how to you know if the copy is the same as the original?
The issues around federation are how to connect to heterogeneous data sources, resolve which bits
of data are definitive, and then create and store keys with unique data not found in any external
data source but still required. For example, data not found in any of these systems might be the
name of the IT service and which workers use it.
There also has to be a method to store awareness of the types of data to be found in each federated
data source in order to process ad hoc queries.
As you can see, the idea of federation is easy to state as "connecting to multiple data sources", but
having a CMDB system that can actually federate is a very tall order indeed. And federation is just
one of four equally complicated CMDB technical requirements. Consider this: what if two data
stores reference the same data? Which data store is definitive then, and more importantly, how
would you determine which is definitive? This is the issue of reconciliation.
Reconciliation
Aside from the issues of simply connecting to heterogeneous and possibly competitive data sources,
the big problem with federation for a CMDB is data confrontation. During creation and
maintenance of the contextual information in the CMDB meta database key bits of data transfer
from federated data sources to the CMDB data store. Since it is common to have multiple
applications and systems that overlap and monitor the same IT assets or store similar data,
possible data inconsistencies and redundancies arise.
For example, in our sample project management service, perhaps CA and CiscoWorks both store
hardware data: Unicenter may refer to a router by name, perhaps “CISCO01”. CiscoWorks may be
aware of the same router, but not by the name “CISCO01”, but rather by its IP address
"128.10.0.1"—this is a real problem since there are not two routers but one. How does the CMDB
system discover that "CISCO01" and "128.10.0.1" refer to the same single router? Further, how
would the CMDB system know its a router at all? This is the domain of reconciliation.
Reconciliation implies adjusting data derived from more than one source to eliminate duplicates
and maintain consistency of data. Federation is useless with reconciliation. Adding even more
complexity to a CMDB system is the need to handle any changes arising from successful
reconciliation, and this leads to synchronization.
Synchronization
Most data stored in federated CMDB systems changes—sometimes slowly, sometimes swiftly. For
example, considering our example project management IT service, the name of the project manager
(e.g., the "user") might change, or the type of router hardware might change from a Cisco 2504 to a
Cisco 7502. Reconciliation has to be able to can resolve these differences to maintain the integrity
of the CMDB. But this is IT, not "simple" data warehousing. Nothing in a CMDB should change
with a Request for Change (RFC). Thus, a CMDB system that can successfully federate and
reconcile data must also be able to alert when it detects unauthorized changes.
This makes the CMDB system require an awareness of approved changes. Then, when the
reconciliation engine detects and resolves a change in infrastructure or data, it has to compare this
change to an expected list of approved changes and generate an alert if the change is unapproved
(e.g., not planned.) This alert brings CMDB data to the attention of its administrators, who need
http://www.itsmsolutions.com/newsletters/DITYvol3iss4.htm (3 of 5)1/21/2007 8:38:38 AM
5. Enterprise CMDB
help visualizing, mapping, and displaying data—modeling.
Modeling
Modeling is mapping and visualizing synthesized relationships that are IT service definitions.
Modeling is more than reporting or display lists of resource trees and forks. The CMDB has to be
able to visibly display its data in ways that let humans use the information in impact assessments
for Change Management, privilege determination at the Service Desk, troubleshooting by Incident
or Problem Management, and dozens of other ad hoc inquiries from all over IT.
This requirement goes way the simple "directory tree" listings so commonly found in most alleged
CMDB products today. Federation, reconciliation, and synchronization is worthless if a user in IT
cannot get a definitive, understandable answer to their complex questions quickly. This requires
representing complex relationships between CIs graphically, on demand.
Summary
An enterprise CMDB is not a database, it is a complex software system that has to federate other
data stores, reconcile alternate views of the same data, detect unauthorized changes, synchronize
approved changes with its own metadata store, and be able to dynamically represent
configurations graphically on demand. This is no small task, and also the reason there are so few
true CMDB solutions available today.
As you go forward with your own CMDB plans, keep the concepts of federation, reconciliation,
synchronization, and modeling in clear view. Dig into them to understand them. If you are in the
throws of purchasing a CMDB product, ask your vendors how they handle these issues. If you are
building your own CMDB solution, ask your developers how they plan to accomplish these tasks.
In all cases make sure you create processes to monitor and ensure that federation, reconciliation,
synchronization, and modeling occur. Failure to manage these critical issues can quickly convert
your CMDB project from an asset to a liability.
Forewarned is forearmed! Now you can at least "talk the talk as you walk the walk!"
Subscribe Its Free!
RSS Stay up to date!
PDF Pass it around!
Where to go from here:
q
q
q
q
digg (discuss or comment) on this article. Show your support for DITY!
Subscribe to our newsletter and get new skills delivered right to your Inbox, click here.
Download this article in PDF format for use at your own convenience, click here.
Use your favorite RSS reader to stay up to date, click here.
Related articles:
q
q
Configuration Management for the Rest of Us for more on establishing Configuration
Management.
Browse back-issues of the DITY Newsletter, click here.
http://www.itsmsolutions.com/newsletters/DITYvol3iss4.htm (4 of 5)1/21/2007 8:38:38 AM