3. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 3
Table of Contents
Why Quantum Lifecycle Management?................................................5
Evolution of QLM.................................................................................8
QLM Technical Architecture................................................................9
QLM Messaging Interface................................................................................10
QLM Data Model .............................................................................................10
Business Use-Cases .............................................................................12
About The Open Group......................................................................14
4. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 4
Boundaryless Information Flow™
achieved through global interoperability
in a secure, reliable, and timely manner
Quantum Lifecycle Management (QLM) will enable the “Internet of Things”  to
have an impact on business and the world at large in a way similar to the Internet
itself. This White Paper introduces QLM, explains why it is necessary, its emergence,
and also how its development and acceptance will be assured under the auspices of
The Open Group.
QLM is of interest to business leaders, system planners, and technology providers
seeking to manage dispersed lifecycle information from disparate physical and
technology objects that transcend enterprise boundaries.
As you can see from the logo and description above, The Open Group vision is
Boundaryless Information Flow™, achieved through global interoperability in a
secure, reliable, and timely manner. QLM will extend Boundaryless Information
Flow, not only to the trillions of autonomous objects that will make up the ever-
increasing “Internet of Things”, but also seamlessly integrate them in an open yet
trustworthy manner with existing information sources to create a true System of
5. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 5
Why Quantum Lifecycle Management?
Quantum Lifecycle Management (QLM) is a major leap beyond Product Lifecycle Management (PLM), and
the name has been chosen to highlight a clear differentiation between the two.
Probably the most significant obstacle to effective, whole-of-life lifecycle management is that valuable
information is all too often locked into vertical applications, sometimes called “silos”. This information is not
readily shared with other interested parties across the Beginning-of-Life (BOL), Middle-of-Life (MOL), and
End-of-Life (EOL) lifecycle phases.
Common, open, and trustworthy information exchange standards for QLM will enable the closing of
information loops, allowing information to be shared across the whole spectrum of lifecycles: product,
human, food and beverage, pharmaceutical, healthcare, supply chain and logistics, pedigree and traceability,
and data governance, among many others.
QLM will allow information from any single lifecycle phase to affect processes and decision-making in the
other phases. For example, information about the condition of products at end-of-life may be fed back and
used to affect the maintenance of similar products during middle-of-life or to improve the design and
production of future product series at the beginning-of-life. Closed loops ensure that valuable information is
available to all lifecycle phases.
Figure 1: Closing Information Loops
Closing information loops across all phases of all kinds of lifecycles will make Boundaryless Information
Flow™ a reality, harnessing and harmonizing the technologies and direction of the “Internet of Things” ,
embracing its many trillions of additional, often autonomous entities.
6. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 6
Historically, the PLM acronym is usually associated with the Beginning-of-Life (BOL) design and
manufacturing phase, and is often misunderstood to be synonymous with CAD/CAM. Hitherto, PLM has
been mainly focused on information about product types and their versions.
In contrast, QLM extends PLM to include detailed information not only about each individual product
instance – i.e., physical products – but also their usage in the Middle-of-Life (MOL) and End-of-Life (EOL)
Furthermore, QLM standards and infrastructure may also be applied to other kinds of lifecycles such as
supply chain, food and beverage, human, services, etc., thus allowing the aggregation of information about
each single product instance from different applications throughout its lifecycle.
Figure 2: Scope of QLM
Why call it Quantum Lifecycle Management? It’s curious how a single word can come to have almost
opposite meanings. As a noun, in physics the word “quantum” can mean “the smallest amount of a physical
quantity that can exist independently”, whereas in more general use it can mean the opposite: “a large
quantity; bulk” . As an adjective, it means “sudden or significant: a quantum increase in productivity”
. In the context of QLM, its usage encompasses both meanings: the quantum leap made possible by
harnessing the technologies and multiplicity of the “Internet of Things”, while embracing trillions of discrete
entities that will be added to the scope of lifecycle management.
Whether we like it or not, this explosion of “things” will impact on PLM.
However, to be sustainable, this explosion requires considerable and well coordinated efforts, both in
standards and product development, in order to address some quite significant challenges.
The wise know that the “Internet of Things” can only truly become reality if important societal, political, and
cultural concerns are properly overcome. In order to achieve its full potential, the “Internet of Things”
requires a trusted and secure, open, and unified infrastructure for true interoperability. Without this,
continued lack of trust and parallel development of disparate solutions, technologies, and standards will lead
7. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 7
it to become an ever-increasing web of organization and domain-specific intranets. The QLM standards will
provide this infrastructure.
The open interoperability which this enables increases competitiveness of existing products and services
through all lifecycle phases while also creating new business opportunities for manufacturers, distributors,
service providers, consultants, regulators ,and government.
8. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 8
Evolution of QLM
QLM has its roots in product planning. The early concept of production planning or Manufacturing
Requirement Planning (MRP) mostly concerned itself with workflow, the coordination of orders,
manufacturing to order, and for stocks and procurement. MRP evolved into manufacturing “resource”
planning when production capacities and human resources were brought into the equation; and Distribution
Resource Planning (DRP) brought the supply chain and external partners into the network as a natural
When Finance entered the arena, MRPII was born, which also served to quantify and highlight the
significance of interaction between different hitherto often segregated and isolated departments. It might be
said that in an effort to “bring it all together”, Enterprise Resource Planning (ERP) was born; federating
projects and encompassing the enterprise at large.
PLM followed a similar evolutionary path, driven mainly by technological advances in areas of Computer-
Aided Design (CAD), Product Data Management (PDM), and the realization of collaborative processes
between design and manufacturing organizations. Significantly, however, feedback from the field still
remains very limited and almost entirely failure-driven.
In an effort to provide competitive advantage in a global marketplace where product quality is no longer a
guarantee for success, production cycles and lead times are becoming shorter, and mass customization the
standard. Thus, producers are looking for additional means of differentiation for survival.
Servitization is the evolution from services to support the product, through services to differentiate the
product, to services becoming the product; e.g., car leasing rather than purchase, flight-hours rather than aero
engines, or a warm home rather than a boiler (furnace). Sustained competitiveness in such a service
environment demands extensive and continual feedback on product use and performance compared to cost.
Advances in areas of microelectronics, nanotechnologies, sensors, and Information and Communication
Technologies (ICT) have opened new opportunities with proven benefits, combining multiple sensor
technologies with automated identification and data capture technologies. Item-specific data can be combined
with information from existing enterprise systems and converted into knowledge. This aggregated knowledge
may be used to drive decision support systems in a virtually unlimited variety of application areas.
Highly selective information can be delivered or subscribed to across organizational boundaries to fuel new
service opportunities, with increased efficiency and sustainability.
Social networking has highlighted the potential of interconnected people. Interconnecting objects that are
able to communicate with each other (the “Internet of Things”) require structured and auditable protocols to
deal with trust and security of information.
9. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 9
QLM Technical Architecture
Underlying QLM is a robust Technical Architecture which will provide an open, secure, and trustworthy
infrastructure for the exchange and processing of lifecycle management information throughout all lifecycle
phases, but in particular the Middle-Of-Life (MOL) and End-Of-Life (EOL) phases.
The foundation of the QLM Technical Architecture was originally developed during the EU PROMISE
Project  in the specific context of Product Lifecycle Management (PLM). However, it became clear that
the applicability of the architecture extends beyond PLM.
In QLM, the Internet is the main medium for communication between the different information sources, no
matter whether they are embedded in all kinds of products, running on back-end servers, in the Cloud, or
anything in between.
The QLM Technical Architecture will enable the open realization of the “Internet of Things” by defining
standards, interfaces, and components that allow the creation of a QLM implementation in a flexible and
It supports the development of innovative new technology components, yet at the same time allows the
integration of existing technologies and systems to form a consolidated, flexible infrastructure for the
collection, processing, and exchange of lifecycle data.
The QLM Technical Architecture is designed to support and encourage the flow of lifecycle data between
multiple enterprises throughout the life of an entity and its components.
Figure 3 gives a conceptual impression of the variety of systems, technologies, and products that can
participate in QLM and, using the QLM Technical Architecture, interfaces and technologies can exchange
lifecycle data, thus closing the lifecycle information loop.
Figure 3: QLM Conceptual Connectivity
10. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 10
The QLM Technical Architecture can be applied to many other fields besides simply managing product
lifecycles; for example, healthcare lifecycle management, supply chain, and tracking and tracing of
foodstuffs – indeed, any field that needs data for Internet-connected “things”.
The QLM Work Group was formed as a working group under the auspices of The Open Group in order to
formalize the results of the EU PROMISE Project  and convert them to published open standards.
It has initially adopted two of the components that comprise the QLM Technical Architecture which will be
further developed and formalized through The Open Group:
• The QLM Messaging Interface
• The QLM Data Model
QLM Messaging Interface
The QLM connectivity model is similar to that of the Internet itself. Where the Internet uses the HTTP
protocol for transmitting HTML-coded information mainly intended for human users, QLM uses the QLM
Messaging Interface for transmitting XML-coded information mainly intended for automated processing by
The QLM Messaging Interface provides a flexible interface for making and responding to requests for
instance-specific information. A defining characteristic of the QLM Messaging Interface is that nodes do not
have predefined roles, as it follows a “peer-to-peer” communications model. This means that products can
communicate directly with each other or with back-end servers, but the QLM Messaging Interface can also
be used for server-to-server information exchange of sensor data, events, and other information.
The QLM Messaging Interface allows one-off or standing information request subscriptions to be made.
Subscriptions can be made for receiving updates at regular intervals or on an event basis – when the value or
status changes for the information subscribed to. The QLM Messaging Interface also supports read and write
operations of the value of information items.
QLM Data Model
The QLM Data Model was initially conceived as the basis for the information model for the Product Data
Knowledge Management/Decision Support System (PDKM/DSS), one of the most important components of
the overall PLM system developed by the former EU PROMISE Project .
It enabled detailed information about each and every instance of a product to be augmented with “field data”;
i.e., detailed information about the usage and changes to each instance during its life.
It also allowed the aggregation of instance-specific data from many different software systems; e.g., CAD,
CRM, and/or SCM and other legacy systems as part of a company’s IT infrastructure in order to allow
specific decision support information to be generated and made available through the PDKM system.
The first step undertaken by the PROMISE Project was to analyze the relevant industrial standards in the
field of product lifecycle data modeling. This study also provided many useful ideas for the development of
the model. The most relevant standards that were analyzed are:
• STEP (ISO 10303)
• STEP NC (ISO 10303-238, also known as AP238)
11. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 11
• PLCS (ISO 10303-239, also known as AP239)
• MANDATE (ISO 15531)
• PLM XML
• ANSI/ISA-95 (IEC 62264-1)
These standards have some properties and features in common, but are also distinguishable by many
remarkable differences. First of all, they were designed by different organizations, with different scopes and
for different targets. STEP, STEP NC, PLCS, and MANDATE can at first sight be grouped together, because
they are all ISO (International Organization for Standardization) standards; furthermore, PLCS is an
application protocol of STEP.
STEP is an industry standard for product data representation and it is composed of several parts (application
protocols) whose aim is to focus on a specific industrial context. There are application protocols for product
design, for mechanical and electrical engineering, for sheet-metal manufacturing, for product assembly, for
the automotive industry, etc.
PLM XML is an open standard, developed mainly by EDS (Electronic Data Systems Corporation) and later
by Siemens PLM Software, dealing with the product design phase.
ISA-95 is an ANSI (American National Standard Institution) standard, except for its first part, ANSI/ISA-
95.00.01, which is also an ISO standard (IEC 62264-1). All together, ANSI/ISA-95 Parts I, II, and III
describe the interfaces and activities between an enterprise’s business systems and its manufacturing control
systems: it mainly focuses thus on the area corresponding to the production phase of a product.
Another interesting standard, which is not included in the list given above because it focuses on the exchange
of product “type” lifecycle data, is PLM Services. PLM Services is mainly focused on the design phase. The
underlying data model itself is compliant to STEP AP214.
While PLCS does address some areas required by the QLM Data Model, a much simpler, more “lightweight”
model is required for the majority of application domains. However, interoperability with PLCS and STEP
remains an important objective.
12. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 12
The Open Group QLM Work Group is also preparing a QLM Business Guide which contains brief overviews
of several different industrial application use-cases that reveal the broad applicability of QLM and some of
the diverse business advantages and opportunities enabled by using QLM to close lifecycle information loops
in a consistent manner.
13. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 13
 “Internet of Things”, Wikipedia (http://en.wikipedia.org/wiki/Internet_of_Things).
 Definition of “quantum” (noun) at Dictionary.com (http://dictionary.reference.com/browse/quantum).
 Definition of “quantum” (adj.) at Dictionary.com (http://dictionary.reference.com/browse/quantum).
 The PROMISE Project (2004-2008): A European Union research project funded under the 6th
Framework Program (FP6) which focused on information systems for whole-of-life product lifecycle
The Open Group acknowledges the contributing members of the QLM Work Group.
Aalto University Kary Främling
Capgemini Financial Services Julian Shelbourne
Holonix Srl Jacopo Cassina
Promise Innovation David Potter* (QLM Work Group Chair)
* Primary Contributor
Open Group Staff acknowledgements:
Mike Hickey Director, Collaboration Services
Martin Kirk Director, The Open Group QLM Work Group
Dave Lounsbury CTO
14. An Introduction to Quantum Lifecycle Management (QLM)
www.opengroup.org A White Paper Published by The Open Group 14
About The Open Group
The Open Group is a global consortium that enables the achievement of business objectives through IT
standards. With more than 400 member organizations, The Open Group has a diverse membership that spans
all sectors of the IT community – customers, systems and solutions suppliers, tool vendors, integrators, and
consultants, as well as academics and researchers – to:
• Capture, understand, and address current and emerging requirements, and establish policies and share
• Facilitate interoperability, develop consensus, and evolve and integrate specifications and open source
• Offer a comprehensive set of services to enhance the operational efficiency of consortia
• Operate the industry’s premier certification service
Further information on The Open Group is available at www.opengroup.org.