Coordinating SGML Projects to Maximize Corporate Benefits was the original title from this 1995 article. Although it hails from the past, its lessons for markup technologies, the management of standards, and the handling of corporate politics still ring true. It also showcases how common forces drove the emergence of practices that we now see in the Darwin Information Typing Architecture (DITA).
An Agile Software Development FrameworkWaqas Tariq
Agility in software projects can be attained when software development methodologies attain to external factors and provide a framework internally for keeping software development projects focused. Developer practices are the most important factor that has to cope with the challenges. Agile development assumes a project context where the customer is actively collaborating with the development team. The greatest problem agile teams face is too little involvement from the customer. For a project to be agile, the developers have to cope with this lack of collaboration. Embracing changing requirements is not enough to make agile methods cope with business and technology changes. This paper provides a conceptual framework for tailoring agile methodologies to face different challenges. The framework is comprised of three factors, namely, developer practices, customer collaboration, and predicting change
DevOps shifting software engineering strategy Value based perspectiveiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Information 4.0 for Industry 4.0 (TCWorld 2016)Joe Gollner
An annotated version of a presentation delivered at TCWorld 2016 in Stuttgart, Germany. Explores the concept of Information 4.0 and Content 4.0. Builds connections to the Semantic Web, Internet of Things, Cognitive Computing, and Big Data.
A talk delivered at the Center for Information Development (CIDM) Best Practices conference held in Santa Fe, New Mexico, in September 2016. It is a treatment of the idea of Content 4.0 that focuses on the real implications that come with operating at the higher levels of content practice (3.0 and 4.0).
A bit of a retrospective. Back in the spring of 2005, I delivered this presentation at a Defense Software Symposium. The idea was that if we manage the knowledge behind a software system properly we can create, integrate, manage, and evolve that software far more effectively than we have in the past. This discussion proceeded with reference to very large and very complex software engineering and integration projects.
This session explores the ways in which Content 4.0 can be a useful way to understand the direction that content is going. It proceeds by looking at what content must be like in order to keep up with Industry 4.0. This session was undertaken at the invitation of Tom Aldous of The Content Era.
The Changing Face of Publishing (October 2012)Joe Gollner
A presentation made to the Canadian Heritage Ministry on the changing impacting publishing at this time. Complete with a somewhat jaundice view on how well most publishers are adapting. It comes from 2012 which feels like a long time ago but the presentation doesn't really call for much updating.
This talk was delivered at DITA Europe in Munich Germany. It explores the business and management considerations that apply to the deployment of DITA-enabled solutions that break out beyond the traditional technical documentation focus. Appropriately, the guiding theme for the presentation is drawn from Don Quixote.
An Agile Software Development FrameworkWaqas Tariq
Agility in software projects can be attained when software development methodologies attain to external factors and provide a framework internally for keeping software development projects focused. Developer practices are the most important factor that has to cope with the challenges. Agile development assumes a project context where the customer is actively collaborating with the development team. The greatest problem agile teams face is too little involvement from the customer. For a project to be agile, the developers have to cope with this lack of collaboration. Embracing changing requirements is not enough to make agile methods cope with business and technology changes. This paper provides a conceptual framework for tailoring agile methodologies to face different challenges. The framework is comprised of three factors, namely, developer practices, customer collaboration, and predicting change
DevOps shifting software engineering strategy Value based perspectiveiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Information 4.0 for Industry 4.0 (TCWorld 2016)Joe Gollner
An annotated version of a presentation delivered at TCWorld 2016 in Stuttgart, Germany. Explores the concept of Information 4.0 and Content 4.0. Builds connections to the Semantic Web, Internet of Things, Cognitive Computing, and Big Data.
A talk delivered at the Center for Information Development (CIDM) Best Practices conference held in Santa Fe, New Mexico, in September 2016. It is a treatment of the idea of Content 4.0 that focuses on the real implications that come with operating at the higher levels of content practice (3.0 and 4.0).
A bit of a retrospective. Back in the spring of 2005, I delivered this presentation at a Defense Software Symposium. The idea was that if we manage the knowledge behind a software system properly we can create, integrate, manage, and evolve that software far more effectively than we have in the past. This discussion proceeded with reference to very large and very complex software engineering and integration projects.
This session explores the ways in which Content 4.0 can be a useful way to understand the direction that content is going. It proceeds by looking at what content must be like in order to keep up with Industry 4.0. This session was undertaken at the invitation of Tom Aldous of The Content Era.
The Changing Face of Publishing (October 2012)Joe Gollner
A presentation made to the Canadian Heritage Ministry on the changing impacting publishing at this time. Complete with a somewhat jaundice view on how well most publishers are adapting. It comes from 2012 which feels like a long time ago but the presentation doesn't really call for much updating.
This talk was delivered at DITA Europe in Munich Germany. It explores the business and management considerations that apply to the deployment of DITA-enabled solutions that break out beyond the traditional technical documentation focus. Appropriately, the guiding theme for the presentation is drawn from Don Quixote.
Getting it Right: Building Quality into your Content (July 2014)Joe Gollner
This presentation was delivered as a webinar hosted by STC France on July 8, 2014.
This talk focused on the steps to be taken to design quality into your content assets and to then see that quality realized in high quality information products.
Lean Manufacturing and DITA (Gnostyx at DITA Europe 2014)Joe Gollner
Presentation from DITA Europe 2014 on the topic of Lean Manufacturing and DITA. How DITA (Darwin Information Typing Architecture) has been used on Lean Manufacturing projects and how Lean Principles change how we deploy DITA and Content Solutions.
An annotated slide deck from a webinar hosted by Stilo International and conducted on June 24, 2014.
The talk introduces tactics for moving a content solution project forward quickly while also attending to essential details.
Practical Steps Towards Integrated Content Management (Nov 2015)Joe Gollner
This talk was delivered at TCWorld 2015 in Stuttgart Germany. It explores ideas initially touched upon in a talk at the Information Energy event in Utrecht.
DITA - What is it good for? (J Gollner 2015)Joe Gollner
A presentation delivered on April 20, 2015 in Chicago at the annual Content Management Strategies / DITA North America conference. It presents tactics and tools for presenting DITA, and its business benefits, to executive management.
Integrated Content Management - Information Energy 2015 KeynoteJoe Gollner
The opening keynote at the 2015 Information Energy conference convened in beautiful Utrecht in the Netherlands. A talk that explored how the various content management disciplines can come together to help organizations to leverage their content more effectively and to improve their overall performance.
Jumping to Light Speed (Spotlight Session at STC 2014)Joe Gollner
A spotlight session from the STC Summit 2014 (#STC14). It sought, in a short 20 minute slot, to introduce the challenges facing technical communicators and a framework (rooted on the work of Hippocrates) for thinking about both leadership and content. A Star Trek theme is mixed in for good measure. Thanks to @maxwellhoffmann for the photo of me presenting this talk (included on the last slide)
Brief History of Content (J Gollner 2014)Joe Gollner
This presentation was first created for an opening keynote at Documation 1999 and it has evolved to reflect ongoing evolution ever since. The Brief History of Content explores how we came to look at content as a discrete entity and as something we needed to think about, manage, and perfect separately from how we conduct our routine information exchanges. Information carries content and when we are put upon to deliver content in many ways simultaneously we have no choice but to treat content separately and in a way that is more open, adaptable, portable and processable than what any single information transaction, in being concretely rooted in a specific transactional context, will ever need to be. The Brief History of Content chronicles the emergence of content technologies that now make it possible to manage and evolve content as strategic enterprise assets.
This is the Extended Edition version of the keynote presentation delivered at Lavacon 2015 in New Orleans. It tackles some key concepts and principles that will drive a grounded Content Strategy and its implementation.
The Emergence of Intelligent Content (Revised)Joe Gollner
White paper tracing the recent history of content technologies and their convergence in what can be called "the age of intelligent content".
Revised in September 2010.
A Teaser Version for a Presentation proposed for Lavacon 2015 in New Orleans. Looks into the dark arts of content leadership, into how leaders emerge and influence for the better content solution projects and consequently their organizations.
Getting a Handle on the Content Life Cycle (April 2014)Joe Gollner
Slides from a Webinar conducted for the Society for Technical Communication (STC) Special Interest Group (SIG) on the Content Life Cycle. It introduces a Content Life Cycle model and situated within the context of a Content Solution framework.
Professional Publishing: Intelligent eBooks for Working ProfessionalsJoe Gollner
Presentation given at Intelligent Content 2013 in San Francisco. Incorporated a live demonstration of a professional eBook based on the Canadian Criminal Code.
A presentation developed and delivered in 1995. It was designed to be part of a larger introduction to SGML. It is interesting today because it foregrounds many (if not all - and perhaps a few extra) of the themes being touched upon in discussions of Intelligent Content. It needed to be shared just in case someone thought that this was all new.
Getting it Right: Building Quality into your Content (July 2014)Joe Gollner
This presentation was delivered as a webinar hosted by STC France on July 8, 2014.
This talk focused on the steps to be taken to design quality into your content assets and to then see that quality realized in high quality information products.
Lean Manufacturing and DITA (Gnostyx at DITA Europe 2014)Joe Gollner
Presentation from DITA Europe 2014 on the topic of Lean Manufacturing and DITA. How DITA (Darwin Information Typing Architecture) has been used on Lean Manufacturing projects and how Lean Principles change how we deploy DITA and Content Solutions.
An annotated slide deck from a webinar hosted by Stilo International and conducted on June 24, 2014.
The talk introduces tactics for moving a content solution project forward quickly while also attending to essential details.
Practical Steps Towards Integrated Content Management (Nov 2015)Joe Gollner
This talk was delivered at TCWorld 2015 in Stuttgart Germany. It explores ideas initially touched upon in a talk at the Information Energy event in Utrecht.
DITA - What is it good for? (J Gollner 2015)Joe Gollner
A presentation delivered on April 20, 2015 in Chicago at the annual Content Management Strategies / DITA North America conference. It presents tactics and tools for presenting DITA, and its business benefits, to executive management.
Integrated Content Management - Information Energy 2015 KeynoteJoe Gollner
The opening keynote at the 2015 Information Energy conference convened in beautiful Utrecht in the Netherlands. A talk that explored how the various content management disciplines can come together to help organizations to leverage their content more effectively and to improve their overall performance.
Jumping to Light Speed (Spotlight Session at STC 2014)Joe Gollner
A spotlight session from the STC Summit 2014 (#STC14). It sought, in a short 20 minute slot, to introduce the challenges facing technical communicators and a framework (rooted on the work of Hippocrates) for thinking about both leadership and content. A Star Trek theme is mixed in for good measure. Thanks to @maxwellhoffmann for the photo of me presenting this talk (included on the last slide)
Brief History of Content (J Gollner 2014)Joe Gollner
This presentation was first created for an opening keynote at Documation 1999 and it has evolved to reflect ongoing evolution ever since. The Brief History of Content explores how we came to look at content as a discrete entity and as something we needed to think about, manage, and perfect separately from how we conduct our routine information exchanges. Information carries content and when we are put upon to deliver content in many ways simultaneously we have no choice but to treat content separately and in a way that is more open, adaptable, portable and processable than what any single information transaction, in being concretely rooted in a specific transactional context, will ever need to be. The Brief History of Content chronicles the emergence of content technologies that now make it possible to manage and evolve content as strategic enterprise assets.
This is the Extended Edition version of the keynote presentation delivered at Lavacon 2015 in New Orleans. It tackles some key concepts and principles that will drive a grounded Content Strategy and its implementation.
The Emergence of Intelligent Content (Revised)Joe Gollner
White paper tracing the recent history of content technologies and their convergence in what can be called "the age of intelligent content".
Revised in September 2010.
A Teaser Version for a Presentation proposed for Lavacon 2015 in New Orleans. Looks into the dark arts of content leadership, into how leaders emerge and influence for the better content solution projects and consequently their organizations.
Getting a Handle on the Content Life Cycle (April 2014)Joe Gollner
Slides from a Webinar conducted for the Society for Technical Communication (STC) Special Interest Group (SIG) on the Content Life Cycle. It introduces a Content Life Cycle model and situated within the context of a Content Solution framework.
Professional Publishing: Intelligent eBooks for Working ProfessionalsJoe Gollner
Presentation given at Intelligent Content 2013 in San Francisco. Incorporated a live demonstration of a professional eBook based on the Canadian Criminal Code.
A presentation developed and delivered in 1995. It was designed to be part of a larger introduction to SGML. It is interesting today because it foregrounds many (if not all - and perhaps a few extra) of the themes being touched upon in discussions of Intelligent Content. It needed to be shared just in case someone thought that this was all new.
Reimagining Energy Trading and Risk Management (ETRM) With Advanced Delivery ...CTRM Center
ETRM systems are, by their nature, complex software products as the software must mirror the full complexity of the commodities industries, markets, and assets that they serve. Spanning from contract administration through invoicing and settlement, the business processes involved in commodity trading varies greatly. This variation is created by the unique combinations and nature of the physical or financial commodity or commodities traded, as well as by the industry segment (power generation/trading, gas production/trading, agricultural production/trading, etc.), the assets employed in the supply chain(s) and geographic differences (North American power vs. European vs. Japan, for example).
Why is Org Strategy important, what are the possible org patterns and what are some of the benefits and challenges to consider? This 12-page long white paper describes different org existence models, trade-offs, design best practices, and assessment approach. Please leave your comments.
Application Modernization and its Impact on Business Transformation.pdfbasilmph
A typical plan to extend the life of legacy applications is to move parts of the application over
time, but this can be an extremely time-consuming process. It is more efficient to build a new
application using new technologies while using the existing code as a foundation.
This declarative approach is called the application modernization strategy. It involves dealing
with critical application elements in legacy software and is considered the lever for business
transformation and API management platforms.
SHARE in Boston: z/OS Applications Adapting at the Speed of BusinessRichard Szulewski
This is an early look at my SHARE presentation for Boston, Session 13948, to be given at the Hynes Convention Center on 15 August 2013 at 1:30PM in Room 203. Slides are subject to change without notice. See www.share.org.
DESIGN AND DEVELOPMENT OF BUSINESS RULES MANAGEMENT SYSTEM (BRMS) USING ATLAN...ijcsit
Nowadays, in the world of industry end-users of business rules inside huge or small companies claims that
it’s so hard to understand the rules either because they are hand written by a specific structural or
procedural languages used only inside their organizations or because they require a certain understanding
of the back-end process. As a result, a high need for a better management system that is easy to use, easy to
maintain during the evolution process has increased. In this paper, the emphasis is put on building a
business rule management system (BRMS) as a graphical editor for editing the models in a flexible agile
manner with the assistant of ATL and Sirius frameworks within Eclipse platform. Thus, the proposed
solution, on one hand, solves the problem of wasting resources dedicated for updating the rules and on the
other hand it guarantees a great visibility and reusability of the rules.
A Content Manifesto (Gnostyx CIDM IDEAS Conference 2020)Joe Gollner
Touching on Digital Transformation, the economics of content, and the history of the content industry, this presentation concludes with a Content Manifesto - seven declarations that define how we, as an industry, should be talking about our work. At one and the same time, this talk is both traditional and radical. If the content manifesto is genuinely adopted then the implementations are massive as are the opportunities.
The Economics of Content (October 2019)Joe Gollner
Virtual Presentation delivered at Lavacon 2019. A bit of a deep dive into some fundamental questions around the nature of the content industry and some of the challenges it has historically faced. In order to stave off depression, it ends with a more positive "Content Manifesto" that declares what needs to be done to redress some of the observed problems in the content industry. Relevant to content management and to open content standards like DITA and XML.
So You Want a CMS (Gnostyx Workshop Lavacon 2016)Joe Gollner
A half-day workshop held at Lavacon 2016 in Las Vegas. A relatively thorough introduction to a proven way to acquire a content management system as part of an overall content solution. Leans towards a more formal approach to selecting and validating a CMS platform than is usually followed. The approach has been proven to be effective in numerous circumstances but is especially valuable when the content infrastructure being selected will play a broad role within an enterprise environment.
Managing Knowledge in the Fractal Enterprise (Retro Alert 1999)Joe Gollner
A blast from the past - a talk I gave at Documation 1999 entitled "Managing Knowledge in the Fractal Enterprise". Interestingly, the themes touched on in this presentation have proved resilient and useful in all the years since. If anything, the ideas seem closer to the mark today than they did 20 years ago!
A presentation given the Center for Information Development Management (CIDM) Content Management Strategies and DITA conference in San Diego 2017. This talk looked at DITA in context of Digital Transformation - so as to consider what this new and changing context means for DITA and what it is that DITA can contribute that is both needed and unique.
Engineering Content: The Discipline of Designing Future-Ready ContentJoe Gollner
A session delivered at Spectrum 2017 at the Rochester Institute of Technology for the STC Rochester Chapter. It pulls together many years of reflection on what really works when it comes to designing content management and publishing systems - and why this has become so important amid the changes wrought by Digital Transformation.
Brave New World of Technical CommunicationJoe Gollner
Keynote address at the 2017 Spectrum conference delivered at the Rochester Institute of Technology for the STC Rochester Chapter. Looks at how the work of technical communication must change in the light of Digital Transformation.
Digital Transformation and the Business of Content (May 2017)Joe Gollner
This talk was delivered as the opening keynote for the virtual track at Lavacon Dublin 2017. It's primary intent is to explore the implications of Digital Transformation for Profession Communicators and for the Content Standards and Technologies that they use.
Three case studies that showcase the central importance in Content Management projects of jumping in with both feet, getting up close and personal with your content, and adding new value.
CALS and Canadian Government Acquisition 1994Joe Gollner
This is a paper written for, and presented at, CALS Europe 1994 in Paris. It outlines how the principles, and in some cases the technologies, of the Continuous Acquisition and Lifecycle Support (CALS) initiative were applied to complex custom procurement within the Canadian Federal Government.
1.Wireless Communication System_Wireless communication is a broad term that i...JeyaPerumal1
Wireless communication involves the transmission of information over a distance without the help of wires, cables or any other forms of electrical conductors.
Wireless communication is a broad term that incorporates all procedures and forms of connecting and communicating between two or more devices using a wireless signal through wireless communication technologies and devices.
Features of Wireless Communication
The evolution of wireless technology has brought many advancements with its effective features.
The transmitted distance can be anywhere between a few meters (for example, a television's remote control) and thousands of kilometers (for example, radio communication).
Wireless communication can be used for cellular telephony, wireless access to the internet, wireless home networking, and so on.
This 7-second Brain Wave Ritual Attracts Money To You.!nirahealhty
Discover the power of a simple 7-second brain wave ritual that can attract wealth and abundance into your life. By tapping into specific brain frequencies, this technique helps you manifest financial success effortlessly. Ready to transform your financial future? Try this powerful ritual and start attracting money today!
APNIC Foundation, presented by Ellisha Heppner at the PNG DNS Forum 2024APNIC
Ellisha Heppner, Grant Management Lead, presented an update on APNIC Foundation to the PNG DNS Forum held from 6 to 10 May, 2024 in Port Moresby, Papua New Guinea.
Multi-cluster Kubernetes Networking- Patterns, Projects and GuidelinesSanjeev Rampal
Talk presented at Kubernetes Community Day, New York, May 2024.
Technical summary of Multi-Cluster Kubernetes Networking architectures with focus on 4 key topics.
1) Key patterns for Multi-cluster architectures
2) Architectural comparison of several OSS/ CNCF projects to address these patterns
3) Evolution trends for the APIs of these projects
4) Some design recommendations & guidelines for adopting/ deploying these solutions.
# Internet Security: Safeguarding Your Digital World
In the contemporary digital age, the internet is a cornerstone of our daily lives. It connects us to vast amounts of information, provides platforms for communication, enables commerce, and offers endless entertainment. However, with these conveniences come significant security challenges. Internet security is essential to protect our digital identities, sensitive data, and overall online experience. This comprehensive guide explores the multifaceted world of internet security, providing insights into its importance, common threats, and effective strategies to safeguard your digital world.
## Understanding Internet Security
Internet security encompasses the measures and protocols used to protect information, devices, and networks from unauthorized access, attacks, and damage. It involves a wide range of practices designed to safeguard data confidentiality, integrity, and availability. Effective internet security is crucial for individuals, businesses, and governments alike, as cyber threats continue to evolve in complexity and scale.
### Key Components of Internet Security
1. **Confidentiality**: Ensuring that information is accessible only to those authorized to access it.
2. **Integrity**: Protecting information from being altered or tampered with by unauthorized parties.
3. **Availability**: Ensuring that authorized users have reliable access to information and resources when needed.
## Common Internet Security Threats
Cyber threats are numerous and constantly evolving. Understanding these threats is the first step in protecting against them. Some of the most common internet security threats include:
### Malware
Malware, or malicious software, is designed to harm, exploit, or otherwise compromise a device, network, or service. Common types of malware include:
- **Viruses**: Programs that attach themselves to legitimate software and replicate, spreading to other programs and files.
- **Worms**: Standalone malware that replicates itself to spread to other computers.
- **Trojan Horses**: Malicious software disguised as legitimate software.
- **Ransomware**: Malware that encrypts a user's files and demands a ransom for the decryption key.
- **Spyware**: Software that secretly monitors and collects user information.
### Phishing
Phishing is a social engineering attack that aims to steal sensitive information such as usernames, passwords, and credit card details. Attackers often masquerade as trusted entities in email or other communication channels, tricking victims into providing their information.
### Man-in-the-Middle (MitM) Attacks
MitM attacks occur when an attacker intercepts and potentially alters communication between two parties without their knowledge. This can lead to the unauthorized acquisition of sensitive information.
### Denial-of-Service (DoS) and Distributed Denial-of-Service (DDoS) Attacks
Bridging the Digital Gap Brad Spiegel Macon, GA Initiative.pptxBrad Spiegel Macon GA
Brad Spiegel Macon GA’s journey exemplifies the profound impact that one individual can have on their community. Through his unwavering dedication to digital inclusion, he’s not only bridging the gap in Macon but also setting an example for others to follow.
1. Coordinating SGML Projects to Maximize Corporate Benefits
CALS Expo International '95
Conference Proceedings [pp 247-253]
National Security Industrial Association
Long Beach, California – 1995
Joseph Gollner – Euclid Consulting Group
Ernest O'Dell – Department of National Defence CALS Office
Abstract
The Canadian Department of National Defence (DND) has made reducing the
cost of published information a major priority. The Department recognizes that a
sizable portion of potential savings will result from improved information quality and
usability. The challenge lies in ensuring that all areas of potential savings and quality
improvements are addressed and addressed in such a fashion that facilitates a
constructive synergy between modernization activities. DND is looking at a number of
technologies and management disciplines as tools for meeting these objectives.
Emphasis is being placed on object orientation in the design, development and
deployment of reusable Standard Generalized Markup Language (SGML) components
and the application of Configuration Management (CM) practices to these objects.
DND is currently designing an environment that will allow projects to share SGML
objects and continuously coordinate the implementation of new capabilities without
undermining the corporate benefits of an integrated approach.
1: Maximizing Corporate Benefits
The kind of the reductions and cost savings currently being demanded of DND
can only be realized through department-wide improvements in efficiency. This is
widely accepted. There is even a general management consensus that many of these
efficiencies will be the result of better resource sharing, with resources understood to
be any type of asset – data, infrastructure, personnel or systems. This environment has
been conducive to the growth of SGML in DND. SGML is seen as a mechanism for
improving the quality, timeliness and cost-effectiveness of documentation through the
use of standard document structures and corporately-shared systems for creating,
publishing and maintaining information.
Following the line of thinking that stresses the importance of sharing resources,
there are some who would conclude, erroneously, that the surest way to realize these
corporate benefits lies in establishing standards and mandating compliance. This is an
erroneous conclusion in part because it has been tried before and has failed. In fact, it has
2. been tried many times before and has always failed. The main reason why this is an
erroneous conclusion, and why attempts to base implementations on this conclusion have
always failed, stems from the fact that a single, static solution cannot successfully address
any problem where variability and change are significant factors. Unfortunately, variability
and change characterize most of the problems confronting large organizations like DND.
This is not to say there are no solutions, only that there are no easy or fixed solutions.
Let's translate this into the SGML domain. Say that it is one of our primary
objectives, and it is, to ensure that the many SGML projects ongoing within DND
ultimately result in a set of corporately-shared applications to facilitate the creation,
publication and maintenance of information. The development cost of these
applications would stabilize after their completion while the benefits associated with
their use would grow for as long as these applications remained operational. We might
find this so compelling a prospect that we establish a management infrastructure for
SGML initiatives that would ensure that all initiatives and Programs use these
applications, and the document structures that support them. Through various policy
instruments, we might even attempt to strip Programs of the ability to deviate from
the mandated standards and practices. To guard against the possibility that the
corporately-shared applications may not yet be perfect, we would establish a change
management process whereby modifications would be proposed by Programs or SGML
initiatives and approved by a suitable control board after an exhaustive analysis of the
impact on all corporately-shared applications. Having done all this, we would feel that
we had succeeded and had made a sizable contribution to the efficiency objectives of
our organization. We would be wrong.
What we would have in fact done is commit the logical error outlined above.
Our desire to establish corporately-shared SGML applications would have been valid.
Even our desire to promote uniformity among implementations to safeguard the
benefits of corporately-shared applications would have been essentially valid. Where
we made our error was in hoping, even believing, that once established the
corporately-shared applications would be basically sufficient to address unforeseen or
changing requirements. We also erred in thinking that a single, centrally-managed
process for change control could successfully handle the rising tide of change requests
that were driven, not by minor deficiencies in the corporately-shared applications, but
by new requirements from new Programs and new initiatives. This latter error was the
fatal one. What we should have done was make the change management environment
itself a corporately-shared application that was designed to ensure that all corporately-
shared applications would be adaptive to the variability of Program requirements and
the inevitability of change.
3. 2: SGML in DND
Within DND, SGML has been enjoying a steady rise in popularity. Many policy
documents, including the Information Management Directives that establish the role of
CALS (Continuous Acquisition and Lifecycle Support) within the Department, are
currently authored and published using SGML. These documents now find their way, in
electronic form, into every unit and organization within DND. In the domain of
technical information, the DND CALS Office has made significant progress in
developing, validating, implementing and modifying a Document Type Definition (DTD)
to describe and manage all operations and support information currently published for
DND equipment. With this expanding base of SGML-encoded information, the
opportunity presents itself to develop, and introduce into service, corporately-shared
applications designed to leverage the investment in SGML structures and improve
documentation quality and usability.
3: Departmental Initiatives and the Role of SGML
3.1: Automated Translation
Bilingualism is a reality in Canada and the cost of translation between the two
official languages is something DND must continually address. One area of ongoing effort
is automated translation whereby an application attempts to establish a sufficiently
accurate "understanding" of a text to be able to render a tolerable translation. This can
be sufficient for "ad hoc" translations (i.e. to quickly determine the general content of a
text) but, depending upon the clarity of the source text, there can remain
insurmountable ambiguities. Obviously, this is not acceptable for official documentation
which must always be available in both official languages. This applies, significantly, to all
technical documentation. In these cases where accuracy counts, automated translation
can be used to perform the initial translation with the intent that this will reduce the
amount of time and effort required by highly skilled translators to complete the
translation. It is for this reason that SGML DTDs within DND associate English and French
content as consecutive elements typically within a container element. High volume
automated translation applications will be able to "fill" one language element based on
the content of its twinned counterpart. This approach permits translators to view both
elements (English and French) simultaneously, saving on cross-referencing time, and
facilitate verification that everything that requires translation has been translated.
3.2: Simplified English
The language of composition for most documentation in DND is English and
therefore the typical translation process takes content from English and translates it into
French. As mentioned previously, ambiguities in the source text make translation
4. (automated and manual) more difficult and therefore more expensive and error-prone.
English is notoriously prone to ambiguities and it is generally conceded that reducing the
cost of translation from English will be a function of reducing the ambiguity of the English
source. This is, in part, the rationale for Simplified English (SE). SE also realizes benefits in
making the English text more readable for native English speakers. In effect, SE
establishes a constrained vocabulary and restricted syntax that are applied to the
content of official documentation. Different SE rules can be applied to different
document types (e.g. a Policy document versus a technical manual) and to different parts
of the same document (e.g. theory of operations versus a safety precaution). The link to
SGML is obvious in that SE rules can be invoked according to the appropriate document
type and element. For example, the SE rules may be very explicit about the vocabulary
and syntax of a maintenance step requiring that the sentence describing a step always
begin with an action verb (themselves selected from a restricted list of possible action
verbs). This type of quality control application would be directly dependent upon the
available SGML structures and element names (e.g. <Mainttask> <Step>).
3.3: Defence Terminology Management System
Implementations of SE must make specific exceptions for technical terminology
which obviously will not comply with the requirements of the constrained vocabulary.
The management of this terminology is the function of the Defence Terminology
Management System (DTMS). As with SE, the DTMS seeks to enforce a consistent
usage of terminology across the Department such that texts are unambiguous and
easily translated. SGML will be used to explicitly identify occurrences of terminology
(e.g. <term>) within the text and associate each occurrence of the term with the
appropriate acronym and definition, themselves defined within the text. The strategy
of the DTMS is to execute terminology control during the authoring process as
opposed to using a downstream review activity. Terms will be defined, delineated with
markup, and validated by authors with the support of intelligent applications that can
locate terminology occurrences and ensure that usage is consistent across the
document and reconciles with the departmental term database. The departmental
term database is itself conceived as a "live" database based upon document instances
that incorporate approved terminology. The population of this database from active
documents will rely very heavily on the use of the appropriate SGML elements.
3.4: Publishing Services
The most immediate, and probably most significant, cost savings to be realized
through a corporately-shared system will be in the publishing domain. Standardized
formatting instructions are being developed that will apply to SGML instances created
5. according to known (and managed) DTDs and implement departmental formatting
guidelines. There is also a requirement for shared publishing systems that will process
formatting instructions and instances for the production of all required output formats.
Finally, there is a need for a department-wide capability in high-volume on-demand
printing, and electronic dissemination and retrieval. DND has initiatives underway to
provide all of the above. The objective is to establish a robust and adaptable SGML
publishing capability within the department. Establishing this type of capability has
consequences for many SGML components and, in particular, those that are directly
processed by the publishing system and that must provide sufficient information to
allow formatting instructions to be applied successfully. Base text elements (e.g.
paragraphs), tables and graphics must all manifest a measure of standardization and
completeness for formatting purposes. Once in place, however, the benefits to be
realized through a corporately-shared publishing system are not only significant in
terms of cost but also in terms of reduced complexity. Programs can conceivably
reallocate resources to content preparation and the specific objectives of the Program
as opposed to engaging in expensive publication publishing projects.
3.5: Integrated Equipment Management Environment
It is understood that technical publications do not exist in a vacuum and that a
high proportion of the information typically bundled in a "technical manual" in fact
emerges from other sources. Behind the entire push toward electronic SGML-based
technical information is the desire to improve the currency of accessed information
(e.g. configuration status on the equipment being serviced), the level of reuse of
information created during established processes (e.g. logistics data) and the ability of
maintainers to execute actions when required in the context of a larger procedure (e.g.
ordering a replacement part). It comes as no surprise, then, that DND is proceeding
with efforts to establish an integrated environment for managing equipment
information. The following are a sample of initiatives underway to ensure that SGML-
encoded information fits neatly within a larger data management regime:
Logistics Support Analysis Report (LSAR) / DND CALS DTD integration is being
prototyped to ensure that all information created during the Logistic Support
Analysis (LSA) process that will be reused within published technical information
has corresponding destination elements in SGML document structures;
Equipment Program Management Information System is an initiative, under the
Operation Excelerate re-engineering project for the Materiel Group, that will
establish an integrated working environment for the management of equipment
6. information with the consequent impact upon all information extracted from
this environment and published in SGML;
Material Support Data Model (MSDM) is a conceptual data model being
prepared for the information requirements of materiel support. Data definitions
established here will have a broad influence on all applications, including SGML-
based applications (e.g. DND CALS DTD), within the Materiel Group;
Management of Government Information Holdings (MGIH) is a government-
wide policy directing the manner in which departments, such as DND, manage
their information holdings. This impacts the way identification and management
information is described by departmental DTDs and implemented in the
resulting SGML instances.
4: Design, development and maintenance of shared SGML objects
The variety of initiatives within DND, each impacting upon different aspects of
SGML implementation, forces us to consider the issue of establishing, enforcing and
managing corporately-shared systems and standards. There are obvious potential
benefits offered by quality improvements, reduced processing costs (e.g. publishing
and translation) and better integration between published information and the
management processes for that information. However, it is also immediately apparent
that the complexities of implementing the required corporately-shared applications,
and coordinating the impact upon in-service SGML components, poses serious risks to
ever realizing these benefits. Having overcome these risks, the temptation will be great
to force closure on SGML component development and modification as soon as these
corporately-shared applications have been successfully implemented. As we have seen
before, surrendering to this temptation would be a mistake. How then will we bring
into service corporately-shared applications for translation, Simplified English,
terminology management, publishing and information management, bearing the
separate costs of each implementation, and ensure that new or changing requirements
can be accommodated without incurring an unacceptable cost in modifying each
application to suit each deviation? The answer lies in the way we design, develop and
maintain SGML objects.
4.1: Object-Orientation
To date, we have designed SGML DTDs, and associated components (e.g. output
specifications), strictly on an application by application basis. As soon as we must
simultaneously address the requirements of existing corporately-shared applications in
7. addition to the specific application being considered, then, as DTD designers, we are
constrained in the way we address certain functionality. What is important to
determine is where, and in what manner, we are constrained if we plan to produce an
SGML application that can also exploit the corporately-shared support applications.
Once this has been determined, we can assess what functional trade-offs may be
necessary due to conflicts between corporate and local requirements. To arrive at this
point, we must be able to access SGML sub-components, or objects, that provide the
building blocks for incorporating corporately-shared functionality into new SGML
constructs. Following this process also has a general impact in that our DTD will be a
modular construct rather than being a single, self-defining entity.
In this context, SGML objects are understood to be coherently defined groups of
inter-dependent SGML components. On the first level, an SGML object is comprised of
a logically integral DTD fragment together with the directly associated processing
instructions or requirements (e.g. formatting specification instances) for all
applications (corporate and local) to process information complying with the DTD
fragment. On a second level of detail, an SGML object contains all the information
required to understand and manage the functionality provided, or supported, by the
object. The SGML object includes all documentation for elements/attributes of the DTD
fragment, their associated processing requirements, the associated application
functionality, the user community and usage scenarios, current implementations, test
and demonstration instances, and management information about the object.
Management information would include identification, domain, version, revision,
Office of Primary Interest (OPI) or sponsor, Offices of Collateral Interest (OCI) or
stakeholders, creation and revision dates, and registration status (e.g. in development
or approved for use). By taking this approach we have effectively created "functional
objects" that encapsulate the data definitions (DTD fragment) together with the
associated methods (functionality). What we have, as a result, are meaningful SGML
building blocks that make it as straightforward as possible for DTD developers to
identify and understand functionality that is currently provided by corporately-shared
applications.
Recalling the corporate applications being developed within DND, we can begin
to see how this approach would operate in practice. Support for automated translation,
Simplified English validation, terminology management, publishing services and
information management guidelines has a direct impact upon the SGML object that
defines base text elements as well as common structures such as tables. In establishing
this logical grouping as an SGML object, the close link between the SGML structures
and the associated functionality is made explicit. Changes to base text elements can
potentially have dramatic implications for the associated functionality. Support for
Simplified English validation and information management guidelines (in particular,
data integration) depends on aspects of content specific deployments of SGML (e.g.
8. <Maint-task>). In the case of supporting the requirements of departmental information
management guidelines (MGIH), there is a very specific association of content elements
(e.g. <subject>) with desired functionality (e.g. archival storage and retrieval).
4.2: Configuration Management
Once we have created a set of SGML objects, what remains to be done is
establish the overall discipline of Configuration Management (CM). The intent, of
course, is not to deploy CM as an impediment to change but rather as a facilitator of
orderly, effective and timely change. With SGML objects that have been defined with
the intent of managing "functional objects", it becomes a natural extension to
implement the change review and status accounting practices that derive from
applying CM to these objects. Because data structures and functionality are
encapsulated together in the SGML objects, it is possible to discretely identify the
variability tolerance of different SGML elements within an object. As an example, a
proposed change to the mixed content model for English text (e.g. changing the use of
attributes within the emphasis element) may introduce processing complexities that
would very quickly indicate that the cost of altering the corporately-shared applications
outweighs the perceived benefits of the proposed change. Due to the number of
applications that depend directly upon a predictable structure within the base text
elements of an instance, the variability tolerance of the SGML object encompassing
these base elements may be, in general, determined to be low. Other SGML objects,
however, may be far more tolerant of variations and in fact support a large number of
implementation-specific element variants. This is the critical point about the
application of CM to SGML objects. With the proposed approach, requested changes
are reviewed in the context of the real impacts of that change. If a proposed change
(such as the introduction of a Program-specific content element) requires only minor
changes to a single shared application then the change could be rapidly approved with
the proviso that the requesting organization fund the minor changes so as to retain the
benefits of the impacted shared system. This is the type of interaction between local
implementers and corporate standards that encourages participation in the corporate
change management process rather than forcing Program offices, with set budgets and
schedules, to "drop out" of the process due to unresponsiveness or unreasonable (i.e.
unfounded) constraints. This is in distinct contrast to some change management
philosophies that treat any change request as a threat to the free world.
4.3: Shared Development Environment (SDE)
Responsiveness becomes a key aspect in determining whether or not a
corporate change management process for SGML objects is ultimately successful in
9. accommodating necessary changes while protecting the benefits of corporately-shared
applications. The desirability of responsiveness to local implementers attempting to
address real requirements has the effect of making the change management process
itself a corporately-shared application. It is for this reason that DND is proceeding with
plans to implement a comprehensive SGML Registry and Repository (SRR) system
where implementers will directly access SGML objects and, as necessary, initiative the
change review processes associated with these SGML objects. It would be a deliberate
move to name this shared application a Shared Development Environment (SDE) as
opposed to a central change management process. This would make it clear that the
SDE is a resource to be used by Programs in implementing higher quality SGML
applications. In effect, this would recognize that the value of corporately-shared
applications is the product of effective usage which is itself the product of the
responsiveness and effectiveness of the SDE. Understanding the SRR as a SDE would
also recognize the reality that local implementers within Programs will deviate from
using corporate SGML objects as soon as the price of compliance and foregone
functionality outweighs the benefits of the corporate applications. The operation of
the SDE becomes a service to SGML implementers.
4.4: A Typical SDE Process
An example will serve to illustrate how the SDE would function. Suppose that we
have been asked to form a DTD development team to address the publication
requirements of a new piece of communications equipment. To date, we have had
little or no exposure to the departmental information management and publications
bureaus. After a few inquiries, we learn of the existence of the SGML Registry and
Repository and its availability as a Shared Development Environment for use by the
team. Accessing the SDE, we encounter a vast repository of information regarding
corporately-shared applications that can be used to address our requirements. We also
discover that we are not alone and that there are other Program initiatives that, from
the documentation available, appear to share a remarkable similarity to our own.
Once the appropriate resources have been added to our DTD team, we begin
the obvious task of accessing how many of our requirements are addressed by the
available SGML objects. We discover that the publication specifications called up in our
supplier contract are automatically supported by in-house publishing systems if we
adopt the corporate SGML object for base text elements including tables. This makes
our first decision, that of adopting this particular object, an easy one. Further analysis
indicates, however, that we do have a requirement for a set of content tags that are
very specific to the target piece of equipment although many of the available content
tags, such as those addressing standard maintenance procedures, will suit our
purposes perfectly.
10. The DTD team directs its energies to modelling these unique features. The DTD
team makes extensive use of the online reference material provided by the SDE. The
team finds industry-adopted DTDs for comparable communications equipment and
borrows heavily to address our requirements. The team also finds a DND SGML object
that embodies content elements that are closely related to those we seek to develop.
The team decides to address our requirements by customizing (through augmentation)
this SGML object. Once this modelling is complete, we assemble the building blocks into
what will become our DTD. We are ready to invoke some of the services of the SDE.
The DTD team electronically submits a request for a specific range of services
from the SDE administrators. Firstly, we would like a technical review to be performed
on our DTD. This review will test our DTD for validity, using multiple parsers,
compatibility with multiple commercial applications used in the Department, and for
processability by corporately-shared applications such as publishing services.
Simultaneously, we request that the variant of the SGML object that we have created
by adding new elements to an existing SGML object be considered for certification as
either the new official version of the SGML object or as an approved variant.
The results of the technical review are returned to us electronically after many
of the tests have been performed programmatically based on our request. The review
includes the observations and recommendations on our content model as made by the
SDE administrator and one subject matter specialist (from another project) whose
input was solicited by the SDE administrator. One section of the review highlights
where the approach that we have taken will pose some difficulties for shared
applications. The report specifies what applications are effected and in what way.
Many of the recommendations received in fact identify changes in our content model
that would resolve the difficulties while still meeting our intended objectives.
As a separate event, we receive a detailed response to our request for
certification. The certification response includes input from a variety of stakeholders,
all of whom are currently using the SGML object that we are proposing to modify. The
certification response indicates that all critical issues raised by the stakeholders or by
incompatibilities with corporately-shared applications will have to be resolved if the
request is to result in a new version of the SGML object being certified. If, after further
analysis, unresolved issues remain related to specific stakeholders but all
incompatibilities with corporately-shared applications are resolved, then certification
as an approved variant would be forthcoming.
From the perspective of the DTD development team, it is perfectly clear where
we stand in terms of the change management process. The apparent responsiveness of
that process to our requests has given us confidence that we will be able to address
our requirements while also participating in, and benefiting from, corporately-shared
SGML applications.
11. 4.5: SDE Performance Metrics
There are two critical features of the Shared Development Environment (SDE)
for SGML objects. Firstly, the reusable SGML objects, reference information, and
support services must be value-added to the degree that Programs and new SGML
initiatives will, by accessing the SDE, automatically reduce the cost and risk of their
own SGML implementations. Secondly, the responsiveness of the SDE to requests for
information, services or certification must be optimized. Time is one of the most
precious commodities in any Program. Unless implementers can determine that the
entire SGML SDE and change management process is supporting them and their
requirements, and doing so in a timely fashion, then the willing participation of those
implementers will not be forthcoming. The process, as a result, will break down. All
corporately-shared applications that rely upon some measure of uniformity in SGML
implementations within the department will be at risk.
The ability of the SDE to gather performance metrics will be an important
feature. The SDE must monitor response times for service requests, the development
times for new SGML objects, and the process cycle times for certification and change
request review. The SDE will be designed to be continually adapting and improving.
The administrators of the SDE must be oriented toward introducing improved
automation such as automated workflow to distribute input requests to stakeholders
of an impacted SGML objects. Performance metrics will provide the best mechanism to
determine the effectiveness of the SDE and the departmental SGML strategy. It is the
best way to ensure, through proactive corrections to the process, that the benefits
realized today through corporately-shared applications are realized tomorrow.
5: Conclusion
Systems, like life forms, must adapt to survive. This truism underlines the basic
approach being taken in DND to managing investments in SGML objects and shared
applications. The value to the enterprise of the savings and efficiency gains offered by
just the SGML initiatives underway today, makes it imperative that we get the
management of departmental SGML objects and applications right. It becomes
apparent that the most important of the corporately-shared applications is the change
management system itself. A successful change management system will ensure that
SGML objects, shared applications, and Program and user requirements evolve in a
coordinated manner. To be successful, the change management system must be
responsive to new requirements as well as being highly attuned to its own
performance. Given the general objective of coordinating SGML projects to maximize
corporate benefits, the change management system must be fully "adaptive", adjusting
its operations to match its performance to the demands of the environment. It must be
granted that establishing a management regime that strives to protect corporate
12. benefits by effectively, and proactively, supporting local initiatives is both technically
and culturally challenging. Unfortunately, it is also the only approach that has any
chance of success. To return to our guiding truism, where change and variability are
part of the environment, systems, like life forms, must adapt to survive.
5.1: Cultural Issues
As a final note, we should emphasize that there are two major cultural issues
that present themselves when we consider this approach. The first issue pertains to
Programs. Programs must be encouraged to actively participate in the established
corporate processes rather than budgeting and planning for maximum self-sufficiency.
Within the typical Program office, the general desire to reduce risk by limiting
dependencies on matrix organizations often manifests itself as a genuine belief that
their requirements are unique, and therefore cannot be supported by a corporately-
shared application. This inclination must be replaced by the desire to reuse what is
available and specifically address what is actually unique.
The second issue pertains to the central agencies that are typically called upon
to administer standards and shared applications. These organizations have existed for
many years and have survived the continuous pressure to reduce overhead by
establishing their indispensability as guardians of standards and policies. For many of
these organizations it is the protracted dialogue on waivers and change requests that
gives them organizational life. In the near future, however, this type of role will no
longer be tolerated within streamlined enterprises. What will be tolerated, even
demanded, will be proactive coordination services such as those described for the SDE.
The cultural shift for these organizations will be a change in how we understand the
value of standardization. Effective standardization will no longer be measured by
effective enforcement. Effective standardization will be measured by effective usage
and by the benefits achieved thereby. It is a very new world for many who are familiar
with older operational paradigms. However, to re-use our truism in a slightly different
way, life forms, like systems, must adapt to survive.
13. Epilogue
Looking back on this, from the vantage point of 2016, we can see several noteworthy
things in this article. For example, we see how a large enterprise came to view
document content in terms of modular content models and element variants. We also
see the vital importance of associating these constructs with application behavior
without which it is impossible to meaningfully assess or manage those constructs
(following the logic of formal functionality-grounded Configuration Management).
It is probably worth noting that armed with these best practices, the DND CALS Office
was able to effectively address complex multilateral information challenges across
NATO where other markup standards (specifically the Department of Defense’s library
of CALS markup standards) had proven unsustainable.
It is also worth pointing out that the markup standard that evolved as a result of this
effort, the DND CALS Document Type Definition (DTD) and shared application suite,
prefigured many of the core tenets now seen in the Darwin Information Typing
Architecture (DITA). This should be taken as a corroboration and validation of DITA as it
shows how similar ideas will emerge to address similar problems and how those
problems are widely experienced by organizations large and small.
- Joe Gollner (2016)