The document discusses strengths and weaknesses of adopting the Darwin Information Typing Architecture (DITA). It outlines some of DITA's benefits such as supporting single sourcing, reuse, and semantic markup. However, it also notes challenges with DITA being both too restrictive and flexible for some users. The document provides context on what DITA is and is not, and manages expectations for how DITA should be viewed and implemented.
Converting and Integrating Legacy Data and Documents When Implementing a New CMSdclsocialmedia
If you are in the Insurance and Financial industries, attend this webinar and learn the roadmap for implementing a content management system with a customized conversion process.
In this webinar, I will showcase scenarios in which content analysis and design were more collaborative endeavors, and advocate for getting designers and content experts in conversation early on. The result is a better product and less stressful releases.
Attend this webinar as DCL & Comtech Services review the results of the 2016 Industry Trends survey. Learn innovative approaches to development/delivery and more.
This session, targeted at decision makers, consultants, and information professionals, introduces the concepts behind structured content and discusses the benefits and challenges to adoption.
Developing and Implementing a QA Plan During Your Legacy Data to S1000Ddclsocialmedia
This document discusses developing and implementing a quality assurance (QA) plan when converting legacy data. It recommends planning the conversion by asking important initial questions, learning from others, and preparing for the next steps. The document outlines DCL's project startup methodology, including inventorying and assessing the content to convert, prioritizing what to convert and when, analyzing content reuse, creating a conversion specification, normalizing the data, and viewing converted data during quality control. The overall message is to thoroughly plan the conversion by involving stakeholders, understanding the content, and establishing a solid process.
In this DCL Webinar, long-time DITA champion Don Day will talk about the basic principles of lightweight structured authoring and the current work of the OASIS Lightweight DITA Subcommittee along those lines. And since this is a work in progress, Don will lay out some practical steps you can take today to start taking advantage of some of these principles as we anticipate the Subcommittee's eventual recommendations.
Managing Deliverable-Specific Link Anchors: New Suggested Best Practice for Keysdclsocialmedia
1) The document discusses using keys to define and maintain publicly linkable anchors in deliverables produced from DITA source.
2) It recommends putting unique keys on each navigation topicref that should be publicly linkable or cross-referenced, and using navigation keys to determine deliverable anchors.
3) The keys ensure anchors are reliably persistent and do not change from release to release for the same logical component.
Converting and Integrating Legacy Data and Documents When Implementing a New CMSdclsocialmedia
If you are in the Insurance and Financial industries, attend this webinar and learn the roadmap for implementing a content management system with a customized conversion process.
In this webinar, I will showcase scenarios in which content analysis and design were more collaborative endeavors, and advocate for getting designers and content experts in conversation early on. The result is a better product and less stressful releases.
Attend this webinar as DCL & Comtech Services review the results of the 2016 Industry Trends survey. Learn innovative approaches to development/delivery and more.
This session, targeted at decision makers, consultants, and information professionals, introduces the concepts behind structured content and discusses the benefits and challenges to adoption.
Developing and Implementing a QA Plan During Your Legacy Data to S1000Ddclsocialmedia
This document discusses developing and implementing a quality assurance (QA) plan when converting legacy data. It recommends planning the conversion by asking important initial questions, learning from others, and preparing for the next steps. The document outlines DCL's project startup methodology, including inventorying and assessing the content to convert, prioritizing what to convert and when, analyzing content reuse, creating a conversion specification, normalizing the data, and viewing converted data during quality control. The overall message is to thoroughly plan the conversion by involving stakeholders, understanding the content, and establishing a solid process.
In this DCL Webinar, long-time DITA champion Don Day will talk about the basic principles of lightweight structured authoring and the current work of the OASIS Lightweight DITA Subcommittee along those lines. And since this is a work in progress, Don will lay out some practical steps you can take today to start taking advantage of some of these principles as we anticipate the Subcommittee's eventual recommendations.
Managing Deliverable-Specific Link Anchors: New Suggested Best Practice for Keysdclsocialmedia
1) The document discusses using keys to define and maintain publicly linkable anchors in deliverables produced from DITA source.
2) It recommends putting unique keys on each navigation topicref that should be publicly linkable or cross-referenced, and using navigation keys to determine deliverable anchors.
3) The keys ensure anchors are reliably persistent and do not change from release to release for the same logical component.
Is Your Enterprise “fire-fighting” translation issues? Optimize the process w...dclsocialmedia
Join Scott Carothers, Senior Globalization Executive at Kinetic the Technology Agency for an overview of specific translation metrics that will assist your enterprise in optimizing the translation process, and assist you in leading your organization as an advocate of continual process improvement.
Content Engineering and The Internet of “Smart” Thingsdclsocialmedia
The Smart Ass™ Fan is the latest ceiling fan from Big Ass Fans®. Smart products are everywhere now, and they’re connected. Imagine a family of smart products and how much content could be/should be shared. These products can include mechanical, electrical and software parts AND content.
How will you deal with this explosive content requirement? This webinar takes a tour of the problem and explains what content engineering is …and how it can be used to create a sustainable content life cycle. Smart products need smart content.
Preparing Your Legacy Data for Automation in S1000Ddclsocialmedia
This document discusses preparing legacy data for automation in S1000D. It outlines the challenges of converting traditional linear documents into the modular structure required by S1000D. These challenges include identifying reusable content, assigning data modules and codes, and structuring information across publications. The document recommends planning thoroughly for a conversion project, including assessing source materials, analyzing content reuse, specifying the conversion, and normalizing data. It describes setting up the conversion project, performing document analysis, and developing a detailed specification to guide the conversion process.
Converting and Integrating Content When Implementing a New CMSdclsocialmedia
This document discusses converting content when moving to a new content management system (CMS). It highlights key considerations for the conversion like choosing an appropriate XML schema and addressing legacy content. The document also shares lessons learned from surveying 12 companies that implemented DITA, including common business drivers, implementation timelines, and maximizing benefits of content reuse. Overall, the document provides guidance on planning a successful content conversion project when adopting a new CMS.
Structured authoring involves writing content in a modular, reusable way. It allows information to be:
1) Assembled and published in different contexts like various documents, on websites, or as help files.
2) Easily updated and maintained through single sourcing where content is written once and reused many times.
3) Accessed and analyzed using semantic markup which labels content with metadata about its meaning rather than just presentation.
New Directions 2015 – Changes in Content Best Practicesdclsocialmedia
The Center for Information-Development Management (CIDM) and Data Conversion Laboratories (DCL) announce the results of our 2015 Industry Trends Survey. Comparisons with these surveys in previous years provides you with a comprehensive view of what is the same and what is changing in technical information best practices.
10 Million Dita Topics Can't Be Wrong, December 6th, 2016, Webinar by Keith Schengili-Roberts, IXIASOFT DITA Specialist, Hosted by Scott Abel at The Content Wrangler Virtual Summit
Using Markdown and Lightweight DITA in a Collaborative EnvironmentIXIASOFT
Using Markdown and Lightweight DITA in a Collaborative Environment, by Keith Schengili-Roberts, IXIASOFT DITA Evangelist and Market Researcher and Leigh W. White, IXIASOFT DITA Specialist, at the CIDM CMS DITA North America, April 2017
Enabling Telco to Build and Run Modern Applications Tugdual Grall
This document discusses how MongoDB can help enable businesses to build and run modern applications. It begins with an overview of Tugdual Grall and his background. It then discusses how industries and data have changed, driving the need for a next generation database. The rest of the document provides an overview of MongoDB, including the company, technology, and community. Examples are given of how MongoDB has helped companies in the telecommunications industry achieve a single customer view, improve product catalogs and personalization, and build mobile and open data APIs.
DCL offers data-driven user experience services including document digitization, XML and HTML conversion, eBook production, and hosted solutions. They blend years of conversion experience with cutting-edge technology and infrastructure to make the content transformation process easy and efficient. DCL serves a broad client base across many industries, including aerospace, defense, education, government, libraries, publishing, and technology. They provide world-class services, leading technology, unparalleled infrastructure, and US-based management for complex content projects.
451 Research + NuoDB: What It Means to be a Container-Native SQL DatabaseNuoDB
This document discusses how traditional SQL databases anchor enterprises to the past and hinder digital transformation efforts. It introduces NuoDB as a container-native SQL database that can be fully deployed within container platforms. NuoDB addresses limitations of traditional and NoSQL databases by providing elastic SQL, ACID compliance, zero downtime, and horizontal scalability while running in containers on commodity hardware and clouds.
GraphTalks Stuttgart - Einführung in Graphdatenbanken und Neo4jNeo4j
This document provides an agenda for the Neo4j GraphTalks event. The agenda includes:
- Breakfast and networking from 09:00-09:30.
- An introduction to graph databases and Neo4j from 09:30-10:00 by Bruno Ungermann from Neo4j.
- A presentation on semantic data management from 10:00-11:00 by Dr. Andreas Weber from semantic PDM.
- A presentation on how to make graph database projects successful from 11:00-11:30 by Stefan Kolmar from Neo4j.
- An open discussion from 11:30 onward moderated by Alexander Erdl from Neo4j
How to Make your Graph DB Project Successful with Neo4j ServicesNeo4j
Neo4j is widely used across many industries to tackle a multitude of modern-day business challenges. From powering Walmart’s retail recommendation system, to detecting fraud at Fortune 500 financial institutions, to optimizing delivery service routing at eBay, the Neo4j team has helped implement projects across a wide spectrum of industries and use-cases. Using this breadth of experience combined with a unique expertise in the application of graph databases, the Neo4j Consulting team offers a number of services ranging from product training, PoC evaluations and early data modelling, to getting projects into production on the Neo4j graph database.
Attend this webinar to hear how other top organisations have quickly and successfully launched their graph database projects by leveraging Neo4j Consulting Services and learn more about the different offerings available.
Slides: NoSQL Data Modeling Using JSON Documents – A Practical ApproachDATAVERSITY
After three decades of relational data modeling, everyone’s pretty comfortable with schemas, tables, and entity-relationships. As more and more Global 2000 companies choose NoSQL databases to power their Digital Economy applications, they need to think about how to best model their data. How do they move from a constrained, table-driven model to an agile, flexible data model based on JSON documents?
This webinar is intended for architects and application developers who want to learn about new JSON document data modeling approaches, techniques, and best practices. This webinar will show you how to get started building a JSON document data model, how to migrate a table-based data model to JSON documents, and how to optimize your design to enable fast query performance.
This webinar will provide practical, experience-based advice and best practices for modeling JSON documents, including:
- When to embed or not embed objects in your JSON document
- Data modeling using a practical data access pattern approach
- Indexing your JSON documents
- Querying your data using N1QL (SQL for JSON)
Neo4j GraphTalks - Einführung in GraphdatenbankenNeo4j
The document announces a Neo4j GraphTalks event in October 2016 in Berlin. It includes an agenda with presentations on ADAMA's use of Neo4j for data sharing and knowledge management, and their experiences implementing and demoing Neo4j. There will also be an open networking session with NeoTechnology and PRODYNA representatives.
DITA for Small Teams: An Open Source Approach to DITA Content Managementdclsocialmedia
Eliot Kimbler describes a general approach to using common and easily-available open-source tools to provision an authoring and production support system suitable for small teams of authors.
Minimalism Revisited — Let’s Stop Developing Content that No One Wantsdclsocialmedia
Dr. JoAnn Hackos, Comtech President and Director of the Center for Information-Development Management (CIDM), demonstrates how using a minimalist approach in developing content is more relevant today than ever before. Busy customers simply want simple help on performing a task and getting a job done. Learn what minimalism really feels like. Learn about designing minimalist information that gets your customers coming back for more.
Is Your Enterprise “fire-fighting” translation issues? Optimize the process w...dclsocialmedia
Join Scott Carothers, Senior Globalization Executive at Kinetic the Technology Agency for an overview of specific translation metrics that will assist your enterprise in optimizing the translation process, and assist you in leading your organization as an advocate of continual process improvement.
Content Engineering and The Internet of “Smart” Thingsdclsocialmedia
The Smart Ass™ Fan is the latest ceiling fan from Big Ass Fans®. Smart products are everywhere now, and they’re connected. Imagine a family of smart products and how much content could be/should be shared. These products can include mechanical, electrical and software parts AND content.
How will you deal with this explosive content requirement? This webinar takes a tour of the problem and explains what content engineering is …and how it can be used to create a sustainable content life cycle. Smart products need smart content.
Preparing Your Legacy Data for Automation in S1000Ddclsocialmedia
This document discusses preparing legacy data for automation in S1000D. It outlines the challenges of converting traditional linear documents into the modular structure required by S1000D. These challenges include identifying reusable content, assigning data modules and codes, and structuring information across publications. The document recommends planning thoroughly for a conversion project, including assessing source materials, analyzing content reuse, specifying the conversion, and normalizing data. It describes setting up the conversion project, performing document analysis, and developing a detailed specification to guide the conversion process.
Converting and Integrating Content When Implementing a New CMSdclsocialmedia
This document discusses converting content when moving to a new content management system (CMS). It highlights key considerations for the conversion like choosing an appropriate XML schema and addressing legacy content. The document also shares lessons learned from surveying 12 companies that implemented DITA, including common business drivers, implementation timelines, and maximizing benefits of content reuse. Overall, the document provides guidance on planning a successful content conversion project when adopting a new CMS.
Structured authoring involves writing content in a modular, reusable way. It allows information to be:
1) Assembled and published in different contexts like various documents, on websites, or as help files.
2) Easily updated and maintained through single sourcing where content is written once and reused many times.
3) Accessed and analyzed using semantic markup which labels content with metadata about its meaning rather than just presentation.
New Directions 2015 – Changes in Content Best Practicesdclsocialmedia
The Center for Information-Development Management (CIDM) and Data Conversion Laboratories (DCL) announce the results of our 2015 Industry Trends Survey. Comparisons with these surveys in previous years provides you with a comprehensive view of what is the same and what is changing in technical information best practices.
10 Million Dita Topics Can't Be Wrong, December 6th, 2016, Webinar by Keith Schengili-Roberts, IXIASOFT DITA Specialist, Hosted by Scott Abel at The Content Wrangler Virtual Summit
Using Markdown and Lightweight DITA in a Collaborative EnvironmentIXIASOFT
Using Markdown and Lightweight DITA in a Collaborative Environment, by Keith Schengili-Roberts, IXIASOFT DITA Evangelist and Market Researcher and Leigh W. White, IXIASOFT DITA Specialist, at the CIDM CMS DITA North America, April 2017
Enabling Telco to Build and Run Modern Applications Tugdual Grall
This document discusses how MongoDB can help enable businesses to build and run modern applications. It begins with an overview of Tugdual Grall and his background. It then discusses how industries and data have changed, driving the need for a next generation database. The rest of the document provides an overview of MongoDB, including the company, technology, and community. Examples are given of how MongoDB has helped companies in the telecommunications industry achieve a single customer view, improve product catalogs and personalization, and build mobile and open data APIs.
DCL offers data-driven user experience services including document digitization, XML and HTML conversion, eBook production, and hosted solutions. They blend years of conversion experience with cutting-edge technology and infrastructure to make the content transformation process easy and efficient. DCL serves a broad client base across many industries, including aerospace, defense, education, government, libraries, publishing, and technology. They provide world-class services, leading technology, unparalleled infrastructure, and US-based management for complex content projects.
451 Research + NuoDB: What It Means to be a Container-Native SQL DatabaseNuoDB
This document discusses how traditional SQL databases anchor enterprises to the past and hinder digital transformation efforts. It introduces NuoDB as a container-native SQL database that can be fully deployed within container platforms. NuoDB addresses limitations of traditional and NoSQL databases by providing elastic SQL, ACID compliance, zero downtime, and horizontal scalability while running in containers on commodity hardware and clouds.
GraphTalks Stuttgart - Einführung in Graphdatenbanken und Neo4jNeo4j
This document provides an agenda for the Neo4j GraphTalks event. The agenda includes:
- Breakfast and networking from 09:00-09:30.
- An introduction to graph databases and Neo4j from 09:30-10:00 by Bruno Ungermann from Neo4j.
- A presentation on semantic data management from 10:00-11:00 by Dr. Andreas Weber from semantic PDM.
- A presentation on how to make graph database projects successful from 11:00-11:30 by Stefan Kolmar from Neo4j.
- An open discussion from 11:30 onward moderated by Alexander Erdl from Neo4j
How to Make your Graph DB Project Successful with Neo4j ServicesNeo4j
Neo4j is widely used across many industries to tackle a multitude of modern-day business challenges. From powering Walmart’s retail recommendation system, to detecting fraud at Fortune 500 financial institutions, to optimizing delivery service routing at eBay, the Neo4j team has helped implement projects across a wide spectrum of industries and use-cases. Using this breadth of experience combined with a unique expertise in the application of graph databases, the Neo4j Consulting team offers a number of services ranging from product training, PoC evaluations and early data modelling, to getting projects into production on the Neo4j graph database.
Attend this webinar to hear how other top organisations have quickly and successfully launched their graph database projects by leveraging Neo4j Consulting Services and learn more about the different offerings available.
Slides: NoSQL Data Modeling Using JSON Documents – A Practical ApproachDATAVERSITY
After three decades of relational data modeling, everyone’s pretty comfortable with schemas, tables, and entity-relationships. As more and more Global 2000 companies choose NoSQL databases to power their Digital Economy applications, they need to think about how to best model their data. How do they move from a constrained, table-driven model to an agile, flexible data model based on JSON documents?
This webinar is intended for architects and application developers who want to learn about new JSON document data modeling approaches, techniques, and best practices. This webinar will show you how to get started building a JSON document data model, how to migrate a table-based data model to JSON documents, and how to optimize your design to enable fast query performance.
This webinar will provide practical, experience-based advice and best practices for modeling JSON documents, including:
- When to embed or not embed objects in your JSON document
- Data modeling using a practical data access pattern approach
- Indexing your JSON documents
- Querying your data using N1QL (SQL for JSON)
Neo4j GraphTalks - Einführung in GraphdatenbankenNeo4j
The document announces a Neo4j GraphTalks event in October 2016 in Berlin. It includes an agenda with presentations on ADAMA's use of Neo4j for data sharing and knowledge management, and their experiences implementing and demoing Neo4j. There will also be an open networking session with NeoTechnology and PRODYNA representatives.
DITA for Small Teams: An Open Source Approach to DITA Content Managementdclsocialmedia
Eliot Kimbler describes a general approach to using common and easily-available open-source tools to provision an authoring and production support system suitable for small teams of authors.
Minimalism Revisited — Let’s Stop Developing Content that No One Wantsdclsocialmedia
Dr. JoAnn Hackos, Comtech President and Director of the Center for Information-Development Management (CIDM), demonstrates how using a minimalist approach in developing content is more relevant today than ever before. Busy customers simply want simple help on performing a task and getting a job done. Learn what minimalism really feels like. Learn about designing minimalist information that gets your customers coming back for more.
This session will specifically address the analysis phase including considerations such as where the inconsistencies lie, how the content is currently being reused or not, how translation services are applied as a measure of quality, what channels does the content need to support, what issues each channel may have in using the content, does task-based authoring make sense and more in order to achieve the maximum ROI.
10 Mistakes When Moving to Topic-Based Authoringdclsocialmedia
But moving to topic-based authoring can be one of the most expensive things you've ever done. In this talk, Sharon Burton will show you the top 10 mistakes made by companies and how you can avoid them. These mistakes can include missing deadlines, delivering poor quality content, or not integrating this content development strategy into the rest of the product development strategy.
Content Conversion Done Right Saves More Than Moneydclsocialmedia
Can you significantly reduce your conversion costs – by 25% or more – without sacrificing quality? The answer is a resounding yes, and this webinar will review the proven methods and best practices for achieving that goal.
Precision Content™ Tools, Techniques, and Technologydclsocialmedia
This webinar will explore fundamental principles for writing and structuring content for the enterprise. Attendees will learn how to approach information typing for structured authoring for more concise and reusable content.
Attend this session and explore the unseen world of metadata. Learn essential concepts about metadata and taxonomies used to organize metadata. Discuss the role standards play in the design of metadata and controlled vocabularies. Start to formulate strategies and tactics to take control of your metadata.
Using HTML5 to Deliver and Monetize Your Mobile Contentdclsocialmedia
This document discusses how HTML5 can be used to deliver and monetize mobile content. It provides an overview of Data Conversion Laboratory (DCL) and their services in converting content. The document then discusses how mobile content consumption continues to grow, especially on smartphones and tablets rather than desktop. It analyzes different routes for delivering HTML5 applications and the results of a survey on HTML5 adoption. The document concludes that HTML5 is the best approach for future-proofing mobile content and that its adoption should increase, though some browser and API limitations remain.
This document provides an overview and update on DITA, EPUB, and HTML5 standards. It discusses the current state of EPUB3 and HTML5, how DITA 1.3 aligns with these standards, and tools for generating EPUB3 and HTML5 outputs from DITA. It also includes screenshots of real EPUB and HTML5 outputs generated from DITA using various open-source and commercial tools.
Join this webinar to learn:
• What SPL is
• How it affects medical devices
• The relationship between SPL and UDI
• What medical device manufacturers can learn from the pharmaceutical industry
• How you can automatically create SPL documents with your standard labeling content
While open-source solutions may have no purchase cost, total costs including configuration, customization and support can equal proprietary solutions. DITA provides benefits like reuse and translation but has limitations in areas like graphics, equations, custom output and legacy content migration. PDF publishing from DITA is especially challenging due to the complexity of XSL-FO. DITA works best for organizations with significant reuse across contexts and languages, while smaller groups may find its limitations easier to overcome.
Improve your Chances for Documentation Success with DITA and a CCMS LavaCon L...IXIASOFT
This document discusses how adopting DITA and a content management system (CCMS) can improve documentation success. It outlines key features of DITA including content reuse. Four main reasons for adopting DITA and a CCMS are discussed: needing more efficiency, outgrowing current tools, rising localization costs, and needing content verification. Four things that can be done with DITA and a CCMS are also presented: versioning content, implementing workflows, measuring documentation metrics, and improving localization. The presenter is then available for questions.
Supercharge Your Authoring - ASTC Conference 2018Gareth Oakes
This document summarizes Gareth Oakes' presentation on supercharging authoring at the ASTC Conference 2018. It discusses the evolution of publishing technology from early desktop publishing to modern multi-channel publishing systems using XML. It outlines challenges clients face with authoring and managing large technical documents across multiple channels. The presentation reviews trends in software-as-a-service and evaluates several structured authoring tools based on criteria like usability, features, and integration with content repositories and publishing systems. Top recommendations include easyDITA, FontoXML, and the Quark Publishing Platform. A live demonstration of the Quark system is provided.
Keith Schengili-Roberts: Improve Your Chances for Documentation Success with ...Jack Molisani
Keith Schengili-Roberts presented on how moving to DITA and a content management system (CCMS) can improve chances of documentation success. He outlined key features of DITA like content reuse and separation of form from content. Four chief reasons for wanting to move included needing more efficiency, outgrowing current tools, rising localization costs, and content verification needs. A CCMS allows for versioning, workflow, metrics on production and reuse, and improved localization. It was argued the benefits outweigh upfront costs over time through opportunities for process improvement.
Session at tcworld 2016. Organized by Kristen James Eberlein (Eberlein Consulting LLC); other participants were Joe Gollner (Gnostyx), George Bina (SyncroSoft), Jean-François Ameye (IXIASOFT), and Eliot Kimber (Contrext).
DITA and Information Architecture for Responsive Web Designdclsocialmedia
Increasingly people are reading your technical content using a mobile device. How can you ensure that your DITA-based content can be read equally-well by a lineman using his weatherproofed tablet 18ft above the street, or an electronics engineer using her smartphone in a clean-room environment? The answer: responsive content. But designing effective responsive content is not just a matter of picking an HTML template and hoping for the best: you need to think about how your content will be presented, its priority to the user and how they can navigate through it. In this presentation Keith Schengili-Roberts and Phil Kneer from Yellow Pencil will talk about the information architecture considerations behind the creation of effective responsive design for technical content.
Painless XML Authoring?: How DITA Simplifies XMLScott Abel
Presented at DocTrain East 2007 by Bob Doyle, DITA Users -- This introduction to XML Authoring will acquaint you with over fifty tools aimed at structuring content with DITA. They are not just DITA-compliant authoring tools (editors) for writers. They also include content management systems (CMS), translation management systems (TMS), and dynamic publishing engines that fully support DITA. You will also need to know about tools that convert legacy documents to DITA and help to design stylesheets for DITA deliverables. The best DITA tools for technical communicators implement the DITA standard while hiding all the complexity of the underlying XML (eXtensible Markup Language).
As a tech writer and not a tech, you should be able to forget about XML - except to know that you are using it (DITA is XML) and that it consists of named content elements (or components) with attributes. You need to know enough about the content elements so you can reference (conref) them for reuse. You need to know about their attributes so you can filter on them for conditional processing. And you should appreciate that because components are uniquely identifiable they lend themselves perfectly to automated dynamic assembly using a publishing engine.
We will describe how you can get started with structured writing without knowing XML or installing anything.
The promise of topic-based structured authoring is not simply better documentation. It is the creation of mission-critical information for your organization, written with a deep understanding of your most important audiences, that can be repurposed to multiple delivery channels and localized for multilingual global markets. You are not just writing content, you are preparing the information deliverables that enhance the value of your organization in all its markets.
To do that well, you must understand the latest tools in structured writing that are revolutionizing corporate information systems - today in documentation but tomorrow throughout the enterprise, from external marketing to internal human resources. Whether you are trying to push a new product into a new market or are “onboarding” a new employee, the need for high quality information to educate the customer or train the new salesperson is a challenge for technical communicators. You need to think outside the docs!
The key idea behind Darwin Information Typing Architecture is to create content in small chunks or modules called topics. A topic is the right size when it can stand alone as meaningful information. Topics are then assembled into documents using DITA maps, which are hierarchical lists of pointers or links to topics. The pointers are called “topicrefs” (for topic references).
Think of documents as assembled from single-source component parts. Assembly can be conditional, dependent on properties or metadata “tags” you attach to a topic. For example, the “audience” property might be “beginner” or “advanced.”
At a still finer level of granularity, individual elements of a topic can also be assigned property tags for conditional assembly. More importantly, a topic element can be assigned a unique ID that makes it a content component reusable in other topics.
As you will learn, DITA is a leading technology for “component content management,” which multiplies the value of your work. You need to leverage DITA and structured content to multiply your income.
A Brief Look at DITA in Current Technical Communication Practices_SIGDOC 2017IXIASOFT
A Brief Look at DITA in Current Technical Communication Practices, by Keith Schengili-Roberts, IXIASOFT Market Researcher, at SIGDOC, Halifax, August 2017
Is your technical content development organization considering a move to structured authoring and/or DITA (Darwin Information Typing Architecture)? This presentation provides a high-level introduction to what DITA is--and what the benefits of moving to DITA are. DITA is an excellent solution for many--but not all--organizations and projects. This introduction can help you begin to understand why DITA may or may not be a good solution for you.
Localization and DITA: What you Need to Know - LocWorld32IXIASOFT
The document discusses localization best practices when using DITA (Darwin Information Typing Architecture). It provides an overview of key DITA features like content reuse and separation of form and content. It also looks at current adoption of DITA, with over 650 companies using it worldwide across many sectors. Localization considerations with DITA are examined, including challenges around incomplete translation packages, content reuse with conrefs and conditions, and ensuring proper context for translation. Best practices are suggested for localization teams and LSPs (language service providers) working with DITA content.
(Almost) Four Years On: Metrics, ROI, and Other Stories from a Mature DITA CM...Keith Schengili-Roberts
The document summarizes the experience of implementing a DITA content management system (CMS) at AMD's graphics division over almost four years. Key points include:
1) Productivity increased 2.3-3x through content reuse, automation, and fewer formatting issues in localization. Output increased while staff decreased.
2) Localization costs dropped to less than half of pre-CMS levels due to greater content reuse and streamlined processes.
3) Tracking metadata allows comprehensive measurement of productivity, including topics created/modified, translation auto-matches, and topic reuse rates. This data aligns with product release cycles.
DITA Quick Start: System Architecture of a Basic DITA ToolsetSuite Solutions
Presenter: Joe Gelb, President, Suite Solutions
Abstract: In this webinar, you will learn about the software, integration and customization which enable you to effectively author, manage, localize, publish and share your DITA XML content. We will review how each tool fits into the content lifecycle and discuss options for an incremental DITA XML implementation using a basic toolset as the starting point.
DITA and Agile are Made for Each Other by Keith Schengili-Roberts, IXIASOFT DITA Specialist. Presented at CMS/DITA North America 2016 in Reston, Virginia.
Agile software development makes specific demands on documentation teams, whose content creators now need to be more nimble, describe features in a piece-meal fashion, and report on their progress in an effective way. The topic-based structure of DITA is ideally suited to these needs. Keith Schengili-Roberts (also known as “DITAWriter”) focuses on how DITA-based content is the optimal way of working in an Agile environment, enabling content creators to effectively meet the demands of short sprint cycles, measure content output for Scrum meetings, and how to become a pig rather than staying a chicken (yes, seriously). Keith also looks at several case studies of DITA-using documentation groups working within an Agile environment. If you are wondering about what the impacts are of working with Agile or are simply looking to optimize your DITA-based documentation processes, come to this presentation!
What can the audience expect to learn?
Keith expands upon the material that was touched upon during the Best Practices conference on the same subject, including information based on subsequent interviews with clients and other content creators who are using DITA in an Agile environment. He provides information on how others are using DITA in this scenario and emerging best practices within it. Keith has found that many content creators using DITA are looking to move to an Agile environment—particularly if they work for a software firm. The ideas presented here serve as an introduction on what to expect. Even those who do not fit this scenario may find some of the ways and processes used by DITA-using doc groups in an Agile team to be beneficial.
Dominoapplikationen im Wandel der Zeit: Alles neu mit HCL Nomad Webpanagenda
This document discusses preparing a Domino application landscape for migration to HCL Nomad Web. It recommends first consolidating the landscape by identifying unused databases that can be archived. Key questions to answer include which databases have few active users, incompatible code like hardcoded dependencies, and standard templates that are easier to migrate. Tools like iDNA and MarvelClient can help by analyzing application usage, detecting duplicate code, and finding code that is not supported in Nomad Web to speed up the migration process. The goal is to understand what applications can be modernized with Nomad Web versus left behind or re-written.
The enterprise software landscape has gone through several changes over the last few years. One of key changes has been the shift from large monolithic "on-premises" software to modular services (or microservices) served via the cloud. This is a fundamental shift in the way we build and release software, and it has necessitated a change in how technical writing teams manage and deliver documentation.
This is our story of transformation, of how we adapted and responded to changes coming our way from multiple directions, and what we learnt through the process.
XML Drafting Discussion - PCC IT Conference 2013Gareth Oakes
This document discusses using XML for legislative drafting. It notes that legislative drafting requires precision and accuracy, but current processes can be costly and slow. XML tagging can make documents "machine friendly" by adding semantics and metadata, allowing automation and improved collaboration. While no single XML standard exists, legislative documents have an intrinsic hierarchical structure that could be represented. XML offers benefits like re-use, reduced costs, and preparing documents for future needs like open data initiatives. Success requires clear goals, a phased implementation plan, and choosing the right technologies and partners to meet objectives.
Keith Schengili-Roberts - The Rise of SME within Technical CommunicationsLavaConConference
In this session attendees will learn:
Technical options for going mobile, including responsive design, converting traditional online help to an app, and creating a “true” app using RMAD (Rapid Mobile App Development) tools. The pros and cons of each approach and some of the tools available for creating each option.
Anticipated changes in content creation practices and workflows including the elimination of local formatting, adoption of a “mobile first” philosophy, rethinking the role of tables, and more.
How company issues like terminology standardization, strategic benefit, politics, and the development of metrics and standards can help or hinder a move to mobile.
Jarod Sickler and Morley Tooke - DITA Support Portals: A One Stop Shop to Giv...LavaConConference
Jorsek LLC created an easyDITA content management system to help companies adopt DITA for authoring, organizing, and managing documentation. Their DITA support portals act as a one-stop shop for users to access content in different formats. DITA portals are more than just a place for users to consume information - they are part of the larger content creation and management stack, allowing content to be enriched with metadata and reused across authoring and delivery channels. Thinking of DITA content as data allows portals to leverage element structures and attributes to dynamically assemble and deliver information to users in various forms like search results, chatbots, and augmented reality.
Similar to What are the Strengths and Weaknesses of DITA Adoption? (20)
Automating Complex High-Volume Technical Paper and Journal Article Page Compo...dclsocialmedia
SAE International is a global association of more than 138,000 engineers and related technical experts in the aerospace, automotive and commercial-vehicle industries. Annually, SAE organizes and manages an industry conference, its World Congress and Exhibition, where thousands of technical papers and journal articles are presented as part of the conference program. Leading up to the Word Congress, the technical papers and journal articles are reviewed for compliance to SAE publishing requirements and published for print and made available online in a very short time-frame. This paper describes how SAE evolved the production cycle from a less than efficient XSL-FO based process to a highly automated process leveraging NLM XML, XSLT and Adobe InDesign resulting in productivity gains and higher quality output. This paper will take you through the evolution of this project and talk to future enhancements aimed at driving additional benefits.
If everyone write their documents with the intent that they be standardized and converted, conversion to S1000D would be easy. But the reality is that most legacy data lacks the details needed for a full conversion or contains anomalies and irrelevant text. This leads us to the question one must ask: should I convert, rewrite, or manually convert the legacy data? In this presentation, we will attempt to answer this question by reviewing:
o A very quick introduction to S1000D conversions
o What the technical headaches are
o Whether to convert or rewrite
o Planning for a good conversion experience
o What the timeline looks like
o Some tools to help
Marketing and Strategy and Bears... oh my!dclsocialmedia
It's a big scary world out there, filled with content strategists, content marketers, content creators, content managers... it never ends! In this talk, we'll talk about the care and feeding of a content whatever, and answer the question: why does it matter what we call ourselves?
The document discusses the roles of various professionals involved in the user experience design process. It begins by describing the jobs of a UI designer, information architect, usability expert, content strategist, visual designer, and front end developer. It then provides more details on the responsibilities of each role, such as a content strategist being responsible for developing content schemas and attributes. The document emphasizes that these roles should not work in silos and stresses the importance of collaboration between professionals to deliver a cohesive user experience.
Managing Documentation Projects in Nearly Any Environmentdclsocialmedia
The document discusses managing documentation projects. It introduces Sharon Burton as an expert in communication and content strategy who has 20 years of experience. The document then discusses Data Conversion Labs (DCL), who is hosting the webinar, and their services related to document digitization, conversion, and publishing. Finally, it covers best practices for planning documentation projects, including defining success, estimating timelines, and preparing content for multiple delivery channels.
Coming Up to Speed with XML Authoring in Adobe FrameMakerdclsocialmedia
This document discusses Adobe FrameMaker and XML authoring. It provides an overview of FrameMaker's capabilities for both structured XML/DITA authoring and unstructured authoring. It also briefly demonstrates FrameMaker's tools for working with XML documents and publishing XML content to multiple formats. Additionally, it introduces FrameMaker XML Author, a separate product focused only on XML authoring. The document provides resources for learning more about FrameMaker and XML authoring.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Stan will introduce – JoAnn will comment – academics weighing in
Thinking about tools rather than standards. Community unfamiliar with standards unless they are in regulated industries. Conceptual framework – comes experience by many organizations, managers, and writers. And research in areas like minimalism and information mapping. Best practices were removed from the specification. Now facing vendors
What about the focus on formatting? Hard page breaks wherever they want – for example. Crafting every publication. Beautiful page no longer exists.