Understanding the value of Darwin Information Typing Architecture (DITA) is easy, but the DITA adoption process can be very overwhelming for some. Implementing the DITA standard is not easy by any means, but if carefully planned and executed the adoption process will go much smoother.
The DITA Workflow 101 will cover the basic steps you need to follow in order to succeed in your DITA implementation efforts. Topics included will be Data Analysis, Architecture Strategy Development, Technology Evaluation, Training, Conversion, Authoring, Style Sheet development and Output/Publish.
The document discusses past, present, and future research trends in enterprise architecture (EA). Past research before 2004 focused mainly on EA frameworks and adoption, but lacked focus on core EA aspects. Current research examines adoption, tools, and the business-IT connection, but lacks consistent terminology and focus on strategy. Future research areas identified include a multi-disciplinary viewpoint, case studies to build knowledge, evaluation techniques, basic research, standardization of language and symbols, and modeling. This will help advance EA maturity and growth.
Structuring Content with WordPress - My Talk at WordCamp Kanpur #WCKanpurAbhishek Deshpande
In this talk, I shared my experiences of small and medium-sized business (SMBs) websites, content related issues I faced,
Different types of content that make web experience better.How WordPress makes life easier to manage different types of Contents.
DMI – Tres herramientas que harán visible tu negocio en internetPalmaActiva
El documento presenta 3 herramientas clave para hacer visible un negocio en internet: posicionamiento, redes sociales y campañas de email marketing. Explica brevemente cada una de estas herramientas y proporciona ejemplos de cómo utilizarlas como el SEO, compartir contenido en redes como Pinterest y Facebook, y enviar boletines por correo electrónico con Mailchimp.
Energy-Balanced Dispatch of Mobile Sensors in a Hybrid Wireless Sensor Networkambitlick
The document discusses energy-efficient dispatch of mobile sensors in a hybrid wireless sensor network. It formulates the problem as a multiround sensor dispatch problem to schedule mobile sensors' traveling paths in an energy-balanced way, maximizing system lifetime. It proves the problem is NP-complete. It then proposes a centralized heuristic that minimizes mobile sensors' moving energy while balancing energy consumption, and a distributed heuristic that utilizes a grid structure for event locations to bid for mobile sensors. Simulations show the effectiveness of the proposed schemes at extending system lifetime.
Trackables are unique trading items that geocachers can trade, including travel bugs, geocoins, and custom items. Travel bugs are random items with tracking tags, while geocoins are coins with designs. Custom trackables come in various forms. The main rules are that trackables are not for keeping and should be moved to new locations. Logging involves entering the tracking number on the website to record when an item is picked up or dropped off. Proper logging helps keep trackables alive as they travel between geocachers.
Este documento proporciona información sobre la ciudad de Toledo, España. Brinda detalles sobre la población, historia y ubicación de Toledo, así como sobre algunos de sus monumentos más importantes como la Catedral, el Alcázar y el Monasterio de San Juan de los Reyes. También menciona algunas curiosidades sobre Toledo, como que su catedral es la segunda más rica después del Vaticano y que el pintor El Greco vivió y murió allí.
This document summarizes a presentation about implementing Confluence at Harvard University to enable enterprise collaboration across its various schools, departments, and organizations. The implementation would need to support over 30,000 users across 600 buildings. Key requirements included security, user privacy, external collaborator access, and permissioning. The solution was to simplify integration by mapping users to a table and creating services for Confluence to consume for user and group information, rather than overhauling Confluence's native user and permissioning systems. This would allow Confluence to handle its core functions while integrating with Harvard's centralized authentication and user directory.
The document discusses past, present, and future research trends in enterprise architecture (EA). Past research before 2004 focused mainly on EA frameworks and adoption, but lacked focus on core EA aspects. Current research examines adoption, tools, and the business-IT connection, but lacks consistent terminology and focus on strategy. Future research areas identified include a multi-disciplinary viewpoint, case studies to build knowledge, evaluation techniques, basic research, standardization of language and symbols, and modeling. This will help advance EA maturity and growth.
Structuring Content with WordPress - My Talk at WordCamp Kanpur #WCKanpurAbhishek Deshpande
In this talk, I shared my experiences of small and medium-sized business (SMBs) websites, content related issues I faced,
Different types of content that make web experience better.How WordPress makes life easier to manage different types of Contents.
DMI – Tres herramientas que harán visible tu negocio en internetPalmaActiva
El documento presenta 3 herramientas clave para hacer visible un negocio en internet: posicionamiento, redes sociales y campañas de email marketing. Explica brevemente cada una de estas herramientas y proporciona ejemplos de cómo utilizarlas como el SEO, compartir contenido en redes como Pinterest y Facebook, y enviar boletines por correo electrónico con Mailchimp.
Energy-Balanced Dispatch of Mobile Sensors in a Hybrid Wireless Sensor Networkambitlick
The document discusses energy-efficient dispatch of mobile sensors in a hybrid wireless sensor network. It formulates the problem as a multiround sensor dispatch problem to schedule mobile sensors' traveling paths in an energy-balanced way, maximizing system lifetime. It proves the problem is NP-complete. It then proposes a centralized heuristic that minimizes mobile sensors' moving energy while balancing energy consumption, and a distributed heuristic that utilizes a grid structure for event locations to bid for mobile sensors. Simulations show the effectiveness of the proposed schemes at extending system lifetime.
Trackables are unique trading items that geocachers can trade, including travel bugs, geocoins, and custom items. Travel bugs are random items with tracking tags, while geocoins are coins with designs. Custom trackables come in various forms. The main rules are that trackables are not for keeping and should be moved to new locations. Logging involves entering the tracking number on the website to record when an item is picked up or dropped off. Proper logging helps keep trackables alive as they travel between geocachers.
Este documento proporciona información sobre la ciudad de Toledo, España. Brinda detalles sobre la población, historia y ubicación de Toledo, así como sobre algunos de sus monumentos más importantes como la Catedral, el Alcázar y el Monasterio de San Juan de los Reyes. También menciona algunas curiosidades sobre Toledo, como que su catedral es la segunda más rica después del Vaticano y que el pintor El Greco vivió y murió allí.
This document summarizes a presentation about implementing Confluence at Harvard University to enable enterprise collaboration across its various schools, departments, and organizations. The implementation would need to support over 30,000 users across 600 buildings. Key requirements included security, user privacy, external collaborator access, and permissioning. The solution was to simplify integration by mapping users to a table and creating services for Confluence to consume for user and group information, rather than overhauling Confluence's native user and permissioning systems. This would allow Confluence to handle its core functions while integrating with Harvard's centralized authentication and user directory.
Catalizador transformador de co2 en metanolJorge Ozuna
El documento describe un proyecto de un grupo de estudiantes para desarrollar un catalizador que convierte el dióxido de carbono en metanol, un combustible útil. El aumento de las emisiones de CO2 desde la revolución industrial y el agotamiento eventual de los combustibles fósiles motivan este proyecto. El documento explica la metodología del proceso de conversión y los resultados obtenidos, con el objetivo de ayudar al planeta y crear un combustible alternativo.
Collins Kipkoech Togom has over 10 years of experience in community development work. He holds a Bachelor of Arts in Community Development from Daystar University in Nairobi, Kenya. Currently, he is the Program Coordinator at A-Kili Foundation where he develops community projects to encourage participation. He is also working on a mobile application to provide mentorship and resources to young entrepreneurs. His objective is to work in a competitive environment where he can learn, share his experience, and impact lives in the community.
Examen abierto nacional por Internet OMI 2001MaryRomero77
Este documento presenta las respuestas y explicaciones a 22 problemas o preguntas de un examen abierto nacional por internet. Cubre una variedad de temas matemáticos y lógicos como números enteros, operaciones aritméticas, geometría, probabilidad y razonamiento deductivo.
Next Education India provides K-12 educational technology solutions to over 7,000 schools across India, impacting 7 million students. It offers various products and services to transform school education, including online learning platforms, adaptive learning solutions, teacher training, school management software, and digital content. Next Education's consultancy arm, NextDeeksha, provides training and advisory services to schools to improve quality of education. These include CBSE-mandated teacher training, school audits and consulting, and customized workshops for teachers, principals, and parents on various topics to support student learning. NextDeeksha draws on Next Education's research experience of over 150,000 teachers to offer tailored solutions and guidance to help schools achieve their educational goals
Este documento es una introducción a la historia del protagonista sobre su encuentro con un vampiro durante las vacaciones de primavera entre su segundo y tercer año de preparatoria. Describe cómo estaba vagando solo por la escuela después de una ceremonia cuando se cruzó con una compañera popular llamada Hanekawa Tsubasa. Un ráfaga de viento levantó la falda de Hanekawa, exponiendo sus pantis blancas con un patrón de flores bordado. Este incidente inesperado marcó el comienzo de una cadena de eventos que arrastr
Два федеральных агента, которые вели дело Silk Road, обвиняются в краже битко...Anatol Alizar
This 3-sentence summary provides the essential information from the criminal complaint document:
The criminal complaint charges Carl Force IV and Shaun Bridges, both former members of the Baltimore Silk Road Task Force, with wire fraud, money laundering, theft of government property, and conflict of interest. The complaint alleges that Force and Bridges abused their positions and engaged in schemes to steal bitcoin from the Silk Road investigation and laundered the proceeds into personal accounts. The investigation revealed evidence that Force extorted the Silk Road administrator and stole bitcoin, while Bridges received hundreds of thousands of dollars in bitcoin from an exchange that was later seized by the government.
Programas de Certificación Oficial y Masters 2013 sem2Cas Trainining
El documento presenta el catálogo de programas de certificación y posgrados de la compañía CAS-training para 2013. Incluye información sobre cursos de Cisco, Microsoft, Java, PHP y MySQL, entre otros. Detalla el código, descripción, lugar, horas y fechas de los cursos programados entre julio y diciembre de 2013.
Este documento presenta conceptos básicos de geometría como punto, recta, ángulo, triángulo, trapecio, polígono convexo, círculo, esfera y perímetro. Explica brevemente cada uno de estos conceptos geométricos fundamentales. También incluye créditos de las fuentes de las imágenes utilizadas.
Este documento describe la evolución de los modelos energéticos a lo largo de la historia humana, desde el pre-agrícola hasta el industrial avanzado. Se destacan las principales fuentes de energía en cada período, como la tracción humana y animal en el pre-agrícola, y el uso creciente de la madera, el carbón y finalmente el petróleo en los modelos posteriores. El documento analiza los cambios socioeconómicos asociados a la transición entre cada modelo energético.
El documento habla sobre la misericordia de Dios. Explica que la misericordia de Dios es una característica fundamental de su naturaleza y que es rica e ilimitada. Además, destaca que Jesús perdona con misericordia en lugar de condenar con la ley, y que Dios perdona nuestros pecados a través de su gracia y misericordia.
Este documento descreve o Sistema de Mediação Laboral em Portugal, explicando:
1) Como foi criado em 2006 para resolver conflitos laborais de forma extrajudicial.
2) Os principais objetivos e etapas do processo de mediação, incluindo o pedido, indicação do mediador, sessões e possível acordo.
3) Os requisitos legais atuais no Código do Trabalho e Código de Processo do Trabalho que regulamentam parcialmente a mediação laboral.
This document provides an overview of key concepts in computer networking and data communication. It defines what a computer network is as two or more computers connected by transmission media that allows users to communicate and share applications and data. The document outlines different types of networks including local area networks (LANs), metropolitan area networks (MANs), and wide area networks (WANs). It also discusses common network typologies like star, ring, and bus topologies as well as client-server and peer-to-peer network models.
El documento resume los principios y conceptos clave de las Constelaciones Familiares, un método terapéutico desarrollado por Bert Hellinger que reconoce la transmisión intergeneracional de conflictos familiares. Algunos de los principios centrales son que todos los miembros de una familia tienen igual derecho a pertenecer, que quien llegó primero tiene prioridad, y que existe una "conciencia de grupo" que guía las decisiones inconscientes de los individuos para el bien del sistema familiar. El método implica identificar implicaciones sistémicas como la ident
ChaCha is a question answering platform that has answered over 1 billion questions from its large database. It allows users to get definitive answers about anything from millions of Guides available 24/7. ChaCha sees high traffic with over 25 million monthly users and answers nearly 3 million questions per day across its website and mobile apps. It offers brand marketers opportunities to engage with its diverse audience segments through various online and mobile advertising options.
Esta unidad didáctica aborda el origen de la vida a través de diferentes áreas como educación artística, conocimiento del medio, inglés y educación física. Los estudiantes trabajarán en equipos para investigar astros como Marte, Titán, Encelado y Europa siguiendo el método científico. Realizarán actividades como búsquedas de tarjetas y minerales. Crearán dibujos de extraterrestres y desarrollarán hipótesis sobre el origen de la vida en la Tierra colaborando
Information architecture (IA) involves organizing and labeling information to support usability across various mediums. An IA defines relationships between content through taxonomy, metadata schemas, and site maps. Common IA roles include content analysis, developing controlled vocabularies and content models, and ensuring information is consistently structured and labeled. While backgrounds vary, many IAs come from fields like technical writing, usability, and library science.
Keith Schengili-Roberts: Improve Your Chances for Documentation Success with ...Jack Molisani
Keith Schengili-Roberts presented on how moving to DITA and a content management system (CCMS) can improve chances of documentation success. He outlined key features of DITA like content reuse and separation of form from content. Four chief reasons for wanting to move included needing more efficiency, outgrowing current tools, rising localization costs, and content verification needs. A CCMS allows for versioning, workflow, metrics on production and reuse, and improved localization. It was argued the benefits outweigh upfront costs over time through opportunities for process improvement.
DITA Quick Start Webinar Series: Building a Project PlanSuite Solutions
Presenters: Joe Gelb, President, Suite Solutions and Yehudit Lindblom, Project Manager, Suite Solutions
Abstract:
Migrating to DITA XML-based authoring and publishing promises rich rewards in terms of lower costs and faster time to publication. But DITA migration also requires a well-planned process that will lead you through all the steps of a successful implementation. In this webinar, experienced project manager Yehudit Lindblom and Joe Gelb will review a process that covers all the bases, helping you build your game plan for a winning DITA implementation.
Visit us at http://www.suite-sol.com
Follow us on LinkedIn http://www.linkedin.com/company/527916
Catalizador transformador de co2 en metanolJorge Ozuna
El documento describe un proyecto de un grupo de estudiantes para desarrollar un catalizador que convierte el dióxido de carbono en metanol, un combustible útil. El aumento de las emisiones de CO2 desde la revolución industrial y el agotamiento eventual de los combustibles fósiles motivan este proyecto. El documento explica la metodología del proceso de conversión y los resultados obtenidos, con el objetivo de ayudar al planeta y crear un combustible alternativo.
Collins Kipkoech Togom has over 10 years of experience in community development work. He holds a Bachelor of Arts in Community Development from Daystar University in Nairobi, Kenya. Currently, he is the Program Coordinator at A-Kili Foundation where he develops community projects to encourage participation. He is also working on a mobile application to provide mentorship and resources to young entrepreneurs. His objective is to work in a competitive environment where he can learn, share his experience, and impact lives in the community.
Examen abierto nacional por Internet OMI 2001MaryRomero77
Este documento presenta las respuestas y explicaciones a 22 problemas o preguntas de un examen abierto nacional por internet. Cubre una variedad de temas matemáticos y lógicos como números enteros, operaciones aritméticas, geometría, probabilidad y razonamiento deductivo.
Next Education India provides K-12 educational technology solutions to over 7,000 schools across India, impacting 7 million students. It offers various products and services to transform school education, including online learning platforms, adaptive learning solutions, teacher training, school management software, and digital content. Next Education's consultancy arm, NextDeeksha, provides training and advisory services to schools to improve quality of education. These include CBSE-mandated teacher training, school audits and consulting, and customized workshops for teachers, principals, and parents on various topics to support student learning. NextDeeksha draws on Next Education's research experience of over 150,000 teachers to offer tailored solutions and guidance to help schools achieve their educational goals
Este documento es una introducción a la historia del protagonista sobre su encuentro con un vampiro durante las vacaciones de primavera entre su segundo y tercer año de preparatoria. Describe cómo estaba vagando solo por la escuela después de una ceremonia cuando se cruzó con una compañera popular llamada Hanekawa Tsubasa. Un ráfaga de viento levantó la falda de Hanekawa, exponiendo sus pantis blancas con un patrón de flores bordado. Este incidente inesperado marcó el comienzo de una cadena de eventos que arrastr
Два федеральных агента, которые вели дело Silk Road, обвиняются в краже битко...Anatol Alizar
This 3-sentence summary provides the essential information from the criminal complaint document:
The criminal complaint charges Carl Force IV and Shaun Bridges, both former members of the Baltimore Silk Road Task Force, with wire fraud, money laundering, theft of government property, and conflict of interest. The complaint alleges that Force and Bridges abused their positions and engaged in schemes to steal bitcoin from the Silk Road investigation and laundered the proceeds into personal accounts. The investigation revealed evidence that Force extorted the Silk Road administrator and stole bitcoin, while Bridges received hundreds of thousands of dollars in bitcoin from an exchange that was later seized by the government.
Programas de Certificación Oficial y Masters 2013 sem2Cas Trainining
El documento presenta el catálogo de programas de certificación y posgrados de la compañía CAS-training para 2013. Incluye información sobre cursos de Cisco, Microsoft, Java, PHP y MySQL, entre otros. Detalla el código, descripción, lugar, horas y fechas de los cursos programados entre julio y diciembre de 2013.
Este documento presenta conceptos básicos de geometría como punto, recta, ángulo, triángulo, trapecio, polígono convexo, círculo, esfera y perímetro. Explica brevemente cada uno de estos conceptos geométricos fundamentales. También incluye créditos de las fuentes de las imágenes utilizadas.
Este documento describe la evolución de los modelos energéticos a lo largo de la historia humana, desde el pre-agrícola hasta el industrial avanzado. Se destacan las principales fuentes de energía en cada período, como la tracción humana y animal en el pre-agrícola, y el uso creciente de la madera, el carbón y finalmente el petróleo en los modelos posteriores. El documento analiza los cambios socioeconómicos asociados a la transición entre cada modelo energético.
El documento habla sobre la misericordia de Dios. Explica que la misericordia de Dios es una característica fundamental de su naturaleza y que es rica e ilimitada. Además, destaca que Jesús perdona con misericordia en lugar de condenar con la ley, y que Dios perdona nuestros pecados a través de su gracia y misericordia.
Este documento descreve o Sistema de Mediação Laboral em Portugal, explicando:
1) Como foi criado em 2006 para resolver conflitos laborais de forma extrajudicial.
2) Os principais objetivos e etapas do processo de mediação, incluindo o pedido, indicação do mediador, sessões e possível acordo.
3) Os requisitos legais atuais no Código do Trabalho e Código de Processo do Trabalho que regulamentam parcialmente a mediação laboral.
This document provides an overview of key concepts in computer networking and data communication. It defines what a computer network is as two or more computers connected by transmission media that allows users to communicate and share applications and data. The document outlines different types of networks including local area networks (LANs), metropolitan area networks (MANs), and wide area networks (WANs). It also discusses common network typologies like star, ring, and bus topologies as well as client-server and peer-to-peer network models.
El documento resume los principios y conceptos clave de las Constelaciones Familiares, un método terapéutico desarrollado por Bert Hellinger que reconoce la transmisión intergeneracional de conflictos familiares. Algunos de los principios centrales son que todos los miembros de una familia tienen igual derecho a pertenecer, que quien llegó primero tiene prioridad, y que existe una "conciencia de grupo" que guía las decisiones inconscientes de los individuos para el bien del sistema familiar. El método implica identificar implicaciones sistémicas como la ident
ChaCha is a question answering platform that has answered over 1 billion questions from its large database. It allows users to get definitive answers about anything from millions of Guides available 24/7. ChaCha sees high traffic with over 25 million monthly users and answers nearly 3 million questions per day across its website and mobile apps. It offers brand marketers opportunities to engage with its diverse audience segments through various online and mobile advertising options.
Esta unidad didáctica aborda el origen de la vida a través de diferentes áreas como educación artística, conocimiento del medio, inglés y educación física. Los estudiantes trabajarán en equipos para investigar astros como Marte, Titán, Encelado y Europa siguiendo el método científico. Realizarán actividades como búsquedas de tarjetas y minerales. Crearán dibujos de extraterrestres y desarrollarán hipótesis sobre el origen de la vida en la Tierra colaborando
Information architecture (IA) involves organizing and labeling information to support usability across various mediums. An IA defines relationships between content through taxonomy, metadata schemas, and site maps. Common IA roles include content analysis, developing controlled vocabularies and content models, and ensuring information is consistently structured and labeled. While backgrounds vary, many IAs come from fields like technical writing, usability, and library science.
Keith Schengili-Roberts: Improve Your Chances for Documentation Success with ...Jack Molisani
Keith Schengili-Roberts presented on how moving to DITA and a content management system (CCMS) can improve chances of documentation success. He outlined key features of DITA like content reuse and separation of form from content. Four chief reasons for wanting to move included needing more efficiency, outgrowing current tools, rising localization costs, and content verification needs. A CCMS allows for versioning, workflow, metrics on production and reuse, and improved localization. It was argued the benefits outweigh upfront costs over time through opportunities for process improvement.
DITA Quick Start Webinar Series: Building a Project PlanSuite Solutions
Presenters: Joe Gelb, President, Suite Solutions and Yehudit Lindblom, Project Manager, Suite Solutions
Abstract:
Migrating to DITA XML-based authoring and publishing promises rich rewards in terms of lower costs and faster time to publication. But DITA migration also requires a well-planned process that will lead you through all the steps of a successful implementation. In this webinar, experienced project manager Yehudit Lindblom and Joe Gelb will review a process that covers all the bases, helping you build your game plan for a winning DITA implementation.
Visit us at http://www.suite-sol.com
Follow us on LinkedIn http://www.linkedin.com/company/527916
This document summarizes a presentation about optimizing DITA-based content for search engine optimization. The presentation discusses how DITA content is transformed and published on the web, and what search engines like Google prioritize, such as descriptive titles, effective short descriptions, and relationship tables. It emphasizes writing content with users in mind by understanding their needs and scenarios. While techniques like keywords and Dublin Core metadata don't significantly impact rankings, focusing on user experience through topics like tasks and troubleshooting is important as search evolves to understand natural language queries.
Lean Analytics is a set of rules to make data science more streamlined and productive. It touches on many aspects of what a data scientist should be and how a data science project should be defined to be successful. During this presentation Richard will present where data science projects go wrong, how you should think of data science projects, what constitutes success in data science and how you can measure progress. This session will be loaded with terms, stories and descriptions of project successes and failures. If you're wondering whether you're getting value out of data science, how to get more value out of it and even whether you need it then this talk is for you!
What you will take away from this session
Learn how to make your data science projects successful
Evaluate how to track progress and report on the efficacy of data science solutions
Understand the role of engineering and data scientists
Understand your options for processes and software
The document provides information about pursuing a career in SharePoint. It discusses common SharePoint job roles like administrator, developer, business analyst, and project manager. It also provides details on the typical responsibilities and skills required for each role. The document aims to help readers understand the different paths they could take in a SharePoint career and determine if this field is a good fit for them. It emphasizes that gaining hands-on experience through training, user groups, and test environments is important for career development in SharePoint.
What You Need to Know Before Upgrading to SharePoint 2013Perficient, Inc.
Ready to join the SharePoint 2013 revolution but not sure what is involved? Are you in the middle of a migration that is behind schedule? This presentation walks you through general guidelines and common pitfalls to avoid so your transition to SharePoint 2013 will be successful.
Speaker Suzanne George discusses tips and tricks to ensure a successful SharePoint 2013 implementation and describe common mistakes that organizations make during the transition.
Whether you are in the middle of migrating to SharePoint 2013 or you are just thinking about implementation, this session will give you tools that will help you successfully deploy SharePoint within your organization.
Presenter Suzanne George, MCTS, is a Senior Technical Architect a Perficient. She has developed, administered, and architected website applications since 1995 and has worked with top 100 companies such as Netscape, AOL, Sun Microsystems, and Verio. Her experience includes custom applications and SharePoint integration with applications such as ESRI, Deltek Accounting Software, and SAP. Suzanne sits on the MSL IT Manager Advisory Council, was a contributing author for SharePoint 2010 Administrators and presents at SharePoint Saturdays around the country.
Improve your Chances for Documentation Success with DITA and a CCMS LavaCon L...IXIASOFT
This document discusses how adopting DITA and a content management system (CCMS) can improve documentation success. It outlines key features of DITA including content reuse. Four main reasons for adopting DITA and a CCMS are discussed: needing more efficiency, outgrowing current tools, rising localization costs, and needing content verification. Four things that can be done with DITA and a CCMS are also presented: versioning content, implementing workflows, measuring documentation metrics, and improving localization. The presenter is then available for questions.
Ruth Preston has over 15 years of experience in writing, editing, instructional design, and project coordination. She has worked as an instructional designer, technical editor, and content manager at companies including Sublime Media, Microsoft, and Charles Schwab. Preston has expertise in Microsoft Office, SharePoint, and HTML and is proficient in developing training materials, editing documents, and managing websites and SharePoint sites.
The document discusses metadata strategies for DITA content at an enterprise scale. It introduces the [A] Content Intelligence Framework, which separates structure and semantics using a Master Content Model and Master Semantic Model. The framework maximizes investments in DITA by enabling metadata-enriched, structured content to be delivered across multiple channels. The document also reviews DITA's built-in metadata and semantic mechanisms and their strengths and weaknesses for implementing metadata at scale.
The Senior Systems Analyst role is responsible for strategically designing and implementing applications and systems in consultation with senior leadership. This includes business intelligence integration, application rationalization, architecture, design, and development. As a driver of innovation, the Senior Systems Analyst identifies opportunities for improvement and promotes best practices. Key responsibilities include designing systems, leading implementation projects, performing data analysis, and contributing to strategic and automation plans to align with organizational goals. The role requires advanced technical skills and experience in areas such as database design, application development, and project management.
Northern New England Tableau User Group (TUG) May 2024patrickdtherriault
Join us live in Portland or over the wire for networking and two fantastic presentations! Data viz freelancer Desireé Abbott will demonstrate how adding interactivity to your dashboards will delight and spark curiosity in your users. Then, Charlotte Taft & Laurie Rugemer will reprise their TC24 presentation on the keys to building a successful analytics team.
Northern New England TUG May 2024 - Abbott, Taft, Rugemerpatrickdtherriault
Join us live in Portland or over the wire for networking and two fantastic presentations! Data viz freelancer Desireé Abbott will demonstrate how adding interactivity to your dashboards will delight and spark curiosity in your users. Then, Charlotte Taft & Laurie Rugemer will reprise their TC24 presentation on the keys to building a successful analytics team.
Data-Ed Slides: Data Modeling Strategies - Getting Your Data Ready for the Ca...DATAVERSITY
Because every organization produces and propagates data as part of their day-to-day operations, data trends are becoming more and more important in the mainstream business world’s consciousness. For many organizations in various industries, though, comprehension of this development begins and ends with buzzwords: “Big Data”, “NoSQL”, “data scientist”, and so on. Few realize that any and all solutions to their business problems, regardless of platform or relevant technology, rely to a critical extent on the data model supporting them. As such, data modeling is not an optional task for an organization’s data effort, but rather a vital activity that facilitates the solutions driving your business.
Instead of the technical minutiae of data modeling, this webinar will focus on its value and practicality for your organization. In doing so, we will:
- Address fundamental data modeling methodologies, their differences and various practical applications, and trends around the practice of data modeling itself
- Discuss abstract models and entity frameworks, as well as some basic tenets for application development
- Examine the general shift from segmented data modeling to more business-integrated practices
Hand In Glove Content Strategy & SEO 10 2008 FinalLynn Leitte
A capabilities deck that promotes why Content Strategy and SEO efforts are most effective for a project when there is overlap between the two disciplines.
The document outlines a 9 step process for developing a successful technology plan for nonprofit river groups. It discusses identifying stakeholders, needs, assets, potential solutions, creating a living document and budget, fundraising strategy, timelines, and taking an iterative approach. The goal is to help pull together the right people, focus on goals, and create a compelling story for funding. Key steps include identifying needs, exploring off-the-shelf and custom solutions, creating a total cost of ownership budget, and developing a fundraising strategy focused on problems solved rather than just technology.
Building SharePoint Enterprise Platforms - Off the beaten path - SharePoint S...Andy Talbot
This document contains notes from a presentation on building and managing enterprise SharePoint platforms. It includes considerations around governance, roles and responsibilities, capacity planning, hardware, monitoring, backups, and change management. The presentation emphasizes having the right people, processes, and documentation in place to support a large SharePoint deployment. It also stresses understanding technical requirements and having a plan to scale the platform over time.
Join Concept Searching and partner C/D/H for this thought-provoking webinar on what intelligent enterprise search should be.
Our solution is unique in the marketplace, and overcomes the limitations of other enterprise search engines. It was originally deployed as an enterprise search solution for engineers and support staff.
This webinar will focus on how one unified view of all unstructured, semi-structured, and structured data assets, including 2D and 3D images, can be integrated into the search interface, with previewers and navigational aids.
Both business and technical professionals will benefit from this session:
• Understand how the technology works, and how it can be set up with a platform and search engine of choice
• See how search returns results, and provides visual and navigational aids for all information retrieved
• Watch how to select an image based on color, size, or shape
• Learn how any business or artificial intelligence applications can benefit from the multi-term metadata created
• Find out why the search framework provides a responsive user interface for any tablet, PC or mobile device
In this session we explore the different elements that you need to oversee when you are planning a Intranet for your company. We will discuss the different intranet approaches we can follow based on they way you want to engage with your end users. Learn the difference between communication portal and collaboration team site in order to establish an Intranet framework able to scale business needs.
Similar to DITA Workflow 101- An Action Plan for DITA Implementation (20)
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
CAKE: Sharing Slices of Confidential Data on BlockchainClaudio Di Ciccio
Presented at the CAiSE 2024 Forum, Intelligent Information Systems, June 6th, Limassol, Cyprus.
Synopsis: Cooperative information systems typically involve various entities in a collaborative process within a distributed environment. Blockchain technology offers a mechanism for automating such processes, even when only partial trust exists among participants. The data stored on the blockchain is replicated across all nodes in the network, ensuring accessibility to all participants. While this aspect facilitates traceability, integrity, and persistence, it poses challenges for adopting public blockchains in enterprise settings due to confidentiality issues. In this paper, we present a software tool named Control Access via Key Encryption (CAKE), designed to ensure data confidentiality in scenarios involving public blockchains. After outlining its core components and functionalities, we showcase the application of CAKE in the context of a real-world cyber-security project within the logistics domain.
Paper: https://doi.org/10.1007/978-3-031-61000-4_16
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Things to Consider When Choosing a Website Developer for your Website | FODUUFODUU
Choosing the right website developer is crucial for your business. This article covers essential factors to consider, including experience, portfolio, technical skills, communication, pricing, reputation & reviews, cost and budget considerations and post-launch support. Make an informed decision to ensure your website meets your business goals.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
DITA Workflow 101- An Action Plan for DITA Implementation
1.
2. • Founded in 1973 in San Antonio, TX
• A Vast experience in all types of XML-based authoring and conversion (SGML,
XML, DITA, S1000D)
• Over 10 years experience helping companies integrate the DITA standard
• Equipped with the resources to tackle a project of any size from beginning to end
3. Today’s Speakers
Joe Storbeck
Senior Structured Data Analyst
Joe has been working closely with the DITA
specification since its release at IBM in
2001, and is a highly regarded expert in its
use and application.
Matthew Cassi
Account Executive - DITA
Account Executive at JANA, Inc. with a focus
on structured data methodologies (DITA,
S1000D) and the content management
industry and communities.
4. The DITA Source
Joe Storbeck’s DITA Source blog: http://www.janacorp.com/blog
10. Example of questions to ask for a successful DITA
implementation:
• Is the source material in a different structured language, or
are you converting from a desktop publisher?
• Is your source data consistent, or even well-organized?
• With your content in its present form, is it even possible to
categorize the data into the various topic type options?
Data Analysis
11. Architecture Strategy Development
Data Analysis
Conversion & Authoring
Effort
Style Sheet
Development
Final Deliverables
Training
Technology Evaluation &
Selection
Architecture Strategy
Development
12. • How are you going to move existing data into the various
elements?
• What are you going to move?
• How are you going to repurpose existing DITA elements to fit
your needs - while always trying to stay within the framework
of the DITA standard?
• How are you going to identify material that can be reused?
• What will your file directory look like?
Architecture Strategy Development
13. Technology Evaluation & Selection
Data Analysis
Conversion & Authoring
Effort
Style Sheet
Development
Final Deliverables
Training
Architecture Strategy
Development
Technology Evaluation & Selection
14. • Take the time to understand your data and how it fits
into your overall architecture strategies
• Use the information that you gather to choose the right
authoring software and/or CMS for your organization
• The CMS can be the most critical decision you make.
Don’t rush it!!
Technology Evaluation & Selection
15. Training
Data Analysis
Conversion & Authoring
Effort
Style Sheet
Development
Final Deliverables
Architecture Strategy
Development
Training
Technology Evaluation &
Selection
16. • Give your editor or tools support person the
training task
• Provide training as close to the beginning of the
conversion process as possible.
• Don’t throw DITA at your staff and hope for the best.
Training
17. You may need to convince writers that topic-based writing works.
Show them the DITA difference.
Training
19. Conversion & Authoring Effort
• Clean up tags
• Create a topic-
based outline
• Look for duplicate
information
• Fix and
formatting
problems
• Fix broken links
• Verify topics have
been typed
correctly
Before After
22. Data Analysis
Technology Evaluation &
Selection
Training
Conversion & Authoring
Effort
Style Sheet
Development
Architecture Strategy
Development
Final
Deliverables
Final Deliverables
29. JANA KNOWS DATA.
Joe Storbeck
(210) 616-0083 x215
jstorbeck@janacorp.com
JANA Main
(210) 616-0083
solutions@janacorp.com
We thank you for participating!
Matthew Cassi
(210) 309-5944
mcassi@janacorp.com
www.janacorp.com
Editor's Notes
Hello and welcome to the JANA Inc. “DITA Workflow 101” webinar, we thank you for your interest and attendance today. We know you’re all very busy people, so making the time to attend today’s webinar is much appreciated. My name is Matthew Cassi and I’m joined today by one of JANA’s DITA experts Joe Storbeck.
Before we go into Joe’s background, let us first give you some JANA history. (click)
For those of you who might not be familiar with JANA, we are a technical documentation services company established in San Antonio, Texas, in 1973. For 43 years, our core business has been the development, management and delivery of quality technical documentation. We have the expertise and the resources necessary to support a project of any size in the authoring, conversion, management, and publishing of quality content, with over 10 years of experience implementing the DITA standard.
But first, you might just be wondering who your presenters are – (click)
As I stated earlier my name is Matthew Cassi and I’ll be your presenter today. As an account executive for JANA, I represent a point of contact for our clients, but I’m sure you’d rather hear about JANA’s DITA consultation and development leader – Mr. Joe Storbeck.
Welcome, Joe. (Thanks, Matt.)
Joe has been a technical communication professional for over 30 years with an extensive background in the development of structured data and data application methodologies, leading technical documentation teams for some of the worlds largest and most successful companies, including IBM, Citibank and AIG. He is also recognized as an expert in the development and application of the DITA standard.
(click)
Joe’s Dialogue - Thank you, I’m excited to be here today! Coupled with being a consultant for JANA, I regularly teach classes on topic-based authoring and I publish a DITA related blog called – THE DITA SOURCE – which you can view at the JANA website. To give you a little more background, during my time at IBM, I worked directly with the team responsible for developing the DITA architecture.
MATT - Thanks Joe. (CLICK) So you can check out Joe’s DITA Source blog at janacorp.com/blog. Don’t worry, if you missed it, we’ll provide a link to Joe’s blog in our Webinar follow up email. So, now that we’ve got that out of the way, let’s walk through today’s agenda. (click)
Here’s a quick overview of today’s webinar agenda.
While we expect this webinar to last around 30 minutes, we will try to keep the last 5 minutes or so free for answering any questions that you might have. You will notice that there is a box in the attendee panel that you can use anytime to submit a question. Our production team here will collect the questions as we go, and Joe and I will try to answer as many of those as we can before we close.
You can see from the agenda that we’re really going to be focused on the discussion of the overall workflow, but we want to make sure that we start out the discussion with an overview of the types of people who will be involved, so lets take a quick look at some of the different roles that are important to the process, and some of the responsibilities that each member has to ensure a successful DITA project.
As an organization investigates moving to a DITA writing environment, new roles, responsibilities, and skills must be identified as many individuals must work together to develop tools, procedures, and processes for the new environment. We have identified four basic roles that are necessary for a successful DITA implementation, although the number and different types of roles might change based on the complexity of your conversion needs. The roles can also vary significantly when moving past the conversion phase and into the authoring phase.
In any case, the best DITA adoption results occur when key roles collaborate closely together.
Matt - Let’s take a closer look at each of the four roles that we have identified, and note that we’re talking about these roles as an individual but there may be a number of people within your organization that perform each role. Joe why don’t you first tell us about what functions the Information Architect would perform as part of the DITA team.
Joe - OK. The Information Architect is very familiar with the organization’s content and understands it’s life cycle. These individuals often act as the key DITA Evangelists within your organization – explaining what to expect in terms of changes to procedures and processes, and getting your IT team on board. They would also Manage the conversion process & defines milestones, and evaluate when and how documents will be converted or authored. The IA also designs the overall navigation, tracks progress and reports status. The Information architect determines where and how topics should be presented. Also note that there may multiple people in this and any other role.
Matt – Okay, and how about the Author? (click)
Joe - Authors are the individuals who, upon completion of training, begin writing content in DITA, often times learning the tools & processes as they go. They also can create pre-conversion and new topic outlines, and track ID’s and Filenames (for fixing broken links later). They are usually also responsible for authoring new material, running the conversion scripts, and performing the post-migration file clean-up.
Matt – Next we have the role of Editor (CLICK) Joe can you elaborate on their responsibilities?
Joe – Sure, this individual ensures that all authors are using the DITA elements and attribute values consistently, making sure that language and structure meet requirements for usability and clarity, and that all possible reuse is being employed. Additionally, they review document outlines and provide guidance to writers during the conversion and authoring process.
Matt – and lastly we have the role of the person that we’re going to refer to as the Tech Tools Guru - (click)
Joe – Yes, the Tech Tools Guru is important because you will need someone on your team who really understands your new toolset and the vendors you will be working with – someone who will understand your customizations and how everything works together. They will also help writers with tagging issues, debugging build and stylesheet problems and they can also create a conversion script.
Matt – Thanks Joe, please also remember that these are just a few examples of some of the key roles you need to identify for a successful DITA conversion. Your specific needs might dictate that additional roles may or may not be needed, like a DITA sponsor or a Translation manager.
Up next is a depiction of the sample Implementation Roadmap we’ll be using throughout this presentation. (click)
Matt - On your screen now is a typical DITA implementation roadmap: From Data Analysis all the way through the publishing of the Final Deliverables.
The roadmap before you breaks down the steps by phase: Data Analysis, Architecture Strategy Development, Technology Evaluation & Selection, Training, Conversion & Authoring Effort, Style Sheet Development and Final Deliverables.
This scenario takes for granted that you will be converting (or at least, considering converting) portions of your existing product library. If you’re developing content from scratch some of these steps may not be applicable to you. Also these steps do not have to be in this particular order; keep in mind that DATA Analysis should always be your first step, but the specific order of the 5 intermediate steps can be interchangeable to follow your specific data conversion needs.
But first, as a quick and dirty attempt to collect some information that will guide us through the rest of the webinar, we’d like to know which of these steps our attendees are having the most trouble with. You’ll see a survey pop up on your screen, so please take a few seconds to answer this question for us. (launch poll) We will use the data to customize some of the more topic-flexible portions of this presentation, and to give us an idea of what topics we might want to cover in upcoming webinars.
(show poll results)
Joe - OK. It looks like the majority of you are having trouble with (largest poll results). We will make sure that we cover it in more detail when we get to that portion of the webinar.
(hide results)
(click next screen)
Matt - Now let’s start by taking a look at the Data Analysis phase.
In the decade-plus that we’ve been performing services in the DITA world, we’ve talked to company after company who did not take the time to evaluate their data prior to performing the other steps. Analyzing your data first will allow you to make an informed decision when performing each of the later steps, especially architecture strategy development and technology selection.
If you have legacy data that you need to convert, it is especially important to take the time to review your existing content. There are a number of critical questions that have to be asked and answered prior to starting a DITA conversion and authoring effort, and you have to get to know your data before any of these questions can be answered.
(click)
Matt – Questions like those shown here will help to identify the variables and define the information you will need later on, especially in the Conversion and Authoring and Technology Evaluation stages. Additionally, questions such as: Will you be able to create scripts to do the conversion? or Will you need to select tools that will help automate portions of the conversion? The entire scope of your project can change depending on the answer to a single question, and the answer may not be applicable to the entire data set. For instance, if your existing library is in a structured environment, your project will look very different from one which is converting data written using a desktop publishing package.
The steps that you need to take will be determined by the structure of your existing data. You may find that a customized set of conversion scripts is the best method to convert your data, or you may be able to get away with a simple cut and paste using a piece of DITA-smart authoring software.
Data analysis is also where you can determine what portions of your documents and library can be marked for reuse. This stage is where you will inform how much rewriting you will need to do when moving your documentation into DITA topic types.
Joe – BTW, keep in mind that a typical roadmap for a dita conversion would progress from data analysis to architecture strategy development, which is how we’re going to progress in this presentation but there may be characteristics of your data that necessitate performance of steps in another order. Trust your data to show you the way.
CLICK
Matt - Architecture Strategy Development: After thoroughly analyzing your legacy data, but before converting data or authoring any new content, you should begin planning the strategies that you are going to employ when you begin to build-out your DITA project. Take the time to develop intelligent strategies - strategies that involve moving existing data into the various elements, or maybe repurposing existing DITA elements to fit your needs - while always trying to stay within the framework of the DITA standard.
JOE - Your information architect should be the final arbiter for the structure of your document. (The IA should work closely with writers to create pre-conversion outlines and new topic outlines. The IA should also design the overall navigation and determine where and how topics will be presented. Writers should keep track of filenames as well as topic and element IDs that will be used later for fixing broken links.
By organizing content into small chunks, it is much easier to re-use content for multiple outputs. When content is structured this way, it becomes possible to freely mix-and-match the content so that it can suit the needs of different users.
DITA also allows the writer to set conditions for the audience, products, version, and platforms. The writer can then use these conditions to produce different outputs for different types of users.
For example, you may need to write a help system for a software application that is available in both Windows and UNIX versions. The functionality of the application is similar in both operating systems, but there are a few differences between the Windows and Unix versions. By using the platform attribute, you can indicate which operating system the topic covers. When you publish your files, you can use the platform attribute to produce documentation for both operating systems.
We’ll focus on these attributes later when we discuss the final deliverables phase. (CLICK)
Matt – Technology Evaluation Selection: So once you have taken the time to understand your data and how it fits into your overall architecture strategies, you can begin the search for the software that best fits your company’s requirements. Choosing the right authoring software and CMS for your organization is a critical step in ensuring the success of the overall project. (CLICK)
Matt - Take your time with this step. This is one of the biggest potential stumbling blocks in a successful and efficient DITA implementation. We have talked to many companies that, once they got approval and the money for a DITA implementation, immediately invested a good portion of their budget in technology that didn’t fit their needs. The folks doing the buying either didn’t talk to the folks that analyzed the data, or the Information Architects weren’t consulted regarding the library structure and output, or they skipped the data analysis altogether. One company bought technology they didn’t fully understand and after a full year of dedicated conversion effort they had converted only one manual to DITA - a small fraction of the several hundred titles within their library of legacy data.
JOE - A few quick tips to make sure that you don’t fall prey to the potential pitfalls in this step:
Do your homework in the two previous steps. Once you understand your data and how it fits into your overall architecture strategies, you can begin the search for the software that best fits your company’s requirements.
Choosing the right authoring software and/or Content Management System for your organization is perhaps the most important step in ensuring the success of the overall project. Make sure that you leverage the information that you have gathered to make an informed decision for both.
Don’t be afraid to move on to the another step before making a final decision on what CMS to use. The CMS can be the most critical decision you make. Don’t rush it!!
Also, I think that it’s important to note here that DITA does not necessarily require the use of a CMS. After looking at all of the options and project variables you may decide that it’s best for your organization to move forward without a CMS. This is a perfectly legitimate alternative, although what you’ll typically find is that the larger the scale of your DITA project, the more benefits you will get from using a CMS to manage the use and storage of your DITA topic files.
(CLICK)
Matt - Training: Authoring and converting DITA topics is about more than just knowing what information goes where. It requires an understanding of the benefits of using the DITA standard, as well as the benefits of topic-based authoring. Writers who understand both are going to be a major asset to the authoring and conversion team.
JOE – Keep in mind that training can happen at any time prior to authoring and conversion, but the training should happen as close to the start of the authoring and conversion effort as possible.
CLICK
Matt - Also, you might want to give your editor or tech tools guru the training task. That person can act as a counselor for writers as they move through the DITA implementation. Writers are always tempted to find short cuts. They are creative! And they will find ways to achieve goals that are outside of the DITA standard. The training person should review DITA tags to ensure they are used in the correct fashion.
We also want to point out that writers need training to avoid the many pitfalls that are possible. DITA introduces changes that will affect writers and developers differently. Plan for the change and allow time for real learning. Don’t just throw DITA at your staff and hope for the best.
Joe –DITA can seem intimidating to writers at first. As mentioned in one of our previous webinars, when I was teaching DITA to writers at IBM one woman told me that she wasn’t going to learn DITA, she told me should would rather retire than go through that. And……she did! That’s when we realized that in addition to educating our writers on how to use DITA we had to also educate them on WHY we we’re using DITA.
CLICK
You may need to convince writers that topic-based writing works. Here’s how you can show them the differences:
JOE – if you look at this chart, it shows the major differences between Book and Topic-based Structure. In it’s most basic form topic-based documentation provides the individual building blocks for quality documentation. Topic-based documentation gets to the point. Each topic answers one questions – How Do I….? – What is….? – What went wrong….? – Topics should only be a few paragraphs long, stick to one subject and they should be able to stand on their own.
CLICK
Next we have the Conversion & Authoring Efforts: This is the step where all of the work that you have done in each of the preceding steps pays off for you, in the form of efficient automated conversions and intelligent topic-based authoring.
Everyone should be involved in this effort. And it is important to remember that you might need to---no---you WILL need to do rework. Automated conversion tools can’t and won’t achieve a conversion. You are always going to have to do rework and cleanup.
Before you start your conversion:
Clean up tags
Create a topic-based outline
Look for duplicate information
If your writing staff is large, you might want to create a regularly scheduled meeting in which the staff can discuss problems or questions they are having with DITA tagging and formatting. You could use these sessions to create a Best Practices document for future projects and staff members. When your writers realize that EVERYONE is learning, the transition to topic-based authoring is easier.
After your conversion:
Fix any formatting problems
Fix broken links
Verify that topics have been typed correctly
CLICK
And now we move to the development of Stylesheets: While Stylesheets are needed to transform XML source into output, most DITA-smart software titles will come with a series of out of the box Stylesheets you can use if you want to publish your data with no bells or whistles. But if you want to customize the output, such as maybe highlighting elements, you can customize the existing stylesheet in order to do so. If you want a new specialization, you’ll need to create a new stylesheet. However, we strongly suggest you not create new specializations unless you’re well versed in the different customization features. Because each rendering application is different, each distinct output type requires a different transform and a different formatting stylesheet.
CLICK
Output options include, but not limited to, PDF, WebHelp, CHUM, EPUB, JavaHelp, Eclipse Help, XHTML and other structured models. - each requiring its own stylesheet(s).
Stylesheets may be written in a variety of languages. For example, XSL, CSS, CSS3, DSSSL, and FOSI. Additionally, there are many tools that can process your stylesheets.
Please note that not every processing tool renders identically. Be sure to choose a language and tool set that suits your needs and is sustainable.
CLICK
Final Deliverables: Once your data has been properly converted and authored and your style sheets have been developed, outputting and publishing your data should be as easy as pushing a button - that’s the true beauty of DITA.
CLICK
Your final delivery may be a PDF and an online document. You only need one dataset with DITA!
JOE - One of the most useful features in DITA is Conditional Processing (or Profiling), which allows you to use the same set of source files to create different versions of your documentation. Content is marked in a way that you can filter out entire topics, paragraphs, sentences, and even words.DITA offers a set of standard self-explanatory profiling attributes that are applicable to most elements – For example the Audience element (CLICK)
Suppose you are writing a topic for an audience that includes both advanced and novice users. A novice will need more explanation than an advance user, so if you notice in the outlined paragraphs, the paragraph tags have the audience attribute listed as novice. An advanced user would know what this information means without having to read it. So when you produce the documentation for the novice user, your ditaval file would read “Include Audience = Novice”. And all of those paragraphs would show up in your final documentation.
If you’re producing documentation for an advanced user, in the ditaval file you would indicate “Exclude Audience = Novice”. The resulting transform would exclude that paragraphs intended for the novice audience.
In this next example we have commands listed in both Unix and Windows styles. In the outlined tag you’ll see there is a platform attribute that says either windows or unix.
To produce documentation for the windows format your DITAVAL file would indicate “Include platform=windows Exclude platform=Unix”
To create the unix only documentation your DITAVAL file would indicate “Exclude platform=windows Include platform=Unix”
Our last attribute is “PROPS”.
In the example you’ll see it indicates that the database can be installed on a desktop or laptop device, suppose you have a customer that thinks desktops are history and only want references to the laptop, the offending word is in your short description, do you want to create two short descriptions and label them with the appropriate attributes? You could attack the offending words directly using the PH tag and the PROPS attribute as seen in the example.
Note that the word “OR” is also tagged, if it isn’t excluded the sentence will not make sense.
To exclude the desktop reference your DITAVAL file will indicate “exclude Props = Desktop”.
These three attribute are just an example of the attirbutes available within DITA.
We have now laid out for you a basic roadmap to a successful DITA implementation, but we want to restate that the path that we have laid out is simply a depiction of a typical path from start to finish. The specific steps and the order of the steps will change for every company. If you are unsure of the order that makes sense for your company, or if your DITA project is stuck in one of these steps, give us a call. This is what we do. JANA has helped many companies in their efforts to build a DITA library by providing the expertise and the manpower required to see a project through to completion. As a respected provider of technical documentation services for over 40 years, JANA has extensive resources that it can bring to bear on your project – no matter the size.
If you have questions for the panel, get them ready now because we are about to move on to the Q & A session, but before we do, we also want to mention the importance of conducting a DITA pilot project.
A small-scale DITA pilot project will provide course direction and help you decide how to focus your implementation plan. It will also help you determine where your strengths and weaknesses lie, and will increase your odds of a successful large-scale DITA implementation.
CLICK
At this time we’ll answer the questions submitted during today’s webinar and if we have time remaining we’ll answer any additional questions as well.
(CLICK when q/a is over)
Great, thanks Joe. So, that concludes our webinar today, we thank you for your questions and participation. If you would like to speak to us in more detail or know more about JANA’s DITA consulting solutions please don’t hesitate to reach out to us. Remember, JANA has the expertise and the resources necessary to support a project of any size in the authoring, conversion, management, and publishing of quality content, with over 10 years of experience implementing the DITA standard.
We hope that each of you have a great remainder of the day and thank you again for attending this Webinar. Look for a follow-up email later today.