This document discusses Gunnar Krause's presentation on using an agile approach to specialized DITA-based authoring at Qimonda. It describes Qimonda's motivation to migrate documentation to a topic-based structure using DITA, including workflow improvements and potential cost savings. It also outlines Gunnar's blend of strategies for chunking content into topics and specialized DITA schemas developed at Qimonda, and why an agile project management approach was taken for the migration.
This document summarizes a presentation given by G.H. Krause of Qimonda on chunking content with confidence. Qimonda is a top 3 global DRAM manufacturer with over 10,000 employees globally. Their technical documentation department produces over 1,000 documents and 150 releases per month. Krause discusses introducing a DITA-based single-sourcing system to improve reuse and parallel work. He covers strategies for determining the right chunk size, including reading literature, balancing top-down and bottom-up approaches, and allowing variations while maintaining consistency. The goal is to optimize reuse, flexibility and reduce redundancy and complexity in documentation.
Shigeyuki Kobayashi has over 21 years of experience in software development and 12 years in storage product marketing. He has a proven track record of technical marketing skills like new product launches and customer presentations. His most recent experience includes less than one year working in data center operations transition and ITIL-based operations for a global pharmaceutical company in Japan.
The main focus of my talk is how DITA works well in an Agile environment for technical publication to produce simple, crisp, and lean user documentation per sprint. Just as programmers employ Agile techniques to improve their deliverables, task-oriented documentation using DITA helps technical writers in creating user deliverables that allows for continuous feedback and improve the documentation’s velocity and adaptability to change, even extreme changes.
The State of the Technical Communication Industry: tcworld India 2013 Keynote...Scott Abel
The keynote address, delivered February 21, 2013 at tcworld India conference, Bangalore, India. The presentation provides a summary of the 2012 Technical Communication Industry Benchmarking Survey produced by The Content Wrangler. The survey aimed to capture the current methods, standards and tools used -- as well as future plans of -- the technical documentation and training departments of large, global, content-heavy organizations.
Faster than Agile - Proposal for Lavacon 2015Jang F.M. Graat
This is my second proposal for the LavaCon conference to be held in New Orleans in October 2015. The first version of this talk will be delivered at DITA/CMS NA in Chicago in April.
Agile Content Development and the IXIASOFT DITA CMSIXIASOFT
Keith Schengili-Roberts, IXIASOFT DITA Information Architect, reviews the benefits of working with agile content development and the IXIASOFT DITA CMS.
This document summarizes a presentation given by G.H. Krause of Qimonda on chunking content with confidence. Qimonda is a top 3 global DRAM manufacturer with over 10,000 employees globally. Their technical documentation department produces over 1,000 documents and 150 releases per month. Krause discusses introducing a DITA-based single-sourcing system to improve reuse and parallel work. He covers strategies for determining the right chunk size, including reading literature, balancing top-down and bottom-up approaches, and allowing variations while maintaining consistency. The goal is to optimize reuse, flexibility and reduce redundancy and complexity in documentation.
Shigeyuki Kobayashi has over 21 years of experience in software development and 12 years in storage product marketing. He has a proven track record of technical marketing skills like new product launches and customer presentations. His most recent experience includes less than one year working in data center operations transition and ITIL-based operations for a global pharmaceutical company in Japan.
The main focus of my talk is how DITA works well in an Agile environment for technical publication to produce simple, crisp, and lean user documentation per sprint. Just as programmers employ Agile techniques to improve their deliverables, task-oriented documentation using DITA helps technical writers in creating user deliverables that allows for continuous feedback and improve the documentation’s velocity and adaptability to change, even extreme changes.
The State of the Technical Communication Industry: tcworld India 2013 Keynote...Scott Abel
The keynote address, delivered February 21, 2013 at tcworld India conference, Bangalore, India. The presentation provides a summary of the 2012 Technical Communication Industry Benchmarking Survey produced by The Content Wrangler. The survey aimed to capture the current methods, standards and tools used -- as well as future plans of -- the technical documentation and training departments of large, global, content-heavy organizations.
Faster than Agile - Proposal for Lavacon 2015Jang F.M. Graat
This is my second proposal for the LavaCon conference to be held in New Orleans in October 2015. The first version of this talk will be delivered at DITA/CMS NA in Chicago in April.
Agile Content Development and the IXIASOFT DITA CMSIXIASOFT
Keith Schengili-Roberts, IXIASOFT DITA Information Architect, reviews the benefits of working with agile content development and the IXIASOFT DITA CMS.
The document discusses adopting the DITA framework for technical documentation at Qimonda, a DRAM manufacturer. It describes Qimonda's current documentation process and tools, and the increasing demands that would require more resources if unchanged. The author argues that moving to a DITA-based system using single-sourcing and a content management system could help by allowing topic-based editing and automatic delivery building. However, success requires understanding adopter categories and marketing the new system to colleagues across the "technology adoption chasm".
In an ideal world, all documentation content would come in one format (and that format should be DITA). But let's face it, content produced in a company is diverse and comes in many forms and sizes.
So how can we single source everything? Can we integrate contributors who use formats like language-specific API documentation, HTML, MarkDown or even Excel spreadsheets or database tables in a DITA-based workflow? Could we convert everything to DITA on the fly? Could we use a magic glass to perceive various data sources as DITA?
We may try to convince everybody to produce DITA content but this may not be always possible. Instead of that we can accept these diverse data formats but look at them as different ways of encoding DITA. So if we put in place the right decoder we will get back our DITA content.
The document discusses the major changes in TOGAF 9.2 including restructuring the framework and introducing the TOGAF Library. Key changes include enhancements to business architecture with new artifacts related to value streams and business capabilities, updated terms and definitions, and additional details in the ADM. Security architecture was also enhanced with its own guide. The presentation provides an overview of ITpreneurs' TOGAF training offerings and pathways for architects to become certified or take additional courses in related frameworks like DevOps and CCC.
This document discusses considerations for converting legacy data to the S1000D specification. It notes that a conversion involves changing content from one format to another to enable new capabilities. The document outlines analyzing source formats and content, designing a conversion system, planning the project, understanding specifications and customer needs, and considering return on investment. It emphasizes understanding resources, dependencies, outputs, and limitations when planning a conversion project.
The document summarizes a case study where a medical device manufacturer used Adobe FrameMaker 9 and DITA to reduce costs of multilingual documentation and translation. They created a single structured template that could publish documents in 27 languages by controlling formatting with XML attributes for elements. This allowed automatic publishing of manuals through processing instructions and saved chapters in native XML format. It reduced project time and costs by 25-50% compared to previous processes.
Sustainable XML for Publishing Applications: DITA Makes It PossibleScott Abel
Presented by Eliot Kimber at Documentation and Training East 2008,
October 29-November 1, 2008 in Burlington, MA.
XML applications for publishers have largely failed to realize the
full potential inherent in the technology. While larger publishers
could make the investment necessary to realize significant return on
the use of XML technology, smaller enterprises simply could not, for a
number of reasons, but fundamentally because the startup costs and
ongoing costs of ownership were simply too high. The DITA standard
fundamentally changes the equation, bringing several unique features
that, together, serve to lower both the startup cost and ongoing
costs, making the use of XML for publishers much more affordable than
it ever has before. At the same time, advances in supporting
technologies important to Publishers, such as improved support for XML
in Adobe Creative Suite and Microsoft Office, powerful new XML search
and retrieval systems such as MarkLogic, and a new generation of lower-
cost XML editors, as serve to make the use of XML for Publishing
applications more attractive than it ever has been before.
Data Science in Production: Technologies That Drive Adoption of Data Science ...Nir Yungster
Critical to a data science team’s ability to drive impact is its effectiveness in incorporating its solutions into new or existing products. When collaborating with other engineering teams, and especially when solutions must operate at scale, technological choices can be critical factors in determining what type of outcome you'll have. We walk through strategies and specific technologies - Airflow, Docker, Kubernetes - that can help promote successful collaboration between data science and engineering.
DITA is an OASIS standard for modular content that can be assembled and published in many different ways. The full DITA standard provides powerful features for single-sourcing and structured authoring but can be intimidating for new adopters who require only a subset of those features.
The OASIS DITA Technical Committee is planning to define a lightweight DITA architecture to allow a broader range of authoring and publishing tools to support a useful subset of the full DITA standard.
This presentation provides a preview of the lightweight DITA proposal for DITA 1.3, including some example markup and possible architectural approaches.
Case Study: ABAP Development Life Cycle and Governance at THE GLOBE AND MAIL ...Virtual Forge
Check out this much-noticed presentation held at the 2013 SAPTechEd Conference in Las Vegas. Attendees were pleased and excited by the content that was presented.
Drupal cho doanh nghiệp - cắt giảm tổng chi phí sở hữu phần mềmAiTi Education
Tom has experience in marketing, content strategy, and web development. He discusses the total cost of ownership (TCO) model, noting that upfront software costs are small compared to hidden lifetime costs like maintenance and replacement. TCO should be considered over the full product lifecycle. Drupal reduces enterprise TCO through its large community, modular architecture, and proven deployments at organizations like governments and universities. Drupal aims to provide flexibility, security, and integration at a lower overall cost than alternatives.
Drupal enterprise solutions reduce total cost of ownership (tco)Tom T
Tom has experience in marketing, content strategy, and web development. He discusses the total cost of ownership (TCO) model, noting that upfront software costs are small compared to hidden lifetime costs like maintenance and replacement. TCO should be considered over the full product lifecycle. Drupal reduces enterprise TCO through its large community, modular architecture, and proven deployments at governments and large organizations. Drupal helps control costs and adapt to changing needs over the long term.
This document discusses trends in engineering collaboration for heavy equipment machinery. It highlights challenges like globalization, complexity, and short time to market. A key trend is emerging global demand driving cost and quality pressures. This impacts businesses requiring global design, local manufacturing, and more innovative products. The document suggests collaborative, global engineering addressing local customer needs as a solution. It promotes product data management, document management, and change management tools for enabling effective engineering collaboration. Case studies show these tools increasing productivity, efficiency, and flexibility for customers.
Mark Niebergall presented on reducing technical debt by avoiding adding new debt, paying off existing debt through a repeating process, and tips on when to refactor code versus initiating a rewrite. He defined technical debt as consequences of poor design and architecture that are incurred knowingly and inadvertently. Examples of debt include unused code, outdated technology, and overcomplicated code. Sources of debt include changes to libraries, frameworks, and inexperience. Paying off debt requires identifying issues, prioritizing work, and making improvements over time through consistent effort.
Always Be Deploying. How to make R great for machine learning in (not only) E...Wit Jakuczun
The document discusses best practices for deploying machine learning models and R code in a production environment. It advocates treating ML deployment like software deployment by using version control, continuous integration, standardized projects and dependency management. It also recommends abstracting ML processes with concepts like a feature store, model factory, model repository and scoring engine to support productionization. The goal is an "Always Be Deploying" policy where deployment starts early and ML solutions can be easily maintained and reproduced.
Ten Reasons Why Netezza Professionals Should Consider GreenplumVMware Tanzu
This webinar is for IT professionals who have devoted considerable time and effort growing their careers in and around the Netezza platform.
We’ll explore the architectural similarities and technical specifics of what makes the open source Greenplum Database a logical next step for those IT professionals wishing to leverage their MPP experience with a PostgreSQL-based database.
As the Netezza DBMS faces a significant end-of-support milestone, leveraging an open source, infrastructure-agnostic replacement that has a similar architecture will help avoid a costly migration to either a different architecture or another proprietary alternative.
Presenters :
Jacque Istok, Head of Data, Pivotal
Kelly Carrigan, Principal Consultant, EON Collective
Managing Big Data projects in a constantly changing environment - Rafał Zalew...GetInData
Watch our full performance given by our team during the Big Data Technology Warsaw Summit: https://www.youtube.com/watch?v=CBrq7z8ikaM
The nature of Big Data projects are nowadays one of its kind - they are not like the data warehousing initiatives in the old days, nor like cloud native applications projects, at least not yet. Variety of technologies, complicated architectures and rapidly changing landscape are just a few challenges that the IT Department is facing in such projects. When you add the number of stakeholders from different departments involved and that Big Data project is sometimes more like an R&D with unpredictable outcome, this makes a mix where the objectives can be easily lost. It is not a surprise that up to 85% of Big Data projects were pure failures (Gartner 2016).
In this talk we will share our experience in planning and executing Big Data initiatives in the organisations, with some use cases and good practices in mind
Watch our webinar here: https://www.youtube.com/watch?v=CBrq7z8ikaM
Speakers:
Rafał Małanij
Rafał Zalewski
Linkedin: https://www.linkedin.com/in/rafalzalewski/
___
Company:
Getindata is a company founded in 2014 by ex-Spotify data engineers. From day one our focus has been on Big Data projects. We bring together a group of best and most experienced experts in Poland, working with cloud and open-source Big Data technologies to help companies build scalable data architectures and implement advanced analytics over large data sets.
Our experts have vast production experience in implementing Big Data projects for Polish as well as foreign companies including i.a. Spotify, Play, Truecaller, Kcell, Acast, Allegro, ING, Agora, Synerise, StepStone, iZettle and many others from the pharmaceutical, media, finance and FMCG industries.
https://getindata.com
[Case Study] - Nuclear Power, DITA and FrameMaker: The How's and Why'sScott Abel
Presented by Thomas Aldous at Documentation and Training East 2008,
October 29-November 1 in Burlington, MA.
This session is for anyone that is interested in learning how to
manage a transition to Specialized DITA including Content Management
Systems, Editors and Publishing Server issues and resolutions. As a
added bonus, we will also convert an Word Document To Specialized DITA
and edit the content is FrameMaker 8. There will be a question and
answer period at the end of the session for both technical and project
management issues.
Talk on the GitLab Commit 2020: Join us to learn how we helped one of the largest financial services institutions in the world shape their cloud strategy using GitLab and Terraform. Starting on a cloud journey brings so many questions around resource provisioning & management, security, compliance, how to enable the team with easy access to definitions, and keep everyone updated. As we know, the most reliable source of truth is the code, so the use of infrastructure as code paired with an inner-source process is a solid foundation.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
More Related Content
Similar to Buzzword at Work - An Agile Approach to Spezialized DITA-based Authoring
The document discusses adopting the DITA framework for technical documentation at Qimonda, a DRAM manufacturer. It describes Qimonda's current documentation process and tools, and the increasing demands that would require more resources if unchanged. The author argues that moving to a DITA-based system using single-sourcing and a content management system could help by allowing topic-based editing and automatic delivery building. However, success requires understanding adopter categories and marketing the new system to colleagues across the "technology adoption chasm".
In an ideal world, all documentation content would come in one format (and that format should be DITA). But let's face it, content produced in a company is diverse and comes in many forms and sizes.
So how can we single source everything? Can we integrate contributors who use formats like language-specific API documentation, HTML, MarkDown or even Excel spreadsheets or database tables in a DITA-based workflow? Could we convert everything to DITA on the fly? Could we use a magic glass to perceive various data sources as DITA?
We may try to convince everybody to produce DITA content but this may not be always possible. Instead of that we can accept these diverse data formats but look at them as different ways of encoding DITA. So if we put in place the right decoder we will get back our DITA content.
The document discusses the major changes in TOGAF 9.2 including restructuring the framework and introducing the TOGAF Library. Key changes include enhancements to business architecture with new artifacts related to value streams and business capabilities, updated terms and definitions, and additional details in the ADM. Security architecture was also enhanced with its own guide. The presentation provides an overview of ITpreneurs' TOGAF training offerings and pathways for architects to become certified or take additional courses in related frameworks like DevOps and CCC.
This document discusses considerations for converting legacy data to the S1000D specification. It notes that a conversion involves changing content from one format to another to enable new capabilities. The document outlines analyzing source formats and content, designing a conversion system, planning the project, understanding specifications and customer needs, and considering return on investment. It emphasizes understanding resources, dependencies, outputs, and limitations when planning a conversion project.
The document summarizes a case study where a medical device manufacturer used Adobe FrameMaker 9 and DITA to reduce costs of multilingual documentation and translation. They created a single structured template that could publish documents in 27 languages by controlling formatting with XML attributes for elements. This allowed automatic publishing of manuals through processing instructions and saved chapters in native XML format. It reduced project time and costs by 25-50% compared to previous processes.
Sustainable XML for Publishing Applications: DITA Makes It PossibleScott Abel
Presented by Eliot Kimber at Documentation and Training East 2008,
October 29-November 1, 2008 in Burlington, MA.
XML applications for publishers have largely failed to realize the
full potential inherent in the technology. While larger publishers
could make the investment necessary to realize significant return on
the use of XML technology, smaller enterprises simply could not, for a
number of reasons, but fundamentally because the startup costs and
ongoing costs of ownership were simply too high. The DITA standard
fundamentally changes the equation, bringing several unique features
that, together, serve to lower both the startup cost and ongoing
costs, making the use of XML for publishers much more affordable than
it ever has before. At the same time, advances in supporting
technologies important to Publishers, such as improved support for XML
in Adobe Creative Suite and Microsoft Office, powerful new XML search
and retrieval systems such as MarkLogic, and a new generation of lower-
cost XML editors, as serve to make the use of XML for Publishing
applications more attractive than it ever has been before.
Data Science in Production: Technologies That Drive Adoption of Data Science ...Nir Yungster
Critical to a data science team’s ability to drive impact is its effectiveness in incorporating its solutions into new or existing products. When collaborating with other engineering teams, and especially when solutions must operate at scale, technological choices can be critical factors in determining what type of outcome you'll have. We walk through strategies and specific technologies - Airflow, Docker, Kubernetes - that can help promote successful collaboration between data science and engineering.
DITA is an OASIS standard for modular content that can be assembled and published in many different ways. The full DITA standard provides powerful features for single-sourcing and structured authoring but can be intimidating for new adopters who require only a subset of those features.
The OASIS DITA Technical Committee is planning to define a lightweight DITA architecture to allow a broader range of authoring and publishing tools to support a useful subset of the full DITA standard.
This presentation provides a preview of the lightweight DITA proposal for DITA 1.3, including some example markup and possible architectural approaches.
Case Study: ABAP Development Life Cycle and Governance at THE GLOBE AND MAIL ...Virtual Forge
Check out this much-noticed presentation held at the 2013 SAPTechEd Conference in Las Vegas. Attendees were pleased and excited by the content that was presented.
Drupal cho doanh nghiệp - cắt giảm tổng chi phí sở hữu phần mềmAiTi Education
Tom has experience in marketing, content strategy, and web development. He discusses the total cost of ownership (TCO) model, noting that upfront software costs are small compared to hidden lifetime costs like maintenance and replacement. TCO should be considered over the full product lifecycle. Drupal reduces enterprise TCO through its large community, modular architecture, and proven deployments at organizations like governments and universities. Drupal aims to provide flexibility, security, and integration at a lower overall cost than alternatives.
Drupal enterprise solutions reduce total cost of ownership (tco)Tom T
Tom has experience in marketing, content strategy, and web development. He discusses the total cost of ownership (TCO) model, noting that upfront software costs are small compared to hidden lifetime costs like maintenance and replacement. TCO should be considered over the full product lifecycle. Drupal reduces enterprise TCO through its large community, modular architecture, and proven deployments at governments and large organizations. Drupal helps control costs and adapt to changing needs over the long term.
This document discusses trends in engineering collaboration for heavy equipment machinery. It highlights challenges like globalization, complexity, and short time to market. A key trend is emerging global demand driving cost and quality pressures. This impacts businesses requiring global design, local manufacturing, and more innovative products. The document suggests collaborative, global engineering addressing local customer needs as a solution. It promotes product data management, document management, and change management tools for enabling effective engineering collaboration. Case studies show these tools increasing productivity, efficiency, and flexibility for customers.
Mark Niebergall presented on reducing technical debt by avoiding adding new debt, paying off existing debt through a repeating process, and tips on when to refactor code versus initiating a rewrite. He defined technical debt as consequences of poor design and architecture that are incurred knowingly and inadvertently. Examples of debt include unused code, outdated technology, and overcomplicated code. Sources of debt include changes to libraries, frameworks, and inexperience. Paying off debt requires identifying issues, prioritizing work, and making improvements over time through consistent effort.
Always Be Deploying. How to make R great for machine learning in (not only) E...Wit Jakuczun
The document discusses best practices for deploying machine learning models and R code in a production environment. It advocates treating ML deployment like software deployment by using version control, continuous integration, standardized projects and dependency management. It also recommends abstracting ML processes with concepts like a feature store, model factory, model repository and scoring engine to support productionization. The goal is an "Always Be Deploying" policy where deployment starts early and ML solutions can be easily maintained and reproduced.
Ten Reasons Why Netezza Professionals Should Consider GreenplumVMware Tanzu
This webinar is for IT professionals who have devoted considerable time and effort growing their careers in and around the Netezza platform.
We’ll explore the architectural similarities and technical specifics of what makes the open source Greenplum Database a logical next step for those IT professionals wishing to leverage their MPP experience with a PostgreSQL-based database.
As the Netezza DBMS faces a significant end-of-support milestone, leveraging an open source, infrastructure-agnostic replacement that has a similar architecture will help avoid a costly migration to either a different architecture or another proprietary alternative.
Presenters :
Jacque Istok, Head of Data, Pivotal
Kelly Carrigan, Principal Consultant, EON Collective
Managing Big Data projects in a constantly changing environment - Rafał Zalew...GetInData
Watch our full performance given by our team during the Big Data Technology Warsaw Summit: https://www.youtube.com/watch?v=CBrq7z8ikaM
The nature of Big Data projects are nowadays one of its kind - they are not like the data warehousing initiatives in the old days, nor like cloud native applications projects, at least not yet. Variety of technologies, complicated architectures and rapidly changing landscape are just a few challenges that the IT Department is facing in such projects. When you add the number of stakeholders from different departments involved and that Big Data project is sometimes more like an R&D with unpredictable outcome, this makes a mix where the objectives can be easily lost. It is not a surprise that up to 85% of Big Data projects were pure failures (Gartner 2016).
In this talk we will share our experience in planning and executing Big Data initiatives in the organisations, with some use cases and good practices in mind
Watch our webinar here: https://www.youtube.com/watch?v=CBrq7z8ikaM
Speakers:
Rafał Małanij
Rafał Zalewski
Linkedin: https://www.linkedin.com/in/rafalzalewski/
___
Company:
Getindata is a company founded in 2014 by ex-Spotify data engineers. From day one our focus has been on Big Data projects. We bring together a group of best and most experienced experts in Poland, working with cloud and open-source Big Data technologies to help companies build scalable data architectures and implement advanced analytics over large data sets.
Our experts have vast production experience in implementing Big Data projects for Polish as well as foreign companies including i.a. Spotify, Play, Truecaller, Kcell, Acast, Allegro, ING, Agora, Synerise, StepStone, iZettle and many others from the pharmaceutical, media, finance and FMCG industries.
https://getindata.com
[Case Study] - Nuclear Power, DITA and FrameMaker: The How's and Why'sScott Abel
Presented by Thomas Aldous at Documentation and Training East 2008,
October 29-November 1 in Burlington, MA.
This session is for anyone that is interested in learning how to
manage a transition to Specialized DITA including Content Management
Systems, Editors and Publishing Server issues and resolutions. As a
added bonus, we will also convert an Word Document To Specialized DITA
and edit the content is FrameMaker 8. There will be a question and
answer period at the end of the session for both technical and project
management issues.
Talk on the GitLab Commit 2020: Join us to learn how we helped one of the largest financial services institutions in the world shape their cloud strategy using GitLab and Terraform. Starting on a cloud journey brings so many questions around resource provisioning & management, security, compliance, how to enable the team with easy access to definitions, and keep everyone updated. As we know, the most reliable source of truth is the code, so the use of infrastructure as code paired with an inner-source process is a solid foundation.
Similar to Buzzword at Work - An Agile Approach to Spezialized DITA-based Authoring (20)
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!