The document discusses the National Library of Australia's approach to digital preservation. It addresses the various types of digital materials in the library's collections, preservation responsibilities, required preservation processes, and approaches to prioritizing preservation treatment. It describes how understanding these areas led the library to develop systems for preservation assessment and reporting to help manage risks to digital content over time. The goal is to maintain long-term access to content while addressing different levels of complexity, formats, and preservation needs across collections.
Presented by Sarah Grimm (Wisconsin Historical Society) and Emily Pfotenhauer (WiLS) for the Wisconsin Association of Academic Librarians (WAAL) conference, Elkhart Lake, Wisconsin, April 25, 2013. Content based on Modules 1 & 2 of the Digital Preservation Outreach and Education (DPOE) Baseline Digital Preservation Curriculum developed by the Library of Congress.
Presented by Sarah Grimm (Wisconsin Historical Society) and Emily Pfotenhauer (WiLS) for the WiLSWorld conference, Madison, Wisconsin, July 24, 2013. Content based on Modules 1 & 2 of the Digital Preservation Outreach and Education (DPOE) Baseline Digital Preservation Curriculum developed by the Library of Congress.
This presentation, delivered at CNI 2012, summarizes the lessons learned from trial audits of a several production distributed digital preservation networks. These audits were conducted using the open source SafeArchive system, which enables automated auditing of a selection of TRAC criteria related to replication and storage. An analysis of the trial audits demonstrates both the complexities of auditing modern replicated storage networks, and reveals common gaps between archival policy and practice. Recommendations for closing these gaps are discussed, as are extensions that have been added to the SafeArchive system to mitigate risks in distributed digital preservation (DDP).
Presented by Sarah Grimm (Wisconsin Historical Society) and Emily Pfotenhauer (WiLS) for the Wisconsin Association of Academic Librarians (WAAL) conference, Elkhart Lake, Wisconsin, April 25, 2013. Content based on Modules 1 & 2 of the Digital Preservation Outreach and Education (DPOE) Baseline Digital Preservation Curriculum developed by the Library of Congress.
Presented by Sarah Grimm (Wisconsin Historical Society) and Emily Pfotenhauer (WiLS) for the WiLSWorld conference, Madison, Wisconsin, July 24, 2013. Content based on Modules 1 & 2 of the Digital Preservation Outreach and Education (DPOE) Baseline Digital Preservation Curriculum developed by the Library of Congress.
This presentation, delivered at CNI 2012, summarizes the lessons learned from trial audits of a several production distributed digital preservation networks. These audits were conducted using the open source SafeArchive system, which enables automated auditing of a selection of TRAC criteria related to replication and storage. An analysis of the trial audits demonstrates both the complexities of auditing modern replicated storage networks, and reveals common gaps between archival policy and practice. Recommendations for closing these gaps are discussed, as are extensions that have been added to the SafeArchive system to mitigate risks in distributed digital preservation (DDP).
Digital Preservation Planning: Just Do It!valariek
AkLA 2014 Digital Preservation Planning: Just Do It!
Presenters: Valarie Kingsland, Kristine Bunnell, Lisa C. Krynicki, Neva Reece, and Rachel Seale
(Organized and moderated by Valarie Kingsland)
A Library of Congress Digital Preservation Outreach and Education (DPOE) Train-the- Trainer workshop was held at the Elmer E. Rasmuson Library at UAF, where participants from around the State of Alaska completed an intensive training program to learn how to present the Library of Congress curriculum in order to inform archives, libraries, museums, and other institutions or organizations, about how to develop a digital preservation plan. Join us to discover what Alaska DPOE Trainers have to offer in this fast paced introduction to concepts and stages of digital preservation that can be applied to your organization, workplace, or your personal digital environment. Start planning today!
http://akla.org/anchorage2014/presentation/digital-preservation-planning-just-do-it/
Integrity of assets and metadata affects asset analytics, production, usage rights, and the story of the asset from creation to dissemination. Because of the transient and fragile nature of electronic records, without Digital Preservation, your metadata, your database, your digital asset management system and your assets mean very little.
Preserving repository content: practical steps for repository managers by Mig...JISC KeepIt project
The JISC-funded KeepIt project is working with a series of different types of digital repository to enable the participating repository managers to formulate practical and achievable preservation plans. From the point of view of the repository manager, this presentation summarises the activities of the KeepIt project, describes the impact that the project has had on the participating repositories, and suggests 7 steps to preservation readiness that other repository managers might take. The presentation was first given at the international Open Repositories 2010 conference during July in Madrid. For more updates see the project blog http://blogs.ecs.soton.ac.uk/keepit/
Track 4. New publishing and scientific communication ways: Electronic edition, digital educational resources
Authors: Laerte Silva Júnior; Maria Manuel Borges
https://youtu.be/nRRCFo2NLoM
This presentation discusses what digital ‘stuff’ the National Library of Australia is responsible for and explores some of the main issues regarding digital preservation of this ‘stuff’. It was delivered at the New South Wales State Library on February 15, 2011 by David Pearson
This presentation will provide an overview of issues in digital preservation. Presentation was delivered during the joint DPE/Planets/CAPAR/nestor training event, ‘The Preservation challenge: basic concepts and practical applications’ (Barcelona, March 2009)
ArchivesSpace: Building a Next-Generation Archives Management ToolMark Matienzo
Presentation at Digital Library Federation Forum, October 31, 2011, by Katherine Kott, ArchivesSpace Development Manager Mark A. Matienzo, ArchivesSpace Technical Architect.
Digital Preservation is the focus of a three-part webinar series that will help you preserve your digital content. Sponsored by the Nebraska State Historical Society and the Nebraska Library Commission, these webinars will connect you to Library of Congress training modules. The LC’s Digital Preservation Outreach Education (DPOE) program simplifies the complex world of digital preservation into six tasks modules: inventory, select, storage, protect, manage, and provide.
The February 6 webinar will focus on the Inventory and Select Modules: The first step in digital preservation is identifying what types of digital content needs to be preserved. Learn the importance of conducting and maintaining an inventory of your digital content and how that inventory will assist you in setting priorities and selecting what should be preserved.
All three webinars will be presented by Karen Keehr, Curator of Photographs at the Nebraska State Historical Society. Karen represented Nebraska at an intensive week-long DPOE training workshop this summer. These webinars are the first in a series of training opportunities for libraries, archives and museums that will be presented in 2013-14 as part of the newly-formed Husker Heritage NEtwork, funded in part by the Institute of Museum and Library Services. To find out more about future offerings, Nebraska’s statewide collections preservation plan and more resources, go to www.nebraskahistory.org/connect.
NCompass Live - February 6, 2013
http://nlc.nebraska.gov/ncompasslive/
Workshop presented at the Wisconsin Conference for Local History and Historic Preservation, Wisconsin Rapids, October 11, 2013. Presenters: Sarah Grimm, Electronic Records Archivist, Wisconsin Historical Society and Emily Pfotenhauer, Recollection Wisconsin Program Manager, WiLS.
Rebecca Grant - Archiving and Digital Preservation (Figshare Fest)dri_ireland
Presentation given by Rebecca Grant, Digital Archivist with Digital Repository of Ireland, part of a workshop on Digital Archiving and Digital Preservation held as part of Figshare Fest in London, May 12th 2016. Figshare is an online digital repository where researchers can preserve and share their research outputs, including figures, datasets, images, and videos. Its annual Figshare Fest is a chance to gather together institutional clients, advocates and friends to talk about open research.
Digital Preservation is the focus of a three-part webinar series that will help you preserve your digital content. Sponsored by the Nebraska State Historical Society and the Nebraska Library Commission, these webinars will connect you to Library of Congress training modules. The LC’s Digital Preservation Outreach Education (DPOE) program simplifies the complex world of digital preservation into six tasks modules: inventory, select, storage, protect, manage, and provide.
March 6 presents Manage and Provide Modules: Managing your digital content is an active and ongoing process. Learn how planning and policies are keys to digital preservation. With your digital content safely stored and preserved, how you do you provide access to your patrons? This final module will address the issues of delivering your content in user-friendly, long-term ways.
All three webinars will be presented by Karen Keehr, Curator of Photographs at the Nebraska State Historical Society. Karen represented Nebraska at an intensive week-long DPOE training workshop this summer. These webinars are the first in a series of training opportunities for libraries, archives and museums that will be presented in 2013-14 as part of the newly-formed "Husker Heritage NEtwork," funded in part by the Institute of Museum and Library Services. To find out more about future offerings, Nebraska’s statewide collections preservation plan and more resources, go to www.nebraskahistory.org/connect.
NCompass Live - March 6, 2013.
http://nlc.nebraska.gov/ncompasslive/
What goes where? Bringing a new repository online at the Ohio State Universit...Emily Frieda Shaw
A presentation delivered on 6/28/15 to the Digital Preservation Interest Group, part of the Preservation and Reformatting Section of the Association of Library Collections and Technical Services, which is in turn part of the American Library Association.
Like most libraries, the Ohio State University Libraries did not enter the digital library sphere with clear policies and a unified, inter-operable infrastructure for managing all of our digital collections. The Libraries has a long-standing commitment to making our unique collections accessible to the campus and global communities and maintains an expertly managed and curated Institutional Repository (the Knowledge Bank). But for more than a decade, OSU’s digital collections developed in response to the requirements of specific projects, resulting in a fragmented infrastructure that is difficult to maintain and is ultimately ill suited to long-term preservation and sharing on the global scale to which we aspire.
Thus, for the past several years, the OSU Libraries has been investing heavily in planning and development of a robust repository infrastructure to enhance access, management and preservation of digital collections of all types. As our Fedora repository comes on line, a team of colleagues from across the organization are developing policies and decision making criteria for reappraising digital assets that currently exist in a variety of legacy systems and servers with widely variable metadata, and creating prioritized workflows for preparing and ingesting content into the new repository infrastructure. This presentation will give an overview of our planning process and share some of the workflow documentation currently under development.
Digital Preservation Planning: Just Do It!valariek
AkLA 2014 Digital Preservation Planning: Just Do It!
Presenters: Valarie Kingsland, Kristine Bunnell, Lisa C. Krynicki, Neva Reece, and Rachel Seale
(Organized and moderated by Valarie Kingsland)
A Library of Congress Digital Preservation Outreach and Education (DPOE) Train-the- Trainer workshop was held at the Elmer E. Rasmuson Library at UAF, where participants from around the State of Alaska completed an intensive training program to learn how to present the Library of Congress curriculum in order to inform archives, libraries, museums, and other institutions or organizations, about how to develop a digital preservation plan. Join us to discover what Alaska DPOE Trainers have to offer in this fast paced introduction to concepts and stages of digital preservation that can be applied to your organization, workplace, or your personal digital environment. Start planning today!
http://akla.org/anchorage2014/presentation/digital-preservation-planning-just-do-it/
Integrity of assets and metadata affects asset analytics, production, usage rights, and the story of the asset from creation to dissemination. Because of the transient and fragile nature of electronic records, without Digital Preservation, your metadata, your database, your digital asset management system and your assets mean very little.
Preserving repository content: practical steps for repository managers by Mig...JISC KeepIt project
The JISC-funded KeepIt project is working with a series of different types of digital repository to enable the participating repository managers to formulate practical and achievable preservation plans. From the point of view of the repository manager, this presentation summarises the activities of the KeepIt project, describes the impact that the project has had on the participating repositories, and suggests 7 steps to preservation readiness that other repository managers might take. The presentation was first given at the international Open Repositories 2010 conference during July in Madrid. For more updates see the project blog http://blogs.ecs.soton.ac.uk/keepit/
Track 4. New publishing and scientific communication ways: Electronic edition, digital educational resources
Authors: Laerte Silva Júnior; Maria Manuel Borges
https://youtu.be/nRRCFo2NLoM
This presentation discusses what digital ‘stuff’ the National Library of Australia is responsible for and explores some of the main issues regarding digital preservation of this ‘stuff’. It was delivered at the New South Wales State Library on February 15, 2011 by David Pearson
This presentation will provide an overview of issues in digital preservation. Presentation was delivered during the joint DPE/Planets/CAPAR/nestor training event, ‘The Preservation challenge: basic concepts and practical applications’ (Barcelona, March 2009)
ArchivesSpace: Building a Next-Generation Archives Management ToolMark Matienzo
Presentation at Digital Library Federation Forum, October 31, 2011, by Katherine Kott, ArchivesSpace Development Manager Mark A. Matienzo, ArchivesSpace Technical Architect.
Digital Preservation is the focus of a three-part webinar series that will help you preserve your digital content. Sponsored by the Nebraska State Historical Society and the Nebraska Library Commission, these webinars will connect you to Library of Congress training modules. The LC’s Digital Preservation Outreach Education (DPOE) program simplifies the complex world of digital preservation into six tasks modules: inventory, select, storage, protect, manage, and provide.
The February 6 webinar will focus on the Inventory and Select Modules: The first step in digital preservation is identifying what types of digital content needs to be preserved. Learn the importance of conducting and maintaining an inventory of your digital content and how that inventory will assist you in setting priorities and selecting what should be preserved.
All three webinars will be presented by Karen Keehr, Curator of Photographs at the Nebraska State Historical Society. Karen represented Nebraska at an intensive week-long DPOE training workshop this summer. These webinars are the first in a series of training opportunities for libraries, archives and museums that will be presented in 2013-14 as part of the newly-formed Husker Heritage NEtwork, funded in part by the Institute of Museum and Library Services. To find out more about future offerings, Nebraska’s statewide collections preservation plan and more resources, go to www.nebraskahistory.org/connect.
NCompass Live - February 6, 2013
http://nlc.nebraska.gov/ncompasslive/
Workshop presented at the Wisconsin Conference for Local History and Historic Preservation, Wisconsin Rapids, October 11, 2013. Presenters: Sarah Grimm, Electronic Records Archivist, Wisconsin Historical Society and Emily Pfotenhauer, Recollection Wisconsin Program Manager, WiLS.
Rebecca Grant - Archiving and Digital Preservation (Figshare Fest)dri_ireland
Presentation given by Rebecca Grant, Digital Archivist with Digital Repository of Ireland, part of a workshop on Digital Archiving and Digital Preservation held as part of Figshare Fest in London, May 12th 2016. Figshare is an online digital repository where researchers can preserve and share their research outputs, including figures, datasets, images, and videos. Its annual Figshare Fest is a chance to gather together institutional clients, advocates and friends to talk about open research.
Digital Preservation is the focus of a three-part webinar series that will help you preserve your digital content. Sponsored by the Nebraska State Historical Society and the Nebraska Library Commission, these webinars will connect you to Library of Congress training modules. The LC’s Digital Preservation Outreach Education (DPOE) program simplifies the complex world of digital preservation into six tasks modules: inventory, select, storage, protect, manage, and provide.
March 6 presents Manage and Provide Modules: Managing your digital content is an active and ongoing process. Learn how planning and policies are keys to digital preservation. With your digital content safely stored and preserved, how you do you provide access to your patrons? This final module will address the issues of delivering your content in user-friendly, long-term ways.
All three webinars will be presented by Karen Keehr, Curator of Photographs at the Nebraska State Historical Society. Karen represented Nebraska at an intensive week-long DPOE training workshop this summer. These webinars are the first in a series of training opportunities for libraries, archives and museums that will be presented in 2013-14 as part of the newly-formed "Husker Heritage NEtwork," funded in part by the Institute of Museum and Library Services. To find out more about future offerings, Nebraska’s statewide collections preservation plan and more resources, go to www.nebraskahistory.org/connect.
NCompass Live - March 6, 2013.
http://nlc.nebraska.gov/ncompasslive/
What goes where? Bringing a new repository online at the Ohio State Universit...Emily Frieda Shaw
A presentation delivered on 6/28/15 to the Digital Preservation Interest Group, part of the Preservation and Reformatting Section of the Association of Library Collections and Technical Services, which is in turn part of the American Library Association.
Like most libraries, the Ohio State University Libraries did not enter the digital library sphere with clear policies and a unified, inter-operable infrastructure for managing all of our digital collections. The Libraries has a long-standing commitment to making our unique collections accessible to the campus and global communities and maintains an expertly managed and curated Institutional Repository (the Knowledge Bank). But for more than a decade, OSU’s digital collections developed in response to the requirements of specific projects, resulting in a fragmented infrastructure that is difficult to maintain and is ultimately ill suited to long-term preservation and sharing on the global scale to which we aspire.
Thus, for the past several years, the OSU Libraries has been investing heavily in planning and development of a robust repository infrastructure to enhance access, management and preservation of digital collections of all types. As our Fedora repository comes on line, a team of colleagues from across the organization are developing policies and decision making criteria for reappraising digital assets that currently exist in a variety of legacy systems and servers with widely variable metadata, and creating prioritized workflows for preparing and ingesting content into the new repository infrastructure. This presentation will give an overview of our planning process and share some of the workflow documentation currently under development.
Similar to Dave Pearson The Adventures of Digi (20)
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Dave Pearson The Adventures of Digi
1. The Adventures of Digi:
Ideas, Requirements
and Reality
David Pearson
National Library of
Australia
Future Perfect 2012
Digi
By Imogene Pearson (7 years)
(March 2012)
3. From a preservation point of view, the Library’s digital collections present:
• A mix of materials needing to be kept in perpetuity, along with materials that can be
discarded after specified periods or events;
• Mixed levels of complexity in terms of object structure, relationships and dependencies;
• Mixed levels of intellectual control;
• A wide range of file formats (and carrier formats);
• Different levels of complexity in preservation planning and processing;
• Different timetables for preservation action;
• A need for different preservation approaches, often at different scales; and
• A need for recurring – and possibly changing - preservation action cycles over time, using a
changing suite of tools.
6. Ecology
Ecology or Layers of consciousness for the need for digital preservation intervention
(Given some need to access content over time)
Unaware:
• I am unaware if I have any digital content; or
• I am unaware if I may have a problem accessing any of my digital content.
Aware - no response:
• I don’t think that I have a problem accessing any of my digital content;
• I recognise that I have a problem accessing some of my digital content;
• I recognise that I have a problem accessing some of my digital content. However, the
problem is not my problem; or
• I recognise that I have a problem, but have no response in place - not even a limited one.
Aware – taking some action:
• I accept that I may have a problem accessing some of my digital content. I am taking limited
actions to manage this problem; or
• I accept that I may have a problem accessing some of my digital content. The preservation
mandate is a part of my enterprise or system ecology.
7. Another way of looking at it might be:
David Pearson 2012
8. 3.) What we have come to understand over
time.
http://www.motifake.com/79532 via Google Images
9. Preservation responsibilities:
Preservation of the Library's digital collections involves three main goals:
• Maintaining access to reliable data at bit-stream level;
• Maintaining access to content encoded in the bit streams; and
• Maintaining access to the intended and available meaning of the content.
While specific preservation activities may focus on one or more of these goals, the Library’s
preservation responsibility is only fulfilled when all three goals have been adequately
addressed.
This responsibility applies across all digital collections, subject to curatorial and policy decisions
for specific groups of digital objects.
10. Mission: The primary objective of preservation activities within the NLA is to maintain the
ability to meaningfully access digital collection content over time.
‘Logical on ‘Logical on
Physical Physical
Stuff’ Stuff’
A B
Contextual Dependency
Information – About Information – About
time Content Formats etc.
Systems to Ingest,
Manage, Report and
take Actions
time
Systems to Access –
Master or Derivative
‘Stuffed?’ David Pearson 2012
Google Images
11. Required preservation processes
The Library must be able to:
• Understand what it holds in its collections;
• Understand what its preservation intentions are for every digital object and what it is
entitled to do to realise its intentions;
• Understand what is required to provide access, existing inhibitors to access, and the current
level of support the Library is able to provide;
• Evaluate and monitor the degree of risk arising from collection composition, preservation
intentions and available level of support within the Library for digital collection content, and
monitor for risk conditions arising during general Digital Library operations;
• Anticipate the effects of changes in support;
• Recognise planning triggers, and plan and take appropriate action on a scale appropriate to
the size of the target; and
• Audit the effectiveness of its preservation arrangements and modify the arrangements if
necessary.
12. Risk or ‘Risk-on’ (are you a splitter or a lumper?)
• ‘parameter-based’ risks: a match against a criterion defined by Library staff to indicate a
preservation risk – for example, video encoded with a codec considered to be problematic;
• ‘exception’ risks: the value of a monitored parameter is outside a set of acceptable values;
• ‘change’ risks: there has been a change in status for a monitored parameter for content – for
example, the confidence in format identification for a particular file has changed;
• ‘conflict’ risks: conflicting values for the parameter are reported by one or more tools – for
example, file format identification returns conflicting values;
• ‘unknown value’ risks: undetermined values for defined parameters – for example,
undetermined values for file format and version; and
• ‘access support’ risks: changes in level of support which affect the Library’s ability to access
to content in accordance with preservation intent and significance – for example, reduction
below an acceptable threshold in the availability of supporting software for a particular file
format.
• ‘content-based’ risks: characteristics of content that may not be identifiable from metadata –
for example, presence of deprecated HTML tags.
13. Likely preservation treatment actions
Broad preservation action approaches that are likely to be required will include:
• Format migration at the point of collecting;
• Format migration on recognition of risks;
• Format migration at the point of delivery;
• Emulation of various levels of software and hardware environments;
• Maintenance or supply of appropriate software or hardware;
• Documenting known problems for which no other action can be taken; and
• Deaccessioning or deletion.
14. Prioritising Preservation Treatment:
The Library expects to take into account indicators of ‘preservation intent’, ‘significance’, and
‘level of support’ within monitoring and reporting activities, and in evaluations of risk and
prioritisation for preservation planning and action.
http://callmemilo.deviantart.com/art/Thunderbirds-are-GO-20717927
15. Preservation intent – indicates the expectations for preservation for content:
• whether content is to be preserved;
• who is responsible for preservation of the content;
• the period over which content must be preserved;
• the required level of support for access to the content over time; for example, that the
Library intends to actively maintain the ability to both present and modify content, or only to
present content, or does not intend to actively maintain access to content beyond its
expected useful life.
• Preservation intent may also extend to include more specific characteristics to be supported,
based on curatorial input or constraints imposed by rights policies or agreements with rights
holders.
16. Significance – indicates the relative priority required for taking preservation action to maintain
access to content, as determined by collection curators; for example, content rated as highly
significant would be prioritised for preservation planning and action before content of lower
significance.
Level of support – indicates how well a digital collection object is supported within the Library,
based on a combination of how much is known about the object and its components (including
their file formats), and the degree to which supporting software or hardware environments are
available.
NLA Image
19. Preservation assessment and reporting
The Library must be able to review the composition and characteristics of its digital collections to
assess trends that may affect preservation management, to aid setup of preservation
monitoring, planning and action, and to report on specific aspects of content when
necessary.
A solution must enable staff to define and request, on both an ad hoc and scheduled basis:
• summary reports of content, metadata characteristics and risks across collections or defined
sets of managed content;
• detailed metadata reports for individual items or sets of items; and
• audit trail history reports for individual items or sets of items.
20. Reference knowledgebases (General)
Enable staff to create, update and maintain reference information
knowledgebases on:
• File formats and versions
• Software and hardware components that support access to
file formats and versions, for maintaining access to managed
content; and
• The level of support available for particular file formats and
versions:
– i. sets of software or hardware components available to
support access to formats;
– ii. functions supported, both for providing access to
content and for use in preservation action – for example,
presentation, modification, batch processing;
– iii. fidelity of support – how well functions are
supported; and
– iv. known risks, including potential inhibitors to
preservation, associated with formats or supporting
software or hardware.
• Preservation intent descriptions and parameters for sets of
content.
21. Other systems are also required to interrelate in this
ecosystem such as:
•Preservation monitoring, reporting and prioritisation
•Preservation options and preservation action planning
•Preservation action evaluation
39. Prioritising preservation treatment based on level of support
In evaluations of risk and prioritisation for preservation planning and action, we must take into
the Level of Support/Access Risks and:
• Any constraints imposed by rights policies or agreements; and
• The amount of resources available.
Based on these factors, the Library (Management, Collections and Digi Pres) should be able to
prioritise material to be preserved.
41. Options for preservation actions
We would like to be able to enable staff to:
•define types of preservation actions for use within preservation planning and evaluation.
•update and delete reference information on options for preservation action, both in general and
for particular formats or format types.
•link to information able from the software KB which provides information on what actions
specific software might be useful for and the proximity of the software to the format.
•Link to other linked data sources.
42. Pres action options generation
The Library must be able to test and evaluate preservation action plans to determine if they
satisfactorily achieve the preservation intent for managed content. For example, a solution
should:
•enable staff to develop and test executable preservation action plans for sets of managed
content. Including:
– Single and multiple step actions (combining manual and automated workflows)
– Replacing files/s and linkages in complex objects
– Linking to a specific emulation environment (if available)
– Replacing access software
– Specifying that no action is required
•Support simulations or testing of preservation actions against a content Testbed. For example,
enable staff to perform 'what if' simulations to determine impact of changes to availability of
support for access, including:
– a. Removal of software or hardware sets supporting access, to assess risks or impacts on access; and
– b. Addition or revision of software or hardware sets supporting access, to assess proposed remedial preservation
action plans.
•enable staff to define quality assurance criteria for preservation action plan outcomes
44. Preservation options evaluation
• support import and integration of preservation-treated content and metadata, from either
internal or external processes, including:
– a. Verifying that preservation-treated digital content conforms to acceptance criteria for
preservation outcomes for designated sets of digital content;
– b. Enabling staff to quality assure and approve preservation-treated digital content for
incorporation into the collection; and
– c. After approval, send to preservation action scheduler for treatment of file/s,
metadata and associated relationships.
• support ‘rollback’ of updated versions of content, metadata and associated relationships to
restore previous versions, if necessary.
• enable staff to define and approve acceptance criteria for preservation action outcomes for
sets of managed content.
45. 10.) So what!
Currently, these ideas and requirements
have become ‘partially real’. They still
need to be implemented.
They formed the basis for the preservation
requirements in a subsequent:
• RFP (Request for Proposal) process;
and
• RFT (Request for Tender) process.
http://www.wildsound-filmmaking-feedback-events.com/images/austin_powers_dr_evil.jpg
46. RFP
So all of these ideas where consolidated as
requirements for a Request for Proposal which
went to the market in July 2011.
A number of responses were received for:
• Core systems
• Preservation
• Digitisation
• Other Workflows http://www.melbournesumos.com.au/pics/twister/Twister078.jpg
These were evaluated and some of the vendors
were invited to participate in the next stage.
47. RFT
Based on the RFP, the NLA clarified the
requirements for the next process.
A select group from the RFP process were
invited to participated in a Request for
Tender in which closed in late December
2011.
http://simpro.co/wp-content/uploads/2010/10/paperwork2.jpg
48. What version of reality
have we decided upon?
Everything, for Everyone
Forever
Digi
By Imogene Pearson (7 years)
(March 2012)
http://www.flickr.com/photos/ricksmit/15671245/
Editor's Notes
I was asked to give a presentation on some of the ideas which the Digital Preservation team at the NLA has been working on over the last year. These ideas have formed the basis for requirements and subsequent tender to replace key components of our Digital Library Infrastructure. The NLA wants to, either: source a product to provide this functionality; Work with a product to extend this functionality; or Build this functionality ourselves.
Like many Libraries, the NLA has very diverse and complex ecosystem. In relation to preservation requirements we have to consider: Lots of stuff (around 1.5 PB of total data) Lots of relationships (especially in our Domain Harvests) Mixed levels of intellectual control (catalogued at the file level and the box level) Many different format types – requiring possibly different and recurring actions at different times in the life of a digital object. Because we do not mandate formats that are accepted into the ecosystem, the NLA will have many formats that we are unable to identify and support.
A general break down of the collections is shown: The largest proportion of the content in the repository is digitised materials (primarily newspapers, however digitised materials can be found in almost all other collections) More problematic for us is born digital materials which is also found in most collections – but in lesser volumes Arguable the most problematic collection area is web archives (domain harvest and selectively harvested) because of: size contains potentially anything complex relationships
First, a caveat.
Pete and Jay from the NLNZ and myself have been talking for some time about an ecology or layers of consciousness for the need for digital preservation intervention. What is presented to you is not a perfect representation of reality. However, it is useful when we are trying to explore if our aims, goals and expectations of preservation are even remotely compatible. At the NLA we have been trying to change the perception of the library over the last 5-10 years to steer us towards integrated preservation systems. We are currently in the process of trying to achieve this last state. If you in the audience are somewhere else in this ecology, perhaps the rest of my talk will be gobbledegook.
We can see this ecology in another way: High vers. Low resources High vers. Low awareness Long term vers. Short term retention
The following is a mixture of observations, common practice and some new ideas.
In order to preserve content over time we believe that we need to do the following: Maintain access to the bit-streams; Maintain access to content encoded in the bit streams; and Maintaining access to the meaning of the content. We need to have all of these components covered in order to have a chance to preserve content over time.
Thus, the primary mission of the digital preservation section at the NLA is to: Maintain the ability to meaningfully access digital collection content over time. For example, two models are presented In model A = doing nothing - over time will lose access to not only content but also the bits In model B = through managed systems were are more likely to maintain access to the bits and through pres actions the content However, if we don’t have the context then we will be literally ‘preserving in the dark’.
Furthermore, our preservation process need to allow us to: Understand what is in the collection; Understand why we have it and what we want to do with it; Understand how to access it; Understand when access is going to become or is problematic; Continue to take steps to maintain access; Audit our arrangements.
In preservation we talk a lot about Risk. The concept of risk is itself risky because we have potentially many different types of risk. Also, obsolescence information is subjective and relative. Much of what we know is a best guess. The only real concrete information that we can get is: can ‘we’ access the format; What is our level and the vendors level of support when its it likely that we won’t be able to? Does a format have characteristics which are problematic, and may therefore be more risky. Also, when we do use risk metrics is some kind of meaningful way we tend to lump them into one bucket. However, there are different kinds of risk which are useful for different circumstances in a repository. We have started to refine risk based on high level use-cases classification. For example: parameter-based risk – e.g. specific M/D which we consider good or bad; exception risk – e.g. the format is not valid or did not validate; change risk – e.g. a number of files have failed fixity checks; conflict risk – e.g. tool X says it is a tiff, tool Y says that it is a pdf!; unknown value risk – e.g. our tools cannot identify 100,000 files in this transfer; access support risk - we can no longer get access 1,000,000 files in format X;
We are going to need many different types of preservation actions. we want to be both proactive and reactive. Be able to see the current state of the repository as well as be able to run ‘what if scenarios’. Understand if we need to take any action on a file. Do we need to take actions on all files in the repository of a particular type, or only those that belong to a particular group (e.g. Tiff’s in a particular collection). These actions could be as simple as replacing the access software (not touching the file). Or as complex as replacing a file and links inside a complex web object. Or even building and maintaining emulation environments over time. We also want be able to get rid of stuff we don’t want (e.g. may not be our responsibility or should not have been taken into the collection in the first place). If we are going to take an action on a file we want to know what about the file is important to the collection owners.
Based on the last point we think that the system at the NLA must take into account: Preservation Intent; Significance; and Level of support of formats and therefore access to the content.
We need a system that can express Pres Intent – does the content need to be preserved. If so, who is responsible for it, how long and what aspects? As we don’t believe in the ‘it is impossible to define significant properties for digital objects school of thought and because we find significant properties so problematic – we have adopted a middle position, expressing pres intent at a fairly hight level (e.g. want to view, edit, navigate, manipulate the content) This is a collaborative process of defining and articulating how the collections see their content and their required level of support for access to the content over time. Including which specific aspects they think are important.
Also, we need to build a system in which an intellectual entity and any given level of granularity can be recorded as being significant. We also need to be able know what is the ‘Level of Support’ for any given digital object within our ecosystem is, at a given time. e.g. given that we can identify what it is - how well (or not) do we maintain access to the content in this file – This will help us to work out priorities and what pres action/s we will need to take.
So, some of these ideas are expressed on this early painting that we found on a cave wall.
Then we took most of the fun out of it. On your right are knowledgebases and systems that deal with: Formats Software Level of Support Pres Intent Priority Pres Actions Pres Options Pres Evaluation
This model is based on being able to access: both human and machine accessible information. consistent preservation metadata which has been recorded and maintained for every digital object in the repository. There will also need to be consistent specific M/D for particular format types (if identified). A summary of this information, which can be grouped into defined sets of managed content (e.g. collections) needs to be readily available.
To start to build a system which looks at pres intent, significance, and level of support and other risk metrics we need to be careful about the level of detail in our system – is it best to have relative indicators or will we drown in the detail? Having said this, we require: Relevant information on formats and versions in our system. Relevant information on software and versions and dependencies that can access particular formats in our system. To be able to build relationships between formats and software – specifically what can open or open edit a format, we need to know: What software is available? Do I have it? What is the external and internal support The proximity of the software to the format e.g. was this software made for this format or is it generic software Take into account Pres Intent and Significance. Does it mater? Use other risk indicators carefully in a measured and meaningful way Reporting on level of support based on these relationships we can determine if we can maintain access to the format and what priority (if any) should be given to its treatment.
There are other parts of this system which we have not prototyped. However, these will need to be built as a part of the new system. These are: Preservation monitoring, reporting and prioritisation Preservation options and preservation action planning Preservation action evaluation
So I will describe this part of the ecology within the red box.
These are the preservation intent statements which we have currently compiled.
We started with agreed statement of Pres Intent for each collection: This was divided into a number of parts: Context of the collection and what they collect; The Preservation Intent of the collection for their identified material; Identified collecting issues/limitations in how the material is collected; Other issues which may effect preservation.
We then started to look at how we might systematise this info at a high level. This raised some very interesting questions about vocabulary and granularity. This partially worked but not to my satisfaction.
This table summarises the previous screen For example, the fields can be characterised as: Owner Description of material Intent: preserve (yes/no), time, what aspects (e.g. view, edit, navigate, manipulate content) Responsibility/Authorisation Detailed Notes Interestingly, the collections tended to view their material based on ingest workflows and catalogue level records. Not files or formats.
We have a slightly different view on how to describe their material. We tend to think about it in terms of files and formats and not workflows However, resolving this in a systematic way will be a job for our next system.
Now I will describe this part of the ecology within the red box. We have prototyped some of these systems which I will briefly show you
This is the File Format Home page It contains Formats that are relevant to our environment The Levels of abstraction are - format family followed by versions
If we take a look at the entry for TIFF We spent a lot of time thinking about descriptions headings for the free text – what makes a sustainable format – however, this is subjective information, but helpful. We have also have a controlled vocab which is integrated with the text field. We currently have a staff member working full time populating these fields for 6 months and hopefully longer
On the version page we have listed the software that can be used to identify this format (we could have many). This info will be linked back to format M/D summaries from the repository You can also see the relationship to our software registered in the system - this is expressed by the vocab (open, open/edit and transcode). We could list other info as required.
This is the Software home page. It lists software relevant to our environment We have not concentrated on this.
If we choose software like Photoshop CS 5 We have a vocab and free text descriptions We see: version releases; plug-ins; support levels at the NLA; support levels from the vendor; software and hardware requirements; etc. We came to the conclusion that in a relative system major release were what we record.
And the most important aspect for establishing the level of support is the format to software relationship The summary list created by these two knowledgebases shows the list of format that this software can access These relationships can be used in other knowledgebases and to run reports on access configurations, possible migration paths, and what if scenarios.
Another important part of our future system will be the level of support and prioritisation KBs
A stated, the key to level of support is the relationships between format to many software instances. However, we could only build that part of this system as, at this time as we: currently cannot get consistent info for all files from the repository (except No. of files by collection); can only connect to text fields on the pres intent screen; have no consistent significance info in the system; have no systematise risk system metrics in our current system. This will change soon!
Another way of look at the level of support to give us access risk could be: The overall level of support by format (including vendor support, internal support and proximity); Other risk metrics (e.g. has it been deemed obsolete in the outside world) The number of files affected; The preservation intent – by collection; Any significance info;
Prioritisation of treatment, could be based on a summary of all the previous fields - including: Any constraints imposed by rights policies or agreements; and Amount of resources available. This summary could give the NLA collections and management the information that they require to prioritise want they want preserved.
We have a number of other modules in this model. For example, Pres options & action planning
We would like to know what options that we can support. For example: Report on relationships between specific options which have been linked to specific formats (e.g. migration). Report on specific software in our KB which are noted as being relevant for specific preservation actions. For example, tell us all the software we have which can access X and is registered as a migration path. Link to other information available through other link data sources.
The part of the system that looks at generating options should also: enable staff to define, approve and prioritise preservation action plans for sets of managed content support preservation action plans which include: multiple steps and combining manual and automated workflows. replacing files and linkages within a complex object Link to a specific emulation environment Replace existing software to change the level of support Specific the action – no action is required It should also be able to support simulating changes to the environment.
And finally, pres options evaluation
Ultimately, we want to be able to tell if what we have planned is any good - before we start any processing happening in the repository that could take some time.
Currently, these ideas and requirements have become ‘partially real’ (almost like ‘Mostly Dead’ from the movie Princess Bride). They still need to be implemented. They formed the basis for the preservation requirements in a subsequent: RFP (Request for Proposal) process; and RFT (Request for Tender) process.
RFP When to market July 2011 A number of responses were received for: Core systems Preservation Digitisation Other Workflows Select vendors were invited to participate in the new stage.
RFT Closed at the end of Dec 2011.
So which version of reality have we decided upon? The evaluation report has recommended that the Library proceed to contract negotiations with selected tenders for each scope of work. Currently the Library is preparing a submission for ministerial approval prior to commencement of contract negotiations with vendors. Thanks for your time.