The document discusses file formats and compression for digitized images. It reviews perspectives on different formats like TIFF, JPEG, and JPEG 2000. While older views discouraged compression, current views acknowledge that compression can be appropriate, especially for large-scale digitization projects, where it provides significant storage and cost savings without unacceptable preservation risk when done properly. JPEG 2000 in particular has gained acceptance as a robust compressed format suitable for digital preservation.
Building a Scalable Digital Asset Management Platform in the Cloud (MED402) |...Amazon Web Services
With the breadth of AWS services available that are relevant to digital media, organizations can readily build out complete content/asset management (DAM/MAM/CMS) solutions in the cloud. This session provides a detailed walkthrough for implementing a scalable, rich-media asset management platform capable of supporting a variety of industry use cases. The session includes code-level walkthrough, AWS architecture strategies, and integration best practices for content storage, metadata processing, discovery, and overall library management functionality—with particular focus on the use of Amazon S3, Amazon Elastic Transcoder, Amazon DynamoDB and Amazon CloudSearch. Customer case study will highlight successful usage of Amazon CloudSearch by PBS to enable rich discovery of programming content across the breadth of their network catalog.
Personal Digital Archiving Initiatives at the Library of Congresslljohnston
Introduction to the personal digital archiving issues and advice from the Library of Congress National Digital Information Infrastructure and Preservation Program
This short presentation is intended as a light interlude linking two practical sessions in a workshop connecting preservation planning with tools provided for use with EPrints repository software.
Building a Scalable Digital Asset Management Platform in the Cloud (MED402) |...Amazon Web Services
With the breadth of AWS services available that are relevant to digital media, organizations can readily build out complete content/asset management (DAM/MAM/CMS) solutions in the cloud. This session provides a detailed walkthrough for implementing a scalable, rich-media asset management platform capable of supporting a variety of industry use cases. The session includes code-level walkthrough, AWS architecture strategies, and integration best practices for content storage, metadata processing, discovery, and overall library management functionality—with particular focus on the use of Amazon S3, Amazon Elastic Transcoder, Amazon DynamoDB and Amazon CloudSearch. Customer case study will highlight successful usage of Amazon CloudSearch by PBS to enable rich discovery of programming content across the breadth of their network catalog.
Personal Digital Archiving Initiatives at the Library of Congresslljohnston
Introduction to the personal digital archiving issues and advice from the Library of Congress National Digital Information Infrastructure and Preservation Program
This short presentation is intended as a light interlude linking two practical sessions in a workshop connecting preservation planning with tools provided for use with EPrints repository software.
Preparation, Proceed and Review of preservation of Digital Library Asheesh Kamal
My paper focuses on the future information to preserve and use in a user-friendly environment; and also digital preservation methods and strategy, the life cycle of digital media, especially in the digital library.
Digital Presentation Best Practices: Lessons Learned From Across the PondULB - Bibliothèques
Digital Presentation Best Practices: Lessons Learned From Across the Pond. Slavko Manojlovich (Associate University Librarian (IT) / Manager, Digital Archives Initiative Memorial University St Johns Canada) and Benoit Pauwels (Head, Library Automation Team, Université libre de Bruxelles Belgium)
Digital Preservation Best Practices: Lessons Learned From Across the PondBenoit Pauwels
Digital Preservation Best Practices: Lessons Learned From Across the Pond. Slavko Manojlovich (Associate University Librarian (IT) / Manager, Digital Archives Initiative Memorial University St Johns Canada) and Benoit Pauwels (Head, Library Automation Team, Université libre de Bruxelles Belgium)
Big Data Handling Technologies ICCCS 2014_Love Arora _GNDU Love Arora
Big data came into existence when the traditional relational database systems were not able to handle the unstructured data (weblogs, videos, photos, social updates, human behaviour) generated today by organisation, social media, or from any other data generating source. Data that is so large in volume, so diverse in variety or moving with such velocity is called Big data. Analyzing Big Data is a challenging task as it involves large distributed file systems which should be fault tolerant, flexible and scalable. The technologies used by big data application to handle the massive data are Hadoop, Map Reduce, Apache Hive, No SQL and HPCC. These technologies handle massive amount of data in MB, PB, YB, ZB, KB, and TB.
In this research paper various technologies for handling big data along with the advantages and disadvantages of each technology for catering the problems in hand to deal the massive data has discussed.
Slides and notes from a presentation that I gave as part of a masterclass for library managers in April 2008. Some slides contain links and the slides are best read in conjunction with the notes that appear at the bottom of the slideshare screen.
BFC: High-Performance Distributed Big-File Cloud Storage Based On Key-Value S...dbpublications
Nowadays, cloud-based storage services are rapidly growing and becoming an emerging trend in data storage field. There are many problems when designing an efficient storage engine for cloud-based systems with some requirements such as big-file processing, lightweight meta-data, low latency, parallel I/O, Deduplication, distributed, high scalability. Key-value stores played an important role and showed many advantages when solving those problems. This paper presents about Big File Cloud (BFC) with its algorithms and architecture to handle most of problems in a big-file cloud storage system based on key value store. It is done by proposing low-complicated, fixed-size meta-data design, which supports fast and highly-concurrent, distributed file I/O, several algorithms for resumable upload, download and simple data Deduplication method for static data. This research applied the advantages of ZDB - an in-house key value store which was optimized with auto-increment integer keys for solving big-file storage problems efficiently. The results can be used for building scalable distributed data cloud storage that support big-file with size up to several terabytes.
An Introduction to digital preservation at the Library of Congresslljohnston
Introduction to digital preservation initiatives at the Library of Congress and the National Digital Information Infrastructure and Preservation Program
Presentation on electronic records management and archival issues. Originally presented at the Fall 2008 meeting of the Southeastern Wisconsin Archivists Group
This presentation discusses what digital ‘stuff’ the National Library of Australia is responsible for and explores some of the main issues regarding digital preservation of this ‘stuff’. It was delivered at the New South Wales State Library on February 15, 2011 by David Pearson
Gilbane 2009 -- How Can Content Management Software Keep Pace?weisinger
The amount of data stored is growing at a phenomenal rate. This paper documents the growth and suggests that a new standard, CMIS, may be useful in getting better control over data and data repositories.
Business Process Analysis for Your Records Management ProgramMARAC Bethlehem PC
A presentation given by Geof Huth on 27 January 2016 to the Long Island Chapter of ARMA. The presentation gives an introduction to how to use BPA in a records management environment.
About experiences creating subject guides for archival research using the LibGuides platform. Recommendations for topics to include and brief review of LibGuides.
Preparation, Proceed and Review of preservation of Digital Library Asheesh Kamal
My paper focuses on the future information to preserve and use in a user-friendly environment; and also digital preservation methods and strategy, the life cycle of digital media, especially in the digital library.
Digital Presentation Best Practices: Lessons Learned From Across the PondULB - Bibliothèques
Digital Presentation Best Practices: Lessons Learned From Across the Pond. Slavko Manojlovich (Associate University Librarian (IT) / Manager, Digital Archives Initiative Memorial University St Johns Canada) and Benoit Pauwels (Head, Library Automation Team, Université libre de Bruxelles Belgium)
Digital Preservation Best Practices: Lessons Learned From Across the PondBenoit Pauwels
Digital Preservation Best Practices: Lessons Learned From Across the Pond. Slavko Manojlovich (Associate University Librarian (IT) / Manager, Digital Archives Initiative Memorial University St Johns Canada) and Benoit Pauwels (Head, Library Automation Team, Université libre de Bruxelles Belgium)
Big Data Handling Technologies ICCCS 2014_Love Arora _GNDU Love Arora
Big data came into existence when the traditional relational database systems were not able to handle the unstructured data (weblogs, videos, photos, social updates, human behaviour) generated today by organisation, social media, or from any other data generating source. Data that is so large in volume, so diverse in variety or moving with such velocity is called Big data. Analyzing Big Data is a challenging task as it involves large distributed file systems which should be fault tolerant, flexible and scalable. The technologies used by big data application to handle the massive data are Hadoop, Map Reduce, Apache Hive, No SQL and HPCC. These technologies handle massive amount of data in MB, PB, YB, ZB, KB, and TB.
In this research paper various technologies for handling big data along with the advantages and disadvantages of each technology for catering the problems in hand to deal the massive data has discussed.
Slides and notes from a presentation that I gave as part of a masterclass for library managers in April 2008. Some slides contain links and the slides are best read in conjunction with the notes that appear at the bottom of the slideshare screen.
BFC: High-Performance Distributed Big-File Cloud Storage Based On Key-Value S...dbpublications
Nowadays, cloud-based storage services are rapidly growing and becoming an emerging trend in data storage field. There are many problems when designing an efficient storage engine for cloud-based systems with some requirements such as big-file processing, lightweight meta-data, low latency, parallel I/O, Deduplication, distributed, high scalability. Key-value stores played an important role and showed many advantages when solving those problems. This paper presents about Big File Cloud (BFC) with its algorithms and architecture to handle most of problems in a big-file cloud storage system based on key value store. It is done by proposing low-complicated, fixed-size meta-data design, which supports fast and highly-concurrent, distributed file I/O, several algorithms for resumable upload, download and simple data Deduplication method for static data. This research applied the advantages of ZDB - an in-house key value store which was optimized with auto-increment integer keys for solving big-file storage problems efficiently. The results can be used for building scalable distributed data cloud storage that support big-file with size up to several terabytes.
An Introduction to digital preservation at the Library of Congresslljohnston
Introduction to digital preservation initiatives at the Library of Congress and the National Digital Information Infrastructure and Preservation Program
Presentation on electronic records management and archival issues. Originally presented at the Fall 2008 meeting of the Southeastern Wisconsin Archivists Group
This presentation discusses what digital ‘stuff’ the National Library of Australia is responsible for and explores some of the main issues regarding digital preservation of this ‘stuff’. It was delivered at the New South Wales State Library on February 15, 2011 by David Pearson
Gilbane 2009 -- How Can Content Management Software Keep Pace?weisinger
The amount of data stored is growing at a phenomenal rate. This paper documents the growth and suggests that a new standard, CMIS, may be useful in getting better control over data and data repositories.
Similar to Puglia marac-file formats-20111020 (20)
Business Process Analysis for Your Records Management ProgramMARAC Bethlehem PC
A presentation given by Geof Huth on 27 January 2016 to the Long Island Chapter of ARMA. The presentation gives an introduction to how to use BPA in a records management environment.
About experiences creating subject guides for archival research using the LibGuides platform. Recommendations for topics to include and brief review of LibGuides.
Power point for MARAC fall conference 2011 on digital asset management systems. Focused on the implementation of a DAMS at the Historical Society of Pennsylvania. Presentation by Matt Shoemaker, Director of Digital Collections and systems at HSP.
"Creating and Maintaining Web Archives"
Presented by Joanne Archer (University of Maryland), Tessa Fallon (Columbia University), Abbie Grotke (Library of Congress), and Kate Odell (Internet Archive)
CLIR officers describe results of recent survey of student engagement in Cataloging Hidden Special Collections and Archives projects. See also: http://www.clir.org/hiddencollections/student_survey_results.html
CLIR staff present the results of a 2011 survey of student engagement with projects funded through the Cataloging Hidden Special Collections and Archives program. See also:
http://www.clir.org/hiddencollections/student_survey_results.html
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
Puglia marac-file formats-20111020
1. Revisiting File Formats for Digitization Steven T. Puglia Digital Conversion Services Manager Office of Strategic Initiatives Library of Congress 101 Independence Ave, SE Washington, DC 20540, USA Phone: 202-707-5726 Email: spug@loc.gov
2.
3. In general, within the digital library community, format and compression recommendations for master and derivative image files remain based on older perspectives regarding digitization, digital preservation, and IT/network/web technologies.
4.
5.
6.
7.
8. Recommended Data Formats for Preservation Purposes in the Florida Digital Archive http://fclaweb.fcla.edu/uploads/Lydia%20Motyka/FDA_documentation/recFormats.pdf
9.
10.
11.
12.
13.
14.
15. 2. Further resolved, that such images are of sufficient quality to serve as preservation images for books which are: Found in the stacks, not in the rare book room. Likely to remain available somewhere in physical form. 3. Further resolved, that such images are of comparable or superior quality to accepted preservation approaches such as microfilm. 4. Further resolved, that cost matters in digital library image conversion projects, even though it is other people's money. With a final nod to the improvements proposed for JPEG 2000, Sharpe argued that at a minimum, the library and archival community should not close the door on the use of visually lossless compression.
16.
17.
18. Rise of Information in the Digital Age http://www.washingtonpost.com/wp-dyn/content/graphic/2011/02/11/GR2011021100614.html?sid=ST2011021100514
19. Really big data: The challenges of managing mountains of information, by John Brandon, October 18, 2011 http://www.computerworld.com/s/article/9220504/Really_big_data_The_challenges_of_managing_mountains_of_information ? The Library of Congress processes 2.5 petabytes of data each year, which amounts to 40TB per week. Thomas Youkel, group chief of enterprise systems engineering at the Library, estimates the data load will quadruple in the next few years as the Library continues to carry out its dual mandates to serve up data for historians and preserve information in all its forms.
20.
21. Andy Jackson, The British Library http://www.openplanetsfoundation.org/blogs/2011-01-12-format-obsolescence-and-sustainable-access This means that the long-term cost of preserving our collection scales not only with the size of the files, but also rises as the number of formats we are required to support is increased.
22.
23. David Rosenthal, Stanford University http://blog.dshr.org/2011/03/how-few-copies.html Compression reduces the redundancy within a single copy and increases the risk of damage. There are also techniques that increase the redundancy within a single copy and reduce the risk.
24.
25. Erik Hetzner, California Digital Library http://groups.google.com/group/digital-curation/msg/b487a1b0188f9c0c I see no reason to store, as a matter of policy, uncompressed files on our disks. In fact, I think we should be more aggressive about compressing files. (Hetzner focuses on lossless compression.)
26. Erik Hetzner, California Digital Library http://groups.google.com/group/digital-curation/msg/b487a1b0188f9c0c Even without error correcting codes, I don’t think the arguments for storing uncompressed data only as a matter of policy are strong at all. When we take error correcting codes into account, not compressing your data as a policy in order to keep a higher level of redundancy seems like the worst way to increase the redundancy of the data. Smart people have figured out how to make codes which can reliably correct limited errors in bytestreams. Why not use them?
27. Data corruption is and will remain a problem. An active part of digital preservation will be to overcome this problem. The LOCKSS concept includes one approach for dealing with the problem – “…the bits and bytes are continually audited and repaired…to protect fragile digital content for the very long time.” http://www.eecs.harvard.edu/~mema/publications/SOSP2003.pdf LOCKSS now has a 12 year track record.
28.
29. If image files are being brought into a managed environment, compression, particularly lossless compression, is much less of a concern. Conversely, if images are being stored on DVDs on a shelf, then compression raises the risks significantly.
30. One option for file format and compression (lossless and lossy) - JPEG 2000
31. There remain barriers for many organizations to adoption of JPEG 2000 (limited open source tools), and concerns and related potential risks (corruption and potential legal issues). These issues have been acknowledged within the broader cultural heritage digitization community.
32. A number of research studies have been conducted on the robustness of JPEG 2000. Studies have seen similar results in terms of susceptibility to corruption. Nevertheless, organizations have concluded that JPEG 2000 is an appropriate file format choice from a robustness perspective – “conclude that JPEG 2000 is a good current solution for our digital repositories.” A Format for Digital Preservation of Images by Buonora and Liberati http://www.dlib.org/dlib/july08/buonora/07buonora.html
33. It is worth noting the format includes some “resiliency” elements that add robustness and thereby counteract some effects of data loss. These resiliency elements are described in the notes at the bottom of the Sustainability of Digital Formats – Planning for Library of Congress web page ( http://www.digitalpreservation.gov/formats/fdd/fdd000138.shtml) .
34. Wellcome Library http://jpeg2000wellcomelibrary.blogspot.com/2010/06/we-need-how-much-storage.html In 2009, the Wellcome Library set out an ambitious vision to digitise a large proportion of its historic collections. This would take the annual digitisation activities of the Library from hundreds, or at most, thousands of images per year to several million images per year. … we realised this could see the generation of up to 30m images over 5 years. Exciting, but perhaps slightly daunting, considering we didn't yet have an infrastructure to fully support such a large collection of digital assets.
35. Wellcome Library- Anyone reading this blog will understand why the scale of the programme is key to the blog topic. When we asked our IT department to tell us how much it would cost to store 30m TIFF files - our de facto standard for the couple hundred thousand images in our existing picture library - we were stunned. Two petabytes of online, spinning disk storage with a top-of-the-line enterprise management system and remote backup would cost how much? We learned that the cost would be something like a fifth of our total budget for the entire digitisation programme.
36. Wellcome Library- Should we consider a lower-cost storage solution? Even tape back-up was quite expensive for that scale, and you can't serve images up online from tape anyway. We revised our image sizes, factoring in smaller and smaller resolutions and/or bit depths for material like the printed books, which didn't need full colour, high resolution images. We still couldn't afford the storage costs. Finally, we saw the light and started looking into a relatively new image format called JPEG 2000 .
37.
38.
39. It is very possible, more digital images are produced by mass digitization efforts and saved as JPEG 2000 files than other file formats. Despite concerns and a clear need for organizational support relating to implementing JPEG 2000, far more cultural heritage organizations are using JPEG 2000 for digitization than most people realize.
40.
41. Conclusions: There is not a single answer to the question of file format for raster image files produced by digitization projects. There are a number of file formats worthy of consideration – suitable from technical, sustainability, fiscal, and other perspectives. Compression can represent a reasonable risk for appropriate efforts, and is likely a practical reality as digitization and digital preservation efforts scale. Not using compression likely represents a real risk, particularly given the dramatic and continued growth in digital data.