The document describes experiments conducted with the JPSearch metadata standard and tools developed to test interoperability. Metadata instances from standards like MPEG-7, Dublin Core and EBUcore were mapped to JPSearch using a rule generator tool. A JPSearch Editor was also developed to embed metadata in JPEG files. The experiments found that some elements were not trivial to map and evaluated capabilities and limitations of JPSearch for metadata interoperability.
This document provides information on a course titled "Computer Networks". The course is 3 credit hours and includes both theory and lab components. It introduces concepts of computer networking and discusses the different layers of the networking model. The course content covers topics such as types of networking techniques, the Internet, IP addressing, routing, transport layer protocols, and local area networks. The goal is to provide students an understanding of computer networking fundamentals.
Mapping cross-domain metadata to the Europeana Data Model (EDM) - EDM introd...Valentine Charles
- The document introduces the Europeana Data Model (EDM), which was created to allow Europeana to ingest metadata from various sources and domains while maintaining granularity and semantics.
- EDM uses standards like Dublin Core, CIDOC-CRM, and RDF to distinguish cultural heritage objects from their representations and metadata, and to represent relationships between objects and contextual information.
- EDM profiles allow communities to build on EDM to meet their specific needs while maintaining interoperability, and it has been adopted by projects beyond Europeana seeking interoperable metadata.
Open Archives Initiatives For Metadata HarvestingNikesh Narayanan
The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) provides a simple but effective mechanism for metadata harvesting. It allows service providers to aggregate content from data providers to build value-added services. The OAI-PMH uses HTTP and XML to share metadata in any agreed format, with Dublin Core as a baseline. It defines a set of verbs and standards for harvesting metadata from repositories in a consistent way. This interoperability has helped surface resources and build services across independently developed digital libraries.
Revolutionizing Laboratory Instrument Data for the Pharmaceutical Industry:...OSTHUS
The Allotrope Foundation is a consortium of major pharmaceutical companies and a partner network whose goal is to address challenges in the pharmaceutical industry by providing a set of public, non-proprietary standards for using and integrating analytical laboratory data. Current challenges in data management within the pharmaceutical industry often center around inconsistent or incomplete data and metadata and proprietary data formats. Because of a lack of standardization, several operations (e.g. integration of instruments/applications, transfer of methods or results, archiving for regulatory purposes) require unnecessary efforts. Further, higher level aggregation of data, e.g. regulatory filings, that are derived from multiple sources of laboratory data are costly to create. These unnecessary costs impact operations within a company’s laboratories, between partnering companies, and between a company and contract research organizations (CROs). Finally, the accelerating transition of laboratories from hybrid (paper + electronic) to purely electronic data streams, coupled with an ever-increasing regulatory scrutiny of electronic data management practices, further require a comprehensive solution. This talk will discuss how The Allotrope Foundation is providing a new framework for data standards through collaboration between numerous stakeholders.
This document provides information on a course titled "Computer Networks". The course is 3 credit hours and includes both theory and lab components. It introduces concepts of computer networking and discusses the different layers of the networking model. The course content covers topics such as types of networking techniques, the Internet, IP addressing, routing, transport layer protocols, and local area networks. The goal is to provide students an understanding of computer networking fundamentals.
Mapping cross-domain metadata to the Europeana Data Model (EDM) - EDM introd...Valentine Charles
- The document introduces the Europeana Data Model (EDM), which was created to allow Europeana to ingest metadata from various sources and domains while maintaining granularity and semantics.
- EDM uses standards like Dublin Core, CIDOC-CRM, and RDF to distinguish cultural heritage objects from their representations and metadata, and to represent relationships between objects and contextual information.
- EDM profiles allow communities to build on EDM to meet their specific needs while maintaining interoperability, and it has been adopted by projects beyond Europeana seeking interoperable metadata.
Open Archives Initiatives For Metadata HarvestingNikesh Narayanan
The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) provides a simple but effective mechanism for metadata harvesting. It allows service providers to aggregate content from data providers to build value-added services. The OAI-PMH uses HTTP and XML to share metadata in any agreed format, with Dublin Core as a baseline. It defines a set of verbs and standards for harvesting metadata from repositories in a consistent way. This interoperability has helped surface resources and build services across independently developed digital libraries.
Revolutionizing Laboratory Instrument Data for the Pharmaceutical Industry:...OSTHUS
The Allotrope Foundation is a consortium of major pharmaceutical companies and a partner network whose goal is to address challenges in the pharmaceutical industry by providing a set of public, non-proprietary standards for using and integrating analytical laboratory data. Current challenges in data management within the pharmaceutical industry often center around inconsistent or incomplete data and metadata and proprietary data formats. Because of a lack of standardization, several operations (e.g. integration of instruments/applications, transfer of methods or results, archiving for regulatory purposes) require unnecessary efforts. Further, higher level aggregation of data, e.g. regulatory filings, that are derived from multiple sources of laboratory data are costly to create. These unnecessary costs impact operations within a company’s laboratories, between partnering companies, and between a company and contract research organizations (CROs). Finally, the accelerating transition of laboratories from hybrid (paper + electronic) to purely electronic data streams, coupled with an ever-increasing regulatory scrutiny of electronic data management practices, further require a comprehensive solution. This talk will discuss how The Allotrope Foundation is providing a new framework for data standards through collaboration between numerous stakeholders.
Denodo DataFest 2016: What’s New in Denodo Platform – Demo and RoadmapDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/ptGwp7
Curious about product roadmap? In this session, we will review some of the new key features introduced this year in the Denodo Platform in areas such as performance, self-service, security and monitoring. We will also take a sneak peek at the most exciting features in the roadmap for Denodo 7.0.
In this session, you will learn:
• New performance-related features in big data scenarios
• New governance and self-service features
• New connectivity, data transformation, and enterprise-wide deployment features
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
BIDS (Brain Imaging Data Structure) is a standard for organizing brain imaging and behavioral data. It facilitates sharing and reuse of neuroscience data by standardizing file organization and metadata. The standard uses a simple directory structure and "sidecar" files to describe datasets in a machine-readable format. While BIDS does not require any specific software or file formats, it helps data to be findable, accessible, interoperable and reusable.
VRE Cancer Imaging BL RIC Workshop 22032011djmichael156
The document discusses the Virtual Research Environment for Cancer Imaging (VRE-CI) project which aims to provide a framework for researchers and clinicians to share cancer imaging information, images, and algorithms. It describes using Business Connectivity Services and managed metadata to organize and search image metadata, and building a reusable SharePoint site definition to manage DICOM files and extract metadata for search. Key aspects covered include mapping folders, issues with document library names, including external code, and adapting the DICOM field model.
Query aware determinization of uncertain objectsieeepondy
This paper addresses the problem of storing probabilistic data generated by automated data analysis techniques in legacy systems that only accept deterministic data. It proposes a query-aware strategy for determinizing probabilistic data by minimizing the expected cost of answering queries to generate an optimal deterministic representation. An algorithm is developed to approximate the near-optimal solution to the NP-hard determinization problem. The paper shows the advantages of this query-aware approach over traditional methods like thresholding through empirical evaluation on real and synthetic datasets.
The document proposes a query expansion technique called Nereau that combines traditional query expansion, data from social websites, semantic metadata, and basic user personalization. Experimental evaluations on TREC, ODP, and real user web search sessions show that Nereau outperforms traditional query expansion and search engines. Future work could incorporate more social data, address the dynamic nature of folksonomies, and automatically assign tags when no social data is available.
The MIDESS Project explored sharing digital content like images between university repositories. It tested standards like OAI-PMH and METS for exchanging metadata and objects. While these standards allow some interoperability, repositories implemented them differently, preventing full sharing. The project highlighted ongoing issues around information architecture, repository functionality for multimedia, and integrating repositories into broader systems.
What Impact Will Entity Framework Have On ArchitectureEric Nelson
This document discusses the impact that adopting the Entity Framework and Entity Data Model will have on application architecture. It provides an overview of object-relational mapping (ORM) technologies and how they help address the impedance mismatch between object-oriented programming and relational databases. The document outlines several key features and improvements in Entity Framework versions 1.0, 2.0, 3.0 and 4.0, such as better code generation tools, a model-first approach, support for stored procedures and persistence ignorance. It argues that adopting an ORM like Entity Framework can improve developer productivity, code quality and database independence.
A framework for visual search in broadcasting companies' multimedia archives FIAT/IFTA
This document proposes a framework for visual search of broadcast archives using content-based image retrieval. It describes Rai's large archive of video, images, and documents that is growing by 130,000 hours per year. Only 46% of the content is annotated with metadata. The framework uses open source tools like LIRe and Apache Solr for scalable visual search without relying solely on metadata. It extracts features from keyframes of video for indexing and retrieval. Preliminary evaluations show it can match shots within videos but struggles with semantic search across different videos. Future work includes creating an annotated dataset and improving feature extraction.
The document discusses testing the use of the SERONTO ontology for semantic data integration of distributed ecological databases from ALTER-Net and LTER Europe. Five databases were independently mapped to SERONTO concepts and queries could be run across the integrated data without knowledge of the underlying database structures. However, the effort required for mapping was significant and maintaining reference lists will be crucial. More use cases are needed to fully evaluate SERONTO's potential for LTER data integration.
The document discusses the need for a new open source database management system called SciDB to address the challenges of storing and analyzing extremely large scientific datasets. SciDB is being designed to handle petabyte-scale multidimensional array data with native support for features important to science like provenance tracking, uncertainty handling, and integration with statistical tools. An international partnership involving scientists, database experts, and a nonprofit company is developing SciDB with initial funding and use cases coming from astronomy, industry, genomics and other domains.
Tim Malthus_Towards standards for the exchange of field spectral datasetsTERN Australia
This document discusses the development of standards for the exchange of field spectral datasets. It notes the importance of metadata for determining the quality and representativeness of spectral data obtained in the field. A workshop was held in 2012 to discuss best practices for data collection and exchange and key conclusions included the need for standards to facilitate accurate comparison across studies and the role of thorough metadata. Work is ongoing to enhance the SPECCHIO system for hosting spectral libraries and metadata and establishing it as the international tool for storage and exchange of spectral datasets.
Metadata can play a vital role in enabling the effective management, discovery, and re-usability of digital information. Digital preservation metadata provides provenance information, supports and documents preservation activity, identifies technical features, and aids in verifying the authenticity of a digital object. This presentation gives and introduction to Digital preservation matadata and preservation metada in practise. Presentation was delivered during the joint DPE/Planets/CAPAR/nestor training event, ‘The Preservation challenge: basic concepts and practical applications’ (Barcelona, March 2009)
The 3TU.Datacentrum repository of research data hosts datasets as well as other objects representing measuring devices, locations, time periods and the like. Virtually all metadata is in rdf so the repository can be approached as an rdf graph. We will show how this is implemented with Fedora Commons, heavily leaning on rdf queries and xslt2.0. As a result of this architecture, it is relatively easy to make the repository linked-data-enabled by generating OAI/ORE resource maps.
While most of the metadata is rdf, most of the data is in NetCDF. Although not very well known in the library world, this is very popular format in various fields of science and engineering. It comes with its own data server Opendap which offers a rich API to interact with the data. Our repository is therefore a hybrid Fedora + Opendap setup and we will show how the two are integrated into a unified view and how they are kept in sync on ingest.
This was presented at the ELAG conference, Palma de Mallorca 2012.
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
A brief introduction to deep learning, providing rough interpretation to deep neural networks and simple implementations with Keras for deep learning beginners.
This document discusses the challenges of collecting, storing, and analyzing large volumes of internet measurement data. It examines issues such as distributed and resilient data collection, handling multi-timescale and heterogeneous data from various sources, and developing standardized tools and formats. The paper proposes the "datapository" - an internet data repository designed to address these challenges through a collaborative framework for data sharing, storage, and analysis tools. The goal is to help both network operators and researchers more effectively harness the wealth of data available.
Machine Learning for automated diagnosis of distributed ...AEbutest
The document discusses challenges in using machine learning for automated diagnosis of performance issues in distributed systems. It describes 4 key challenges: 1) transforming large amounts of metrics data into useful information, 2) adapting models to changing systems, 3) leveraging historical diagnosis to retrieve similar issues, and 4) combining metrics data with unstructured log data from multiple sources. The author proposes approaches for each challenge including Bayesian network classifiers, adaptive ensembles of models, defining issue signatures, and information extraction from logs.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
Denodo DataFest 2016: What’s New in Denodo Platform – Demo and RoadmapDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/ptGwp7
Curious about product roadmap? In this session, we will review some of the new key features introduced this year in the Denodo Platform in areas such as performance, self-service, security and monitoring. We will also take a sneak peek at the most exciting features in the roadmap for Denodo 7.0.
In this session, you will learn:
• New performance-related features in big data scenarios
• New governance and self-service features
• New connectivity, data transformation, and enterprise-wide deployment features
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
BIDS (Brain Imaging Data Structure) is a standard for organizing brain imaging and behavioral data. It facilitates sharing and reuse of neuroscience data by standardizing file organization and metadata. The standard uses a simple directory structure and "sidecar" files to describe datasets in a machine-readable format. While BIDS does not require any specific software or file formats, it helps data to be findable, accessible, interoperable and reusable.
VRE Cancer Imaging BL RIC Workshop 22032011djmichael156
The document discusses the Virtual Research Environment for Cancer Imaging (VRE-CI) project which aims to provide a framework for researchers and clinicians to share cancer imaging information, images, and algorithms. It describes using Business Connectivity Services and managed metadata to organize and search image metadata, and building a reusable SharePoint site definition to manage DICOM files and extract metadata for search. Key aspects covered include mapping folders, issues with document library names, including external code, and adapting the DICOM field model.
Query aware determinization of uncertain objectsieeepondy
This paper addresses the problem of storing probabilistic data generated by automated data analysis techniques in legacy systems that only accept deterministic data. It proposes a query-aware strategy for determinizing probabilistic data by minimizing the expected cost of answering queries to generate an optimal deterministic representation. An algorithm is developed to approximate the near-optimal solution to the NP-hard determinization problem. The paper shows the advantages of this query-aware approach over traditional methods like thresholding through empirical evaluation on real and synthetic datasets.
The document proposes a query expansion technique called Nereau that combines traditional query expansion, data from social websites, semantic metadata, and basic user personalization. Experimental evaluations on TREC, ODP, and real user web search sessions show that Nereau outperforms traditional query expansion and search engines. Future work could incorporate more social data, address the dynamic nature of folksonomies, and automatically assign tags when no social data is available.
The MIDESS Project explored sharing digital content like images between university repositories. It tested standards like OAI-PMH and METS for exchanging metadata and objects. While these standards allow some interoperability, repositories implemented them differently, preventing full sharing. The project highlighted ongoing issues around information architecture, repository functionality for multimedia, and integrating repositories into broader systems.
What Impact Will Entity Framework Have On ArchitectureEric Nelson
This document discusses the impact that adopting the Entity Framework and Entity Data Model will have on application architecture. It provides an overview of object-relational mapping (ORM) technologies and how they help address the impedance mismatch between object-oriented programming and relational databases. The document outlines several key features and improvements in Entity Framework versions 1.0, 2.0, 3.0 and 4.0, such as better code generation tools, a model-first approach, support for stored procedures and persistence ignorance. It argues that adopting an ORM like Entity Framework can improve developer productivity, code quality and database independence.
A framework for visual search in broadcasting companies' multimedia archives FIAT/IFTA
This document proposes a framework for visual search of broadcast archives using content-based image retrieval. It describes Rai's large archive of video, images, and documents that is growing by 130,000 hours per year. Only 46% of the content is annotated with metadata. The framework uses open source tools like LIRe and Apache Solr for scalable visual search without relying solely on metadata. It extracts features from keyframes of video for indexing and retrieval. Preliminary evaluations show it can match shots within videos but struggles with semantic search across different videos. Future work includes creating an annotated dataset and improving feature extraction.
The document discusses testing the use of the SERONTO ontology for semantic data integration of distributed ecological databases from ALTER-Net and LTER Europe. Five databases were independently mapped to SERONTO concepts and queries could be run across the integrated data without knowledge of the underlying database structures. However, the effort required for mapping was significant and maintaining reference lists will be crucial. More use cases are needed to fully evaluate SERONTO's potential for LTER data integration.
The document discusses the need for a new open source database management system called SciDB to address the challenges of storing and analyzing extremely large scientific datasets. SciDB is being designed to handle petabyte-scale multidimensional array data with native support for features important to science like provenance tracking, uncertainty handling, and integration with statistical tools. An international partnership involving scientists, database experts, and a nonprofit company is developing SciDB with initial funding and use cases coming from astronomy, industry, genomics and other domains.
Tim Malthus_Towards standards for the exchange of field spectral datasetsTERN Australia
This document discusses the development of standards for the exchange of field spectral datasets. It notes the importance of metadata for determining the quality and representativeness of spectral data obtained in the field. A workshop was held in 2012 to discuss best practices for data collection and exchange and key conclusions included the need for standards to facilitate accurate comparison across studies and the role of thorough metadata. Work is ongoing to enhance the SPECCHIO system for hosting spectral libraries and metadata and establishing it as the international tool for storage and exchange of spectral datasets.
Metadata can play a vital role in enabling the effective management, discovery, and re-usability of digital information. Digital preservation metadata provides provenance information, supports and documents preservation activity, identifies technical features, and aids in verifying the authenticity of a digital object. This presentation gives and introduction to Digital preservation matadata and preservation metada in practise. Presentation was delivered during the joint DPE/Planets/CAPAR/nestor training event, ‘The Preservation challenge: basic concepts and practical applications’ (Barcelona, March 2009)
The 3TU.Datacentrum repository of research data hosts datasets as well as other objects representing measuring devices, locations, time periods and the like. Virtually all metadata is in rdf so the repository can be approached as an rdf graph. We will show how this is implemented with Fedora Commons, heavily leaning on rdf queries and xslt2.0. As a result of this architecture, it is relatively easy to make the repository linked-data-enabled by generating OAI/ORE resource maps.
While most of the metadata is rdf, most of the data is in NetCDF. Although not very well known in the library world, this is very popular format in various fields of science and engineering. It comes with its own data server Opendap which offers a rich API to interact with the data. Our repository is therefore a hybrid Fedora + Opendap setup and we will show how the two are integrated into a unified view and how they are kept in sync on ingest.
This was presented at the ELAG conference, Palma de Mallorca 2012.
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
A brief introduction to deep learning, providing rough interpretation to deep neural networks and simple implementations with Keras for deep learning beginners.
This document discusses the challenges of collecting, storing, and analyzing large volumes of internet measurement data. It examines issues such as distributed and resilient data collection, handling multi-timescale and heterogeneous data from various sources, and developing standardized tools and formats. The paper proposes the "datapository" - an internet data repository designed to address these challenges through a collaborative framework for data sharing, storage, and analysis tools. The goal is to help both network operators and researchers more effectively harness the wealth of data available.
Machine Learning for automated diagnosis of distributed ...AEbutest
The document discusses challenges in using machine learning for automated diagnosis of performance issues in distributed systems. It describes 4 key challenges: 1) transforming large amounts of metrics data into useful information, 2) adapting models to changing systems, 3) leveraging historical diagnosis to retrieve similar issues, and 4) combining metrics data with unstructured log data from multiple sources. The author proposes approaches for each challenge including Bayesian network classifiers, adaptive ensembles of models, defining issue signatures, and information extraction from logs.
Similar to Metadata interoperability With JPSearch (20)
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
Thinking of getting a dog? Be aware that breeds like Pit Bulls, Rottweilers, and German Shepherds can be loyal and dangerous. Proper training and socialization are crucial to preventing aggressive behaviors. Ensure safety by understanding their needs and always supervising interactions. Stay safe, and enjoy your furry friends!
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
Physiology and chemistry of skin and pigmentation, hairs, scalp, lips and nail, Cleansing cream, Lotions, Face powders, Face packs, Lipsticks, Bath products, soaps and baby product,
Preparation and standardization of the following : Tonic, Bleaches, Dentifrices and Mouth washes & Tooth Pastes, Cosmetics for Nails.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
1. FIB - Barcelona School of Informatics
Universitat Politècnica de Catalunya (UPC)
Barcelona, Spain
Metadata Interoperability
with JPSearch
Nicos Demetriou
Master in Computer Architecture Networks and Systems
Barcelona, Spain July 3, 2013
2. Outline
1. Introduction
2. State of the art
3. Problem definition
4. Resolving the problem
5. Experiments
6. Conclusion
Nicos Demetriou: Metadata Interoperability with JPSearch 2
4. Introduction
Nicos Demetriou: Metadata Interoperability with JPSearch 4
Source: http://grabworthy.com/300-million-photos-uploaded-to-facebook-daily/
January 2013
5. Introduction
Increasing number of digital images cause:
Organization issues
Search and retrieval difficulties
The need to semantically describe them (annotations)
Portability problems
Metadata: “Data about data”
Plenty of metadata standards (MPEG-7)
JPSearch
New standard from JPEG
Tries to solve the aforementioned problems
Still improving
Nicos Demetriou: Metadata Interoperability with JPSearch 5
6. Introduction
Goal
Embed JPSearch data to image files
Easily create translation rules
Evaluate JPSearch standard
Possible lacks, improvements
Does JPSearch resolves the problem?
Research and application work
Nicos Demetriou: Metadata Interoperability with JPSearch 6
8. State of the art
Metadata
Describes the characteristics of a resource
Distinguished from the main content of the resource
E.g. Image {Content = Pixels, Metadata = Properties}
Improve organizing resources
Search mechanism
Embedded to the resource or externally
Digital images
Many proprietary formats such as JPEG, PNG
Contain metadata, method differs
Native metadata (e.g. EXIF)
Metadata in XML format (e.g. XMP)
Handful of metadata standards
Nicos Demetriou: Metadata Interoperability with JPSearch 8
9. State of the art
Joint Photographic Experts Group (JPEG)
Lossy compression / adjustable compression
Standards: JFIF EXIF & ICC profile (color space)
JPEG Interchange Format (JIF) byte layout
Segments of JPEG: Application Markers (0xFFEn)
JFIF – APP0
EXIF – APP1
XMP – APP1
ICC – APP2
JPSearch – APP3
Photoshop – APP13
Nicos Demetriou: Metadata Interoperability with JPSearch 9
10. State of the art
JPSearch
Suite of specifications
Enrichment with metadata in JPEG, JPEG 2000
Abstract framework
Modular
Flexible search architecture
Six parts:
1. System framework and components
2. Schema and ontology
3. Query format
4. File format for embedded metadata
5. Data interchange format between repositories
6. Reference software
Nicos Demetriou: Metadata Interoperability with JPSearch 10
12. Problem definition
Metadata Interoperability
Metadata exchanged without loss of information
Different processes express metadata in certain way
Distinct services exchange query messages
Nicos Demetriou: Metadata Interoperability with JPSearch 12
13. Problem definition
Challenges
Manipulation of image collections’ metadata
Image search and retrieval
Image repository maintenance and synchronization
Metadata storage
Reuse of metadata without regenerating it
Transferability between various image formats
Semantic meaning differs among formats
Approach to the solution: JPSearch
What is missing?
Lack of approaches and applications
Nicos Demetriou: Metadata Interoperability with JPSearch 13
14. Nicos Demetriou: Metadata Interoperability with JPSearch
Resolving the problem4
4.1 Approach
4.2 JPSearch Part 2
4.3 JPSearch Part 4
4.4 Tools Developed
15. Resolving the problem
4.1 Approach
Implement tools
JPSearch Editor for JPEG files
Translation Rule generator
Analyze transformation rules for different metadata sets
Evaluate JPSearch
Find lacks
Suggest improvements
Nicos Demetriou: Metadata Interoperability with JPSearch 15
16. Resolving the problem
4.2 JPSearch Part 2
JPSearch Core Metadata Schema
Rules for machine readable translation
Nicos Demetriou: Metadata Interoperability with JPSearch 16
17. Resolving the problem
4.2 JPSearch Part 2
JPSearch Core Metadata Schema
19 Basic elements
XML syntax
Nicos Demetriou: Metadata Interoperability with JPSearch 17
<?xml version="1.0" encoding="UTF-8"
standalone="yes"?>
<ImageDescription
xmlns="JPSearch:schema:coremetadata">
<Creators>
<GivenName>Leonardo</GivenName>
<FamilyName>DaVinci</FamilyName>
</Creators>
<Publisher>
<PersonName>
<GivenName>Paris</GivenName>
<FamilyName></FamilyName>
</PersonName>
<OrganizationInformation>
<Name>Museum of Louvre</Name>
<Address>
<Name>Lourve, Paris</Name>
</Address>
</OrganizationInformation>
</Publisher>
<CreationDate>1503-01-
01T00:00:00.0Z</CreationDate>
<ModifiedDate>2013-06-
24T13:30:41.395+03:00</ModifiedDate>
<Description>The portrait of Mona
Lisa</Description>
<Keyword>Image</Keyword>
<Keyword>fr</Keyword>
<Title>Mona Lisa</Title>
<CollectionLabel>Painting</CollectionLabel>
<Width>677</Width>
<Height>1024</Height>
</ImageDescription>
Elements
Identifier Title
Modifiers CollectionLabel
Creators PreferenceValue
Publisher Rating
CreationDate OriginalImageIdentifier
ModifiedDate GPSPositioning
Description RegionOfInterest
RightsDescription Width
Source Height
Keyword
Example
18. Resolving the problem
4.2 JPSearch Part 2
Rules for machine readable translation
JPSearch Translation Rules Declaration Language
Rule types: OneToOne, OneToMany, ManyToOne
Dublin Core to JPSearch example
Nicos Demetriou: Metadata Interoperability with JPSearch 18
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<TranslationRules xmlns="JPSearch:schema:translation" fromFormat="http://purl.org/dc/terms/"
toFormat="JPSearch:schema:coremetadata">
<TranslationRule xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:type="OneToManyFieldTranslationType">
<FromField xsi:type="FilteredSourceFieldType">
<XPathExpression>/creator</XPathExpression>
<FilterWithRegExpr>(S+) (S+)</FilterWithRegExpr>
<VariableBinding>
<ExplicitPrefixBinding>$name</ExplicitPrefixBinding>
<ExplicitPostfixBinding>$lastname</ExplicitPostfixBinding>
</VariableBinding>
</FromField>
<ToField xsi:type="FormattedTargetFieldType">
<XPathExpression>/JPSearchCore/Creators/GivenName</XPathExpression>
<ReplaceWithRegExpr>$name</ReplaceWithRegExpr>
</ToField>
<ToField xsi:type="FormattedTargetFieldType">
<XPathExpression>/JPSearchCore/Creators/FamilyName</XPathExpression>
<ReplaceWithRegExpr>$lastname</ReplaceWithRegExpr>
</ToField>
</TranslationRule>
</TranslationRules>
19. Resolving the problem
4.3 JPSearch Part 4
File format – extension of JPEG-1/JPEG 2000
Additional metadata can co-exist
JPSearchMetadata. ElementaryMetadata, Data blocks
Nicos Demetriou: Metadata Interoperability with JPSearch 19
20. Resolving the problem
4.4 Tools developed
JPSearch Editor
Features:
Opens .jpeg files
Shows JPSearch metadata
Embeds JPSearch
Alters JPSearch metadata
Saves JPSearch to XML
Saves metadata to DB
Imports external XML
instance with its
translation rules
Region Tagging
Nicos Demetriou: Metadata Interoperability with JPSearch 20
21. Resolving the problem
4.4 Tools developed
JPSearch Editor
Features:
Opens .jpeg files
Shows JPSearch metadata
Embeds JPSearch
Alters JPSearch metadata
Saves JPSearch to XML
Saves metadata to DB
Imports external XML
instance with its
translation rules
Region Tagging
Nicos Demetriou: Metadata Interoperability with JPSearch 21
22. Resolving the problem
4.4 Tools developed
Rule generator
Features
Opens XML instance
Shows XML elements
Element selection
All three rule type
supported
Rule type details
Show saved rules
Update/Deletion of
saved rules
Save rules in XML file
Nicos Demetriou: Metadata Interoperability with JPSearch 22
24. Experiments
Methodology
Creating mappings of multiple XML instances
Use rule generator tool
Based on a practical approach
Standards experimented with:
MPEG-7
Dublin Core
EBU core
XMP
Other formats (DeviantArt)
Evaluate JPSearch
Detect lacks and tool improvements
Nicos Demetriou: Metadata Interoperability with JPSearch 24
25. Experiments
MPEG-7
Large schema
Favors moving pictures and audio
Impossible to map all elements
Work on a subset of the schema
Valid sample documents found online
Conclusions
Some elements not trivial to be mapped
Title name of a person e.g. Doctor, Professor
Version element exists
Audio and Video metadata encountered
Element selection using attributes issue
Nicos Demetriou: Metadata Interoperability with JPSearch 25
27. Experiments
Dublin Core
Simple schema
15 basic elements
Qualified Dublin Core adds 6 more elements
Online samples and manually created
Conclusions
Some elements not so clear how to be mapped
Qualified Dublin Core elements
Elements can appear multiple times
Multiple elements map to Creators/Modifiers/Region of Interest
Nicos Demetriou: Metadata Interoperability with JPSearch 27
28. Experiments
Dublin Core to JPSearch mappings
Nicos Demetriou: Metadata Interoperability with JPSearch 28
Dublin Core element JPSearch Core element
contributor Publisher/OrganizationInformation/Name
creator Creators/GivenName, FamilyName
rights RightsDescription/Description
source Source/SourceElementType
relation OriginalImageIdentifier/Identifier
coverage Publisher/OrganizationInformation/Address/Name
rightsholder RightsDescription/ActualRightsDescription
accrualperiodicity Source/SourceElement/SourceElementDescription
29. Experiments
EBUcore
Extension to Dublin Core
Additional elements for video, audio and image
Extra contact details
Merely based on attributes
Manually created samples
Conclusions
Similar mapping problems as Dublin Core
Version element exists
Optional elements can be ignored
Special repeated attributes
typeGroup, typeLabel, typeLink and typeDefinition
Nicos Demetriou: Metadata Interoperability with JPSearch 29
30. Experiments
EBUcore to JPSearch mappings
Nicos Demetriou: Metadata Interoperability with JPSearch 30
EBUCore elements JPSearch elements
coreMetadata/publisher/entity/contactDetails/
name
Publisher/PersonName/GivenName,
FamilyName
coreMetadata/publisher/entity/contactDetails/
organisationDetails/details
Publisher/OrganizationInformation/Name
coreMetadata/format/imageFormat/width Width
coreMetadata/coverage/spatial/location RegionOfInterest/ContentDescription/Place/
Description
coreMetadata/version RegionOfInterest/ContentDescription/Object/
Name
coreMetadata/rating/ratingValue Rating/LabelValue
31. Experiments
XMP
Different than other standards
Incorporates multiple standards in the same schema
Mixture
Namespace required
EXIF, Dublin Core, Photoshop tags
Heavily based on Resource Description Framework (RDF)
Consists of Description XMP packets
Stored in APP1 of JPEG
External file .xmp
Online valid samples
Conclusions
Large set of elements from different standards
Hard to map certain elements (camera raw metadata)
Nicos Demetriou: Metadata Interoperability with JPSearch 31
32. Experiments
Other formats
Third party web services use their own format
DeviantArt, Flickr and Youtube use oEmbed
12 basic elements
Most of them are optional
Use API to obtain metadata
No attributes
Samples got using API
Conclusions
URL elements
Thumbnail sizes
Nicos Demetriou: Metadata Interoperability with JPSearch 32
48. Conclusion
Nicos Demetriou: Metadata Interoperability with JPSearch 48
JPSearch is powerful international standard
Well documented/defined and still improving
Two tools were developed
JPSearch metadata Editor
Translation Rule generator
Evaluation of the standard through experimentation
JPSearch
provides storage of big set of metadata
allows many metadata sets to be mapped
gives a solution to the metadata interoperability issue
49. Conclusion
Nicos Demetriou: Metadata Interoperability with JPSearch 49
Future work
Tools
Complete database support
Support JPSearch native metadata
JPEG 2000 file type support
Load XML schema (xsd files) to rule generator
Load XML rules to rule generator
Read and translate EXIF to JPSearch
50. Conclusion
Nicos Demetriou: Metadata Interoperability with JPSearch 50
Future work
JPSearch
Attribute support
Additional helpful elements
Extended Contact details (profession title, email, suffix)
Version
URLs and Thumbnail details
Dictionary based values e.g. “Perfect” in Rating
51. FIB - Barcelona School of Informatics
Universitat Politècnica de Catalunya (UPC)
Barcelona, Spain
Metadata Interoperability
with JPSearch
Nicos Demetriou
Master in Computer Architecture Networks and Systems
Barcelona, Spain July 3, 2013